Human Generated Data

Title

Untitled (two photographs: men playing cards at table; cars parked in lot)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6786

Human Generated Data

Title

Untitled (two photographs: men playing cards at table; cars parked in lot)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6786

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.6
Person 99.6
Person 99.6
Poster 99.5
Advertisement 99.5
Collage 99.5
Person 99.4
Person 98.9
Person 98.7
Person 98.4
Person 96.9
Person 95.1
Nature 91.3
Person 89.5
Outdoors 89
Building 85.6
Person 84.3
Countryside 82.2
Person 80.9
Hut 75
Rural 75
Shack 73.6
Person 65.8
Person 65.6
Dugout 59.3
Prison 59.3
People 58.6
Person 46.9

Clarifai
created on 2019-11-16

people 99.6
group 99.3
many 98.7
group together 98.4
man 97.2
adult 97.2
woman 94
room 91.6
child 89
crowd 88.6
several 88.3
movie 88.1
furniture 85
war 82.7
wear 82.5
television 82.5
indoors 81.6
monochrome 81
education 80.7
audience 79.9

Imagga
created on 2019-11-16

barbershop 25.5
shop 22.1
window 21.1
old 19.5
building 18.9
city 18.3
structure 17.3
vehicle 16.9
billboard 16.8
wheeled vehicle 16.8
mercantile establishment 16.8
conveyance 16.7
street 16.6
snow 15.6
architecture 15.6
transportation 15.2
travel 14.8
tramway 14
black 13.8
winter 13.6
car 13.3
train 12.5
signboard 12
transport 11.9
streetcar 11.4
urban 11.4
door 11.3
cold 11.2
place of business 11.2
wall 11.1
house 10.9
public 10.7
people 10
passenger car 9.6
silhouette 9.1
road 9
station 8.8
man 8.7
lamp 8.7
light 8.7
glass 8.6
dirty 8.1
scene 7.8
windows 7.7
sky 7.6
decoration 7.6
sign 7.5
passenger 7.4
equipment 7.4
track 7.3
tourist 7.2
history 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.5
clothing 94.9
person 87.3
indoor 85.6
man 74.6
gallery 67.7
people 62.8
old 51
room 44.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-68
Gender Male, 50.4%
Sad 50%
Angry 49.5%
Calm 49.8%
Happy 49.6%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%

AWS Rekognition

Age 50-68
Gender Male, 50.5%
Calm 49.8%
Confused 49.5%
Happy 49.9%
Disgusted 49.5%
Surprised 49.5%
Sad 49.6%
Fear 49.5%
Angry 49.6%

AWS Rekognition

Age 20-32
Gender Male, 50.1%
Sad 50%
Surprised 49.5%
Angry 49.5%
Confused 49.5%
Fear 49.9%
Happy 49.5%
Calm 49.6%
Disgusted 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50.2%
Sad 50.1%
Fear 49.5%
Disgusted 49.5%
Angry 49.5%
Happy 49.6%
Confused 49.5%
Calm 49.8%
Surprised 49.5%

AWS Rekognition

Age 26-40
Gender Female, 50.1%
Disgusted 49.5%
Sad 50.3%
Confused 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Calm 49.6%
Angry 49.5%

AWS Rekognition

Age 37-55
Gender Male, 50.1%
Disgusted 49.5%
Fear 49.6%
Calm 49.7%
Surprised 49.5%
Angry 49.6%
Confused 49.5%
Happy 49.9%
Sad 49.6%

AWS Rekognition

Age 20-32
Gender Male, 50.3%
Confused 49.5%
Calm 49.5%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Angry 49.5%
Sad 50.4%
Disgusted 49.5%

AWS Rekognition

Age 32-48
Gender Male, 50.5%
Happy 49.5%
Disgusted 49.5%
Sad 50.2%
Angry 49.5%
Surprised 49.5%
Fear 49.5%
Calm 49.8%
Confused 49.5%

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 52.9%
food drinks 44.9%

Text analysis

Amazon

Smpscige
Srpeion

Google

Stupecior Spscier
Stupecior
Spscier