Human Generated Data

Title

Untitled (women and nuns seated at banquet table)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4089

Human Generated Data

Title

Untitled (women and nuns seated at banquet table)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4089

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.7
Human 99.7
Person 98.9
Helmet 98.5
Clothing 98.5
Apparel 98.5
Helmet 98.4
Person 98.3
Person 96.8
Person 96.3
Person 92.9
Person 88.5
Helmet 87.1
Sailor Suit 84.6
Face 78.3
Accessory 78.3
Accessories 78.3
Sunglasses 78.3
Person 78.1
Food 74.5
Meal 74.5
Chef 72.5
Hat 67.8
Hat 65.8
Photography 65.1
Portrait 65.1
Photo 65.1
Dessert 62.4
Creme 62.4
Icing 62.4
Cream 62.4
Cake 62.4
Hat 61.4

Clarifai
created on 2019-06-01

people 99.7
group 98.7
adult 98.2
man 97.2
group together 95.5
woman 93.6
uniform 93
military 91.7
wear 87.7
medical practitioner 86.9
leader 86.7
administration 86
several 85.5
veil 85.4
war 84.9
child 82.3
outfit 81.3
hospital 80.8
many 78.2
facial expression 77.7

Imagga
created on 2019-06-01

nurse 47.9
man 46.3
surgeon 35.5
male 34.7
person 32.7
people 29.5
patient 26.5
medical 25.6
work 24.3
adult 24
doctor 23.5
professional 21.8
worker 21.4
men 20.6
hospital 19.4
specialist 18.6
room 18
office 17
job 16.8
working 16.8
home 16.7
indoors 16.7
health 16.7
team 16.1
medicine 15.8
occupation 15.6
coat 15.2
sitting 14.6
smiling 14.5
30s 14.4
clinic 13.9
happy 13.8
business 13.4
businessman 13.2
senior 13.1
case 12.7
meeting 12.2
teamwork 12
student 11.7
lab coat 11.6
education 11.2
industry 11.1
laptop 11.1
portrait 11
surgery 10.7
care 10.7
desk 10.5
table 10.5
businesspeople 10.4
clothing 10.4
day 10.2
uniform 9.9
sick person 9.8
equipment 9.8
scientist 9.8
interior 9.7
lab 9.7
laboratory 9.6
test 9.6
looking 9.6
couple 9.6
illness 9.5
women 9.5
clothes 9.4
casual 9.3
bright 9.3
20s 9.2
hand 9.1
confident 9.1
businesswoman 9.1
operation 8.9
to 8.8
half length 8.8
40s 8.8
colleagues 8.7
chemistry 8.7
mid adult 8.7
daytime 8.7
iron lung 8.7
serious 8.6
research 8.6
adults 8.5
horizontal 8.4
human 8.2
group 8.1
computer 8
science 8
together 7.9
happiness 7.8
days 7.8
exam 7.7
elderly 7.7
profession 7.7
talking 7.6
biology 7.6
instrument 7.6
classroom 7.5
color 7.2
kitchen 7.2
smile 7.1
face 7.1
look 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 95.2
clothing 89.6
black and white 81.7
man 72.7
human face 53.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-63
Gender Male, 95.5%
Happy 0.7%
Confused 0.9%
Disgusted 0.7%
Sad 1.8%
Calm 93.7%
Angry 1%
Surprised 1.3%

AWS Rekognition

Age 23-38
Gender Female, 50.8%
Disgusted 3%
Surprised 8.1%
Angry 3.9%
Confused 3.7%
Sad 6.8%
Calm 63.4%
Happy 11%

AWS Rekognition

Age 26-43
Gender Female, 53.8%
Sad 50.6%
Happy 45.1%
Surprised 45.2%
Calm 48.5%
Disgusted 45%
Confused 45.4%
Angry 45.3%

AWS Rekognition

Age 27-44
Gender Male, 98.8%
Confused 1.9%
Surprised 2.2%
Happy 4.3%
Calm 86.1%
Sad 3%
Disgusted 1.3%
Angry 1.3%

Feature analysis

Amazon

Person 99.7%
Helmet 98.5%
Sunglasses 78.3%
Hat 67.8%

Categories

Imagga

people portraits 81.9%
events parties 17.1%