Human Generated Data

Title

Untitled (family portrait)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2360

Human Generated Data

Title

Untitled (family portrait)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Human 99.7
Person 99.7
Person 99.6
Person 99.5
Person 98.9
Person 94.8
People 94
Apparel 91.8
Clothing 91.8
Face 85.3
Accessories 85
Sunglasses 85
Accessory 85
Family 79.8
Person 77.7
Sleeve 75.6
Female 65.7
Long Sleeve 64.8
Photography 63.5
Photo 63.5
Smile 58.9
Suit 58
Coat 58
Overcoat 58
Shirt 55.9

Imagga
created on 2022-01-30

man 47
person 40.7
male 35.4
people 31.8
senior 25.3
portrait 23.9
adult 23.6
couple 21.8
men 21.5
nurse 20.4
looking 20
work 19.6
happy 19.4
shower cap 18.8
coat 18.5
businessman 18.5
handsome 17.8
clothing 17.8
face 17.7
professional 16.7
occupation 16.5
cap 16.4
specialist 16.3
old 16
mature 15.8
smile 15.7
lab coat 15.1
medical 15
worker 14.3
business 14
elderly 13.4
job 13.3
patient 13.3
doctor 13.1
together 13.1
student 13.1
headdress 13
education 13
office 12.8
human 12.7
team 12.5
smiling 12.3
sitting 12
love 11.8
suit 11.7
holding 11.5
surgeon 11.5
black 11.4
attractive 11.2
teamwork 11.1
two 11
happiness 11
teacher 10.8
hand 10.6
medicine 10.6
married 10.5
clinic 10.5
success 10.4
standing 10.4
health 10.4
lifestyle 10.1
indoor 10
husband 9.9
equipment 9.9
scientist 9.8
serious 9.5
table 9.5
glasses 9.2
cheerful 8.9
group 8.9
look 8.8
case 8.7
retired 8.7
chemistry 8.7
laboratory 8.7
mask 8.6
tie 8.5
care 8.2
confident 8.2
covering 8.2
garment 8.2
fan 8
science 8
home 8
to 8
hair 7.9
guy 7.8
lab 7.8
older 7.8
retirement 7.7
casual 7.6
biology 7.6
wife 7.6
drink 7.5
manager 7.4
technology 7.4
lady 7.3
room 7.3
groom 7.2
women 7.1
life 7.1
working 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

human face 95.5
text 95.3
person 94.7
posing 93.1
window 89.1
clothing 81
smile 77.2
man 70.3
old 41.3

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 97.5%
Calm 97.6%
Surprised 0.7%
Confused 0.6%
Sad 0.3%
Fear 0.2%
Angry 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 39-47
Gender Female, 86.8%
Calm 95%
Surprised 4.7%
Happy 0.2%
Confused 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Male, 98.3%
Calm 94.1%
Surprised 5.1%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%
Sad 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 96.3%
Calm 99.7%
Surprised 0.2%
Sad 0.1%
Fear 0%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 45-53
Gender Male, 79.2%
Calm 94.1%
Sad 2.6%
Happy 2%
Confused 0.4%
Surprised 0.3%
Disgusted 0.3%
Angry 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 96.8%
a group of people posing for the camera 96.7%
a group of people posing for a picture 96.6%

Text analysis

Google

AND
AND