Human Generated Data

Title

Untitled (two women holding small children, seated, half-length)

Date

c. 1940

People

Artist: Michael Disfarmer, American 1884 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2007.288

Human Generated Data

Title

Untitled (two women holding small children, seated, half-length)

People

Artist: Michael Disfarmer, American 1884 - 1959

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 99.8
Clothing 99.8
Person 99.2
Human 99.2
Person 99.1
Baby 98.9
Newborn 98.9
People 92.8
Person 89
Face 87.4
Smile 76.5
Family 71.2
Photography 66.1
Portrait 66.1
Photo 66.1
Hat 64
Bonnet 57.7

Imagga
created on 2022-03-05

nurse 55.8
mother 39.9
parent 37.5
people 31.8
home 30.3
family 30.3
man 29.6
male 29.6
child 28.1
happy 27.6
kin 26.1
smiling 26.1
daughter 26
father 25.9
indoors 22.9
adult 21.5
dad 19.8
love 19.7
happiness 19.6
senior 18.7
women 18.2
cheerful 17.9
smile 17.8
group 17.7
together 17.5
portrait 17.5
lifestyle 17.4
men 17.2
mature 16.7
couple 16.6
sibling 15.6
husband 15.3
kid 15.1
medical 15
baby 14.9
clothing 14.5
30s 14.4
wife 14.2
doctor 14.1
room 14
office 13.7
face 13.5
care 13.2
hospital 13.1
business 12.8
team 12.5
childhood 12.5
elderly 12.4
togetherness 12.3
son 12.3
boy 12.2
patient 12.1
sitting 12
person 12
businessman 11.5
professional 11.3
meeting 11.3
health 11.1
casual 11
two 11
children 10.9
businesswoman 10.9
holding 10.7
thirties 10.7
colleagues 10.7
uniform 10.7
fun 10.5
indoor 10
two people 9.7
married 9.6
table 9.5
businesspeople 9.5
work 9.4
cute 9.3
old 9.1
blond 9
military uniform 9
interior 8.8
working 8.8
computer 8.8
grandmother 8.8
grandfather 8.8
talking 8.6
attractive 8.4
horizontal 8.4
teamwork 8.3
color 8.3
success 8.1
little 8
40s 7.8
discussion 7.8
brother 7.7
eating 7.6
leisure 7.5
inside 7.4
20s 7.3
girls 7.3
domestic 7.2
looking 7.2
to 7.1
medicine 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

baby 99.5
wall 99.3
text 98.9
human face 98.9
person 98.7
toddler 98.1
clothing 97.8
old 94.3
child 94.1
smile 93.3
black 90.1
posing 86.5
boy 72.1
woman 63
group 60.9
vintage 35.6
picture frame 18.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Calm 80%
Happy 6.8%
Confused 5.1%
Surprised 3.4%
Disgusted 1.4%
Angry 1.4%
Fear 1%
Sad 0.8%

AWS Rekognition

Age 30-40
Gender Female, 99.5%
Calm 99.8%
Angry 0.1%
Sad 0%
Confused 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 0-4
Gender Female, 99.6%
Confused 78.2%
Calm 9.3%
Sad 4.8%
Surprised 4.5%
Fear 1.3%
Disgusted 1%
Angry 0.6%
Happy 0.3%

AWS Rekognition

Age 0-3
Gender Female, 99.9%
Surprised 81.7%
Calm 8.9%
Confused 2.7%
Disgusted 2.7%
Fear 2.1%
Angry 1%
Sad 0.6%
Happy 0.4%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Microsoft Cognitive Services

Age 43
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 97.7%
a vintage photo of a group of people sitting posing for the camera 97%
a vintage photo of a group of people posing for a picture 96.9%