Human Generated Data

Title

Untitled (studio portrait of grandfather and young girl, both looking left, girl seated)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6939

Human Generated Data

Title

Untitled (studio portrait of grandfather and young girl, both looking left, girl seated)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.5
Person 98.5
Person 97.9
Clothing 96.2
Apparel 96.2
Suit 96.2
Overcoat 96.2
Coat 96.2
Face 83.1
People 73.9
Home Decor 71.4
Performer 59.8
Photography 56.6
Photo 56.6

Imagga
created on 2022-01-23

man 45.1
male 38.1
adult 31.1
people 30.2
happy 29.5
suit 29.3
kin 29.2
portrait 28.5
couple 27.9
smiling 26.8
person 26.7
businessman 26.5
bow tie 22.7
business 22.5
standing 20.9
love 20.5
handsome 20.5
attractive 20.3
smile 20
necktie 19.2
executive 18.8
professional 18.7
family 18.7
mature 18.6
tie 18.1
mother 17.5
parent 17.1
two 17
senior 16.9
happiness 16.5
men 16.3
father 16.2
dad 15.7
office 15.3
together 14.9
corporate 14.6
confident 14.6
group 14.5
lifestyle 14.5
brother 13.6
looking 13.6
boy 13.1
successful 12.8
cheerful 12.2
guy 12.2
fun 12
expression 12
sibling 11.9
casual 11.9
relationship 11.3
home 11.2
businesswoman 10.9
face 10.7
boss 10.5
husband 10.5
success 10.5
old 10.5
businesspeople 10.5
women 10.3
team 9.9
fashion 9.8
black 9.8
daughter 9.6
meeting 9.4
work 9.4
manager 9.3
garment 9
job 8.9
indoors 8.8
brunette 8.7
partner 8.7
adolescent 8.7
buddy 8.6
elderly 8.6
clothing 8.6
child 8.6
serious 8.6
adults 8.5
togetherness 8.5
teamwork 8.4
alone 8.2
lady 8.1
romance 8
sexy 8
grandfather 8
hair 7.9
businessmen 7.8
two people 7.8
colleagues 7.8
party 7.7
30s 7.7
studio 7.6
human 7.5
shirt 7.5
camera 7.4
friendly 7.3
romantic 7.1
juvenile 7.1

Google
created on 2022-01-23

Photograph 94.2
Smile 86.1
Gesture 84.4
Vintage clothing 76.7
Tints and shades 76
Blazer 75.3
Picture frame 74.8
Snapshot 74.3
Classic 73.9
Collar 72.1
Suit 70.6
Art 70
Formal wear 68.5
Stock photography 64.6
Room 64.6
Event 64.1
Sitting 63.4
Happy 62
Monochrome 61
Child 58.9

Microsoft
created on 2022-01-23

person 98.9
human face 98.3
clothing 96.3
wall 95.4
smile 91.6
indoor 89.2
text 89.1
suit 67.3
portrait 56.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Confused 91.9%
Calm 4.3%
Sad 1.3%
Surprised 1%
Disgusted 0.5%
Fear 0.5%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 6-14
Gender Female, 100%
Fear 52%
Calm 30.4%
Sad 9.2%
Surprised 2.2%
Happy 2.1%
Confused 1.6%
Disgusted 1.5%
Angry 1%

Microsoft Cognitive Services

Age 11
Gender Female

Microsoft Cognitive Services

Age 58
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Suit 96.2%

Captions

Microsoft

a man and a woman sitting on a table 75%
a man and a woman sitting at a table 74.9%
a man and woman sitting on a table 70.1%