Human Generated Data

Title

Untitled (family portrait of eight on front porch steps, two oldest seated on chairs at front)

Date

c. 1940

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11291

Human Generated Data

Title

Untitled (family portrait of eight on front porch steps, two oldest seated on chairs at front)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.3
Person 99.3
Person 98.2
Person 98.1
Accessories 97.8
Accessory 97.8
Tie 97.8
Person 95.6
Person 95.1
Person 94.6
Person 94.4
People 93.8
Overcoat 91.3
Clothing 91.3
Suit 91.3
Coat 91.3
Apparel 91.3
Person 90.7
Tie 84.9
Family 84.4
Suit 73
Chair 72.4
Furniture 72.4
Suit 71.6
Shoe 62.9
Footwear 62.9
Face 60.4
Tie 55.2

Imagga
created on 2022-01-23

kin 100
man 32.9
male 28.4
people 27.3
happy 26.9
family 24
couple 20.9
group 20.1
men 18.9
adult 18.8
person 18.8
home 18.3
portrait 18.1
smiling 17.3
business 17
businessman 16.8
happiness 15.7
mother 15.3
team 15.2
senior 15
women 14.2
together 14
child 13
father 12.8
smile 12.8
love 12.6
indoors 12.3
businesswoman 11.8
black 11.5
30s 11.5
standing 11.3
office 11.2
clothing 11.2
professional 11.1
lifestyle 10.8
uniform 10.8
job 10.6
teamwork 10.2
attractive 9.8
cheerful 9.7
interior 9.7
nurse 9.7
daughter 9.5
casual 9.3
face 9.2
military uniform 9.1
suit 9
boy 8.7
married 8.6
elderly 8.6
husband 8.6
relationship 8.4
inside 8.3
fun 8.2
success 8
handsome 8
looking 8
kid 8
room 7.8
son 7.8
corporate 7.7
partnership 7.7
old 7.7
youth 7.7
two 7.6
businesspeople 7.6
tie 7.6
wife 7.6
females 7.6
meeting 7.5
doctor 7.5
joy 7.5
camera 7.4
executive 7.4
aged 7.2
worker 7.2
holiday 7.2
working 7.1
medical 7.1

Google
created on 2022-01-23

Hair 98.3
Coat 88.1
Chair 85.4
Classic 77
Suit 75
Snapshot 74.3
Event 73.4
Vintage clothing 73.3
Formal wear 66.6
Team 65.2
History 64.9
Monochrome 62.9
Sitting 57
Retro style 54.8
Family 52.6
Photography 52.2
Uniform 51.4
Room 50.6
Blazer 50.6

Microsoft
created on 2022-01-23

person 99.9
posing 99.7
clothing 98.3
smile 98.3
group 96
woman 92.8
standing 88.9
suit 84.2
man 76.3
text 64.1
human face 62.9
dress 61.2
team 41.3
crowd 0.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-48
Gender Female, 100%
Happy 76.5%
Confused 7.2%
Calm 6.5%
Surprised 2.9%
Disgusted 2.3%
Sad 1.6%
Fear 1.5%
Angry 1.5%

AWS Rekognition

Age 62-72
Gender Female, 96.9%
Calm 96.6%
Angry 1.3%
Sad 0.7%
Confused 0.5%
Disgusted 0.4%
Surprised 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 50-58
Gender Female, 96.3%
Confused 46.4%
Calm 43.1%
Happy 5.2%
Angry 1.9%
Disgusted 1.4%
Sad 0.8%
Surprised 0.7%
Fear 0.4%

AWS Rekognition

Age 53-61
Gender Female, 99.7%
Calm 99.7%
Angry 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 38-46
Gender Female, 59.7%
Calm 98.8%
Angry 0.3%
Confused 0.3%
Happy 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 99.9%
Confused 0%
Sad 0%
Angry 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 82.8%
Angry 15.2%
Confused 0.7%
Sad 0.5%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%
Happy 0.2%

AWS Rekognition

Age 59-69
Gender Male, 100%
Angry 48.7%
Calm 43.6%
Sad 3.8%
Confused 1.5%
Fear 1%
Happy 0.5%
Disgusted 0.5%
Surprised 0.5%

Microsoft Cognitive Services

Age 54
Gender Female

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 61
Gender Male

Microsoft Cognitive Services

Age 47
Gender Male

Microsoft Cognitive Services

Age 64
Gender Male

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 54
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 97.8%
Suit 91.3%
Shoe 62.9%

Captions

Microsoft

Oveta Culp Hobby, Eva Hoffman posing for a photo 99%
Oveta Culp Hobby, Eva Hoffman posing for the camera 98.9%
Oveta Culp Hobby, Eva Hoffman posing for a picture 98.8%

Text analysis

Google

TOM
TOM