Human Generated Data

Title

Untitled (woman standing up in audience, DAR)

Date

1950, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.209

Human Generated Data

Title

Untitled (woman standing up in audience, DAR)

People

Artist: Jack Gould, American

Date

1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.209

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Person 99.4
Person 99.2
Face 98.5
Person 98.1
Clothing 97.9
Apparel 97.9
Person 96.7
Person 96.6
Person 95.4
Person 92.4
People 77.3
Portrait 72.3
Photography 72.3
Photo 72.3
Performer 72.2
Female 70.8
Hat 67.6
Head 64.6
Text 62.7
Girl 60
Sleeve 59
Costume 57.3
Person 43.3

Clarifai
created on 2023-10-25

people 99.9
lid 99.9
woman 99.2
portrait 99.1
veil 99
adult 98.7
three 98.5
man 98.3
group 98.2
two 98
retro 97.9
wear 96
one 95.5
elderly 95
child 94.9
nostalgia 94.1
four 93.2
costume 89.8
cap 89.3
outerwear 88.7

Imagga
created on 2021-12-14

man 24.2
person 23.2
male 20.2
people 19.5
portrait 19.4
vintage 18.1
couple 16.5
smile 16.4
happy 16.3
dress 16.3
adult 15.3
attractive 14.7
bride 13.8
black 13.7
love 13.4
old 13.2
office 13
fashion 12.8
business 12.1
pretty 11.9
retro 11.5
child 11.4
brunette 11.3
sexy 11.2
blackboard 10.8
lady 10.5
tie 10.4
happiness 10.2
professional 10.2
smiling 10.1
wedding 10.1
holding 9.9
romantic 9.8
style 9.6
hair 9.5
sitting 9.4
money 9.4
studio 9.1
family 8.9
women 8.7
party 8.6
world 8.6
room 8.6
marriage 8.5
groom 8.5
kin 8.4
glasses 8.3
suit 8.1
currency 8.1
looking 8
antique 8
businessman 7.9
look 7.9
standing 7.8
face 7.8
wife 7.6
jacket 7.4
indoor 7.3
home 7.2
holiday 7.2
paper 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.1
human face 94.8
clothing 94.1
person 94
drawing 87.7
posing 83.2
sketch 74.9
cartoon 68.2
old 67.4
man 54.8
vintage 31.7
picture frame 26.5
screenshot 17.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-67
Gender Female, 94%
Surprised 91.8%
Fear 7.8%
Angry 0.2%
Calm 0.1%
Happy 0.1%
Sad 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 44-62
Gender Female, 53.5%
Calm 96.8%
Angry 2.6%
Sad 0.2%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 51-69
Gender Female, 76.9%
Calm 92.4%
Angry 1.9%
Sad 1.8%
Fear 1.3%
Surprised 0.8%
Disgusted 0.7%
Confused 0.7%
Happy 0.4%

AWS Rekognition

Age 43-61
Gender Female, 96.9%
Calm 76.9%
Disgusted 13.4%
Angry 4.6%
Happy 2.5%
Sad 1.3%
Confused 0.7%
Surprised 0.4%
Fear 0.1%

AWS Rekognition

Age 58-76
Gender Female, 91.9%
Calm 86.7%
Happy 8.8%
Surprised 1.4%
Sad 0.9%
Disgusted 0.8%
Angry 0.7%
Fear 0.4%
Confused 0.2%

AWS Rekognition

Age 43-61
Gender Female, 99.8%
Calm 82.1%
Happy 6.9%
Sad 3.3%
Surprised 2.8%
Angry 1.5%
Confused 1.5%
Fear 1%
Disgusted 0.9%

AWS Rekognition

Age 49-67
Gender Female, 69%
Calm 98.2%
Happy 0.6%
Surprised 0.4%
Disgusted 0.2%
Sad 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 51-69
Gender Female, 59.8%
Calm 86.6%
Sad 5.5%
Angry 4.5%
Confused 1%
Surprised 0.9%
Fear 0.8%
Happy 0.4%
Disgusted 0.2%

AWS Rekognition

Age 34-50
Gender Female, 98.4%
Calm 50.7%
Sad 44.7%
Angry 1.1%
Fear 0.9%
Confused 0.9%
Happy 0.8%
Surprised 0.6%
Disgusted 0.2%

AWS Rekognition

Age 24-38
Gender Female, 99.5%
Fear 57.7%
Sad 38.7%
Calm 1.7%
Happy 0.5%
Angry 0.5%
Confused 0.4%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 26-40
Gender Female, 72.7%
Sad 53.5%
Calm 34.5%
Fear 6.1%
Confused 2.2%
Angry 1.3%
Surprised 0.9%
Disgusted 0.8%
Happy 0.7%

Microsoft Cognitive Services

Age 61
Gender Female

Microsoft Cognitive Services

Age 58
Gender Female

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Possible
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 97.9%
people portraits 1.6%