Human Generated Data

Title

Untitled (portrait of nine men and seven women)

Date

c. 1920-1925, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5851

Human Generated Data

Title

Untitled (portrait of nine men and seven women)

People

Artist: Durette Studio, American 20th century

Date

c. 1920-1925, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5851

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.8
Person 98.8
Person 98.5
Person 97.7
Person 96.9
Person 95.6
Person 94.7
Tie 94
Accessories 94
Accessory 94
Person 92.8
Crowd 91.3
Audience 91.3
Person 90.5
Sitting 90
Person 89.1
Person 89.1
Person 88.8
Person 88.2
Person 85.4
Apparel 82.8
Clothing 82.8
Person 82.6
Indoors 78.7
Room 78.7
People 72.2
Person 71.5
School 68
Classroom 68
Face 62.3
Speech 57.5
Female 57.2
Chair 57
Furniture 57
Person 47.4

Clarifai
created on 2019-11-16

people 99.7
group 99.4
many 98.4
group together 97.6
woman 96.7
adult 96.3
music 95.4
musician 94.6
man 93.2
portrait 92.6
wear 92.3
child 92
singer 90.8
outfit 89
leader 88.7
retro 86.4
administration 85.2
several 84.9
education 81.1
actress 80.9

Imagga
created on 2019-11-16

people 30.1
person 27.3
man 24.2
adult 23.9
black 23.6
business 21.2
clothing 19.8
male 18.4
portrait 18.1
couple 17.4
happy 16.3
dress 16.2
musical instrument 15.9
businessman 15.9
fashion 15.1
sexy 14.4
model 14
elegant 13.7
suit 13.6
love 13.4
attractive 13.3
blackboard 13
party 12.9
women 12.6
happiness 12.5
dark 12.5
military uniform 12.3
wind instrument 12
professional 12
room 11.9
elegance 11.7
pretty 11.2
men 11.2
style 11.1
night 10.6
face 10.6
lady 10.5
group 10.5
fun 10.5
office 10.4
celebration 10.4
lifestyle 10.1
hand 9.9
success 9.6
hair 9.5
uniform 9.4
relationship 9.4
smile 9.3
indoor 9.1
silhouette 9.1
gorgeous 9.1
pose 9.1
posing 8.9
hands 8.7
smiling 8.7
corporate 8.6
tie 8.5
expression 8.5
youth 8.5
two 8.5
clothes 8.4
studio 8.4
executive 8.3
dance 8.2
cheerful 8.1
romantic 8
family 8
musician 8
holiday 7.9
boss 7.6
singer 7.6
brass 7.6
passion 7.5
human 7.5
holding 7.4
outfit 7.4
spectator 7.4
businesswoman 7.3
home 7.2
together 7

Google
created on 2019-11-16

Photograph 96.2
People 93.1
Snapshot 82.5
Black-and-white 74.4
Photography 67.8
Room 65.7
Picture frame 62.5
Monochrome 60.1
Family 59.2
Team 52.3
Art 50.2

Microsoft
created on 2019-11-16

posing 97.7
person 97.6
text 97.5
clothing 97.2
man 88.1
smile 87
group 81.6
white 78.7
black 78.2
standing 78
black and white 67.9
woman 65.2
old 55.2
suit 50.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-32
Gender Female, 50.2%
Angry 45.8%
Disgusted 45%
Surprised 45%
Sad 45.5%
Confused 45.1%
Calm 53.5%
Fear 45%
Happy 45%

AWS Rekognition

Age 19-31
Gender Female, 51%
Disgusted 45%
Calm 55%
Angry 45%
Confused 45%
Fear 45%
Sad 45%
Surprised 45%
Happy 45%

AWS Rekognition

Age 25-39
Gender Female, 54.7%
Sad 45.2%
Disgusted 45.2%
Fear 45.2%
Angry 45.3%
Happy 45.1%
Surprised 45.4%
Calm 53.1%
Confused 45.5%

AWS Rekognition

Age 23-35
Gender Male, 53.7%
Calm 54.3%
Sad 45.1%
Fear 45%
Disgusted 45%
Confused 45.1%
Surprised 45.1%
Happy 45.3%
Angry 45.2%

AWS Rekognition

Age 17-29
Gender Male, 54.7%
Fear 45%
Sad 45.7%
Disgusted 45%
Surprised 45%
Calm 54.2%
Happy 45%
Angry 45%
Confused 45%

AWS Rekognition

Age 23-35
Gender Female, 53.8%
Happy 45%
Sad 45.1%
Confused 45%
Disgusted 45%
Angry 45.2%
Surprised 45%
Calm 54.5%
Fear 45%

AWS Rekognition

Age 21-33
Gender Male, 55%
Calm 54.8%
Sad 45%
Angry 45.1%
Disgusted 45%
Happy 45.1%
Surprised 45%
Fear 45%
Confused 45%

AWS Rekognition

Age 13-25
Gender Male, 54.3%
Fear 45%
Sad 45.1%
Happy 45.1%
Disgusted 45%
Calm 47.9%
Confused 45.4%
Angry 51.4%
Surprised 45.1%

AWS Rekognition

Age 20-32
Gender Male, 54.1%
Angry 45.3%
Calm 54.6%
Sad 45%
Happy 45%
Fear 45%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 21-33
Gender Female, 53.8%
Fear 45%
Angry 45%
Sad 45%
Happy 45%
Calm 54.9%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Male, 53.9%
Calm 54.4%
Confused 45.2%
Fear 45%
Sad 45.1%
Angry 45.2%
Surprised 45.1%
Disgusted 45%
Happy 45%

AWS Rekognition

Age 20-32
Gender Male, 54.9%
Confused 45%
Fear 45%
Angry 45.3%
Sad 45%
Disgusted 45%
Surprised 45%
Happy 45%
Calm 54.5%

AWS Rekognition

Age 13-25
Gender Male, 54.6%
Angry 45%
Happy 45%
Disgusted 45%
Fear 45%
Confused 45%
Surprised 45%
Calm 55%
Sad 45%

AWS Rekognition

Age 24-38
Gender Male, 54.2%
Surprised 45.1%
Calm 45.6%
Happy 45%
Angry 52.8%
Fear 45.3%
Sad 46%
Confused 45.1%
Disgusted 45%

AWS Rekognition

Age 21-33
Gender Female, 54.7%
Angry 45%
Happy 45%
Disgusted 45%
Sad 45.7%
Calm 54.2%
Surprised 45%
Confused 45%
Fear 45%

AWS Rekognition

Age 18-30
Gender Male, 54.6%
Angry 45.1%
Surprised 45%
Sad 45.1%
Fear 45%
Calm 54.7%
Happy 45%
Confused 45%
Disgusted 45%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 46
Gender Female

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Tie 94%

Categories

Imagga

people portraits 97.5%
events parties 1.2%