Human Generated Data

Title

Untitled (studio portrait of two boys in religious attire)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6083

Human Generated Data

Title

Untitled (studio portrait of two boys in religious attire)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6083

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.3
Human 99.3
Person 98.7
Chair 98.5
Furniture 98.5
Person 98.4
Person 97
Clothing 92
Apparel 92
Shoe 87.4
Footwear 87.4
Sitting 67.4
Shoe 67
Leisure Activities 61.8
Photography 61.4
Photo 61.4
Face 61
Portrait 61
Performer 59.2
Female 58.2
Dress 57.9

Clarifai
created on 2019-11-16

people 99.8
group 96.9
wear 96.3
woman 96.1
man 95.4
adult 94.1
room 93.3
music 92.9
movie 92.4
indoors 89.3
group together 85.8
child 85.4
theater 84.8
musician 84.1
outfit 81.3
actor 81
education 77.4
opera 77.4
actress 76.5
leader 74.2

Imagga
created on 2019-11-16

man 30.2
people 27.3
male 22.7
businessman 21.2
person 21.1
business 20
group 19.3
silhouette 19
television 17.4
men 16.3
professional 15.3
adult 14.1
executive 13.9
couple 13.1
photographer 12.9
office 12.1
corporate 12
team 11.6
job 11.5
groom 11.3
blackboard 11.2
black 11.1
telecommunication system 10.6
teacher 10.5
communication 10.1
employee 9.9
wind instrument 9.8
fashion 9.8
success 9.6
musical instrument 9.6
meeting 9.4
happiness 9.4
happy 9.4
manager 9.3
suit 9.2
chair 8.7
women 8.7
boss 8.6
career 8.5
elegance 8.4
teamwork 8.3
human 8.2
indoor 8.2
dress 8.1
clothing 8
spectator 8
love 7.9
standing 7.8
world 7.8
youth 7.7
building 7.5
future 7.4
art 7.4
worker 7.3
looking 7.2
portrait 7.1
information 7.1
work 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 98.5
wall 98.5
man 96.8
text 96.1
person 94
standing 90.8
posing 89.3
black 76.1
suit 63.6
old 60.3
black and white 50.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-42
Gender Male, 54.5%
Calm 54.2%
Surprised 45.4%
Disgusted 45%
Happy 45%
Angry 45.1%
Confused 45.2%
Sad 45.1%
Fear 45%

AWS Rekognition

Age 31-47
Gender Female, 50.8%
Disgusted 45%
Calm 52.8%
Fear 45%
Happy 46.1%
Confused 45.1%
Sad 45.8%
Surprised 45.1%
Angry 45%

AWS Rekognition

Age 3-9
Gender Female, 54.6%
Calm 53.1%
Sad 46.8%
Angry 45%
Disgusted 45%
Happy 45%
Surprised 45%
Fear 45%
Confused 45%

AWS Rekognition

Age 27-43
Gender Female, 52.6%
Sad 46.1%
Fear 45%
Disgusted 45%
Surprised 45%
Angry 48%
Calm 50.7%
Happy 45%
Confused 45.1%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Shoe 87.4%

Categories

Imagga

people portraits 72.8%
paintings art 25.1%
food drinks 1.2%