Human Generated Data

Title

Untitled (group of six women and two boys in front of brick building)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6000

Human Generated Data

Title

Untitled (group of six women and two boys in front of brick building)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.6
Person 99.6
Person 99.3
Person 99.3
Person 98.9
Person 98.6
Person 98.3
Apparel 96.9
Clothing 96.9
Home Decor 88.8
Crowd 77.7
Advertisement 72.6
Sleeve 66.9
Long Sleeve 64.6
People 62.6
Coat 61.6
Overcoat 61.6
Suit 59.5
Collage 58.3
Poster 54.5
Person 51.5

Clarifai
created on 2019-11-16

people 100
adult 98.8
one 98.7
leader 98
wear 96.8
two 95.8
portrait 95.8
man 95.5
group 95.4
administration 95
outfit 94
woman 91.6
monochrome 91.4
facial expression 91.1
group together 88.1
outerwear 87.9
actor 86.5
actress 85.1
musician 84.4
music 84

Imagga
created on 2019-11-16

newspaper 30.8
pay-phone 28.5
old 24.4
telephone 23.9
product 23.7
call 20.3
vintage 19
creation 18.6
electronic equipment 18
book jacket 17.2
art 16.9
ancient 16.4
portrait 15.5
jacket 15.3
city 15
man 14.8
building 14.6
sculpture 14.3
people 13.9
black 13.8
statue 13.6
antique 13
person 12.6
history 12.5
architecture 12.5
male 12.1
aged 11.8
wall 11.1
symbol 10.8
equipment 10.7
one 10.5
culture 10.3
wrapping 10.2
religion 9.9
office 9.9
grunge 9.4
stone 9.2
house 9.2
outdoors 9
artistic 8.7
historical 8.5
dirty 8.1
business 7.9
urban 7.9
paintings 7.8
museum 7.8
travel 7.7
door 7.7
room 7.5
human 7.5
monument 7.5
tourism 7.4
covering 7.4
structure 7.4
historic 7.3
window 7.3
figure 7.2
face 7.1
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 99
text 98.8
person 98.2
window 85.7
man 82.8
old 77.3
white 69.3
black 68.8
human face 68.7
black and white 50.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-60
Gender Male, 53.4%
Calm 45.2%
Sad 52.7%
Angry 45.1%
Disgusted 45.2%
Happy 46.6%
Surprised 45%
Fear 45.1%
Confused 45%

AWS Rekognition

Age 22-34
Gender Male, 51.2%
Angry 45.6%
Calm 53.6%
Confused 45.1%
Disgusted 45%
Surprised 45%
Happy 45%
Sad 45.6%
Fear 45%

AWS Rekognition

Age 32-48
Gender Female, 96.5%
Sad 4.3%
Fear 0.7%
Disgusted 1.7%
Surprised 3.6%
Angry 1.2%
Calm 60.1%
Happy 26.7%
Confused 1.6%

AWS Rekognition

Age 21-33
Gender Female, 51.2%
Disgusted 45.1%
Fear 45%
Confused 45%
Calm 54.2%
Surprised 45%
Angry 45%
Sad 45.6%
Happy 45%

AWS Rekognition

Age 23-37
Gender Female, 53.6%
Angry 45%
Surprised 45%
Sad 45%
Happy 54.9%
Disgusted 45%
Confused 45%
Calm 45%
Fear 45%

AWS Rekognition

Age 21-33
Gender Male, 52.3%
Disgusted 45.1%
Calm 54.2%
Confused 45%
Sad 45.1%
Happy 45.3%
Angry 45.2%
Fear 45%
Surprised 45%

AWS Rekognition

Age 19-31
Gender Male, 54.1%
Confused 45%
Surprised 45%
Sad 45%
Disgusted 45%
Calm 45.3%
Fear 45%
Angry 45%
Happy 54.7%

AWS Rekognition

Age 19-31
Gender Male, 54.7%
Disgusted 45%
Fear 45%
Confused 45.1%
Angry 45%
Calm 45.7%
Surprised 45%
Sad 45.1%
Happy 54.1%

AWS Rekognition

Age 13-25
Gender Female, 54.5%
Disgusted 46.4%
Happy 45.9%
Surprised 45%
Sad 47.6%
Fear 45.1%
Calm 49.7%
Angry 45.2%
Confused 45.1%

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 54.5%

Captions

Microsoft

a vintage photo of a person 93.6%
a black and white photo of a person 91.1%
an old black and white photo of a person 90.5%

Text analysis

Amazon

runls

Google

AL
AL