Human Generated Data

Title

Untitled (slightly blurred studio portrait of young child seated in chair)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6030

Human Generated Data

Title

Untitled (slightly blurred studio portrait of young child seated in chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6030

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Apparel 99.1
Clothing 99.1
Human 98.7
Person 98.7
Person 97.2
Person 97
Person 96.7
Overcoat 90.1
Footwear 87.3
Shoe 87.3
Suit 86.5
Coat 78.7
Sleeve 71.8
Long Sleeve 71
Animal 66.9
Bird 66.9
Accessories 62.3
Tie 62.3
Accessory 62.3
Advertisement 60.6
Collage 57.6
Door 56.1
Poster 55.2

Clarifai
created on 2019-11-16

people 99.9
group 98.9
wear 97.3
adult 96.8
many 96.1
man 95.8
one 94.5
woman 94.1
two 93.9
outfit 92.4
administration 91.3
monochrome 90.2
vehicle 89.1
several 87.7
military 87.4
child 87.2
room 86.3
group together 86.2
furniture 85.7
outerwear 85.7

Imagga
created on 2019-11-16

barbershop 25.8
shop 24.1
black 21.7
window 19.1
man 18.8
mercantile establishment 18.3
person 16.1
old 16
vintage 15.7
architecture 15.6
male 15.6
building 14.8
people 13.9
chair 12.7
interior 12.4
room 12.3
place of business 12.2
wall 11.1
business 10.9
house 10.9
office 10.8
history 10.7
art 10.4
style 10.4
fashion 9.8
one 9.7
indoors 9.7
urban 9.6
home 9.6
light 9.4
glass 9.3
device 9.2
city 9.1
working 8.8
antique 8.7
ancient 8.6
dark 8.3
retro 8.2
dirty 8.1
adult 7.8
men 7.7
statue 7.6
television 7.6
historical 7.5
human 7.5
furniture 7.4
inside 7.4
historic 7.3
detail 7.2
dress 7.2
religion 7.2
portrait 7.1
businessman 7.1
travel 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.4
black and white 93.2
clothing 92
person 89
man 74.5
gallery 62.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 3-9
Gender Male, 95.5%
Disgusted 0.7%
Confused 2.7%
Angry 11.4%
Sad 4.9%
Happy 53.6%
Surprised 0.5%
Calm 25.8%
Fear 0.4%

AWS Rekognition

Age 22-34
Gender Male, 54.7%
Angry 45%
Calm 45.5%
Sad 54.4%
Happy 45%
Fear 45%
Confused 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 25-39
Gender Male, 53.9%
Fear 45.1%
Calm 45.2%
Disgusted 45.1%
Angry 45.1%
Surprised 45%
Happy 45.2%
Sad 54.2%
Confused 45.1%

AWS Rekognition

Age 25-39
Gender Female, 51.4%
Disgusted 45%
Sad 45.5%
Confused 45.1%
Happy 45.1%
Fear 45.1%
Surprised 45.2%
Calm 50.7%
Angry 48.4%

AWS Rekognition

Age 39-57
Gender Female, 50.7%
Fear 45.1%
Happy 51.3%
Calm 47.1%
Surprised 45.2%
Disgusted 45.6%
Angry 45.3%
Sad 45.3%
Confused 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 87.3%
Coat 78.7%
Bird 66.9%
Tie 62.3%
Poster 55.2%

Categories