Human Generated Data

Title

Untitled (two photographs: man with chest of drawers on dolly, seen from behind; three boys in living room, blurred, all with arms crossed)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6065

Human Generated Data

Title

Untitled (two photographs: man with chest of drawers on dolly, seen from behind; three boys in living room, blurred, all with arms crossed)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6065

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Person 99.5
Human 99.5
Person 99.5
Apparel 98.1
Clothing 98.1
Person 97.9
Person 90.8
Person 90.1
Face 76.7
Coat 69.8
Overcoat 69.8
People 66.4
Pants 64.5
Wheel 62.2
Machine 62.2
Suit 60.6
Sleeve 60.4
Workshop 55.9
Screen 55.8
LCD Screen 55.8
Monitor 55.8
Electronics 55.8
Display 55.8

Clarifai
created on 2019-05-30

people 100
group 98.8
adult 98.2
woman 96.7
room 96.6
group together 96.3
man 96.1
two 95.1
furniture 93.9
child 92.4
wear 92
administration 90.4
street 89.8
four 89.4
three 88.8
war 88.8
actor 86.2
indoors 85.7
music 85.5
several 85.4

Imagga
created on 2019-05-30

shop 34.5
mercantile establishment 24.4
barbershop 21.6
people 20.6
man 19.5
city 19.1
stall 18.9
black 17.5
urban 16.6
men 16.3
place of business 16.3
window 14.7
passenger 14.4
adult 14.2
building 14.1
business 14
old 13.2
street 12.9
person 12.3
safety 12
walking 11.4
architecture 10.9
male 10.6
human 10.5
life 10.4
art 10.4
motion 10.3
industry 10.2
travel 9.9
interior 9.7
statue 9.6
vintage 9.1
industrial 9.1
one 9
shoe shop 8.8
walk 8.6
inside 8.3
tourism 8.2
clothing 8.2
establishment 8.2
sidewalk 8.1
worker 8
indoors 7.9
work 7.8
modern 7.7
crowd 7.7
grunge 7.7
fashion 7.5
blur 7.4
light 7.3
metal 7.2
dirty 7.2
religion 7.2
history 7.2
portrait 7.1
women 7.1
working 7.1
steel 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

clothing 98.4
person 97.7
man 87.1
footwear 85.2
black and white 60.5
store 54.8
old 45.2
posing 36.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 9-14
Gender Female, 53.6%
Happy 54.5%
Disgusted 45%
Confused 45%
Calm 45.1%
Angry 45%
Sad 45.3%
Surprised 45.1%

AWS Rekognition

Age 20-38
Gender Male, 53.6%
Disgusted 45.1%
Surprised 45.3%
Angry 45.6%
Sad 45.9%
Happy 45.1%
Confused 45.3%
Calm 52.7%

AWS Rekognition

Age 6-13
Gender Male, 50.7%
Disgusted 45.1%
Calm 46%
Confused 45.1%
Surprised 45.2%
Angry 45.3%
Sad 53.1%
Happy 45.2%

Microsoft Cognitive Services

Age 20
Gender Female

Feature analysis

Amazon

Person 99.5%
Wheel 62.2%