Human Generated Data

Title

Untitled (double reversed portrait of man standing against brick wall)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6002

Human Generated Data

Title

Untitled (double reversed portrait of man standing against brick wall)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6002

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clarifai
created on 2019-11-16

people 99.6
adult 97.3
man 95.2
group 94.9
room 94
furniture 93.1
woman 92.3
wear 89.6
child 88.8
one 86.7
indoors 85.9
two 85.5
chair 83.9
group together 82.2
vehicle 78.1
actor 76.9
music 76.4
boy 74.2
family 73.9
sit 73.5

Imagga
created on 2019-11-16

banjo 56.5
stringed instrument 47.8
musical instrument 36.5
room 21
toilet 19.1
old 15.3
man 14.8
dirty 14.5
black 14.4
window 13.3
people 12.3
building 11.9
person 11.6
urban 11.4
grunge 11.1
danger 10.9
dark 10.9
city 10.8
light 10.7
male 10.6
wall 10.3
street 10.1
vintage 9.9
art 9.1
one 9
mask 8.7
architecture 8.6
space 8.5
portrait 8.4
adult 8.4
future 8.4
house 8.4
silhouette 8.3
water 8
body 8
women 7.9
shovel 7.8
youth 7.7
outdoor 7.6
damaged 7.6
human 7.5
wheeled vehicle 7.4
sport 7.4
door 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.5
black and white 79.8
cartoon 77.5
clothing 56.8
footwear 55.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 31-47
Gender Male, 95.9%
Disgusted 1.3%
Calm 86.4%
Angry 1.7%
Confused 2.6%
Fear 2.2%
Sad 2%
Surprised 1.5%
Happy 2.2%

AWS Rekognition

Age 32-48
Gender Male, 54.8%
Confused 45%
Fear 45%
Calm 54.6%
Happy 45.1%
Sad 45.1%
Angry 45.1%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Male, 53.8%
Disgusted 45%
Calm 54.2%
Confused 45.1%
Sad 45.1%
Happy 45.2%
Angry 45.2%
Fear 45%
Surprised 45.1%

Microsoft Cognitive Services

Age 28
Gender Male

Feature analysis

Amazon

Person 98.1%

Categories

Imagga

paintings art 97.4%
interior objects 1.4%