Human Generated Data

Title

Untitled (two photographs: studio portrait of man standing behind woman seated in chair with fur stole; studio portrait of three girls in black dresses and large white hair bows)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6108

Human Generated Data

Title

Untitled (two photographs: studio portrait of man standing behind woman seated in chair with fur stole; studio portrait of three girls in black dresses and large white hair bows)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6108

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 98.6
Person 98.5
Person 98.5
Person 97
Clothing 95.4
Coat 95.4
Apparel 95.4
Footwear 83.1
Shoe 83.1
Worker 78.8
People 78.3
Overcoat 75.5
Performer 72.6
Suit 70.9
Photo 64.2
Photography 64.2
Chair 63.8
Furniture 63.8
Hairdresser 63.1
Portrait 62.8
Face 62.8

Clarifai
created on 2019-11-16

people 99.9
group 99.1
woman 98
man 96.8
room 95.4
adult 95.1
music 94.8
many 93.9
furniture 93.6
family 93.3
movie 92.7
wear 92
child 91.3
group together 90.1
theater 89.7
indoors 89.5
actor 87.4
outfit 87.2
several 86.9
opera 86.8

Imagga
created on 2019-11-16

musical instrument 31
people 22.3
person 21.1
man 20.2
stringed instrument 19.3
kin 19.2
sax 17.9
wind instrument 15.8
male 15.6
adult 15
women 14.2
black 14.1
business 14
keyboard instrument 13.3
silhouette 13.2
businessman 12.4
window 12.2
fashion 12.1
room 12
men 12
urban 11.4
office 11.3
bowed stringed instrument 11.3
sexy 11.2
city 10.8
light 10.7
cello 10.7
accordion 10.6
body 10.4
outfit 10.3
sitting 10.3
dark 10
building 9.9
old 9.8
interior 9.7
chair 9.4
percussion instrument 9.4
portrait 9.1
human 9
one 9
style 8.9
couple 8.7
clothing 8.5
suit 8.2
indoor 8.2
dirty 8.1
looking 8
working 8
hair 7.9
love 7.9
walking 7.6
world 7.6
symbol 7.4
piano 7.4
lady 7.3
protection 7.3
danger 7.3
group 7.3
art 7.2
posing 7.1
architecture 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 95.3
wall 95.2
text 92.8
person 90.3
footwear 84.1
standing 84.1
old 78
white 75.8
black 75.5
man 74.3
posing 60.8
coat 52.1
clothes 20

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 8-18
Gender Female, 54.6%
Fear 45.1%
Confused 45.1%
Happy 45%
Calm 49.4%
Surprised 45%
Sad 50.2%
Angry 45.2%
Disgusted 45%

AWS Rekognition

Age 12-22
Gender Female, 53.2%
Calm 54.7%
Angry 45.1%
Disgusted 45%
Happy 45%
Sad 45.2%
Confused 45%
Fear 45%
Surprised 45%

AWS Rekognition

Age 9-19
Gender Female, 54.8%
Angry 45.1%
Calm 51%
Fear 45%
Disgusted 45%
Surprised 45%
Happy 45%
Sad 48.8%
Confused 45.1%

AWS Rekognition

Age 17-29
Gender Male, 52.5%
Disgusted 45.1%
Fear 45.1%
Confused 45.4%
Calm 47.9%
Surprised 45.1%
Angry 51.2%
Sad 45.1%
Happy 45%

AWS Rekognition

Age 22-34
Gender Male, 54.7%
Disgusted 45%
Surprised 45%
Happy 45%
Sad 45%
Angry 45%
Fear 45%
Calm 55%
Confused 45%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Coat 95.4%
Shoe 83.1%
Chair 63.8%