Human Generated Data

Title

Untitled (studio portrait of young girl in black outfit with white belt, leaning back against chair)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6059

Human Generated Data

Title

Untitled (studio portrait of young girl in black outfit with white belt, leaning back against chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6059

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Clothing 98.5
Apparel 98.5
Human 97.5
Person 97.5
Chair 94.9
Furniture 94.9
Poster 93.8
Advertisement 93.8
Collage 93.8
Person 93.6
Female 86.4
Person 83.5
Overcoat 80.7
Coat 80.7
Woman 73.1
Dress 68.3
Military Uniform 65.2
Military 65.2
Suit 61.3
Photography 55.3
Face 55.3
Photo 55.3
Portrait 55.3

Clarifai
created on 2019-05-30

people 99.9
woman 97.7
wedding 96.4
child 95.7
wear 95.4
adult 94.5
furniture 94
movie 93.2
actress 92.8
group 92.7
monochrome 92.6
man 92.5
room 91.8
dress 91.8
seat 91.6
chair 90.5
veil 89.9
portrait 88.5
one 88.5
actor 88.2

Imagga
created on 2019-05-30

wind instrument 45.6
cornet 45.1
musical instrument 43.2
brass 42
black 24.2
person 20.4
accordion 20
man 18.8
adult 17.5
keyboard instrument 16.3
portrait 16.2
fashion 15.8
people 15.1
male 14.9
dark 14.2
mask 13.4
posing 13.3
style 12.6
clothing 12.5
city 12.5
device 12.4
sexy 12
urban 11.4
elegance 10.9
dress 10.8
silhouette 10.8
one 10.4
sitting 9.4
model 9.3
bride 9.1
sax 9.1
sensuality 9.1
dance 8.8
couple 8.7
art 8.7
elegant 8.6
old 8.4
action 8.3
human 8.2
body 8
pretty 7.7
studio 7.6
fun 7.5
danger 7.3
pose 7.2
dirty 7.2
hair 7.1
night 7.1
weapon 7.1
modern 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

wedding dress 94.3
dress 94.1
clothing 93.7
person 89.6
bride 87.4
black and white 85
woman 81.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-7
Gender Female, 81.2%
Angry 0.6%
Calm 96.9%
Confused 0.4%
Disgusted 0.3%
Happy 0.5%
Sad 0.8%
Surprised 0.5%

AWS Rekognition

Age 26-43
Gender Female, 51.5%
Angry 45.2%
Surprised 45.1%
Confused 45.1%
Calm 53.9%
Happy 45%
Disgusted 45.1%
Sad 45.7%

AWS Rekognition

Age 23-38
Gender Male, 54.9%
Calm 53.6%
Disgusted 45.1%
Angry 45.2%
Happy 45.2%
Surprised 45.1%
Sad 45.5%
Confused 45.2%

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%
Chair 94.9%