Human Generated Data

Title

Untitled (two photographs: studio portrait of man in chair with book, woman standing; studio portrait of girl with baby doll in carriage)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6064

Human Generated Data

Title

Untitled (two photographs: studio portrait of man in chair with book, woman standing; studio portrait of girl with baby doll in carriage)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6064

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Furniture 100
Chair 100
Person 99.2
Human 99.2
Person 98.3
Person 96.1
Clothing 93.6
Apparel 93.6
Person 91
Coat 76.4
Footwear 68.4
Shoe 68.4
Face 67.9
Portrait 61.9
Photography 61.9
Photo 61.9
Indoors 61.7
Overcoat 56.8
Bicycle 56.4
Vehicle 56.4
Transportation 56.4
Bike 56.4
Room 55.1

Clarifai
created on 2019-05-30

people 99.9
group 99.5
woman 99
adult 98
music 96.7
man 95.9
wear 95.7
room 95.1
furniture 95
musician 92.2
group together 91.8
child 91.4
movie 89.3
many 88.6
family 88.3
actress 87.9
portrait 87.7
stage 87.3
outfit 86.5
three 86.1

Imagga
created on 2019-05-30

man 27
people 23.4
male 22.7
business 22.5
black 21.8
person 21.3
businessman 21.2
adult 20.8
office 16.9
suit 16.1
attractive 14.7
laptop 13.6
portrait 13.6
fashion 13.6
briefcase 13.1
couple 13.1
chair 12.9
sexy 12.9
love 12.6
model 12.4
room 12.4
corporate 12
professional 11.9
women 11.9
silhouette 11.6
executive 11.4
lady 11.4
passion 11.3
looking 11.2
style 11.1
job 10.6
computer 10.5
one 10.5
boy 10.4
career 10.4
body 10.4
sitting 10.3
work 10.2
lifestyle 10.1
indoor 10
human 9.7
success 9.7
tie 9.5
men 9.4
perfume 9.3
support 9.3
elegance 9.2
device 9.2
dark 9.2
alone 9.1
pretty 9.1
posing 8.9
working 8.8
happy 8.8
hair 8.7
clothing 8.7
elegant 8.6
relax 8.4
world 8.3
sensual 8.2
businesswoman 8.2
light 8
smile 7.8
window 7.8
youth 7.7
boss 7.7
toiletry 7.6
call 7.6
telephone 7.3
dress 7.2
romance 7.1
face 7.1
interior 7.1
happiness 7.1
building 7

Google
created on 2019-05-30

Photograph 95.3
Black-and-white 78.1
Photography 70.6
Art 67.7
Room 65.7
Gentleman 57.3
Visual arts 55
Sitting 54.1

Microsoft
created on 2019-05-30

clothing 97.2
person 94.9
black and white 94.6
furniture 89
man 88.1
footwear 83.5
chair 70.1
posing 61.2
suit 57.3
monochrome 54.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-18
Gender Female, 51.7%
Happy 45%
Disgusted 45%
Surprised 45.1%
Confused 45.1%
Sad 45.5%
Angry 45.1%
Calm 54.1%

AWS Rekognition

Age 20-38
Gender Male, 55%
Calm 53.7%
Sad 45.3%
Angry 45.2%
Confused 45.5%
Disgusted 45%
Surprised 45.1%
Happy 45.1%

AWS Rekognition

Age 17-27
Gender Male, 54.5%
Surprised 45.3%
Disgusted 45.2%
Angry 45.9%
Sad 46.3%
Happy 45%
Calm 51.8%
Confused 45.4%

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Disgusted 45%
Surprised 45.2%
Sad 50.1%
Happy 45%
Calm 49%
Confused 45.2%
Angry 45.4%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 68.4%
Bicycle 56.4%

Categories

Imagga

interior objects 70.3%
food drinks 29%