Human Generated Data

Title

Untitled (two photographs: studio portrait of woman holding flowers, seen from rear three-quarter angle; studio portrait of two women seated around standing woman)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6067

Human Generated Data

Title

Untitled (two photographs: studio portrait of woman holding flowers, seen from rear three-quarter angle; studio portrait of two women seated around standing woman)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6067

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 99.3
Person 99.3
Person 98.9
Person 98.7
Person 97.6
Apparel 94.7
Clothing 94.7
Shop 82.8
Window Display 81.2
Coat 70
Overcoat 70
People 68.6
Suit 57.8

Clarifai
created on 2019-05-30

people 100
group 99.3
wear 98.8
adult 97.7
outfit 97.4
uniform 96.1
woman 95.8
man 95.8
military 95.7
music 94.9
group together 94.3
furniture 93.3
room 93.2
musician 93.2
military uniform 93
administration 92
actress 90.9
several 90.3
medical practitioner 88.9
two 88

Imagga
created on 2019-05-30

musical instrument 34.4
brass 31.8
wind instrument 31.3
people 20.6
man 19.5
fashion 16.6
adult 16
black 15.1
person 15
old 14.6
male 14.2
interior 14.1
window 14
dress 13.5
cornet 12.6
room 12
men 12
kin 11.9
style 11.9
portrait 11.6
couple 11.3
business 10.9
model 10.9
vintage 10.7
outfit 10.5
women 10.3
elegance 10.1
chair 10
stringed instrument 9.9
building 9.8
human 9.7
sexy 9.6
hair 9.5
device 9.4
happiness 9.4
indoor 9.1
trombone 9
posing 8.9
businessman 8.8
urban 8.7
wall 8.5
culture 8.5
art 8.5
pretty 8.4
happy 8.1
ancient 7.8
banjo 7.8
modern 7.7
attractive 7.7
bride 7.7
dark 7.5
clothes 7.5
city 7.5
clothing 7.4
silhouette 7.4
holding 7.4
inside 7.4
office 7.3
smile 7.1
indoors 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

wall 97.7
clothing 96.3
person 96.1
man 75.9
old 74.1
sketch 67.8
gallery 62.5
drawing 60.4
footwear 54.3
store 30.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 52.8%
Disgusted 45.1%
Happy 45.1%
Sad 45.4%
Calm 53.1%
Angry 45.2%
Surprised 45.5%
Confused 45.6%

AWS Rekognition

Age 20-38
Gender Female, 54.9%
Disgusted 45%
Calm 54.6%
Confused 45.1%
Surprised 45.1%
Angry 45.1%
Sad 45.1%
Happy 45%

AWS Rekognition

Age 20-38
Gender Female, 53.9%
Happy 45.2%
Surprised 45.6%
Angry 45.7%
Disgusted 45.2%
Sad 47.1%
Confused 45.7%
Calm 50.5%

AWS Rekognition

Age 20-38
Gender Male, 52%
Calm 45.8%
Sad 52.9%
Angry 45.8%
Confused 45.2%
Disgusted 45.1%
Surprised 45.1%
Happy 45%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

interior objects 98.7%