Human Generated Data

Title

Untitled (old woman receiving award in living room)

Date

August 1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18091

Human Generated Data

Title

Untitled (old woman receiving award in living room)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

August 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18091

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.7
Person 98.7
Sitting 98.5
Person 97.9
Apparel 97.5
Clothing 97.5
Person 94.6
Coat 77.1
Overcoat 77.1
Accessories 71.4
Accessory 71.4
Tie 71.4
Furniture 68.3
Couch 67.6
Suit 66.9
Electronics 64.9
Screen 64.9
Indoors 64.4
Room 64.4
Crowd 64.1
Floor 62.4
Monitor 61.8
Display 61.8
People 59.2
Face 58.7
Female 57.4
Footwear 57.1
Shoe 57.1
Evening Dress 56.4
Gown 56.4
Fashion 56.4
Robe 56.4
Bar Counter 55.8
Pub 55.8

Clarifai
created on 2019-11-16

people 99.9
group 98.6
woman 98.5
adult 97.8
wear 95.9
man 95.5
outfit 94.1
actress 93.8
two 91.9
group together 91.7
dress 90.4
actor 89.4
wedding 89.4
chair 88.4
music 88.4
movie 87.6
furniture 85.7
administration 84.7
leader 84.6
child 83.6

Imagga
created on 2019-11-16

man 25.5
people 22.9
clothing 20.6
person 19.6
fashion 18.1
black 16.2
adult 16
military uniform 15.1
musical instrument 14.7
uniform 14.4
male 14.2
women 13.4
men 12.9
business 12.8
style 12.6
city 12.5
covering 11.9
wind instrument 11.8
outfit 11.7
dark 11.7
group 11.3
art 11.1
musician 11
performer 10.6
urban 10.5
mask 9.8
fun 9.7
businessman 9.7
sexy 9.6
world 9.2
silhouette 9.1
standing 8.7
youth 8.5
portrait 8.4
studio 8.4
consumer goods 8.3
back 8.3
human 8.2
music 8.2
dress 8.1
body 8
hair 7.9
accordion 7.8
model 7.8
play 7.8
sitting 7.7
hand 7.6
legs 7.5
vintage 7.4
street 7.4
protection 7.3
sensuality 7.3
team 7.2
oboe 7.1
face 7.1
singer 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 99.8
clothing 95.9
dress 91.8
text 90
woman 85.7
people 67.6
smile 58.1
wedding 51.8
man 51.4
dressed 47.4
old 42.7
clothes 25.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 55-73
Gender Female, 51.2%
Angry 45.3%
Happy 45.1%
Disgusted 45.1%
Fear 45.1%
Confused 46.5%
Surprised 45.2%
Calm 51.9%
Sad 45.9%

AWS Rekognition

Age 19-31
Gender Male, 85.8%
Fear 0.3%
Calm 7.7%
Angry 17%
Disgusted 0.3%
Confused 0.4%
Surprised 0.1%
Sad 73.8%
Happy 0.4%

AWS Rekognition

Age 30-46
Gender Female, 92.3%
Disgusted 5.3%
Angry 2.5%
Fear 1.2%
Happy 12.4%
Confused 2.3%
Sad 65.8%
Calm 10%
Surprised 0.5%

Microsoft Cognitive Services

Age 69
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Tie 71.4%
Suit 66.9%
Shoe 57.1%

Categories