Human Generated Data

Title

Untitled (woman chewing bubblegum, street scene)

Date

1970s

People

Artist: Leon Levinstein, American 1910 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, P2001.145

Human Generated Data

Title

Untitled (woman chewing bubblegum, street scene)

People

Artist: Leon Levinstein, American 1910 - 1988

Date

1970s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, P2001.145

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Hair 99.6
Human 99.5
Person 99.5
Person 97.6
Person 97.2
Clothing 87
Apparel 87
Shoe 66.6
Footwear 66.6

Clarifai
created on 2023-10-26

people 100
portrait 98.5
adult 98.2
man 97.7
wear 97.5
music 97.5
street 97.2
outfit 96.9
costume 96.7
group 95.2
two 94.2
musician 93.9
one 93.8
outerwear 93
woman 90
movie 87.2
actor 86.8
three 86.5
veil 85.8
leader 84.9

Imagga
created on 2022-01-22

man 30.9
male 27.7
person 26
people 23.4
armor 23.1
soldier 22.5
adult 22.2
protective covering 22.1
military 21.2
weapon 21
clothing 20.7
protection 20
body armor 18.9
mask 18.6
chain mail 18.5
war 17.3
black 15.8
gun 15.2
fashion 15.1
city 15
portrait 14.9
urban 14.8
men 14.6
safety 13.8
covering 13.7
danger 13.6
industrial 13.6
human 13.5
street 12.9
helmet 12.1
world 11.9
shield 11.7
warrior 11.7
breastplate 11.4
women 11.1
uniform 11
dress 10.8
style 10.4
armor plate 10.3
industry 10.2
sword 9.8
camouflage 9.8
two 9.3
power 9.2
statue 9.1
holding 9.1
dirty 9
outdoors 9
job 8.8
body 8.8
hair 8.7
head 8.4
old 8.4
costume 8.3
suit 8.1
musician 8.1
history 8
metal 8
business 7.9
battle 7.8
standing 7.8
disaster 7.8
army 7.8
rifle 7.7
outdoor 7.6
equipment 7.6
shop 7.6
dark 7.5
leisure 7.5
smoke 7.4
occupation 7.3
stylish 7.2
sexy 7.2
lifestyle 7.2
work 7.1
travel 7
architecture 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.6
clothing 98.7
outdoor 94.9
text 94.6
black and white 88
human face 87.1
woman 85.5
footwear 69.7
man 62.7
smile 57.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Male, 100%
Calm 85.6%
Angry 9%
Confused 2.6%
Fear 0.8%
Surprised 0.7%
Sad 0.7%
Happy 0.4%
Disgusted 0.2%

AWS Rekognition

Age 23-33
Gender Female, 99.8%
Calm 58.3%
Angry 11.4%
Disgusted 9.3%
Fear 7.7%
Happy 4.6%
Sad 4.1%
Surprised 3.1%
Confused 1.5%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 58
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 66.6%

Categories

Text analysis

Amazon

TEL