Human Generated Data

Title

Untitled (three pedestrians in profile)

Date

c. 1955 - c. 1975

People

Artist: Leon Levinstein, American 1910 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, P2001.144

Human Generated Data

Title

Untitled (three pedestrians in profile)

People

Artist: Leon Levinstein, American 1910 - 1988

Date

c. 1955 - c. 1975

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Apparel 100
Clothing 100
Person 99.4
Human 99.4
Person 99.3
Person 97.7
Face 96.4
Hat 85.8
Cap 83.4
Sleeve 81.3
Coat 70.4
Overcoat 69.2
Photography 67.4
Portrait 67.4
Photo 67.4
Baseball Cap 64.4
Man 62.2
Beanie 60.7

Imagga
created on 2022-01-22

person 39.9
man 38.3
male 30
black 28.8
people 27.9
adult 25.9
portrait 20.7
clothing 16.4
business 15.8
suit 15.1
fashion 15.1
men 14.6
professional 14
guy 13.5
businessman 13.2
model 13.2
face 12.8
dress 12.7
attractive 12.6
handsome 12.5
pose 11.8
dark 11.7
studio 11.4
couple 11.3
human 11.2
style 11.1
love 11.1
casual 11
lifestyle 10.8
hand 10.6
hat 10.6
looking 10.4
performer 10.3
standing 9.6
hands 9.6
expression 9.4
alone 9.1
one 9
attendant 8.9
office 8.8
jacket 8.8
indoors 8.8
world 8.6
comedian 8.6
tie 8.5
clothes 8.4
modern 8.4
pretty 8.4
mask 8.2
indoor 8.2
sexy 8
planner 8
posing 8
smiling 8
happiness 7.8
boy 7.8
corporate 7.7
shirt 7.7
stand 7.6
coat 7.5
action 7.4
life 7.3
room 7.3
exercise 7.3
success 7.2
body 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.5
text 99
clothing 97.5
human face 94.7
man 89.3
standing 85.8
black and white 72.9
white 67.4
smile 62
fashion accessory 60.8
hat 51.6

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 98.2%
Angry 0.6%
Sad 0.4%
Surprised 0.3%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Female, 98%
Calm 99.9%
Sad 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 53.1%
Happy 19%
Surprised 18.5%
Confused 2.8%
Sad 2.7%
Angry 1.8%
Disgusted 1.4%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a couple of people that are standing in front of a window 76.6%
a group of people standing in front of a window 76.5%
a couple of people that are standing in a room 76.4%

Text analysis

Amazon

VEHICLE
REP
FLIME
NO.
I.G.RO.

Google

HO.
1.G.RO.
VCHICL
VCHICL HO. 1.G.RO. FLIM
FLIM