Human Generated Data

Title

Untitled (unidentified man)

Date

c. 1945-c. 1965

People

Artist: Sol Libsohn, American 1914 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.89

Human Generated Data

Title

Untitled (unidentified man)

People

Artist: Sol Libsohn, American 1914 - 2001

Date

c. 1945-c. 1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.7
Person 99.7
Person 99.7
Person 98.9
Indoors 96.5
Room 96.5
Interior Design 96.4
Crowd 94
Audience 94
Person 91.7
Apparel 89.4
Overcoat 89.4
Suit 89.4
Clothing 89.4
Coat 89.4
Person 88.4
Court 68.9
Finger 66.8
Speech 66.8
Classroom 62.3
School 62.3
People 62.2
Lecture 56

Imagga
created on 2022-01-22

man 50.5
male 42.6
person 40.6
businessman 33.6
portrait 31.7
people 27.9
suit 27.4
business 26.1
adult 25.9
handsome 25
microphone 24.3
face 22
phone 21.2
office 20.9
looking 20.8
harmonica 20.7
professional 20.6
corporate 20.6
human 19.5
black 19.4
executive 19.4
one 17.9
manager 16.8
free-reed instrument 16.5
tie 16.1
mobile 16
world 16
call 15.6
men 15.5
expression 15.4
talking 15.2
communication 15.1
wind instrument 15
cellphone 14.6
happy 14.4
serious 14.3
look 14
shirt 14
smiling 13.8
confident 13.7
attractive 13.3
senior 13.1
mature 13
speaker 12.9
guy 12.7
old 12.6
groom 12.5
cell 12.4
telephone 12.4
smile 12.1
success 12.1
hair 11.9
work 11.8
lifestyle 11.6
holding 11.6
hand 11.4
sitting 11.2
casual 11
successful 11
alone 11
job 10.6
couple 10.5
eyes 10.3
scholar 10.3
love 10.3
happiness 10.2
musical instrument 10.1
dark 10
electrical device 9.9
modern 9.8
articulator 9.8
talk 9.6
eye 8.9
businesspeople 8.6
head 8.4
outdoors 8.2
intellectual 8.2
device 8.2
building 8
outside 7.7
jacket 7.7
close 7.4
beard 7.3
friendly 7.3
communicator 7.3
indoor 7.3
worker 7.1
model 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.9
man 99.6
human face 98.4
wall 96.9
indoor 95.8
text 95.1
glasses 94.5
clothing 92.2
looking 83.9
electronics 81
suit 79.3
staring 17.8
crowd 0.9

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 84.5%
Surprised 12.9%
Fear 0.8%
Sad 0.5%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Female, 64.1%
Calm 82.7%
Sad 5%
Disgusted 4.7%
Surprised 2.2%
Fear 2.2%
Happy 1.6%
Angry 1.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Suit 89.4%

Captions

Microsoft

a man standing in front of a mirror 75%
a man standing in front of a mirror posing for the camera 63.7%
a man that is standing in front of a mirror 63.6%