Human Generated Data

Title

Untitled (unidentified man)

Date

c. 1945-c. 1965

People

Artist: Sol Libsohn, American 1914 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.89

Human Generated Data

Title

Untitled (unidentified man)

People

Artist: Sol Libsohn, American 1914 - 2001

Date

c. 1945-c. 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.89

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.7
Person 98.9
Room 96.5
Indoors 96.5
Interior Design 96.4
Audience 94
Crowd 94
Person 91.7
Suit 89.4
Clothing 89.4
Coat 89.4
Overcoat 89.4
Apparel 89.4
Person 88.4
Court 68.9
Finger 66.8
Speech 66.8
Classroom 62.3
School 62.3
People 62.2
Lecture 56

Clarifai
created on 2023-10-26

people 99.9
adult 99
portrait 98.8
group 98.7
leader 98.6
man 97.9
administration 97.1
microphone 94.7
chair 92.2
league 91.1
room 90.2
two 90
one 89.3
speaker 89.2
meeting 87.4
furniture 87.1
stump 86.7
three 86.1
group together 83.1
politician 82.7

Imagga
created on 2022-01-22

man 50.5
male 42.6
person 40.6
businessman 33.6
portrait 31.7
people 27.9
suit 27.4
business 26.1
adult 25.9
handsome 25
microphone 24.3
face 22
phone 21.2
office 20.9
looking 20.8
harmonica 20.7
professional 20.6
corporate 20.6
human 19.5
black 19.4
executive 19.4
one 17.9
manager 16.8
free-reed instrument 16.5
tie 16.1
mobile 16
world 16
call 15.6
men 15.5
expression 15.4
talking 15.2
communication 15.1
wind instrument 15
cellphone 14.6
happy 14.4
serious 14.3
look 14
shirt 14
smiling 13.8
confident 13.7
attractive 13.3
senior 13.1
mature 13
speaker 12.9
guy 12.7
old 12.6
groom 12.5
cell 12.4
telephone 12.4
smile 12.1
success 12.1
hair 11.9
work 11.8
lifestyle 11.6
holding 11.6
hand 11.4
sitting 11.2
casual 11
successful 11
alone 11
job 10.6
couple 10.5
eyes 10.3
scholar 10.3
love 10.3
happiness 10.2
musical instrument 10.1
dark 10
electrical device 9.9
modern 9.8
articulator 9.8
talk 9.6
eye 8.9
businesspeople 8.6
head 8.4
outdoors 8.2
intellectual 8.2
device 8.2
building 8
outside 7.7
jacket 7.7
close 7.4
beard 7.3
friendly 7.3
communicator 7.3
indoor 7.3
worker 7.1
model 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.9
man 99.6
human face 98.4
wall 96.9
indoor 95.8
text 95.1
glasses 94.5
clothing 92.2
looking 83.9
electronics 81
suit 79.3
staring 17.8
crowd 0.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 84.5%
Surprised 12.9%
Fear 0.8%
Sad 0.5%
Confused 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Female, 64.1%
Calm 82.7%
Sad 5%
Disgusted 4.7%
Surprised 2.2%
Fear 2.2%
Happy 1.6%
Angry 1.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Suit 89.4%

Categories