Human Generated Data

Title

Romana Javitz and an unidentified woman [either at the New York Public Library or the residence of Ben Shahn and Bernarda Bryson Shahn, Roosevelt, New Jersey]

Date

June 1964

People

Artist: Sol Libsohn, American 1914 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.77

Human Generated Data

Title

Romana Javitz and an unidentified woman [either at the New York Public Library or the residence of Ben Shahn and Bernarda Bryson Shahn, Roosevelt, New Jersey]

People

Artist: Sol Libsohn, American 1914 - 2001

Date

June 1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.77

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Piano 98.9
Musical Instrument 98.9
Leisure Activities 98.9
Person 98.2
Officer 82.7
Military Uniform 82.7
Military 82.7
Text 82.1
Clothing 78.1
Apparel 78.1
Overcoat 71
Coat 71

Clarifai
created on 2023-10-26

people 100
adult 98.9
administration 97.2
portrait 96.8
two 96.5
wear 95.9
man 95.2
three 93.3
furniture 93.3
group 93
woman 91.7
leader 90
one 89.9
outfit 89.3
room 88.7
concentration 87.2
war 87.1
group together 82.9
monochrome 82.9
medical practitioner 82.4

Imagga
created on 2022-01-22

military uniform 100
uniform 100
clothing 95.1
covering 65.5
consumer goods 65.4
commodity 32.1
man 31.6
male 25.5
people 25.1
work 22.7
adult 18.7
person 18.4
suit 17.1
portrait 15.5
business 15.2
job 15
black 15
home 14.3
smile 14.2
attractive 14
men 13.7
lifestyle 13.7
happy 13.1
professional 12.6
working 12.4
smiling 12.3
corporate 11.2
love 11
holding 10.7
fashion 10.5
human 10.5
pretty 10.5
helmet 10.2
occupation 10.1
laptop 10
worker 9.9
handsome 9.8
sexy 9.6
hand 9.1
office 8.8
computer 8.8
body 8.8
statue 8.6
face 8.5
room 8.4
one 8.2
equipment 8.2
businesswoman 8.2
industrial 8.2
looking 8
couple 7.8
happiness 7.8
model 7.8
industry 7.7
old 7.7
house 7.5
dark 7.5
technology 7.4
executive 7.4
success 7.2
dress 7.2
women 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.4
text 98.8
clothing 97.7
wall 96.4
man 92.3
posing 71.6
black and white 62.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 98.6%
Sad 97.1%
Calm 2.2%
Fear 0.3%
Angry 0.3%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Piano 98.9%

Categories

Text analysis

Amazon

P