Human Generated Data

Title

Untitled (women watching man with snake around his neck)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10630

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women watching man with snake around his neck)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10630

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.1
Clothing 98.6
Apparel 98.6
Person 96.6
Hat 90.2
Female 83.1
Door 76.7
Shorts 73.5
Face 72.4
Dress 66.8
Woman 64.7
Girl 63.8
People 63.6
Shelter 63
Nature 63
Outdoors 63
Countryside 63
Rural 63
Building 63
Urban 59
Photography 55.7
Photo 55.7

Clarifai
created on 2023-10-25

people 99.8
adult 97.2
monochrome 96.6
woman 96.3
two 96.2
administration 94.7
three 94.5
group 93.8
child 93.8
group together 93.6
furniture 89.7
man 89.5
room 87.9
war 86.3
one 84.1
sit 82.2
wear 81.1
leader 80.7
boy 79.8
four 78.1

Imagga
created on 2022-01-09

newspaper 24.4
man 21.5
people 20.6
male 19.2
product 17.6
barbershop 17.2
person 16.9
home 15.9
adult 15.6
chair 14.9
shop 13.8
creation 13.6
old 13.2
smiling 12.3
couple 12.2
portrait 11.6
smile 11.4
happy 11.3
sitting 11.2
mercantile establishment 11
barber chair 10.9
room 10.7
face 10.7
ancient 10.4
mature 10.2
seat 10.2
two 10.2
dress 9.9
hospital 9.9
family 9.8
hairdresser 9.8
lady 9.7
work 9.7
patient 9.7
sculpture 9.6
love 9.5
happiness 9.4
senior 9.4
city 9.1
art 9.1
vintage 9.1
lifestyle 8.7
building 8.6
house 8.4
human 8.2
life 7.8
mask 7.8
statue 7.7
men 7.7
health 7.6
casual 7.6
hand 7.6
wife 7.6
head 7.6
place of business 7.6
monument 7.5
care 7.4
alone 7.3
industrial 7.3
mother 7.2
history 7.2
romantic 7.1
nurse 7.1
working 7.1
medical 7.1
child 7.1
indoors 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.2
outdoor 95.3
clothing 83.5
person 81.4
black and white 77.3
grave 75
cemetery 54.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 99.3%
Happy 90.1%
Calm 3.8%
Surprised 3.8%
Sad 1%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 16-22
Gender Female, 96%
Happy 82.9%
Calm 10.1%
Sad 3.4%
Fear 1.2%
Surprised 0.9%
Angry 0.5%
Confused 0.5%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Hat 90.2%

Categories

Text analysis

Amazon

MEXICO
FROM MEXICO
MUNDI
34610
COATI MUNDI
FROM
COATI
BAUGER
RE
FLA RE
FLA
AGOUTI
VI33A2
MAGOY
DOENT

Google

COATI MUNDI
COATI
MUNDI