Human Generated Data

Title

Untitled (trainer seated with gorilla in cage)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7093

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (trainer seated with gorilla in cage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.2
Human 99.2
Person 99.1
Clothing 78.4
Apparel 78.4
Face 63.6
Appliance 56.8
Dishwasher 56.8
Door 55.3

Imagga
created on 2021-12-15

portrait 27.2
person 26.9
man 25.5
male 23.5
people 20.6
love 20.5
grandfather 20.3
adult 20.1
face 19.2
home 19.1
lifestyle 18.1
newspaper 17.9
hair 17.4
happy 16.3
couple 15.7
shower cap 15.6
smile 13.5
family 13.3
negative 13.3
senior 13.1
clothing 13
attractive 12.6
smiling 12.3
product 12.2
cap 12.1
black 12
human 12
pretty 11.9
skin 11.8
happiness 11.7
film 11.5
married 11.5
husband 11.4
patient 11.3
looking 11.2
mature 11.2
mother 11.1
blond 10.5
child 10.5
women 10.3
relaxation 10
girls 10
care 9.9
creation 9.6
together 9.6
body 9.6
headdress 9.4
dress 9
one 9
photographic paper 8.9
sibling 8.9
hug 8.7
eyes 8.6
loving 8.6
wife 8.5
expression 8.5
room 8.4
old 8.4
house 8.4
hand 8.4
parent 8.2
lady 8.1
sexy 8
cute 7.9
spectator 7.8
bonding 7.8
retired 7.8
sitting 7.7
retirement 7.7
bride 7.7
casual 7.6
hospital 7.6
bed 7.6
females 7.6
fashion 7.5
covering 7.5
fun 7.5
vintage 7.4
back 7.3
sensuality 7.3
grandma 7.2
gray 7.2
kid 7.1
daughter 7.1
indoors 7
look 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.9
person 97.2
man 95.2
human face 91
clothing 82.3
black and white 70

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 93.5%
Surprised 63.7%
Calm 12.4%
Confused 8%
Angry 6.8%
Fear 4.5%
Happy 2.7%
Disgusted 0.9%
Sad 0.9%

AWS Rekognition

Age 13-25
Gender Female, 60.7%
Sad 42.2%
Calm 38.9%
Happy 16.9%
Confused 0.7%
Surprised 0.4%
Fear 0.4%
Angry 0.3%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man sitting on a newspaper 58.1%
a man in a newspaper 58%
a man standing in front of a newspaper 55.9%

Text analysis

Amazon

16130.
a
16130

Google

16130. XAGON-YT3RA2- MAMTZA 3 16130•
XAGON-YT3RA2-
MAMTZA
3
16130.
16130•