Human Generated Data

Title

Untitled (gorilla seated on swing; trainer on one knee next to gorilla)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4879

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (gorilla seated on swing; trainer on one knee next to gorilla)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Apparel 99.6
Clothing 99.6
Human 97.1
Person 97.1
Female 88.6
Dress 81.4
Face 77.9
Photo 71.4
Photography 71.4
Portrait 71.4
Woman 71.2
Girl 69
Person 68.1
Door 60
Kid 59.3
Blonde 59.3
Teen 59.3
Child 59.3
Fashion 57.6
Gown 57.6
Veil 55.5
Floor 55.4

Imagga
created on 2022-01-23

person 23.6
adult 22
people 21.7
man 21.5
face 20.6
portrait 20
hair 19
male 18.5
device 17.5
human 17.2
black 16.8
lifestyle 15.2
women 15
love 15
sexy 14.4
dress 14.4
groom 13.9
bride 13.4
pretty 13.3
wedding 12.9
fashion 12.8
happy 12.5
couple 11.3
men 11.2
two 11
model 10.9
harp 10.8
smile 10.7
attractive 10.5
body 10.4
happiness 10.2
cute 10
life 9.9
shower 9.8
modern 9.8
stringed instrument 9.5
youth 9.4
skin 9.3
head 9.2
makeup 9.1
city 9.1
musical instrument 9.1
mother 9.1
sensuality 9.1
one 8.9
romantic 8.9
luxury 8.6
blond 8.5
umbrella 8.3
mask 8.3
style 8.1
gorgeous 8.1
lady 8.1
room 8.1
celebration 8
look 7.9
child 7.8
day 7.8
megaphone 7.8
eyes 7.7
energy 7.6
fun 7.5
traditional 7.5
equipment 7.4
alone 7.3
music 7.2
bathroom 7.2
looking 7.2
acoustic device 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.3
text 97.7
window 91.5
black and white 86.9
wedding dress 84.9
bride 57.9

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 93.9%
Calm 99.9%
Fear 0%
Surprised 0%
Sad 0%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%

Captions

Microsoft

a person standing in front of a window 58.9%
a man and a woman standing in front of a window 29%
a person standing in front of a window 28.9%

Text analysis

Amazon

16035
16035.
a
.5E091
MAOOX-Y
2-NAMT2A3

Google

2-NAMTZA 16035. 16035.
16035.
2-NAMTZA