Human Generated Data

Title

Untitled (young boy holding fish as seen through man's legs)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7994

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young boy holding fish as seen through man's legs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7994

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Shorts 99
Clothing 99
Apparel 99
Human 96.9
Person 96.9
Person 82.8
Back 62
Playground 57.6
Play Area 57.6
Toy 56.7
Flooring 56.1

Clarifai
created on 2023-10-25

people 99.9
wedding 99.4
bride 99.4
monochrome 98.6
woman 98.2
child 97.7
man 96.1
two 96
girl 95.9
dress 95.8
adult 95.4
street 94.8
veil 94.8
groom 93.1
wear 91.9
dancing 91.8
portrait 91.7
art 88.3
dancer 87.2
bridal 86.8

Imagga
created on 2022-01-09

swing 24.6
adult 22
plaything 20.2
mechanical device 20.1
portrait 18.8
sexy 18.5
attractive 18.2
people 17.8
person 17.6
man 16.8
body 16.8
fashion 16.6
black 15.1
dark 15
mechanism 14.9
dress 14.5
male 14.3
model 14
wall 12.7
posing 12.4
world 12.2
lady 12.2
one 11.9
hair 11.9
sensuality 11.8
mosquito net 11.8
face 11.4
guillotine 11.3
style 11.1
love 11
skin 11
lifestyle 10.8
sport 10.8
water 10.7
happy 10.6
silhouette 9.9
device 9.9
wet 9.8
column 9.8
pretty 9.8
cool 9.8
human 9.7
protective covering 9.7
building 9.6
women 9.5
erotic 9.4
sitting 9.4
casual 9.3
instrument of execution 9.2
studio 9.1
light 9.1
pose 9.1
fitness 9
child 9
architecture 8.8
rain 8.5
art 8.5
vintage 8.3
sensual 8.2
splashes 7.8
naked 7.7
performer 7.7
concrete 7.7
seductive 7.6
covering 7.6
passion 7.5
dancer 7.4
support 7.4
alone 7.3
gorgeous 7.2
dirty 7.2
active 7.2
step 7.1
happiness 7
modern 7

Microsoft
created on 2022-01-09

text 99.2
outdoor 91.4
black and white 90.4
person 90.1
clothing 90
fog 51.8
posing 46.4
old 43.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 87.5%
Calm 29.9%
Surprised 28.9%
Angry 12.6%
Fear 10.3%
Happy 5.8%
Sad 5.5%
Disgusted 3.6%
Confused 3.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%

Categories

Captions

Microsoft
created on 2022-01-09

an old photo of a person 79.2%
old photo of a person 77.2%
a man and woman posing for a photo 33.4%

Text analysis

Amazon

88
MJI7--YT37A*2 4 29 88
MJI7--YT37A*2
4 29

Google

MJI7--YT3RA°2 N 88
N
MJI7--YT3RA°2
88