Human Generated Data

Title

Untitled (young boy holding fish as seen through man's legs)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7992

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young boy holding fish as seen through man's legs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7992

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 97.6
Person 97.1
Acrobatic 83.9
Dance 83.4
Female 83
Blonde 79.7
Teen 79.7
Kid 79.7
Child 79.7
Girl 79.7
Woman 79.7
Leisure Activities 74.7
Dance Pose 69.4
Portrait 63.3
Face 63.3
Photography 63.3
Photo 63.3
Ballet 59.5
Clothing 58
Apparel 58
Art 57.7

Clarifai
created on 2023-10-25

people 99.8
child 99.8
monochrome 98.4
street 97
boy 96.9
girl 96.3
two 95.6
family 94
art 93.6
man 93.4
woman 93
wear 88.7
shadow 88.6
one 88.4
fun 87.9
adult 87.9
baby 87.2
dancing 86.3
portrait 85.5
wedding 84.3

Imagga
created on 2022-01-09

man 22.2
person 19.9
adult 16.8
column 16.5
male 15.7
silhouette 13.2
smoke 13
people 12.8
dirty 12.6
portrait 12.3
black 12.2
destruction 11.7
dark 11.7
mask 11.7
light 11.5
art 11.4
protection 10.9
sensuality 10.9
industrial 10.9
danger 10
dress 9.9
posing 9.8
statue 9.5
symbol 9.4
water 9.3
sport 9.2
travel 9.1
fashion 9
human 9
sexy 8.8
body 8.8
horror 8.7
world 8.6
model 8.6
grunge 8.5
beach 8.4
building 8.3
one 8.2
style 8.2
sunset 8.1
clothing 8
performer 7.9
device 7.8
nuclear 7.8
gas 7.7
attractive 7.7
structure 7.6
power 7.6
face 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.4
black and white 93.7
person 72.8
clothing 67
statue 64.8
monochrome 64.8
posing 38.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 88.5%
Calm 95.6%
Sad 4.1%
Angry 0.1%
Surprised 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.1%

Categories

Captions

Microsoft
created on 2022-01-09

an old photo of a person 72.7%
old photo of a person 69.7%
a person posing for a photo 62.5%

Text analysis

Amazon

4
MJ17--YT37A-A

Google

MJI7--YT3RA°2-A
89
MJI7--YT3RA°2-A 89