Human Generated Data

Title

Untitled (women in pool with feet on inner tube)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7674

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women in pool with feet on inner tube)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7674

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.6
Human 98.6
Person 95.3
Water 90.5
Pool 78.5
Fish 67.3
Animal 67.3
Bird 65.8
Outdoors 64
Person 62.4
Nature 61.4

Clarifai
created on 2023-10-25

people 99.5
water 98
beach 96.2
child 95.6
one 94.8
art 93.9
monochrome 92.3
man 91.9
adult 91.8
two 91.4
sea 91.2
swimming 90.6
ocean 88.6
woman 88.3
wear 87
nude 86.8
recreation 86.7
action 85.1
river 84.6
swimming pool 83.9

Imagga
created on 2022-01-08

water 22.7
mousetrap 18.4
dark 18.4
trap 15.7
beach 15.2
sea 15
ocean 14.9
device 14.8
black 14.4
man 13.4
silhouette 13.2
chest 13
adult 13
light 12
sunset 11.7
television 11.6
body 11.2
people 11.1
person 11.1
fashion 10.5
sexy 10.4
box 10.3
container 10
night 9.8
male 9.2
outdoors 9.1
sensuality 9.1
art 8.6
model 8.5
travel 8.4
portrait 8.4
color 8.3
fun 8.2
one 8.2
sand 8.1
wet 8
posing 8
hair 7.9
fishing 7.7
happy 7.5
landscape 7.4
style 7.4
alone 7.3
coast 7.2
sky 7

Microsoft
created on 2022-01-08

text 99.2
water 97.5
swimming 87
black and white 85.8
fish 56
wave 24.6
picture frame 14.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 87.8%
Calm 74.2%
Sad 20.1%
Fear 2%
Happy 1.8%
Angry 0.6%
Disgusted 0.5%
Surprised 0.5%
Confused 0.3%

AWS Rekognition

Age 20-28
Gender Male, 92.2%
Calm 75%
Sad 8.5%
Happy 5.7%
Disgusted 5.1%
Confused 3.4%
Surprised 1.3%
Angry 0.8%
Fear 0.2%

Feature analysis

Amazon

Person 98.6%
Fish 67.3%
Bird 65.8%