Human Generated Data

Title

Untitled (four women in pool with heads on inner tube)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7683

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women in pool with heads on inner tube)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7683

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 90.8
Human 90.8
Monitor 90
Display 90
Electronics 90
Screen 90
Outdoors 87.5
Water 87.3
Person 75.6
Painting 64.4
Art 64.4
TV 61.7
Television 61.7
Person 59.6
Nature 58.3

Clarifai
created on 2023-10-25

monochrome 98
art 94.6
people 93.7
texture 93.2
man 92.4
sea 90.7
abstract 90.3
sand 90.3
nature 89.4
desert 88.1
wear 87.3
black and white 85.7
ocean 85.6
beach 83.7
abstraction 83.4
rock 83.2
pattern 82.8
shadow 82.6
dark 81
no person 80.9

Imagga
created on 2022-01-09

device 37.8
black 27.6
sea turtle 18
man 17.5
fan blade 16.6
model 16.3
sexy 15.3
instrument 15.1
business 14.6
binoculars 14.1
person 13.9
hand 13.7
fashion 13.6
key 13.5
male 13.5
turtle 13.4
blade 13.4
adult 12.9
clothing 12
optical instrument 11.2
people 11.2
style 11.1
portrait 11
face 10.6
human 10.5
one 10.4
rotating mechanism 10
close 9.7
technology 9.6
covering 9.6
mechanism 9.3
attractive 9.1
equipment 9
digital 8.9
posing 8.9
money 8.5
pretty 8.4
metal 8
silver 8
men 7.7
modern 7.7
expression 7.7
electric fan 7.6
studio 7.6
mask 7.6
camera 7.5
guy 7.5
gold 7.4
musical instrument 7.3
shirt 7.3
dress 7.2
lifestyle 7.2
looking 7.2
body 7.2

Microsoft
created on 2022-01-09

text 99.2
book 93.8
black and white 70.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 77.8%
Calm 83.4%
Surprised 15.7%
Fear 0.5%
Happy 0.2%
Disgusted 0.1%
Sad 0.1%
Angry 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Female, 99.6%
Sad 84.3%
Calm 13.2%
Fear 1.4%
Angry 0.4%
Surprised 0.2%
Happy 0.2%
Disgusted 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 90.8%
Painting 64.4%

Captions

Text analysis

Amazon

NEW
NEW YO
YO
S

Google

NIVES NEW YO
NIVES
NEW
YO