Human Generated Data

Title

Untitled (people on beach under palm trees)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8874

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people on beach under palm trees)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8874

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.2
Human 99.2
Person 99
Person 94.9
Nature 94.2
Outdoors 91.3
Building 85.5
Tree 74.5
Plant 74.5
Architecture 71.7
Shelter 70.3
Rural 70.3
Countryside 70.3
Weather 67.1
Urban 64.6
Ice 61.3
Housing 59.6
Tower 56.7
Spire 55.5
Steeple 55.5
Apparel 55.5
Clothing 55.5

Clarifai
created on 2023-10-26

people 99.8
group 99.1
many 98.4
adult 98.3
group together 95.6
art 95.4
monochrome 94.8
man 94.3
wear 94
woman 92.1
vehicle 90.2
furniture 89
child 88.3
no person 86.4
home 82.5
seat 82.4
several 82.2
administration 81.6
recreation 81.4
crowd 80.1

Imagga
created on 2022-01-15

fountain 25
sky 23.1
structure 21.6
stage 19.1
park 18.8
sunset 18
night 16.9
tree 15.3
silhouette 14.9
landscape 14.9
platform 14.7
water 14.7
black 14.4
sunrise 14.1
old 13.9
sun 13.7
light 13.5
architecture 13.4
city 13.3
building 12.9
cloud 12.9
travel 12.7
winter 10.2
dark 10
outdoor 9.9
dawn 9.7
negative 9.6
lighting 9.6
scene 9.5
smoke 9.3
power 9.2
piano 9.1
film 9
horizon 9
history 8.9
river 8.9
trees 8.9
fog 8.7
pollution 8.7
dusk 8.6
grand piano 8.5
evening 8.4
ocean 8.4
summer 8.4
color 8.3
outdoors 8.3
industrial 8.2
dirty 8.1
percussion instrument 7.8
sea 7.8
space 7.8
beach 7.7
grunge 7.7
lake 7.5
lights 7.4
vacation 7.4
street 7.4
scenery 7.2
holiday 7.2
apparatus 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.8
black and white 80.8
cemetery 76.6
grave 75.9
tree 62.7
sky 62.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 96.2%
Calm 87.4%
Sad 7%
Angry 1.7%
Surprised 1.1%
Confused 1%
Happy 0.8%
Disgusted 0.5%
Fear 0.5%

AWS Rekognition

Age 27-37
Gender Male, 91.8%
Happy 78.4%
Calm 15.3%
Sad 3.6%
Angry 0.8%
Disgusted 0.8%
Surprised 0.6%
Confused 0.5%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Male, 94.4%
Calm 95.5%
Sad 2.8%
Happy 1.1%
Fear 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft
created on 2022-01-15

an old photo of a person 61.3%
an old photo of a person 56.2%
old photo of a person 56.1%

Text analysis

Amazon

41102
JJS

Google

YTRA°2-XAGO
YTRA°2-XAGO