Human Generated Data

Title

Untitled (Sushine Springs Beach scene)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7922

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Sushine Springs Beach scene)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7922

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.3
Human 98.3
Person 97.6
Person 97.3
Person 97.1
Person 96.6
Person 96.1
Person 92.1
Person 91.5
Vessel 90.6
Transportation 90.6
Watercraft 90.6
Vehicle 90.6
Person 90.2
Person 89.3
Shorts 83.4
Clothing 83.4
Apparel 83.4
Person 80.2
Art 69.4
Water 68.9
People 68.6
Boat 63
Waterfront 58.3
Photography 58
Photo 58
Waterfowl 57.4
Animal 57.4
Bird 57.4
Pier 55.6
Dock 55.6
Port 55.6

Clarifai
created on 2023-10-25

people 99.9
group together 99.2
adult 98.8
group 98.6
many 98.2
watercraft 97.5
monochrome 96.1
vehicle 96.1
woman 95
man 94.5
child 93.5
recreation 92.9
rowboat 91.7
wear 91.3
military 86.2
several 86.1
transportation system 84.4
spectator 82.3
war 81.6
enjoyment 75.3

Imagga
created on 2022-01-09

billboard 26.2
sunset 26.1
silhouette 25.7
sky 25.1
beach 23.8
signboard 21.2
structure 21.2
water 19.3
landscape 18.6
ocean 18.5
sea 18.4
sun 16.1
travel 15.5
clouds 15.2
black 15
night 14.2
summer 14.1
dusk 13.3
tourism 13.2
sunrise 13.1
coast 12.6
people 12.3
evening 12.1
sand 12.1
negative 11.5
art 11
film 10.4
waves 10.2
architecture 10.2
man 10.1
vacation 9.8
stage 9.8
outdoors 9.7
male 9.2
orange 9.2
leisure 9.1
horizon 9
history 8.9
lake 8.8
building 8.7
dawn 8.7
light 8.7
wave 8.6
cloud 8.6
outdoor 8.4
famous 8.4
shore 8.4
old 8.4
city 8.3
park 8.2
recreation 8.1
boat 8.1
platform 7.9
scenic 7.9
twilight 7.8
men 7.7
dark 7.5
fun 7.5
monument 7.5
landmark 7.2
active 7.2
holiday 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.7
drawing 85.1
old 72.6
person 71.4
cartoon 56.9
sketch 56.2
vintage 46.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 85.5%
Sad 54.3%
Calm 43.6%
Happy 0.6%
Fear 0.6%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%
Surprised 0.1%

AWS Rekognition

Age 38-46
Gender Female, 86.1%
Calm 97.3%
Sad 0.9%
Happy 0.9%
Angry 0.4%
Disgusted 0.2%
Confused 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Female, 78.8%
Surprised 34.1%
Calm 33.2%
Happy 16.2%
Disgusted 9.7%
Confused 3.3%
Fear 1.4%
Sad 1.2%
Angry 0.9%

AWS Rekognition

Age 33-41
Gender Female, 60.9%
Sad 33.7%
Calm 20.7%
Fear 20.4%
Happy 15.5%
Angry 3.2%
Confused 2.8%
Disgusted 1.9%
Surprised 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Text analysis

Amazon

42898
M13--YT33-X

Google

428468
428468