Human Generated Data

Title

Untitled (Sushine Springs Beach scene)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7920

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Sushine Springs Beach scene)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7920

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 98.2
Person 98.1
Person 97.6
Person 96.5
Person 96.3
Person 94.4
Person 93.8
Person 86.4
Person 81.1
Art 80
Photography 62.4
Photo 62.4
People 61.6
Outdoors 60.6
Portrait 60.5
Face 60.5
Sculpture 59.7
Water 58.4
Nature 57.7
Painting 56.5

Clarifai
created on 2023-10-25

people 99.9
group 98.8
group together 98.4
adult 98.2
many 97.8
watercraft 97.3
monochrome 96
man 94.8
woman 94
vehicle 93.3
war 90.2
wear 89
military 88.8
rowboat 85.6
administration 84
recreation 83.1
art 82.2
several 81
crowd 79.8
child 79.2

Imagga
created on 2022-01-09

billboard 27.3
signboard 22.1
silhouette 20.7
structure 20.1
black 19.2
sky 19.2
water 15.3
sunset 15.3
art 14.9
beach 14.5
ocean 13.7
landscape 13.4
travel 13.4
people 12.8
old 11.8
vintage 11.6
tourism 11.5
sea 11.3
summer 10.9
vacation 10.6
grunge 10.2
night 9.8
fun 9.7
design 9.6
space 9.3
television 9.2
city 9.1
color 8.9
sun 8.9
cloud 8.6
screen 8.6
background 8.5
evening 8.4
dark 8.4
texture 8.3
pattern 8.2
retro 8.2
coast 8.1
man 8.1
building 8
holiday 7.9
architecture 7.8
wave 7.8
element 7.4
sand 7.3
child 7.3
group 7.3
active 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.2
person 80.8
old 72.2
clothing 56.2
drawing 50.8
vintage 42.3
posing 36.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 82.6%
Calm 86.8%
Happy 10.7%
Sad 0.8%
Confused 0.6%
Disgusted 0.4%
Angry 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 16-24
Gender Male, 67.9%
Calm 91.8%
Fear 4.8%
Surprised 1.2%
Sad 1%
Happy 0.6%
Angry 0.4%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Female, 54.6%
Calm 76.9%
Sad 8%
Angry 7%
Fear 4.7%
Disgusted 1.2%
Happy 0.9%
Confused 0.8%
Surprised 0.6%

AWS Rekognition

Age 27-37
Gender Female, 99.7%
Happy 35%
Sad 18.8%
Surprised 16.7%
Fear 10.9%
Disgusted 6.4%
Angry 5.2%
Calm 4.5%
Confused 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

42901

Google

42901 JI7--YT33A°2-- XAGON
42901
JI7--YT33A°2--
XAGON