Human Generated Data

Title

Untitled (family fishing from jetty)

Date

c. 1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8081

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family fishing from jetty)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8081

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.6
Person 98.6
Person 93.9
Nature 88
Outdoors 82
Standing 73.9
Soil 69.8
People 67.6
Flooring 66.6
Urban 66.4
Building 66
Shorts 65.9
Clothing 65.9
Apparel 65.9
Portrait 64.3
Face 64.3
Photography 64.3
Photo 64.3
Landscape 64.1
Text 60

Clarifai
created on 2023-10-26

people 99.7
adult 98.4
group 98
man 97.1
many 96.2
group together 95.6
monochrome 95
administration 93.8
woman 93.6
vehicle 92.1
war 90.7
wear 88.5
military 84.7
child 83.4
interaction 80.9
watercraft 76.1
street 75.8
art 70.8
soldier 69.9
leader 69.5

Imagga
created on 2022-01-15

shopping cart 77.1
handcart 67.9
wheeled vehicle 52.1
container 33.3
sunset 31.4
water 30.7
beach 30.7
man 30.2
sky 30
sea 29
ocean 27
silhouette 26.5
male 21.3
people 20.6
landscape 20.1
sun 18.6
conveyance 17.3
sunrise 16.9
pier 16.7
coast 16.2
summer 16.1
sand 15.9
shore 15
travel 14.8
clouds 13.5
relax 13.5
outdoors 13.4
men 12.9
light 12.7
horizon 12.6
person 12.5
evening 12.1
waves 12.1
fisherman 11.3
outdoor 10.7
device 10.6
support 10.6
love 10.3
lake 10.1
leisure 10
tourism 9.9
sport 9.9
river 9.8
dusk 9.5
dark 9.2
island 9.2
adult 9.1
vacation 9
boy 8.7
holiday 8.6
life 8.5
boat 8.4
holding 8.3
fun 8.2
freedom 8.2
danger 8.2
recreation 8.1
lifestyle 7.9
day 7.8
couple 7.8
destruction 7.8
disaster 7.8
cloud 7.7
fishing 7.7
tropical 7.7
barrow 7.6
musical instrument 7.6
coastline 7.5
holidays 7.5
building 7.3
calm 7.3
romantic 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.8
black and white 80
person 72
old 65.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 98.3%
Calm 99.3%
Happy 0.3%
Surprised 0.1%
Fear 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Female, 82.9%
Calm 97.2%
Sad 2.2%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft
created on 2022-01-15

an old photo of a person 72.2%
an old photo of a person 72.1%
old photo of a person 69.5%

Text analysis

Amazon

48901.

Google

Y0901.
Y0901.