Human Generated Data

Title

Untitled (man and woman fishing in the ocean)

Date

1957

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7414

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman fishing in the ocean)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7414

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Water 99.4
Human 99.4
Person 99.4
Person 99.3
Outdoors 98.8
Clothing 95.6
Apparel 95.6
Female 94.3
Nature 87.4
Sea 87.2
Ocean 87.2
Woman 81.1
Dress 80.6
Girl 69.8
Fishing 67
Leisure Activities 65.5
Shorts 61
Hat 60.6
Sea Waves 58.3

Clarifai
created on 2023-10-25

water 99
people 98.8
sea 98.4
ocean 97
beach 96.9
fisherman 96.8
monochrome 96.5
girl 96.1
lake 94.3
fish 92.7
man 92.5
sport 92.4
child 92
rod 91.2
motion 90.1
landscape 89.4
river 88.9
fun 88.8
action 88.2
nature 87.7

Imagga
created on 2022-01-08

fisherman 100
sea 38.3
beach 35.6
sport 34.8
water 34
iceberg 31.2
man 30.9
fishing 29.8
summer 28.9
ocean 28.8
sky 24.9
rod 24.6
leisure 24.1
outdoors 23.9
male 23.4
recreation 22.4
sand 21
reel 20.5
hobby 19.9
activity 18.8
coast 18
people 16.7
pole 16.5
fish 16.5
sunset 16.2
active 16.2
vacation 15.6
travel 15.5
sun 15.3
outdoor 15.3
relaxation 15.1
person 14.8
adult 14.2
fun 14.2
line 13.7
landscape 13.4
coastline 13.2
lake 12.8
river 12.5
silhouette 12.4
lifestyle 12.3
wave 12.1
vessel 12
freedom 11.9
angler 11.9
exercise 11.8
waves 11.2
angling 9.9
catch 9.8
surf 9.7
boy 9.6
hook 9.4
holiday 9.3
shore 9.3
gear 9.2
calm 9.2
danger 9.1
horizon 9
casting 8.9
catching 8.9
healthy 8.8
day 8.6
cloud 8.6
device 8.5
tropical 8.5
vacations 8.5
clouds 8.5
relax 8.4
equipment 8.4
fly 8.4
hat 8.2
relaxing 8.2
ship 8.1
suit 8.1
fishing gear 8.1
boat 8.1
hydrofoil 8
cast 7.9
dawn 7.7
recreational 7.7
retirement 7.7
dusk 7.6
sunrise 7.5
resort 7.5
action 7.4
tranquil 7.2
copy space 7.2
happiness 7.1

Microsoft
created on 2022-01-08

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 81.6%
Happy 30.6%
Fear 16%
Calm 15.8%
Angry 13.2%
Sad 8.7%
Surprised 7.4%
Confused 5.6%
Disgusted 2.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 99.1%

Text analysis

Amazon

LSEZH
MU3-VT330°2

Google

42357.
42357.