Human Generated Data

Title

Untitled (man fishing in ocean from shore, boy holding fish)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7993

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man fishing in ocean from shore, boy holding fish)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 100
Shorts 100
Apparel 100
Human 99.6
Person 99.6
Person 99
Nature 92.3
Outdoors 87.9
Plant 87.6
Tree 87.6
Tent 84
Land 81
Vegetation 80.5
Guitar 65.6
Leisure Activities 65.6
Musical Instrument 65.6
Coast 58
Ocean 58
Water 58
Sea 58
Shoreline 58
Beach 58
Furniture 57.9
Chair 57.9

Imagga
created on 2022-01-09

barrow 28.5
handcart 23.8
people 22.9
wheeled vehicle 19.2
man 18.8
adult 18.8
person 18.6
beach 17.9
male 17.4
sunset 17.1
sky 16.6
sun 16.1
summer 16.1
outdoor 16.1
silhouette 14.9
bench 14.7
water 14
outdoors 13.6
sea 13.3
sand 13.2
vacation 13.1
chair 12.9
landscape 12.6
seat 12.6
sax 12.4
portrait 12.3
lifestyle 12.3
relax 11.8
day 11.8
vehicle 11.7
couple 11.3
musical instrument 11.1
dark 10.9
wind instrument 10.4
swing 10.3
sitting 10.3
ocean 10
park 10
park bench 10
child 9.7
sexy 9.6
boy 9.6
clouds 9.3
lake 9.2
leisure 9.1
danger 9.1
attractive 9.1
black 9
recreation 9
destruction 8.8
body 8.8
brass 8.6
outside 8.6
model 8.6
tree 8.5
evening 8.4
relaxation 8.4
peaceful 8.2
protection 8.2
dirty 8.1
world 7.9
holiday 7.9
disaster 7.8
nuclear 7.8
sunny 7.7
dusk 7.6
happy 7.5
life 7.5
shore 7.4
holding 7.4
environment 7.4
safety 7.4
freedom 7.3
lady 7.3
trombone 7.2
grass 7.1
mechanical device 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.8
black and white 95.9
person 80.6
monochrome 80.5
white 61.2
old 42.6

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 84.2%
Calm 99.6%
Happy 0.2%
Sad 0.1%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Female, 50.8%
Calm 99.9%
Sad 0%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tent 84%
Guitar 65.6%

Captions

Microsoft

a vintage photo of a person 85.6%
a vintage photo of a person 81.6%
a vintage photo of a man and a woman posing for a picture 60.3%

Text analysis

Amazon

2
"
لا " 2 MJ17--YT37AS
لا
MJ17--YT37AS

Google

MJI7--YT33A 2 2
MJI7--YT33A
2