Human Generated Data

Title

Untitled (people watching small sailboats launching from dock, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8504

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people watching small sailboats launching from dock, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.1
Person 98.1
Water 91.8
Person 90
Person 89
Person 86.3
Building 86.1
Transportation 85.1
Boat 85.1
Vehicle 85.1
Pool 80.6
Person 80.1
Person 79.7
Person 78.8
Boat 77.5
Person 76.4
Architecture 73.2
Person 70.9
Person 67.3
Sea Life 66.1
Animal 66.1
Waterfront 65.3
Person 65.2
Swimming Pool 64.6
Person 64.4
Mammal 62.8
Vessel 59.5
Watercraft 59.5
Dock 56.4
Pier 56.4
Port 56.4
Person 52.7
Person 52.6
Person 50.8

Imagga
created on 2022-01-09

water 31.4
sky 31
sea 27.7
travel 27.5
architecture 23.6
ocean 21.7
landscape 20.1
ship 19.7
summer 18.7
boat 18.6
lake 18.4
tourism 18.2
sailboat 17.7
coast 17.1
beach 16.8
city 16.6
clouds 15.2
building 14.8
bay 14.4
river 14.2
old 13.9
structure 13.6
shore 13.2
vacation 13.1
island 12.8
landmark 12.6
port 12.5
cityscape 12.3
vessel 12.3
center 12.2
catamaran 11.7
sunset 11.7
tower 11.6
history 11.6
silhouette 11.6
sand 11.2
village 11.1
house 10.9
scenery 10.8
holiday 10.8
light 10.7
boats 10.7
jigsaw puzzle 10.5
scenic 10.5
sun 10.5
skyline 10.5
shopping cart 10.3
construction 10.3
town 10.2
transport 10.1
wheeled vehicle 9.9
monitor 9.8
rural 9.7
harbor 9.6
fishing 9.6
urban 9.6
resort 9.6
cloud 9.5
buildings 9.5
wind 9.4
waves 9.3
historic 9.2
seaside 9.1
work 8.8
cruise 8.8
marina 8.6
culture 8.6
tropical 8.5
winter 8.5
coastline 8.5
pier 8.5
handcart 8.4
hill 8.4
dark 8.4
puzzle 8.3
sailing vessel 8.3
tourist 8.2
reflection 8.1
transportation 8.1
craft 8
night 8
body of water 7.9
electronic equipment 7.8
scene 7.8
sail 7.8
downtown 7.7
panorama 7.6
dusk 7.6
energy 7.6
power 7.6
boathouse 7.5
outdoors 7.5
vintage 7.4
waterfront 7.4
church 7.4
calm 7.3
black 7.2
recreation 7.2
platform 7.2

Microsoft
created on 2022-01-09

text 99.8
ship 99.4
watercraft 97.6
boat 96.3
water 95.4
black 82.6
vehicle 79.3
black and white 77.4
white 76.2
old 71
lake 68.6
harbor 57.7

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 94.6%
Calm 60.1%
Happy 24.4%
Disgusted 4.8%
Sad 3.6%
Angry 3.5%
Confused 1.8%
Surprised 1.1%
Fear 0.7%

AWS Rekognition

Age 23-31
Gender Male, 90.5%
Calm 99.2%
Sad 0.4%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 34-42
Gender Male, 94.8%
Calm 65.8%
Happy 24.3%
Sad 3.3%
Disgusted 2.2%
Surprised 1.8%
Fear 1.1%
Angry 0.8%
Confused 0.7%

AWS Rekognition

Age 22-30
Gender Male, 98.2%
Calm 73.1%
Sad 15.7%
Angry 3.9%
Disgusted 2.5%
Surprised 1.5%
Confused 1.3%
Happy 1.3%
Fear 0.7%

AWS Rekognition

Age 7-17
Gender Female, 67.6%
Calm 84.3%
Fear 6.2%
Sad 3.7%
Happy 1.9%
Angry 1.6%
Disgusted 1.5%
Surprised 0.5%
Confused 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 90.4%
Happy 4.3%
Sad 3.4%
Angry 1.4%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Calm 86.3%
Sad 5%
Angry 3.2%
Happy 2.9%
Disgusted 1%
Fear 0.7%
Confused 0.6%
Surprised 0.3%

AWS Rekognition

Age 25-35
Gender Female, 58.2%
Calm 86.8%
Disgusted 3.3%
Angry 3%
Happy 2%
Sad 1.9%
Fear 1.4%
Confused 1%
Surprised 0.5%

AWS Rekognition

Age 20-28
Gender Male, 96.6%
Sad 27.3%
Angry 26.2%
Calm 23%
Fear 8.9%
Confused 7.5%
Surprised 3.1%
Disgusted 2.9%
Happy 1.1%

AWS Rekognition

Age 27-37
Gender Male, 97.9%
Happy 34.8%
Sad 15.4%
Calm 13.9%
Surprised 10.6%
Confused 9%
Disgusted 7.1%
Angry 6%
Fear 3.2%

AWS Rekognition

Age 4-12
Gender Male, 59.3%
Calm 42.2%
Sad 27.5%
Happy 11%
Fear 5.2%
Disgusted 4.8%
Confused 3.6%
Surprised 3.1%
Angry 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Boat 85.1%

Captions

Microsoft

a vintage photo of a boat next to a body of water 83.9%
a vintage photo of a small boat in a body of water 79.1%
a vintage photo of a boat in a body of water 79%

Text analysis

Amazon

17344
or
17344.

Google

םאצה
רו
17344.
I
םאצה רו 田 I日 17344.