Human Generated Data

Title

Untitled (people watching small sailboat launching from dock, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8502

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people watching small sailboat launching from dock, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Person 99.2
Person 98.9
Person 98.8
Person 98.3
Person 97.7
Person 97
Person 96.7
Person 96
Person 95.8
Person 94.9
Person 93.8
Boat 93.7
Transportation 93.7
Vehicle 93.7
Watercraft 92.6
Vessel 92.6
Person 89.9
Person 88.2
Rowboat 86
Water 78.4
Clothing 74
Apparel 74
Person 73.1
Dinghy 63.4
People 62.9
Person 62.9
Outdoors 61.8
Person 59.9
Waterfront 56.4
Oars 55.9
Person 45.9

Imagga
created on 2022-01-09

musical instrument 22.3
architecture 17.9
travel 16.9
old 16.7
trombone 15.9
water 15.3
brass 15.3
city 14.9
wind instrument 14.7
scene 14.7
percussion instrument 14
history 13.4
light 12.8
black 12.6
building 12.5
river 12.4
night 12.4
tourism 12.4
people 12.3
statue 11.7
religion 11.6
man 11.5
sky 11.5
sculpture 11.4
urban 10.5
monument 10.3
famous 10.2
landmark 9.9
art 9.5
symbol 9.4
stone 9.3
church 9.2
dark 9.2
historic 9.2
color 8.9
faith 8.6
person 8.6
construction 8.5
reflection 8.5
business 8.5
park 8.5
historical 8.5
landscape 8.2
tourist 8.1
new 8.1
catholic 7.9
adult 7.8
menorah 7.5
vintage 7.4
clothing 7.4
dirty 7.2
gray 7.2
women 7.1
businessman 7.1
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.9
ship 88.7
water 86.4
black 85.1
white 77.2
watercraft 70.1
old 69.3
boat 66.6
person 60.7
line 21
several 11

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 50.9%
Happy 86.2%
Sad 8.1%
Calm 1.9%
Confused 1.6%
Angry 0.9%
Surprised 0.7%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Female, 80.6%
Calm 72.3%
Confused 11%
Happy 6.1%
Sad 3.7%
Surprised 3.3%
Disgusted 1.3%
Angry 1.2%
Fear 1%

AWS Rekognition

Age 41-49
Gender Male, 99.3%
Calm 85.8%
Happy 5.1%
Sad 3.9%
Confused 3.1%
Surprised 0.8%
Angry 0.5%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.3%
Sad 60.5%
Calm 30.7%
Confused 4%
Happy 2.9%
Surprised 0.8%
Angry 0.4%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Male, 66.4%
Calm 70%
Happy 28.1%
Sad 0.9%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 89.3%
Calm 83.2%
Sad 7.6%
Happy 5.7%
Confused 1.4%
Surprised 0.7%
Disgusted 0.7%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 91.2%
Calm 83.3%
Sad 8.4%
Happy 6.8%
Fear 0.8%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 31-41
Gender Female, 82.3%
Happy 90.8%
Fear 3.4%
Calm 2.6%
Sad 1%
Disgusted 0.9%
Confused 0.6%
Surprised 0.5%
Angry 0.3%

AWS Rekognition

Age 29-39
Gender Male, 89.1%
Happy 60.9%
Calm 15.1%
Sad 9.8%
Surprised 5.5%
Angry 3.4%
Disgusted 2.2%
Fear 2%
Confused 1.1%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people in an old photo of a man 68.9%
an old photo of a group of people standing in front of a store 57.1%
an old photo of a man 57%

Text analysis

Amazon

17339.
KODVK

Google

.33 3רנ al 339 רל ידוי>
.33
3רנ
al
339
רל
ידוי>