Human Generated Data

Title

Untitled (men standing next to tipped boat on land, Mantalocking, NJ)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8518

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men standing next to tipped boat on land, Mantalocking, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8518

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.6
Interior Design 99.2
Indoors 99.2
Room 97.2
Person 94.6
Person 93
Person 92.3
Person 90.9
Person 87.1
Court 67.7
Airplane 67.1
Transportation 67.1
Vehicle 67.1
Aircraft 67.1
Leisure Activities 65
People 63.8
Theater 55.5

Clarifai
created on 2023-10-25

people 99.3
man 96.9
vehicle 96
adult 95.7
watercraft 95.3
group 94.1
group together 93.8
woman 92
transportation system 91.4
boat 89.2
many 88.5
travel 88.3
aircraft 86.8
monochrome 86.5
water 85.3
one 85.1
no person 84.3
airplane 84.2
recreation 81.3
city 79.8

Imagga
created on 2022-01-09

chair 44.5
furniture 32.2
table 29.6
room 29.2
seat 26.7
interior 20.3
folding chair 20.3
travel 16.9
chairs 15.7
device 15.5
water 15.3
desk 15.2
modern 14.7
sky 14.7
empty 14.6
sea 14.5
relaxation 14.2
business 14
technology 13.4
classroom 13.4
equipment 12.4
vacation 12.3
ocean 11.6
boat 11.6
summer 11.6
architecture 11.1
inside 11
monitor 10.7
computer 10.7
office 10.5
education 10.4
row 10.2
glass 10.1
relax 10.1
house 10
tourism 9.9
conference 9.8
furnishing 9.6
hotel 9.5
center 9.5
beach 9.5
light 9.4
wood 9.2
skeleton 9
indoors 8.8
class 8.7
building 8.7
cockpit 8.6
luxury 8.6
meeting 8.5
floor 8.4
island 8.2
electronic equipment 8.2
home 8
lifestyle 7.9
design 7.9
tables 7.9
scene 7.8
bay 7.7
outdoor 7.6
deck 7.6
learning 7.5
resort 7.5
outdoors 7.5
restaurant 7.3
data 7.3
school 7.3
support 7.3
hall 7.3
sun 7.2
black 7.2
vehicle 7.2
window 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.4
ship 97.8
person 94.9
black and white 86.5
watercraft 72.8
boat 72.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 95.5%
Calm 96.2%
Sad 1.6%
Confused 0.9%
Happy 0.4%
Surprised 0.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 90.5%
Calm 63.2%
Happy 29.3%
Sad 2.7%
Disgusted 2.2%
Surprised 0.8%
Fear 0.8%
Confused 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Airplane 67.1%

Captions

Microsoft
created on 2022-01-09

an old photo of a man 57.2%
old photo of a man 55.1%

Text analysis

Amazon

17348
MOOOH
YT37A8-1A2A7

Google

28 ר בי3 ר1
28
ר
בי3
ר1