Human Generated Data

Title

Untitled (man, woman and dog docked in small motor boat)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8954

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man, woman and dog docked in small motor boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8954

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Chair 99.6
Furniture 99.6
Person 97.5
Human 97.5
Person 97.2
Shelter 83.3
Nature 83.3
Outdoors 83.3
Countryside 83.3
Rural 83.3
Building 83.3
Clothing 79.1
Apparel 79.1
Face 76.5
Portrait 67.3
Photography 67.3
Photo 67.3
Spoke 66.5
Machine 66.5
Vehicle 66.3
Transportation 66.3
Housing 65.7
Female 65.2
Table 64.2
Shorts 61.4
Bed 60.8
Boat 58
Wood 55.8
Canvas 55.7
Desk 55.4
Watercraft 55.2
Vessel 55.2

Clarifai
created on 2023-10-25

people 99.3
monochrome 98.4
adult 96.4
woman 96.3
watercraft 96.1
vehicle 95.4
two 94.7
man 93.5
wear 92.3
transportation system 91.8
one 91.5
boat 90.9
reflection 89.8
group 88.7
indoors 87.6
design 86
wedding 85.4
girl 85.3
mirror 84.3
model 84

Imagga
created on 2022-01-09

device 29.5
seat 25.6
car 24.8
vehicle 24
engine 22.1
transportation 21.5
metal 20.1
support 18.2
automobile 18.2
machine 17.4
auto 17.2
steel 16.8
equipment 16.3
modern 14
transport 13.7
industrial 13.6
industry 12.8
ejection seat 12.4
home appliance 11.9
technology 11.9
dishwasher 11.8
man 11.4
white goods 11
work 11
motor 10.9
people 10.6
engineering 10.5
power 10.1
appliance 10
pipe 9.7
mechanical 9.7
iron 9.7
war 9.6
clothing 9
uniform 8.9
vessel 8.9
hood 8.8
machinery 8.8
glass 8.5
clean 8.3
tool 8.1
new 8.1
light 8
interior 8
business 7.9
station 7.7
military 7.7
factory 7.7
broken 7.7
fuel 7.7
repair 7.7
old 7.7
helmet 7.3
building 7.2
turbine 7.2
shiny 7.1
medical 7.1
car seat 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

ship 99.6
text 98.3
black and white 90
boat 86.6
watercraft 85.6
vehicle 71.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 98.4%
Sad 98.3%
Calm 1.2%
Happy 0.1%
Confused 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%

Feature analysis

Amazon

Person 97.5%
Boat 58%

Categories

Imagga

interior objects 98.9%

Captions

Text analysis

Amazon

42305

Google

42 305
42
305