Human Generated Data

Title

Untitled (family riding in the back of a motor boat)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10566

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family riding in the back of a motor boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10566

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99
Person 99
Person 98.6
Clothing 95.6
Apparel 95.6
Person 95.3
Chair 94.6
Furniture 94.6
Person 93.1
People 86.1
Transportation 76.3
Boat 76.3
Vehicle 76.3
Shorts 75.4
Person 69
Table 68.9
Face 66.3
Meal 62.8
Food 62.8
Bed 60.9
Female 60.3
Advertisement 57.1
Girl 55.4
Person 42.2

Clarifai
created on 2023-10-25

people 99.8
adult 99.3
man 98.7
woman 97.3
group together 96
monochrome 95.9
sitting 95.2
group 95.1
wear 93.5
watercraft 91.6
transportation system 91.2
vehicle 89.1
many 86.9
sit 86.9
several 86.8
recreation 83.2
chair 82.9
facial expression 82.4
administration 81.6
furniture 79.7

Imagga
created on 2022-01-09

ballplayer 28.3
person 25.9
athlete 23.5
daily 23.4
player 22.6
water 22
planner 20.7
sea 20.6
ocean 19.2
contestant 18
beach 17.8
travel 17.6
sky 17.2
vacation 15.5
people 15
summer 14.8
coast 13.5
tourism 13.2
negative 12.4
man 12.1
adult 11.7
transportation 11.6
holiday 11.5
ship 11.4
black 11.4
sport 11.3
boat 11.3
speed 11
film 10.9
liner 10.8
sand 10.7
bathing cap 10.3
newspaper 10.2
lifestyle 10.1
outdoor 9.9
silhouette 9.9
outdoors 9.7
landscape 9.7
motion 9.4
waves 9.3
male 9.2
clothing 9.2
peaceful 9.1
recreation 9
passenger ship 8.9
men 8.6
luxury 8.6
tropical 8.5
exercise 8.2
tourist 8.1
light 8
product 8
bay 7.9
cap 7.9
headdress 7.9
vessel 7.5
coastline 7.5
island 7.3
calm 7.3
design 7.3
sun 7.2
active 7.2
activity 7.2
river 7.1
day 7.1

Microsoft
created on 2022-01-09

text 99.9
boat 94.5
ship 92.6
drawing 79.8
sketch 77.1
person 69.2
clothing 62.3
old 50.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Female, 62.2%
Calm 99.5%
Sad 0.2%
Confused 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Female, 65.4%
Calm 69%
Happy 13.5%
Sad 7%
Confused 6.3%
Angry 1.7%
Disgusted 1.2%
Surprised 1.2%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 95.6%
Happy 82.1%
Fear 7.6%
Calm 4.4%
Sad 3.5%
Angry 1%
Disgusted 0.8%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 29-39
Gender Male, 95.2%
Calm 94.1%
Surprised 2.2%
Confused 1.3%
Sad 1.2%
Fear 0.4%
Disgusted 0.3%
Angry 0.2%
Happy 0.2%

Feature analysis

Amazon

Person 99%
Chair 94.6%
Boat 76.3%
Bed 60.9%

Categories

Text analysis

Amazon

SIESTA
FLA.
J.
SARASOTA,
J. J. STEINMETZ, SIESTA KEY. SARASOTA, FLA.
STEINMETZ,
KEY.
21987.
KODAK-SLA

Google

J.
STEINMETZ,
SARASOTA,
FLA.
21987.
J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLA. 21987.
SIESTA
KEY,