Human Generated Data

Title

Untitled (family riding in the back of a motor boat)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10567

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family riding in the back of a motor boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99
Human 99
Person 99
Person 98.9
Person 97.7
Chair 96.7
Furniture 96.7
Apparel 96.2
Clothing 96.2
Shorts 89.3
Bed 88.8
Person 87.1
Wood 71.8
Female 68.2
Table 66.8
Outdoors 66.2
Flooring 65.9
Leisure Activities 63.5
Portrait 63.2
Photography 63.2
Face 63.2
Photo 63.2
Pants 61.6
Stage 58.6
Dining Table 58.2
Girl 57.6
Waterfront 56.5
Water 56.5

Imagga
created on 2022-01-09

ballplayer 85.3
athlete 72
player 69.7
contestant 52.1
person 37.9
man 30.9
sport 24.9
male 19.9
people 19.5
adult 17
outdoor 15.3
fun 15
vehicle 13.8
lifestyle 13.7
sky 12.7
transportation 12.5
professional 12.2
beach 11.8
uniform 11.2
action 11.1
leisure 10.8
outdoors 10.4
guy 10.4
play 10.3
men 10.3
women 10.3
outside 10.3
protection 10
exercise 10
recreation 9.9
travel 9.9
summer 9.6
bobsled 9.4
water 9.3
speed 9.2
weapon 9.1
ocean 9.1
black 9
clothing 8.9
body 8.8
active 8.2
human 8.2
competition 8.2
sand 8.2
game 8
helmet 8
equipment 7.9
model 7.8
astronaut 7.8
military 7.7
motion 7.7
extreme 7.7
health 7.6
car 7.5
sled 7.5
one 7.5
occupation 7.3
freedom 7.3
suit 7.2
love 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.7
clothing 94.6
person 93.4
window 89.5
dance 88.9
black and white 69.3
man 66.5
player 61.9

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 94.7%
Calm 53%
Surprised 15.2%
Fear 9.7%
Confused 8.5%
Happy 5.9%
Sad 3%
Disgusted 2.7%
Angry 2%

AWS Rekognition

Age 25-35
Gender Female, 57.3%
Surprised 47%
Calm 31.3%
Happy 15.5%
Disgusted 1.9%
Sad 1.5%
Angry 1.4%
Confused 0.9%
Fear 0.6%

AWS Rekognition

Age 27-37
Gender Female, 50.8%
Calm 64.8%
Surprised 18.8%
Happy 10.8%
Disgusted 1.9%
Fear 1.4%
Confused 1%
Angry 0.7%
Sad 0.7%

Feature analysis

Amazon

Person 99%
Bed 88.8%

Captions

Microsoft

a group of people sitting in front of a window 45.8%
a group of people standing in front of a window 45.7%
a group of people in front of a window 45.6%

Text analysis

Amazon

21990.
KODVK-SVEELA

Google

21990.
21990.