Human Generated Data

Title

Untitled (women and men playing and watching shuffleboard)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10738

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women and men playing and watching shuffleboard)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10738

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 94.9
Human 94.9
Person 93
Road 90.6
Person 90.6
Person 90
Person 89.8
Person 87.7
Person 87.4
Person 86.8
Person 85.8
Person 77.7
Person 73.4
Person 73.3
Tarmac 68.8
Asphalt 68.8
Person 67.5
Transportation 66.8
Vehicle 65.8
Freeway 63.8
Person 62.6
Person 60.8
Highway 59.5
Crowd 58
Pedestrian 57.4
Nature 56.8
Person 50.5
Person 48.3

Clarifai
created on 2023-10-26

people 99.9
street 99
group together 98.2
many 98.1
man 97
monochrome 96.7
group 96.4
adult 95.9
transportation system 94.6
vehicle 94.5
woman 89.2
two 88.6
seat 88.4
furniture 87.9
recreation 87.3
crowd 86.3
city 85.3
road 84.5
bench 83.7
music 83.3

Imagga
created on 2022-01-15

city 39.1
architecture 33.3
building 30.3
urban 28
cityscape 27.4
sky 22.3
travel 21.1
bridge 20.2
negative 18.5
water 18
structure 17.9
town 17.6
landmark 17.2
tower 17
construction 16.3
skyline 16.1
film 16
buildings 15.1
night 15.1
landscape 14.9
tourism 14.8
old 14.6
river 14.2
famous 14
scene 13.8
street 13.8
snow 13.3
light 12.7
business 11.5
downtown 11.5
modern 11.2
winter 11.1
photographic paper 10.9
road 10.8
transportation 10.8
aerial 10.7
panorama 10.5
skyscraper 10.2
house 10.2
billboard 10.1
park 9.9
tourist 9.7
outdoor 9.2
history 8.9
color 8.9
district 8.7
cloud 8.6
roof 8.6
signboard 8.4
transport 8.2
reflection 8.2
skyscrapers 7.8
capital 7.6
evening 7.5
church 7.4
historic 7.3
center 7.3
new 7.3
photographic equipment 7.2
black 7.2
summer 7.1
day 7.1
sea 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.5
skyscraper 95.6
black and white 91.5
building 87.6
city 74
sky 70.8
white 60.4
old 50.7
vintage 36.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 97.6%
Happy 62.6%
Calm 29%
Sad 3%
Angry 2.7%
Fear 1%
Surprised 0.7%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 20-28
Gender Female, 88.1%
Calm 69%
Fear 11.4%
Disgusted 8.1%
Sad 4.6%
Confused 3%
Happy 2.1%
Angry 1.4%
Surprised 0.4%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Sad 93.6%
Happy 3.2%
Calm 2.3%
Fear 0.4%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-30
Gender Male, 75.8%
Happy 64.4%
Calm 25.2%
Sad 8.2%
Disgusted 0.6%
Confused 0.6%
Angry 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 19-27
Gender Female, 94%
Sad 49.3%
Calm 40.2%
Happy 7.6%
Fear 0.9%
Angry 0.7%
Disgusted 0.6%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 28-38
Gender Male, 68%
Calm 49.6%
Sad 25.4%
Happy 5.8%
Fear 5%
Disgusted 4.9%
Surprised 4.2%
Angry 3.7%
Confused 1.4%

AWS Rekognition

Age 11-19
Gender Male, 99.8%
Happy 68.7%
Sad 18.9%
Calm 5.8%
Fear 2.9%
Angry 1.5%
Surprised 1%
Disgusted 0.6%
Confused 0.5%

AWS Rekognition

Age 14-22
Gender Male, 95.8%
Calm 93.6%
Sad 2.8%
Angry 1.4%
Fear 0.8%
Happy 0.7%
Disgusted 0.3%
Confused 0.3%
Surprised 0.2%

Feature analysis

Amazon

Person 94.9%

Text analysis

Amazon

36195
10
IS3
OFF
KODVK-EVEELA
a

Google

10 OFF 36195 YS3
10
OFF
36195
YS3