Human Generated Data

Title

Untitled (baseball players sitting in dugout)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7259

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (baseball players sitting in dugout)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7259

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 98.5
Nature 97.8
Person 96
Person 95
Outdoors 94.7
Person 93.5
Person 93.5
Person 87.7
Person 79.9
Person 78.4
Person 77.9
Person 76.5
Person 76
Person 76
Shack 73.8
Countryside 73.8
Hut 73.8
Rural 73.8
Building 73.8
Person 71.3
Dugout 70.5
Person 69.8
Person 67.5
Crowd 61.9
Bus Stop 58.4
Person 42.7

Clarifai
created on 2023-10-25

people 99.6
tent 98.8
many 98.2
group together 96.9
crowd 92.1
man 91.3
group 90.1
uniform 88.3
military 84.8
soldier 83.7
war 80.2
camp 79.8
adult 77.5
campsite 75.5
spectator 73.7
transportation system 68.9
outfit 65.6
child 64.6
wear 63.3
family 60.9

Imagga
created on 2022-01-08

stage 53
platform 42.3
city 20.8
sky 20.4
structure 16.6
urban 14.8
architecture 14.1
travel 13.4
skyline 13.3
ski slope 12.8
landscape 12.6
building 12.3
black 12
water 12
old 11.8
power 11.7
scene 11.2
slope 11.2
construction 11.1
grunge 11.1
sea 10.9
industrial 10.9
ocean 10.8
tower 10.7
light 10.7
night 10.6
business 10.3
equipment 10.1
landmark 9.9
history 9.8
monitor 9.1
man 8.8
port 8.7
downtown 8.6
male 8.5
electronic equipment 8.4
dark 8.3
island 8.2
symbol 8.1
transportation 8.1
day 7.8
sunny 7.7
geological formation 7.7
factory 7.7
tree 7.7
industry 7.7
winter 7.7
house 7.5
silhouette 7.4
smoke 7.4
person 7.4
countryside 7.3
transport 7.3
snow 7.3
people 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.6
outdoor 88.7
standing 79.2
black and white 78.2
black 73.2
people 62.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 74.2%
Calm 85.8%
Disgusted 5.6%
Happy 5.3%
Surprised 1.4%
Confused 0.9%
Fear 0.4%
Sad 0.3%
Angry 0.2%

AWS Rekognition

Age 18-26
Gender Male, 76.9%
Calm 50.9%
Sad 18.9%
Happy 12.3%
Angry 8%
Fear 3.4%
Disgusted 3%
Surprised 2.1%
Confused 1.5%

AWS Rekognition

Age 26-36
Gender Female, 51.8%
Calm 93.3%
Sad 3.1%
Confused 2%
Happy 0.8%
Disgusted 0.3%
Surprised 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Male, 95.6%
Happy 42.7%
Sad 17.7%
Calm 15.1%
Surprised 9.8%
Fear 5.2%
Confused 3.7%
Angry 3%
Disgusted 2.9%

AWS Rekognition

Age 18-24
Gender Male, 68.8%
Calm 97.2%
Sad 1.3%
Angry 0.8%
Surprised 0.2%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 74.4%
Calm 99.3%
Sad 0.4%
Happy 0.2%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%

Text analysis

Amazon

SARASOTA,
STEINMETZ,
STEINMETZ, SARASOTA, FLORIDA
FLORIDA
25757
KODOK-EVEELA
IOS

Google

25757 STEINMETZ, SARASOTA, FLORIDA
STEINMETZ,
FLORIDA
25757
SARASOTA,