Human Generated Data

Title

Untitled (baseball players standing behind string apparatus)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7279

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (baseball players standing behind string apparatus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7279

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.6
Person 99.2
Shoe 93.3
Footwear 93.3
Clothing 93.3
Apparel 93.3
Building 77.2
Outdoors 76.2
Play Area 75.7
Playground 75.7
Shorts 70.7
Nature 68.1
People 67.7
Sphere 64.6
Shoe 59.2
Car 57.5
Transportation 57.5
Vehicle 57.5
Automobile 57.5
Portrait 56.3
Photography 56.3
Face 56.3
Photo 56.3

Clarifai
created on 2023-10-26

people 99.9
group together 99.3
child 97.9
two 93.9
uniform 92.5
baseball 92.5
three 91.5
boy 91.3
sports equipment 90.9
man 90.3
adult 90
many 89.3
group 89.1
outfit 88.7
recreation 88.2
several 87.5
wear 86.8
four 84.4
competition 79.1
nostalgia 75.3

Imagga
created on 2022-01-15

horizontal bar 100
gymnastic apparatus 100
sports equipment 92
equipment 54.5
parallel bars 39
man 19.5
person 17.6
male 16.3
people 15.6
adult 14.2
outdoors 13.4
water 13.3
sport 11.8
active 10.8
success 10.5
portrait 10.3
hand 9.9
human 9.7
men 9.4
happy 9.4
city 9.1
park 9.1
black 9
activity 8.9
body 8.8
happiness 8.6
play 8.6
walk 8.6
outside 8.5
smile 8.5
walking 8.5
business 8.5
outdoor 8.4
summer 8.4
street 8.3
holding 8.2
one 8.2
wet 8
athlete 8
businessman 7.9
building 7.9
day 7.8
net 7.8
sky 7.6
field 7.5
sign 7.5
dark 7.5
fun 7.5
exercise 7.3
fitness 7.2
transportation 7.2
wall 7.1

Microsoft
created on 2022-01-15

outdoor 98.4
text 98.3
clothing 73.9
person 67.8
baseball 61
old 59
child 51
posing 36.4
vintage 34.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.7%
Calm 47.9%
Fear 22.4%
Happy 11.4%
Surprised 8.5%
Sad 4.6%
Angry 2.9%
Confused 1.4%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 93.3%
Car 57.5%

Text analysis

Amazon

STEINMETZ,
SARASOTA,
STEINMETZ, SARASOTA, FLORIDA
FLORIDA
25732
Brooklyr

Google

M. 25732 STEINMETZ, SARASOTA, FLORIDA
M.
25732
STEINMETZ,
SARASOTA,
FLORIDA