Human Generated Data

Title

Untitled (baseball player hitting a ball of a tee, Dodger training quarters at Vero Beach)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5582

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (baseball player hitting a ball of a tee, Dodger training quarters at Vero Beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5582

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 96
Female 69.3
Face 66.4
People 62.5
Sport 59.5
Sports 59.5
Nature 56.7
Cricket 55.4

Clarifai
created on 2023-10-15

people 99.1
man 95.1
sport 94.4
recreation 93.8
competition 91.4
adult 89.1
fun 87.3
one 87
athlete 86.1
woman 86.1
exercise 83.1
field 81.5
street 80
desktop 79.9
child 79.9
young 78
gameplan (sports) 77.7
active 76.6
beach 76.4
game 74.9

Imagga
created on 2021-12-15

cleaner 59
sport 42.1
golf 37.3
man 32.9
golfer 30.7
swing 27.7
course 27.7
club 26.4
ball 25.7
cleaning implement 25.3
leisure 24.1
rake 23.4
male 23.4
play 23.3
active 22.5
grass 22.1
crutch 22
tool 21.6
game 21.4
playing 20.1
tee 19.5
outdoors 19.4
exercise 19.1
golfing 18.6
people 17.9
staff 17.7
outdoor 17.6
recreation 17
stick 17
player 16.4
lifestyle 15.9
adult 15.6
person 15.4
putting 14.7
outside 14.6
hole 14.4
fun 14.2
senior 14.1
action 13.9
hobby 13.3
walking 13.3
competition 12.8
putt 12.8
squeegee 12.4
driver 12.3
summer 12.2
men 12
hit 11.7
activity 11.6
sports 11.1
iron 11.1
broom 10.9
sky 10.8
fairway 10.8
one 10.5
happy 10
fitness 9.9
putter 9.9
vacation 9.8
retirement 9.6
sunny 9.5
old 9.1
sports equipment 9
boy 8.7
swab 8.5
equipment 8.3
park 8.2
trees 8
cricket equipment 7.9
par 7.9
smile 7.8
standing 7.8
retired 7.8
practice 7.7
athlete 7.7
health 7.6
drive 7.6
enjoy 7.5
flag 7.3
snow 7.3
day 7.1
travel 7

Microsoft
created on 2021-12-15

text 97.7
outdoor 96.8
black and white 70.1
drawing 69.3
person 63.3
posing 43.7
net 17.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-54
Gender Male, 88.3%
Calm 76.3%
Sad 21.1%
Happy 1.4%
Angry 0.3%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 4-14
Gender Female, 62.6%
Happy 50.7%
Calm 36.2%
Sad 4.6%
Angry 3.6%
Confused 2.1%
Disgusted 1.7%
Fear 0.5%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

M-25671

Google

M-25676
M-25676