Human Generated Data

Title

Untitled (Cardinal baseball player hitting ball; Stan Musial)

Date

1954

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14456

Human Generated Data

Title

Untitled (Cardinal baseball player hitting ball; Stan Musial)

People

Artist: Jack Gould, American

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14456

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.2
Human 98.2
Person 98.1
Person 94.4
People 86.4
Shorts 76.4
Clothing 76.4
Apparel 76.4
Crowd 69
Duel 57.9
Ballplayer 57.1
Team 57.1
Team Sport 57.1
Sport 57.1
Baseball 57.1
Athlete 57.1
Sports 57.1
Softball 57.1

Clarifai
created on 2023-10-29

people 99.9
group together 99
athlete 98.7
baseball 98.7
sports equipment 97.9
uniform 96.4
many 96.3
wear 95.5
competition 95.2
man 95.1
adult 94.8
outfit 93
motion 92.1
batter 91.6
monochrome 91
pitcher 89.2
two 88.6
baseball bat 88.4
one 88.2
action 87.6

Imagga
created on 2022-01-29

silhouette 29
person 24.9
sport 24.8
man 22.2
people 21.8
sunset 18.9
sky 18.5
stage 18
athlete 17.6
newspaper 17.4
player 16.3
world 15.3
outdoors 15
male 14.9
product 14.6
summer 13.5
activity 13.4
adult 13
platform 13
sun 12.9
creation 12.6
power 12.6
leisure 12.5
light 12.3
smoke 12.1
freedom 11.9
danger 11.8
action 11.5
success 11.3
human 11.2
clothing 11.2
men 10.3
black 10.3
symbol 10.1
relaxation 10.1
active 10
exercise 10
posing 9.8
businessman 9.7
sea 9.6
body 9.6
rock 9.6
lifestyle 9.4
energy 9.3
dark 9.2
travel 9.2
fun 9
one 9
ballplayer 9
destruction 8.8
happy 8.8
women 8.7
water 8.7
day 8.6
happiness 8.6
sunrise 8.4
beach 8.4
structure 8.3
billboard 8.3
sign 8.3
skateboard 8.3
industrial 8.2
dirty 8.1
art 7.9
business 7.9
disaster 7.8
model 7.8
cloud 7.8
chemical 7.7
motion 7.7
gas 7.7
outdoor 7.6
vehicle 7.6
wheeled vehicle 7.6
clouds 7.6
relax 7.6
serene 7.5
landscape 7.4
park 7.4
television 7.4
vacation 7.4
protection 7.3
sculpture 7.2
mountain 7.1

Microsoft
created on 2022-01-29

text 99.7
outdoor 92.1
black and white 73.7
clothing 60.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 77.1%
Calm 49.8%
Sad 14.4%
Happy 12%
Angry 10.8%
Surprised 3.9%
Disgusted 3.8%
Fear 3.4%
Confused 2%

AWS Rekognition

Age 18-24
Gender Male, 83.9%
Calm 87.7%
Sad 3.1%
Happy 2.9%
Angry 2.2%
Confused 2.1%
Fear 0.9%
Surprised 0.7%
Disgusted 0.5%

AWS Rekognition

Age 21-29
Gender Female, 60.6%
Calm 89.8%
Sad 4.7%
Confused 2.2%
Surprised 1%
Disgusted 1%
Happy 0.6%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 18-24
Gender Male, 73.9%
Calm 96.7%
Sad 1.7%
Happy 0.4%
Confused 0.4%
Angry 0.4%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Female, 89.4%
Calm 93%
Sad 5.7%
Confused 0.4%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Female, 78.1%
Calm 82.8%
Angry 6%
Happy 4.1%
Sad 1.7%
Surprised 1.7%
Confused 1.6%
Disgusted 1.1%
Fear 1%

AWS Rekognition

Age 14-22
Gender Female, 93.6%
Sad 54.6%
Calm 38%
Happy 2.5%
Surprised 1.6%
Fear 1.4%
Confused 0.9%
Angry 0.6%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.2%
Person 98.1%
Person 94.4%

Categories

Captions

Text analysis

Amazon

6

Google

MJI7-- YT37A°2 -- XAG
MJI7--
YT37A°2
--
XAG