Human Generated Data

Title

Untitled (high school football players in scrimmage formation)

Date

1939

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22275

Human Generated Data

Title

Untitled (high school football players in scrimmage formation)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.5
Human 99.5
Person 99.4
Person 99.3
People 98.8
Person 98.3
Person 98.1
Field 96.9
Building 96.1
Person 95.9
Football 94.9
Sport 94.9
Team 94.9
Sports 94.9
Team Sport 94.9
Person 86.9
Person 86.5
Stadium 86
Arena 86
Person 82.3
Person 68.2
Footwear 65
Clothing 65
Apparel 65
Shoe 65
Person 63.6
Soccer 57.4
Football Field 55.8

Imagga
created on 2022-03-11

athlete 37.1
sport 34
sports equipment 34
cricket equipment 30.8
man 30.2
ballplayer 30.1
wicket 29.9
player 28.8
active 26.4
beach 26
person 25.1
outdoor 24.5
contestant 22.5
people 22.3
equipment 19
leisure 18.3
sand 18
male 17.7
outdoors 17.3
recreation 17
sky 16.6
lifestyle 15.9
vacation 15.6
summer 15.4
sea 14.9
adult 14.3
grass 14.2
ocean 14.2
football helmet 13.8
travel 13.4
competition 12.8
exercise 12.7
helmet 12.7
water 12.7
activity 12.5
runner 12.1
sports 12
outside 12
sunset 11.7
mountain 11.7
athletic 11.5
walking 11.4
ball 11.3
action 11.1
playing 10.9
field 10.9
silhouette 10.8
track 10.6
couple 10.5
landscape 10.4
play 10.3
fitness 9.9
course 9.8
fun 9.7
run 9.6
running 9.6
golf 9.6
shore 9.5
hobby 9.5
sun 8.9
stick 8.8
day 8.6
cricket bat 8.6
men 8.6
home plate 8.6
winter 8.5
relax 8.4
headdress 8.3
park 8.2
golfer 8.1
game 8
clothing 7.8
race 7.6
enjoy 7.5
hill 7.5
waves 7.4
mountains 7.4
snow 7.4
horse 7.4
freedom 7.3
coast 7.2
happiness 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

outdoor 99.4
black and white 84.7
person 84.1
text 83.2
black 79.3
white 73.8
sport 70.4
footwear 70
baseball 56.2
image 33.3

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 52.8%
Calm 97.3%
Sad 1.3%
Surprised 0.7%
Happy 0.4%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.4%
Calm 68.2%
Happy 12.8%
Sad 6.6%
Fear 4.6%
Confused 4.2%
Disgusted 1.5%
Surprised 1.3%
Angry 0.8%

AWS Rekognition

Age 25-35
Gender Male, 85.1%
Happy 97.2%
Calm 1.2%
Surprised 0.4%
Confused 0.4%
Fear 0.3%
Disgusted 0.2%
Sad 0.2%
Angry 0.2%

AWS Rekognition

Age 31-41
Gender Male, 89.8%
Calm 95.1%
Happy 4.6%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 76.7%
Calm 78.6%
Sad 19.4%
Confused 1.2%
Happy 0.2%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.5%
Shoe 65%

Captions

Microsoft

a vintage photo of a group of people running on a field 88.8%
a vintage photo of a group of people on a field 88.7%
a vintage photo of a group of people in a field 88.6%