Human Generated Data

Title

Untitled (finish line of a children’s race)

Date

1947

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18334

Human Generated Data

Title

Untitled (finish line of a children’s race)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Field 98.9
Person 98.3
Human 98.3
People 98.3
Person 97.1
Person 97.1
Football 96.6
Team 96.6
Sports 96.6
Team Sport 96.6
Sport 96.6
Person 93.7
Person 93.1
Person 92.2
Person 91.8
Building 91.8
Tarmac 90.8
Asphalt 90.8
Person 90.6
Person 89
Person 87.6
Arena 86.9
Stadium 86.9
Person 86.2
Person 85.1
Road 83.3
Person 83.1
Person 77.4
Mammal 72.2
Horse 72.2
Animal 72.2
Person 70.2
Football Field 58.5
American Football 58.5
Urban 56.1

Imagga
created on 2022-03-04

dark 18.4
sky 17.4
man 16.8
structure 16.4
tree 15.6
trees 15.1
athlete 14.9
landscape 14.9
people 14.5
sunset 14.4
person 14
park 14
grass 13.4
silhouette 12.4
bench 12.2
sun 12.2
park bench 11.8
night 11.6
contestant 11.2
fountain 10.9
destruction 10.8
sport 10.7
world 10.7
environment 10.7
male 10.7
runner 10.6
urban 10.5
outdoors 10
danger 10
outdoor 9.9
road 9.9
old 9.8
forest 9.6
building 9.5
scene 9.5
evening 9.3
water 9.3
travel 9.2
protection 9.1
black 9
summer 9
recreation 9
nuclear 8.7
boy 8.7
light 8.7
walk 8.6
winter 8.5
field 8.4
history 8.1
snow 8
ball 8
adult 7.9
gymnasium 7.9
couple 7.8
disaster 7.8
explosion 7.7
beach 7.7
player 7.7
house 7.6
sports equipment 7.5
city 7.5
seat 7.5
equipment 7.4
peaceful 7.3
industrial 7.3
child 7.3
portrait 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 98.4
black and white 97.1
text 96.2
tree 81.2
black 66.9
white 65.5
person 56.4
monochrome 54.6
old 42.3

Face analysis

Amazon

Google

AWS Rekognition

Age 14-22
Gender Female, 69.1%
Sad 57.3%
Calm 39.4%
Disgusted 1.1%
Confused 1%
Happy 0.4%
Angry 0.3%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 23-31
Gender Male, 57.9%
Fear 71.2%
Calm 16.7%
Happy 7.7%
Surprised 1.6%
Disgusted 1.2%
Angry 0.5%
Confused 0.5%
Sad 0.4%

AWS Rekognition

Age 6-14
Gender Female, 75.9%
Calm 73.8%
Sad 17.4%
Happy 3.7%
Fear 2.2%
Disgusted 1%
Confused 0.8%
Surprised 0.6%
Angry 0.6%

AWS Rekognition

Age 18-24
Gender Male, 98.2%
Calm 64.4%
Happy 11.2%
Sad 7.7%
Fear 5.2%
Surprised 3.9%
Confused 3.1%
Angry 2.3%
Disgusted 2.2%

AWS Rekognition

Age 35-43
Gender Female, 67.7%
Happy 54.1%
Surprised 29.2%
Calm 5.6%
Confused 3.8%
Sad 3.4%
Fear 2%
Disgusted 1.3%
Angry 0.7%

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Calm 51.6%
Sad 33.5%
Fear 8.2%
Happy 2.6%
Angry 2.6%
Disgusted 0.6%
Confused 0.5%
Surprised 0.3%

AWS Rekognition

Age 19-27
Gender Male, 99.1%
Calm 92.2%
Disgusted 1.8%
Sad 1.7%
Angry 1.5%
Surprised 0.8%
Fear 0.7%
Confused 0.7%
Happy 0.6%

AWS Rekognition

Age 37-45
Gender Male, 75%
Calm 59.6%
Confused 12.1%
Sad 9.1%
Happy 6.2%
Fear 5.3%
Disgusted 3.3%
Angry 2.4%
Surprised 2%

AWS Rekognition

Age 16-22
Gender Male, 99.7%
Calm 60%
Sad 13.4%
Fear 11.8%
Confused 6.7%
Angry 4.1%
Disgusted 2.3%
Happy 1.2%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.3%
Horse 72.2%

Captions

Microsoft

a vintage photo of a horse 89.3%
a vintage photo of a person riding a horse in front of a building 70.6%
a vintage photo of a group of people standing in front of a building 70.5%

Text analysis

Amazon

7