Human Generated Data

Title

Untitled (male athletes running toward pile of shoes)

Date

c. 1960

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21608

Human Generated Data

Title

Untitled (male athletes running toward pile of shoes)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21608

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Shorts 99.8
Clothing 99.8
Apparel 99.8
Person 99.6
Human 99.6
Person 99.6
Person 99.5
Person 99.5
Person 99.4
Person 99.4
Person 99.1
Person 98.9
Person 98.4
Person 98.3
Person 96.3
People 95.9
Person 95.3
Person 94.8
Field 90.9
Sport 90.8
Sports 90.8
Team Sport 76
Team 76
Crowd 73.3
Football 67.2
Horse 66.3
Animal 66.3
Mammal 66.3
Person 62.3
Marching 58.1
Croquet 56.2
Cricket 55.9
Person 55

Clarifai
created on 2023-10-22

people 99.9
many 99.4
group together 98.7
group 98.3
adult 98
child 95.2
woman 94.3
man 93.3
wear 92.2
crowd 91.9
dancing 89.9
recreation 87.7
music 87.2
several 85.6
dancer 85.2
outfit 83.5
ceremony 82.4
veil 82
monochrome 79.7
athlete 79.1

Imagga
created on 2022-03-05

sand 29
travel 21.1
beach 19.5
landscape 19.3
camel 19.2
sky 18.5
group 17.7
dancer 16.8
outdoors 15.9
sea 15.6
people 15.6
summer 15.4
farm 15.2
vacation 14.7
outdoor 14.5
person 14.3
rural 14.1
park 14
performer 13.5
mountain 13.3
tourism 13.2
country 13.2
rock 13
coast 12.6
shore 12.1
water 11.3
sun 11.3
outside 11.1
sport 10.8
crowd 10.6
resort 10.4
entertainer 10.3
animals 10.2
male 10.1
silhouette 9.9
herd 9.8
scenic 9.7
horse 9.6
grass 9.5
walking 9.5
sunny 9.5
man 9.4
lake 9.2
ocean 9.1
national 9.1
livestock 8.8
swimsuit 8.6
cattle 8.6
desert 8.5
clouds 8.4
swimming trunks 8.3
scenery 8.1
cow 8
agriculture 7.9
holiday 7.9
stone 7.8
earth 7.8
ungulate 7.6
field 7.5
leisure 7.5
adult 7.5
natural 7.4
white 7.3
garment 7.3
sunset 7.2
meadow 7.2
team 7.2
soil 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 98.7
text 86.2
black and white 84.9
white 68.9
person 66.7
horse 19.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 99.2%
Fear 73.7%
Calm 10.7%
Sad 7%
Happy 5.8%
Confused 0.9%
Disgusted 0.8%
Surprised 0.8%
Angry 0.5%

AWS Rekognition

Age 19-27
Gender Male, 91.7%
Calm 59.3%
Happy 25.5%
Fear 9.2%
Sad 2%
Confused 1.5%
Disgusted 1.1%
Surprised 1%
Angry 0.4%

AWS Rekognition

Age 22-30
Gender Female, 57.1%
Calm 61.8%
Happy 22.6%
Sad 9%
Surprised 1.7%
Angry 1.5%
Disgusted 1.5%
Fear 1%
Confused 0.9%

AWS Rekognition

Age 24-34
Gender Male, 78.7%
Disgusted 71.4%
Confused 8.7%
Happy 8.1%
Surprised 3.7%
Angry 3%
Calm 2.6%
Sad 2.2%
Fear 0.4%

AWS Rekognition

Age 6-16
Gender Female, 68.8%
Surprised 41.6%
Fear 17.5%
Happy 11.8%
Confused 11.7%
Calm 9.4%
Sad 4.1%
Disgusted 2%
Angry 2%

AWS Rekognition

Age 28-38
Gender Male, 99.4%
Surprised 65.9%
Happy 9%
Calm 8.9%
Confused 5.7%
Disgusted 3.8%
Fear 3%
Sad 2.3%
Angry 1.4%

AWS Rekognition

Age 18-26
Gender Male, 99.2%
Calm 58.5%
Disgusted 14.7%
Happy 10.5%
Fear 4.8%
Surprised 4%
Confused 3.2%
Angry 2.3%
Sad 2%

AWS Rekognition

Age 14-22
Gender Female, 94.6%
Fear 89.5%
Surprised 9.1%
Calm 0.5%
Confused 0.4%
Sad 0.2%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 18-26
Gender Female, 82.1%
Sad 48.8%
Angry 23.8%
Calm 17.6%
Surprised 2.7%
Disgusted 2.5%
Fear 2.1%
Confused 1.7%
Happy 0.9%

AWS Rekognition

Age 20-28
Gender Female, 94.8%
Fear 98.7%
Happy 0.4%
Sad 0.3%
Angry 0.2%
Surprised 0.1%
Calm 0.1%
Disgusted 0.1%
Confused 0%

AWS Rekognition

Age 20-28
Gender Male, 99.1%
Fear 78.1%
Disgusted 5.7%
Happy 5.4%
Sad 4.6%
Calm 2.9%
Angry 1.3%
Surprised 1%
Confused 0.9%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Fear 62.4%
Calm 17.7%
Happy 15.7%
Sad 1.4%
Disgusted 0.9%
Surprised 0.7%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 23-31
Gender Female, 98.1%
Confused 41.7%
Happy 32.2%
Calm 18.6%
Surprised 2.1%
Sad 1.9%
Angry 1.4%
Fear 1.1%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Horse
Person 99.6%
Person 99.6%
Person 99.5%
Person 99.5%
Person 99.4%
Person 99.4%
Person 99.1%
Person 98.9%
Person 98.4%
Person 98.3%
Person 96.3%
Person 95.3%
Person 94.8%
Person 62.3%
Person 55%
Horse 66.3%

Categories

Imagga

paintings art 96.2%
text visuals 3.4%

Text analysis

Amazon

82A