Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5125

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5125

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Person 98.9
Person 98.5
People 89.6
Face 77
Head 77
Weapon 70.4
Musical Instrument 63.2
Leisure Activities 55.9
Music 55.9
Musician 55.9
Performer 55.9
Drum 55
Percussion 55

Clarifai
created on 2018-05-10

people 100
group together 99.4
adult 98.9
group 98.4
man 98.3
military 97.4
war 96
many 95.7
soldier 94
uniform 92.8
weapon 92.6
several 92.3
wear 92.1
four 91.4
three 91
skirmish 90.3
child 90
two 87.9
five 87.2
woman 85.8

Imagga
created on 2023-10-07

stretcher 92.8
litter 74
conveyance 61.2
man 26.9
barrow 25.7
outdoors 25.5
male 22
vehicle 21
handcart 19.8
person 19.4
people 19
wheeled vehicle 17.2
outdoor 16.8
travel 16.2
active 16.2
adventure 16.1
sport 15.7
extreme 15.3
dirt 15.3
summer 14.8
old 14.6
adult 14.3
vacation 13.1
boy 13
speed 12.8
two 12.7
cowboy 12.6
recreation 12.5
mountain 12.5
sand 12.3
competition 11.9
riding 11.7
together 11.4
fun 11.2
sports 11.1
grass 11.1
transport 11
child 10.9
road 10.8
leisure 10.8
backpack 10.7
horse 10.7
hiking 10.6
desert 10.3
outside 10.3
action 10.2
lifestyle 10.1
happy 10
father 10
couple 9.6
sky 9.6
day 9.4
mountains 9.3
mother 9.1
danger 9.1
tourist 9.1
animal 9
activity 9
rural 8.8
work 8.6
race 8.6
men 8.6
walking 8.5
field 8.4
farm 8
sunlight 8
working 8
women 7.9
love 7.9
racing 7.8
western 7.7
helmet 7.7
beach 7.7
sun 7.2
plow 7.2
portrait 7.1
family 7.1
country 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.6
person 94.4
player 85.7
old 79.7
white 65.2
group 58.1
posing 53.3
team 35.3
vintage 33.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Confused 76.3%
Calm 20.9%
Surprised 6.5%
Fear 6.1%
Sad 2.2%
Disgusted 0.9%
Angry 0.6%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Female, 91.8%
Calm 97.9%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 16-22
Gender Male, 91.6%
Sad 69.1%
Calm 47.6%
Fear 7.8%
Surprised 7.4%
Confused 6.4%
Disgusted 2.6%
Angry 2.2%
Happy 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%