Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5130

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5130

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Adult 96.7
Male 96.7
Man 96.7
Person 96.7
Person 96.3
Adult 96.3
Male 96.3
Man 96.3
Person 96.3
People 95.6
Adult 94.3
Male 94.3
Man 94.3
Person 94.3
Person 93.9
Adult 88.6
Male 88.6
Man 88.6
Person 88.6
Person 86.5
Person 84.7
Person 83.8
Person 83.8
Adult 83.7
Person 83.7
Female 83.7
Woman 83.7
Gun 83
Weapon 83
Face 79
Head 79
Person 77.6
Outdoors 62.5
Clothing 61.6
Shorts 61.6
Animal 56.5
Bull 56.5
Mammal 56.5
Tent 55.5
Sword 55.2

Clarifai
created on 2018-05-10

people 100
group together 98
many 97.6
group 97.4
adult 94.2
man 93.5
military 90.6
several 88.1
war 87.7
wear 83.4
soldier 79.5
recreation 76.8
print 75.8
child 75.3
skirmish 74.4
weapon 74.1
leader 73.6
outfit 71.5
four 71.4
woman 68.7

Imagga
created on 2023-10-05

man 32.3
outdoors 27.5
male 24.2
adult 20.1
sport 18.9
people 17.8
person 17.3
outdoor 16.8
carriage 16.7
wheeled vehicle 16.6
cart 15.9
vehicle 15.3
day 13.3
outside 12.8
vacation 12.3
travel 12
grass 11.9
musical instrument 11.6
summer 11.6
mountain 11.6
playing 10.9
leisure 10.8
park 10.5
old 10.4
chair 10.3
sitting 10.3
winter 10.2
horse cart 10.2
happy 10
recreation 9.9
sky 9.6
snow 9.6
adventure 9.5
child 9.4
relax 9.3
wheelchair 9.1
active 9
fun 9
equipment 9
landscape 8.9
game 8.9
weapon 8.9
working 8.8
couple 8.7
cold 8.6
sunny 8.6
wagon 8.5
horizontal 8.4
wind instrument 8.3
tourist 8.1
building 8
holiday 7.9
climb 7.8
high 7.8
walk 7.6
field 7.5
barrow 7.4
gun 7.3
transport 7.3
work 7.2
activity 7.2
smile 7.1
country 7
season 7

Microsoft
created on 2018-05-10

text 99.8
book 94.7
outdoor 90.9
person 87.2
people 84.5
group 69.6
old 51.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 19-27
Gender Female, 97.1%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 25-35
Gender Male, 97.5%
Calm 90%
Surprised 6.6%
Fear 6.5%
Sad 3.2%
Happy 3.2%
Disgusted 0.8%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 6-14
Gender Male, 94.4%
Calm 69.8%
Sad 9.5%
Surprised 7.4%
Fear 7.3%
Happy 5.2%
Angry 4.8%
Disgusted 2.3%
Confused 1.3%

AWS Rekognition

Age 18-26
Gender Male, 79.1%
Calm 92.9%
Surprised 6.3%
Fear 5.9%
Happy 5.5%
Sad 2.3%
Confused 0.5%
Disgusted 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 37
Gender Male

Feature analysis

Amazon

Adult 98.2%
Male 98.2%
Man 98.2%
Person 98.2%
Female 83.7%
Woman 83.7%
Gun 83%

Categories