Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5172

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5172

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 99
Person 98.7
Clothing 98.4
Shorts 98.4
Person 98.2
Person 98.1
Person 94.4
Person 93.2
City 91.9
Road 91.9
Street 91.9
Urban 91.9
Person 91.3
Outdoors 88.8
People 87.7
Footwear 77.3
Shoe 77.3
Face 77
Head 77
Nature 76.6
Hat 73
Stilts 69.7
Bicycle 66.2
Transportation 66.2
Vehicle 66.2
Firearm 64.9
Gun 64.9
Rifle 64.9
Weapon 64.9
Shoe 58.5
Cycling 57
Sport 57
Skirt 56.8
Walking 56.2
Motorcycle 56
Accessories 55.6
Bag 55.6
Handbag 55.6
Dress 55.5
Slum 55.2

Clarifai
created on 2018-05-10

people 100
group 99.4
group together 98.9
adult 97.7
man 96.3
child 94.7
many 94.7
cavalry 94
military 93.7
canine 91.8
wear 91.6
soldier 90.4
three 89.3
vehicle 89.3
several 89
administration 87.5
war 87.2
transportation system 86.2
four 85.6
outfit 84.8

Imagga
created on 2023-10-06

crutch 45.2
staff 35.1
stick 29.8
weapon 18.1
horse 17.2
man 16.8
male 15.6
outdoors 15
engineer 14.6
travel 13.4
old 13.2
bow and arrow 12.8
uniform 12.3
people 12.3
person 11.7
sport 11.7
military 11.6
sand 11.4
day 11
danger 10.9
animal 10.8
farm 10.7
outdoor 10.7
grass 10.3
history 9.8
military uniform 9.8
desert 9.3
tourist 9.2
road 9
clothing 9
mountain 8.9
rural 8.8
soldier 8.8
tree 8.7
instrument 8.7
war 8.7
two 8.5
leisure 8.3
landscape 8.2
equipment 8
country 7.9
boy 7.8
architecture 7.8
summer 7.7
extreme 7.7
house 7.5
snow 7.5
brown 7.4
rifle 7.2
building 7.1
pedestrian 7.1
camel 7
tool 7
sky 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.8
tree 97.8
posing 85.3
old 58.5
group 57
people 56.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 81.8%
Calm 98.4%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.8%
Happy 0.1%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 47-53
Gender Female, 64%
Sad 99.6%
Happy 15.5%
Surprised 7.3%
Fear 6.9%
Calm 6.3%
Angry 3.9%
Confused 2.8%
Disgusted 1.5%

AWS Rekognition

Age 19-27
Gender Female, 99.7%
Sad 100%
Calm 15.1%
Surprised 6.5%
Fear 5.9%
Happy 3.3%
Disgusted 0.4%
Angry 0.2%
Confused 0%

Feature analysis

Amazon

Person 99%
Shoe 77.3%