Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5164

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5164

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.6
Shorts 99.6
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Person 98.3
Person 97.7
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Person 97.2
Male 95.7
Person 95.7
Boy 95.7
Child 95.7
People 94.2
Person 91.9
Person 90.2
Person 89.7
Person 86.6
Person 83.3
Footwear 78.7
Shoe 78.7
Helmet 74.9
Shoe 74
Face 72.4
Head 72.4
Person 61.6
Person 61.4
Shoe 59.7
Shoe 57.7
Transportation 56.3
Vehicle 56.3
Outdoors 55.8
Back 55.3
Body Part 55.3

Clarifai
created on 2018-05-10

people 100
many 99.5
group 99.1
group together 98
adult 96.7
child 94.8
man 94.5
military 92.4
wear 87.8
woman 86.1
several 83
war 83
spectator 81.1
administration 80.7
crowd 79.1
dancing 78.5
recreation 75.5
boy 74
soldier 71.7
veil 64.4

Imagga
created on 2023-10-06

man 26.2
horse 25.4
outdoors 21.3
animal 20.8
male 19.9
people 19.5
sand 19.3
person 18.8
sport 18.7
planner 16.9
riding 16.6
travel 16.2
beach 14.7
sky 14
summer 13.5
vacation 13.1
harness 12.6
couple 12.2
happy 11.9
ride 11.6
outdoor 11.5
boy 11.3
pedestrian 11.3
speed 11
cowboy 10.9
horses 10.7
uniform 10.7
desert 10.4
sea 10.2
tourism 9.9
active 9.5
lifestyle 9.4
holiday 9.3
leisure 9.1
ocean 9.1
sunset 9
transportation 9
sunlight 8.9
together 8.8
day 8.6
men 8.6
outside 8.6
tourist 8.4
competition 8.2
transport 8.2
mountain 8
water 8
family 8
country 7.9
bike 7.8
adult 7.8
sunny 7.7
running 7.7
old 7.7
dirt 7.6
kin 7.6
child 7.6
walking 7.6
work 7.5
landscape 7.4
animals 7.4
vehicle 7.3
recreation 7.2
farm 7.1
rural 7
camel 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.1
outdoor 96.6
group 90.6
old 82
people 74.8
posing 74.2
white 62.2
team 35.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 23-31
Gender Male, 99.1%
Calm 96.5%
Surprised 6.3%
Fear 5.9%
Confused 2.6%
Sad 2.2%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Male, 98.1%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 31-41
Gender Male, 99.7%
Calm 98.5%
Surprised 6.4%
Fear 5.9%
Sad 2.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%
Confused 0.1%

AWS Rekognition

Age 4-12
Gender Female, 99.6%
Fear 47.1%
Calm 24.9%
Sad 13.2%
Confused 9.8%
Surprised 7.7%
Happy 4.5%
Angry 3.4%
Disgusted 1.7%

Microsoft Cognitive Services

Age 33
Gender Male

Feature analysis

Amazon

Adult 98.5%
Male 98.5%
Man 98.5%
Person 98.5%
Boy 95.7%
Child 95.7%
Shoe 78.7%
Helmet 74.9%

Categories

Imagga

paintings art 98.7%