Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5165

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5165

Machine Generated Data

Tags

Amazon
created on 2023-10-07

People 99.8
Person 98.9
Person 98.4
Person 98.1
Person 97.9
Person 97.2
Person 96.1
Person 95.5
Person 85.9
Adult 85.9
Male 85.9
Man 85.9
Person 84.1
Outdoors 80.9
Face 76.1
Head 76.1
Person 75.8
Clothing 75.4
Skirt 75.4
Machine 74.8
Wheel 74.8
Bicycle 71.3
Transportation 71.3
Vehicle 71.3
Nature 65.4
Wheel 60.9
Plant 58
Vegetation 58
Firearm 56.1
Gun 56.1
Rifle 56.1
Weapon 56.1
Dancing 56
Leisure Activities 56
Samurai 55.9
Shorts 55.7
Walking 55.7

Clarifai
created on 2018-05-10

people 100
many 99.3
group 98.2
group together 98
adult 96.7
man 96
military 94.5
child 92.3
uniform 89.5
soldier 89.1
war 86.7
administration 86.4
crowd 85.3
woman 84.1
recreation 83.1
wear 82.1
outfit 81.8
several 80.6
vehicle 78.4
leader 77.1

Imagga
created on 2023-10-07

pedestrian 46.1
man 29.5
sport 25
outdoors 23.2
male 22.7
people 21.7
person 21.6
planner 20.6
recreation 17
uniform 16
outdoor 15.3
brass 15.1
wind instrument 14.3
travel 13.4
active 12.9
military 12.5
mountain 12.4
men 12
vacation 11.4
horse 11.4
clothing 11
day 11
military uniform 11
bicycle 11
helmet 10.8
transportation 10.7
backpack 10.7
boy 10.4
walking 10.4
musical instrument 10.4
adult 10.4
summer 10.3
sky 10.2
lifestyle 10.1
leisure 10
bike 9.8
old 9.7
two 9.3
park 9
fun 9
activity 9
cycling 8.8
soldier 8.8
building 8.7
couple 8.7
smiling 8.7
carriage 8.6
outside 8.5
hill 8.4
city 8.3
street 8.3
countryside 8.2
wheelchair 8.2
protection 8.2
danger 8.2
road 8.1
sax 8
trombone 8
tourist 7.9
animal 7.8
ride 7.8
extreme 7.7
adventure 7.6
private 7.3
exercise 7.3
grass 7.1
family 7.1
architecture 7
together 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.5
person 97.4
people 85.3
group 84.7
white 62.6
old 61.7
posing 40.3
crowd 1.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 50.5%
Calm 97.7%
Surprised 6.5%
Fear 6.1%
Sad 2.2%
Confused 0.8%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 18-24
Gender Male, 78%
Calm 99.1%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Disgusted 0.1%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 23-31
Gender Male, 83.6%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 6-12
Gender Female, 92.8%
Fear 85.4%
Calm 14.1%
Sad 7.6%
Surprised 6.4%
Angry 5.1%
Happy 2.4%
Disgusted 1%
Confused 0.4%

AWS Rekognition

Age 13-21
Gender Female, 54.2%
Surprised 90.2%
Happy 21.2%
Fear 6.8%
Sad 6.1%
Calm 4.6%
Angry 3.3%
Disgusted 2.4%
Confused 1.4%

AWS Rekognition

Age 9-17
Gender Male, 97%
Calm 64.9%
Fear 10.9%
Angry 10.5%
Surprised 7%
Sad 4.5%
Happy 3.4%
Disgusted 2.3%
Confused 2%

AWS Rekognition

Age 50-58
Gender Male, 55.1%
Calm 97.6%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 1.5%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 12-20
Gender Female, 88.1%
Calm 56.7%
Sad 19.8%
Happy 12.8%
Fear 8.7%
Surprised 7.1%
Angry 2.5%
Disgusted 2.3%
Confused 0.9%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Adult 85.9%
Male 85.9%
Man 85.9%
Wheel 74.8%
Bicycle 71.3%

Text analysis

Amazon

DJENEAN