Human Generated Data

Title

Untitled (Barong Dance, Bali)

Date

February 2, 1960-February 17, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5203

Human Generated Data

Title

Untitled (Barong Dance, Bali)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 2, 1960-February 17, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5203

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 97.6
Person 96.6
Person 96.1
Person 95.4
Person 91.4
Person 88.6
Car 86.1
Transportation 86.1
Vehicle 86.1
Person 86
Animal 79.9
Horse 79.9
Mammal 79.9
Person 76.4
Outdoors 75.3
Machine 63.5
Wheel 63.5
Person 63.2
Person 58.5
Nature 57.1
Zoo 56.4
Bull 55.6
Camel 55.5

Clarifai
created on 2018-05-10

people 100
group 99.3
adult 99.1
group together 97.9
vehicle 97.2
print 96.8
transportation system 96.8
man 95.9
child 95.9
campsite 94.2
three 93.8
home 92.5
two 91.9
veil 91.7
many 91.7
tent 91.1
wear 90.7
wagon 88.3
several 88.1
offspring 88.1

Imagga
created on 2023-10-06

resort area 21.5
area 21.4
swing 19.5
travel 18.3
structure 18.3
tree 18
sky 17.2
landscape 17.1
old 16.7
trees 16
snow 15.9
vehicle 15.2
house 15
region 14.8
building 14.1
winter 13.6
road 13.5
mechanical device 13
seller 13
park 12.7
plaything 12.6
history 12.5
wood 12.5
architecture 12.5
wheeled vehicle 11.8
city 11.6
vacation 11.5
tourist 11.2
construction 11.1
outdoors 10.8
tourism 10.7
outdoor 10.7
scene 10.4
street 10.1
historic 10.1
danger 10
horse cart 9.7
mechanism 9.6
holiday 9.3
transportation 9
sand 9
new 8.9
destruction 8.8
forest 8.7
ancient 8.6
cold 8.6
stall 8.5
location 8.5
power 8.4
cart 8.3
religion 8.1
night 8
hut 7.9
day 7.8
sea 7.8
accident 7.8
world 7.8
industry 7.7
beach 7.6
wheel 7.5
child 7.5
environment 7.4
man 7.4
window 7.3
transport 7.3
industrial 7.3
national 7.2
mobile home 7.2
summer 7.1
rural 7
wooden 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.7
tree 99.3
horse 99.3
drawn 98.1
carriage 96.5
pulling 91.8
cart 88.4
old 77.2
pulled 58
horse-drawn vehicle 52.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 11-19
Gender Female, 55.5%
Sad 95.2%
Calm 35.4%
Surprised 7.4%
Angry 6.6%
Fear 6.2%
Confused 4.1%
Disgusted 1.8%
Happy 1.3%

AWS Rekognition

Age 19-27
Gender Male, 98.8%
Calm 64%
Surprised 9.5%
Confused 8%
Fear 6.3%
Happy 6.1%
Sad 5.6%
Angry 5.4%
Disgusted 3.2%

AWS Rekognition

Age 10-18
Gender Male, 99.6%
Fear 57.6%
Angry 34.3%
Calm 10.8%
Surprised 6.5%
Sad 4.4%
Confused 1.3%
Disgusted 1.1%
Happy 0.9%

AWS Rekognition

Age 14-22
Gender Female, 97%
Sad 80.3%
Calm 48.1%
Surprised 6.7%
Angry 6.4%
Fear 6.2%
Disgusted 3.6%
Confused 2.5%
Happy 1.1%

AWS Rekognition

Age 25-35
Gender Male, 90.2%
Calm 79.4%
Surprised 7.1%
Fear 6.3%
Happy 6%
Sad 3.9%
Angry 3.7%
Disgusted 2.2%
Confused 1.9%

Feature analysis

Amazon

Person 97.6%
Car 86.1%
Horse 79.9%
Wheel 63.5%