Human Generated Data

Title

Untitled (Yogyakarta, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5063

Human Generated Data

Title

Untitled (Yogyakarta, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5063

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Horse Cart 99.1
Transportation 99.1
Vehicle 99.1
Wagon 99.1
Adult 97.7
Female 97.7
Person 97.7
Woman 97.7
Machine 97.6
Wheel 97.6
Person 97.4
Person 95.9
Wheel 95.7
Wheel 95.7
Animal 95.5
Horse 95.5
Mammal 95.5
Person 95.4
Person 95.4
Person 95.2
Person 94.2
Person 94
Person 92.6
Person 92
Person 91.9
Wheel 91.8
Person 89.8
Wheel 87
Person 85.2
Bicycle 81.2
Person 78.3
Carriage 77.5
Bicycle 77
Person 76.8
Wheel 75.1
Wheel 72.6
Face 72.4
Head 72.4
Wheel 72.1
Person 71
Bicycle 68.8
Wheel 68.4
Person 67.6
Bicycle 64.3
Bicycle 60.5
Spoke 57.9
Wheel 56.2

Clarifai
created on 2018-05-10

people 99.9
transportation system 99.6
carriage 99.6
group together 99.5
vehicle 99.5
driver 98.5
cavalry 98.5
group 98.4
many 98.2
cart 98.1
wagon 97.6
street 96.8
man 96.6
adult 95.4
road 89.4
seated 87.5
woman 87.3
stagecoach 86.2
war 86
crowd 85.9

Imagga
created on 2023-10-07

horse cart 100
carriage 100
cart 100
wagon 81.6
wheeled vehicle 47.3
horse 38
transportation 32.3
vehicle 31.7
transport 25.6
old 25.1
wheel 20.7
animal 19.7
horses 18.5
city 16.6
street 16.6
travel 14.1
antique 13.9
bicycle 13.7
bike 13.7
riding 13.6
vintage 13.2
rural 13.2
ride 12.6
man 12.1
historic 11.9
harness 11.5
historical 11.3
road 10.8
history 10.7
farm 10.7
urban 10.5
outdoors 10.5
grass 10.3
tourist 10
tourism 9.9
driving 9.7
retro 9
wheels 8.8
outdoor 8.4
speed 8.2
sport 8.2
mammal 8.1
brown 8.1
people 7.8
color 7.8
scene 7.8
ancient 7.8
driver 7.8
sky 7.7
head 7.6
classic 7.4
aged 7.2
black 7.2
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

carriage 100
drawn 100
horse 100
pulling 99.9
outdoor 99.9
road 98.2
cart 97.5
pulled 96.5
transport 93
street 89.6
horse-drawn vehicle 85.8
driving 73.1
old 68.5
people 67.4
fashioned 54.7
family 15.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Male, 89.6%
Calm 93.6%
Surprised 6.7%
Fear 6.1%
Sad 2.6%
Confused 1.1%
Happy 1.1%
Disgusted 0.8%
Angry 0.4%

AWS Rekognition

Age 28-38
Gender Male, 95.2%
Disgusted 69.7%
Surprised 9.4%
Fear 7%
Happy 6.3%
Angry 5.9%
Sad 4.6%
Confused 2.8%
Calm 1.9%

AWS Rekognition

Age 19-27
Gender Male, 98.2%
Calm 99.2%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Confused 0%

AWS Rekognition

Age 30-40
Gender Male, 99.4%
Calm 66.5%
Surprised 9%
Angry 8.8%
Disgusted 7.9%
Fear 6.2%
Confused 4.4%
Sad 3.6%
Happy 3.3%

AWS Rekognition

Age 19-27
Gender Male, 99.8%
Calm 54.5%
Sad 41.1%
Angry 8.7%
Surprised 6.8%
Happy 6.2%
Fear 6%
Confused 3.4%
Disgusted 1.7%

AWS Rekognition

Age 16-24
Gender Male, 98.7%
Sad 89.1%
Calm 45.9%
Surprised 8%
Fear 6.2%
Angry 4.2%
Happy 2.8%
Disgusted 0.9%
Confused 0.9%

AWS Rekognition

Age 22-30
Gender Female, 75.6%
Sad 44.5%
Calm 33.1%
Fear 12%
Surprised 11.9%
Disgusted 10.4%
Angry 5%
Happy 4.9%
Confused 1.5%

AWS Rekognition

Age 16-24
Gender Female, 53.7%
Calm 73%
Fear 18.4%
Surprised 6.5%
Sad 3.3%
Happy 2.1%
Angry 1.3%
Disgusted 0.5%
Confused 0.3%

Feature analysis

Amazon

Adult 97.7%
Female 97.7%
Person 97.7%
Woman 97.7%
Wheel 97.6%
Horse 95.5%
Bicycle 81.2%