Human Generated Data

Title

Untitled (Yogyakarta, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4539.3

Human Generated Data

Title

Untitled (Yogyakarta, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4539.3

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Transportation 98.6
Vehicle 98.6
Wagon 98.6
Person 98.3
Machine 98.2
Wheel 98.2
Person 97.9
Wheel 97.8
Wheel 97.8
Person 97.8
Horse Cart 97
Person 96.3
Person 95.9
Person 95.8
Bicycle 95.7
Person 95.4
Person 94.8
Spoke 94.3
Animal 94.2
Horse 94.2
Mammal 94.2
Person 94
Adult 94
Male 94
Man 94
Carriage 93
Person 91.7
Person 90.6
Person 90.6
Wheel 87.7
Person 87.3
Person 87.2
Wheel 86.2
Wheel 82.1
Wheel 79.8
Person 79.1
Light 78.8
Traffic Light 78.8
Bicycle 78.3
Bicycle 74.4
Wheel 74.2
Person 73
Bicycle 71.1
Person 69.6
Face 67.4
Head 67.4
Bicycle 66.5
Person 57.9

Clarifai
created on 2018-05-10

people 99.8
group together 98.7
transportation system 98.3
cavalry 97.7
carriage 97.4
vehicle 96.9
adult 94.9
group 94.4
man 92.4
driver 91.4
many 90.5
street 86.8
competition 85.1
cart 84.9
woman 83.6
road 83.6
crowd 83.2
seated 78.1
monochrome 76.5
race 76.1

Imagga
created on 2023-10-06

carriage 100
horse cart 39.9
cart 38.6
horse 35.1
wagon 27.6
transportation 23.3
old 18.8
transport 17.3
animal 17.3
travel 16.2
horses 15.6
wheeled vehicle 15.3
city 15
ride 14.9
vehicle 14.7
urban 12.2
man 12.1
street 12
wheel 11.3
people 10.6
historic 10.1
speed 10.1
history 9.8
bicycle 9.8
riding 9.7
antique 9.5
historical 9.4
road 9
wheelchair 8.9
bike 8.8
sky 8.3
tourism 8.2
tourist 8.2
rural 7.9
architecture 7.8
black 7.8
driver 7.8
culture 7.7
chair 7.6
traditional 7.5
vintage 7.4
park 7.4
harness 7.2
celebration 7.2
male 7.1

Microsoft
created on 2018-05-10

drawn 99.7
carriage 99.7
horse 99.5
pulling 99.1
outdoor 98.5
road 97.2
transport 92.5
horse-drawn vehicle 91.5
cart 83.3
pulled 64.9
old 41.5
family 34.6
team 32.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 99.1%
Happy 95.2%
Surprised 6.4%
Fear 5.9%
Calm 3.7%
Sad 2.2%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.2%
Fear 67.3%
Sad 22.3%
Happy 19.5%
Surprised 6.9%
Calm 5.1%
Angry 2.8%
Disgusted 0.9%
Confused 0.5%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.3%
Confused 0.1%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 18-26
Gender Male, 98.1%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Disgusted 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 26-36
Gender Male, 78.4%
Happy 78.8%
Fear 7%
Angry 6.9%
Surprised 6.6%
Calm 4.4%
Disgusted 4.3%
Sad 2.5%
Confused 0.9%

Feature analysis

Amazon

Person 98.3%
Wheel 98.2%
Bicycle 95.7%
Horse 94.2%
Adult 94%
Male 94%
Man 94%

Categories

Text analysis

Amazon

College
Art
and
(Harvard
Fellows
of
Museums)
Harvard
University
President
© President and Fellows of Harvard College (Harvard University Art Museums)
PA
©
P1970.4539.0003
BY

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4539.0003
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4539.0003