Human Generated Data

Title

Untitled (Yogyakarta, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4539.5

Human Generated Data

Title

Untitled (Yogyakarta, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4539.5

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 98.9
Person 98.5
Bicycle 98.1
Transportation 98.1
Vehicle 98.1
Person 97.1
Carriage 97
Person 96.9
Person 96.4
Person 95.6
Person 95.6
Person 94.9
Machine 94.7
Wheel 94.7
Person 94.2
Person 94
Bicycle 93
Person 92.3
Person 91.8
Person 91.8
Bicycle 90.8
Person 90.1
Wagon 88
Animal 87.7
Horse 87.7
Mammal 87.7
Person 87.4
Person 86.9
Person 86.2
Wheel 85.7
Person 85.2
Bicycle 85.2
Wheel 84.9
Person 83.9
Wheel 83.7
Wheel 83.4
Bicycle 78.6
Wheel 74.9
Head 70.7
Bicycle 70.4
Wheel 69.2
Person 69.2
Horse Cart 66.5
Car 66.1
Person 65.9
Light 65.1
Traffic Light 65.1
Wheel 63.3
Bicycle 59.3
Road 57
Spoke 56.9
Tricycle 55.7

Clarifai
created on 2018-05-10

people 99.9
group together 99.3
adult 98.5
monochrome 98.4
street 98.2
many 97.9
transportation system 97.6
group 97.1
vehicle 96
man 95.8
child 90.6
woman 90.3
crowd 89.9
road 89.8
several 85
cavalry 83.9
wear 81.5
motion 77.9
recreation 77.8
administration 77.7

Imagga
created on 2023-10-06

shopping cart 100
wheeled vehicle 88.2
handcart 84.7
container 40.3
conveyance 36.9
tricycle 30
vehicle 22.4
city 13.3
urban 13.1
chair 13.1
black 12
transportation 11.6
sky 11.5
water 11.3
people 11.1
old 11.1
building 11.1
man 10.1
transport 10
snow 9.7
business 9.7
technology 9.6
scene 9.5
men 9.4
sea 9.4
architecture 9.4
winter 9.4
power 9.2
work 9.2
travel 9.1
industrial 9.1
seat 9
metal 8
construction 7.7
industry 7.7
grunge 7.7
structure 7.6
dark 7.5
street 7.4
wheelchair 7.2
person 7.2
beach 7.1
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 85.3
old 57.4
drawn 40.8
several 16.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 99.2%
Happy 70.5%
Calm 22%
Surprised 7%
Fear 6.5%
Sad 3.1%
Angry 0.6%
Disgusted 0.6%
Confused 0.5%

AWS Rekognition

Age 25-35
Gender Male, 99.5%
Calm 96%
Surprised 6.3%
Fear 6.1%
Sad 2.5%
Happy 1.1%
Angry 0.6%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 45-51
Gender Female, 71.5%
Calm 99%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0.3%
Angry 0.1%
Happy 0.1%
Confused 0%

AWS Rekognition

Age 21-29
Gender Male, 88.3%
Calm 88%
Fear 6.4%
Surprised 6.3%
Sad 4.8%
Happy 3.2%
Disgusted 0.6%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 22-30
Gender Male, 81.5%
Calm 93.2%
Surprised 6.7%
Fear 6%
Sad 2.8%
Angry 1.7%
Happy 1%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 24-34
Gender Female, 88.6%
Happy 93.6%
Surprised 6.6%
Fear 6.1%
Calm 2.5%
Sad 2.3%
Disgusted 1.1%
Angry 0.7%
Confused 0.2%

AWS Rekognition

Age 16-24
Gender Male, 99.8%
Happy 28.6%
Angry 27.7%
Calm 26.7%
Fear 8.3%
Surprised 8%
Sad 3.8%
Disgusted 2.1%
Confused 2%

Feature analysis

Amazon

Person 98.9%
Bicycle 98.1%
Wheel 94.7%
Horse 87.7%
Car 66.1%

Text analysis

Amazon

College
and
Art
Fellows
(Harvard
Museums)
of
Harvard
University
President
© President and Fellows of Harvard College (Harvard University Art Museums)
P1970.4539.0005
©
BA

Google

C President and Fellows of Harvard College (Harvard University Art Museums) P1970.4539.0005
C
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4539.0005