Human Generated Data

Title

Untitled (Indian temple, Singapore)

Date

February 17, 1960-February 20, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5373

Human Generated Data

Title

Untitled (Indian temple, Singapore)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 17, 1960-February 20, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5373

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Person 97.8
Architecture 97.4
Building 97.4
Person 96.7
Crypt 93.2
Arch 85.1
Person 81.5
City 71.8
Person 69
Face 62.4
Head 62.4
Floor 57.5
Urban 57.5
Road 57.2
Street 57.2
Flooring 57
Corridor 56.5
Indoors 56.5
Monastery 56.5
Alloy Wheel 56.3
Car 56.3
Car Wheel 56.3
Machine 56.3
Spoke 56.3
Tire 56.3
Transportation 56.3
Vehicle 56.3
Wheel 56.3
Person 55.7
Art 55.4
Painting 55.4

Clarifai
created on 2018-05-10

no person 96.8
people 96.7
room 95.7
indoors 94.9
door 94.6
doorway 93.2
furniture 93.2
adult 87.2
house 86.3
window 85.4
home 85.1
architecture 83.8
family 83.4
man 83
two 82.3
one 80.6
wear 80
monochrome 78
street 76.2
wall 75.3

Imagga
created on 2023-10-05

forklift 84.7
wheeled vehicle 53.9
vehicle 45.6
stall 26.1
building 25.7
architecture 21.9
old 21.6
conveyance 17.7
house 16.8
city 16.6
industry 15.4
town 14.8
street 14.7
scale 13.7
construction 13.7
ancient 13
travel 12.7
transportation 12.6
window 11.9
transport 11.9
truck 11.7
history 11.6
urban 11.4
measuring instrument 11.2
instrument 10.9
industrial 10.9
historical 10.3
equipment 10.3
machine 10.3
vintage 9.9
door 9.6
work 9.4
wall 9.4
tourism 9.1
religion 9
home 8.8
entrance 8.7
antique 8.7
heavy 8.6
monument 8.4
wood 8.3
traditional 8.3
exterior 8.3
seller 8.1
steel 8
loader 7.9
machinery 7.8
scene 7.8
cargo 7.8
stone 7.7
lamp 7.6
landscape 7.4
style 7.4
historic 7.3
metal 7.2
dirty 7.2
structure 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

furniture 30.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Disgusted 96.2%
Fear 6.5%
Surprised 6.4%
Sad 2.3%
Confused 0.5%
Angry 0.4%
Calm 0.4%
Happy 0.2%

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Calm 34.4%
Angry 27.1%
Happy 12.9%
Fear 11.4%
Surprised 6.7%
Sad 6.1%
Disgusted 5.4%
Confused 1%

AWS Rekognition

Age 18-24
Gender Female, 99.5%
Calm 62.4%
Surprised 13.8%
Fear 10%
Angry 6.4%
Sad 4.9%
Happy 2.5%
Confused 2.4%
Disgusted 1.7%

Feature analysis

Amazon

Person 97.8%
Building 97.4%

Categories

Captions

Text analysis

Amazon

BOX
CHARITY
2_

Google

CHARITY 0
CHARITY
0