Human Generated Data

Title

Untitled (Kyoto, Japan)

Date

March 14, 1960-April 22, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5530

Human Generated Data

Title

Untitled (Kyoto, Japan)

People

Artist: Ben Shahn, American 1898 - 1969

Date

March 14, 1960-April 22, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5530

Machine Generated Data

Tags

Amazon
created on 2023-10-05

City 100
Road 100
Street 100
Urban 100
Clothing 99.9
Alley 99.9
Path 99.3
Person 99
Neighborhood 98.7
Person 97.6
Architecture 94.7
Building 94.7
Bicycle 91.4
Transportation 91.4
Vehicle 91.4
Bicycle 91.3
Machine 91
Wheel 91
Person 86.4
Walking 83
Wheel 78.5
Overcoat 74
Coat 73.2
Accessories 70.6
Bag 70.6
Handbag 70.6
Face 64.1
Head 64.1
Sidewalk 57.7
Outdoors 56.7
Pedestrian 55.8
Walkway 55.1
Shelter 55.1

Clarifai
created on 2018-05-10

people 99.8
group 98.9
group together 96.8
street 96.3
adult 96.2
home 95.5
vehicle 95.2
two 94.8
transportation system 93.9
many 93.4
man 91.7
war 89.3
one 89.2
four 87.4
train 87.1
three 83.4
several 83
wear 82.5
room 81.8
military 81.3

Imagga
created on 2023-10-05

factory 44
building 31.6
old 30.6
architecture 28.9
plant 25.9
house 25.9
structure 23.4
street 21.2
pay-phone 20.3
city 19.9
construction 19.7
industry 17.9
industrial 17.2
gas pump 17.2
travel 16.2
sky 15.9
wall 15.7
telephone 15.7
building complex 15.6
equipment 15.3
urban 14.9
pump 14.6
abandoned 13.7
steel 13.3
brick 13.2
town 13
vehicle 12.9
transportation 12.6
broken 12.5
wheeled vehicle 12.4
tourism 12.4
electronic equipment 12.3
metal 12.1
stone 11.8
train 11.5
environment 11.5
machine 11.1
door 11.1
power 10.9
dirty 10.8
wood 10.8
road 10.8
mechanical device 10.5
home 10.4
energy 10.1
transport 10
track 10
history 9.8
destruction 9.8
wooden 9.7
device 9.5
ancient 9.5
generator 9.5
exterior 9.2
vintage 9.1
forklift 9
black 9
landscape 8.9
rail 8.8
railway 8.8
ruin 8.8
waste 8.8
station 8.7
windows 8.6
roof 8.6
buildings 8.5
window 8.2
mechanism 8.1
sea 7.8
machinery 7.8
scene 7.8
car 7.8
empty 7.7
container 7.7
line 7.7
fuel 7.7
boat 7.7
pollution 7.7
outdoors 7.5
light 7.4
water 7.3
tourist 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 95.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 99.7%
Sad 100%
Surprised 6.8%
Fear 6.8%
Confused 0.7%
Disgusted 0.5%
Calm 0.5%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Female, 93%
Fear 92.3%
Calm 14.8%
Surprised 6.6%
Sad 3.1%
Confused 1.3%
Happy 1.3%
Disgusted 0.5%
Angry 0.4%

AWS Rekognition

Age 16-24
Gender Male, 85.7%
Sad 53.1%
Calm 36%
Confused 18.1%
Surprised 9.5%
Fear 7.8%
Angry 5.3%
Disgusted 1.9%
Happy 1.4%

Feature analysis

Amazon

Person 99%
Building 94.7%
Bicycle 91.4%
Wheel 91%
Coat 73.2%
Handbag 70.6%

Categories

Text analysis

Amazon

***
las
so
POSTA

Google

-水た4%
-
4
%