Human Generated Data

Title

Untitled (Urbana, Ohio)

Date

August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3362

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Urbana, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3362

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Neighborhood 100
City 99.2
Road 99.2
Street 99.2
Urban 99.2
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Person 98.8
Person 98.2
Child 98.2
Female 98.2
Girl 98.2
Path 92.6
Sidewalk 92.6
Person 92
Person 89.7
Person 86.7
Intersection 85.1
Furniture 82.4
Car 81.2
Transportation 81.2
Vehicle 81.2
Machine 80
Wheel 80
Clothing 76.6
Footwear 76.6
Shoe 76.6
Person 75
Chair 72.4
Face 69.7
Head 69.7
Light 64.8
Traffic Light 64.8
Architecture 63.9
Building 63.9
Motorcycle 62.6
Stroller 57.8
Wheelchair 57.4
Tarmac 57.2
Bicycle 56.6
Cycling 56.6
Sport 56.6
Person 56.2
Building 56.1
Car 55.2
Pedestrian 55.1

Clarifai
created on 2018-05-10

people 99.8
street 99.4
group together 99.4
man 98.4
group 97.8
vehicle 97.5
adult 96.8
transportation system 95.3
two 94.9
three 94.7
home 93.5
four 92.7
administration 92.2
woman 90.2
monochrome 90.1
several 89.4
road 86.5
many 86.4
war 85.5
police 83.4

Imagga
created on 2023-10-06

wheeled vehicle 43.5
pedestrian 40.3
city 29.1
vehicle 23.9
tricycle 22.3
shopping cart 21.7
street 20.2
building 19.9
travel 19.7
conveyance 19.3
handcart 18.6
urban 18.3
sky 17.2
road 17.2
architecture 16.5
outdoors 15.9
people 15.6
man 15.5
wheelchair 15.1
water 14.7
sea 13.3
outdoor 13
town 13
sport 12.7
house 11.7
tourism 11.5
old 11.1
chair 10.9
boat 10.5
cityscape 10.4
sunset 9.9
transportation 9.9
seat 9.8
container 9.5
beach 9.3
ocean 9.1
summer 9
person 9
tourist 9
tower 9
lifestyle 8.7
male 8.7
day 8.6
world 8.6
bridge 8.5
buildings 8.5
tree 8.5
church 8.3
square 8.1
landmark 8.1
coast 8.1
light 8
river 8
traffic 7.6
park 7.5
leisure 7.5
board 7.4
skateboard 7.3
sun 7.2
trees 7.1
to 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.6
road 99.1
street 79.3
white 75.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 34-42
Gender Male, 98.2%
Disgusted 75.1%
Angry 11.3%
Surprised 6.9%
Fear 6.2%
Sad 3.5%
Calm 3.1%
Happy 3.1%
Confused 1.7%

AWS Rekognition

Age 23-31
Gender Male, 99.5%
Calm 94.1%
Surprised 6.5%
Fear 5.9%
Confused 3.6%
Sad 2.4%
Angry 0.4%
Disgusted 0.4%
Happy 0.1%

AWS Rekognition

Age 4-10
Gender Male, 82.3%
Sad 98.1%
Calm 22%
Angry 19.4%
Surprised 6.4%
Fear 6%
Happy 2.3%
Confused 0.8%
Disgusted 0.7%

Microsoft Cognitive Services

Age 28
Gender Male

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Child 98.2%
Female 98.2%
Girl 98.2%
Car 81.2%
Wheel 80%
Shoe 76.6%
Building 63.9%

Categories

Text analysis

Amazon

55
DRUGS
54
29
WALL
68
PAPER
IMPERIAL
-
SPRING -
35
UNCH
CAN
GETTYSOURS
TAGY
THE
SPRING
USED
BABISO
27
WILSON
400
are
TEXA