Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1207

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1207

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 96.4
Male 96.4
Man 96.4
Person 96.4
Person 95
Awning 92.9
Canopy 92.9
Person 89
Person 87.9
Architecture 84
Building 84
Face 81
Head 81
City 75.7
Shop 75.3
Urban 68.3
Indoors 61.1
Restaurant 61.1
Cafeteria 56
Road 55.7
Street 55.7
Newsstand 55.3

Clarifai
created on 2018-05-11

people 99.9
adult 99.3
two 99
one 98.7
man 98.4
woman 96.5
four 96.2
monochrome 95
group together 94.7
group 94.6
three 93.7
street 92.7
wear 91.5
child 89.1
home 87.4
several 85.6
administration 85.5
vehicle 83.7
five 79.5
music 76.1

Imagga
created on 2023-10-06

building 25.6
architecture 23.5
door 20.1
old 18.8
window 17.7
wall 17.6
structure 16.3
city 15.8
shop 14.4
house 14.2
barbershop 13.9
travel 12.7
ancient 12.1
device 12
glass 11.7
tourism 11.5
mercantile establishment 11.5
urban 11.4
support 11
history 10.7
construction 10.3
stone 10.1
bridge 10.1
landmark 9.9
vintage 9.9
sky 9.6
buildings 9.4
light 9.4
jamb 9.3
historic 9.2
black 9.1
outside 8.6
wood 8.3
street 8.3
park 8.2
outdoors 8.2
office 8.2
brick 8.2
home 8
upright 7.7
windows 7.7
grunge 7.7
place of business 7.7
roof 7.6
man 7.4
exterior 7.4
gate 7.4
garage 7.3
industrial 7.3
aged 7.2
dirty 7.2
structural member 7.1
adult 7.1
night 7.1
male 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

building 99.7
outdoor 99.6
white 94.5
black 89.9
person 88.7
old 69.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Calm 93.3%
Surprised 6.5%
Fear 6%
Sad 3.1%
Confused 2.2%
Angry 0.5%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 36-44
Gender Male, 100%
Sad 99.9%
Calm 18.8%
Surprised 6.5%
Fear 6%
Confused 3.5%
Disgusted 0.5%
Angry 0.2%
Happy 0.2%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 90%
Surprised 10.3%
Fear 5.9%
Sad 2.8%
Confused 0.6%
Happy 0.4%
Disgusted 0.4%
Angry 0.3%

Microsoft Cognitive Services

Age 40
Gender Male

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Building 84%

Categories

Text analysis

Amazon

STAIRS
OFFICE
OFFICE UP STAIRS
LAWYER
UP
C.T.CARPENTER

Google

C.T. CARPENTER LAWYER OFFICE UP STAIRS
LAWYER
OFFICE
UP
STAIRS
C.T.
CARPENTER