Human Generated Data

Title

Untitled (Greenbelt, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1913

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Greenbelt, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1913

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Carpenter 100
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Clothing 95.8
Hardhat 95.8
Helmet 95.8
Wood 94.7
Face 86.2
Head 86.2
Worker 78.7
Person 64.1
Device 57.7
Hat 57.4
Construction 57.1

Clarifai
created on 2018-05-11

people 99.9
adult 98.9
one 97.1
man 95.7
two 95.6
group together 93.9
woman 93.5
war 92.2
group 91.5
wear 91.3
raw material 89.9
military 86.3
three 84
waste 82.9
administration 81.9
several 79.3
vehicle 78.5
four 78.3
transportation system 75.8
furniture 74.5

Imagga
created on 2023-10-06

factory 57.3
plant 33.8
tool 28.2
construction 26.5
carpenter 26.3
building complex 25.9
working 23
work 22.7
machine 22
man 21.5
industry 21.3
worker 21.3
builder 17.7
building 17.4
industrial 17.2
equipment 16.6
shovel 16.3
structure 16.2
people 16.2
job 15.9
steel 15.9
wood 15
power saw 14.9
labor 14.6
old 13.9
men 13.7
power 13.4
outdoor 13
vehicle 12.3
metal 12.1
male 12
occupation 11.9
heavy 11.4
barrow 11.4
power tool 11.4
water 11.3
site 11.3
outdoors 11.2
chain saw 11.1
safety 11
manual 10.7
person 10.3
iron 10.3
action 10.2
helmet 9.6
dirt 9.5
heat 9.2
house 9.2
hand tool 9.1
handcart 9.1
adult 9
sexy 8.8
saw 8.7
plow 8.6
tools 8.5
build 8.5
device 8.2
rock 7.8
black 7.8
wheeled vehicle 7.8
skill 7.7
attractive 7.7
orange 7.7
repair 7.7
jeans 7.6
business 7.3
sawmill 7.3
home 7.2
activity 7.2
grass 7.1
to 7.1
summer 7.1
wooden 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 87.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Sad 100%
Calm 10.1%
Fear 6.6%
Surprised 6.4%
Confused 2.8%
Disgusted 0.7%
Happy 0.5%
Angry 0.3%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%

Categories

Imagga

paintings art 96.9%