Human Generated Data

Title

Untitled (U.S. Highway 40, central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.900

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (U.S. Highway 40, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.900

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Worker 90
Face 81.9
Head 81.9
Clothing 64.6
Hat 64.6
Glove 62.6
Mining 57.3
Bucket 56.7
Water 56.3
Architecture 55.1
Fountain 55.1

Clarifai
created on 2018-05-11

people 100
adult 99.8
two 99.1
one 99
group 98.6
man 97.7
group together 97.1
three 96.7
wear 96.3
four 93.9
bucket 93.1
military 92.7
child 92.4
war 92
watercraft 91.6
woman 91
portrait 90.9
veil 89.4
vehicle 88.9
home 88.8

Imagga
created on 2023-10-06

barrow 31.9
vessel 26.3
handcart 25.9
shovel 25.5
container 24.6
tool 22.7
wheeled vehicle 20.1
bucket 19.2
man 17.5
hand tool 16.8
old 16
person 15.7
vehicle 14.1
pot 13.2
people 12.8
outdoors 12.7
building 11.9
architecture 11.7
adult 11.6
outdoor 11.5
male 11.3
lawn mower 11
cooking utensil 10.4
men 9.4
attractive 9.1
fashion 9
black 9
standing 8.7
sculpture 8.6
industrial 8.2
cleaner 8.1
wall 7.7
industry 7.7
device 7.6
statue 7.6
power 7.6
garden 7.5
city 7.5
monument 7.5
can 7.4
dirty 7.2
lifestyle 7.2
garden tool 7.2
conveyance 7.2
history 7.1
steel 7.1
summer 7.1
work 7.1
milk can 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
man 96.8
old 94.3
person 91.5
standing 83.4
white 60.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 99.6%
Calm 71.2%
Confused 22.4%
Surprised 6.5%
Fear 6%
Sad 3.4%
Angry 1%
Disgusted 0.9%
Happy 0.7%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Hat 64.6%
Glove 62.6%

Captions