Human Generated Data

Title

Untitled (U.S. Highway 40, central Ohio)

Date

July 1938-August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3346

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (U.S. Highway 40, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3346

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Worker 98.7
Garden 81.2
Gardener 81.2
Gardening 81.2
Nature 81.2
Outdoors 81.2
Face 78.6
Head 78.6
Bucket 70
Clothing 69.4
Glove 69.4
Bathing 67.4
Water 66.8
Glove 58.5
Soil 57.6
Tar 57.1
Washing 55.8
Mining 55.8

Clarifai
created on 2018-05-10

people 100
adult 99.5
man 98.5
one 97.6
two 96.8
group 96.2
woman 92.2
bucket 91.4
war 91.3
group together 91
wear 90.9
military 88.4
portrait 88.2
child 88.1
three 87.8
administration 86.6
four 85.4
soldier 85
home 79.4
furniture 75.2

Imagga
created on 2023-10-07

barrow 23.1
man 22.8
container 21.5
handcart 20.5
people 20.1
person 19.1
bucket 18.4
vessel 16.9
wheeled vehicle 16.2
male 15.6
television 14.2
adult 13.6
outdoors 13.4
sunset 11.7
holding 11.5
businessman 11.5
black 11.4
telecommunication system 11.1
business 10.9
old 10.4
work 10.4
outdoor 9.9
silhouette 9.9
portrait 9.7
summer 9.6
vehicle 9.3
briefcase 9.1
park 9
dirty 9
suit 9
couple 8.7
love 8.7
men 8.6
child 8.4
clothing 8.4
fashion 8.3
working 7.9
autumn 7.9
standing 7.8
sitting 7.7
walking 7.6
guy 7.4
can 7.4
water 7.3
lady 7.3
protection 7.3
danger 7.3
lifestyle 7.2
world 7.1
women 7.1
grass 7.1
shovel 7.1
day 7.1
milk can 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.2
standing 84.3
old 66.2
white 62.6
posing 47.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 98.6%
Calm 75.3%
Confused 11.6%
Surprised 6.8%
Sad 6.2%
Fear 6.1%
Angry 1.6%
Happy 1%
Disgusted 1%

Microsoft Cognitive Services

Age 47
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Glove 69.4%

Captions