Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3517

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3517

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Carpenter 99.9
Architecture 99.5
Building 99.5
Factory 99.5
Person 99.1
Manufacturing 97.4
Clothing 96.9
Face 85.7
Head 85.7
Workshop 72.6
Hat 67.4
Machine 57
Wood 56.8

Clarifai
created on 2018-05-10

people 100
one 99.7
adult 99.5
man 98.1
two 97
wear 95.3
production 95.2
artisan 94.4
grinder 93.7
furniture 92.7
group 91.9
concentration 91.3
room 90.3
military 89.3
war 88.6
woman 88
raw material 86.8
administration 86.6
industry 84.2
three 84.1

Imagga
created on 2023-10-05

man 24.2
industrial 21.8
industry 20.5
metal 19.3
work 19
steel 18.6
worker 17.8
working 17.7
male 17
people 16.2
person 15.9
shop 15.6
old 15.3
factory 15
job 14.1
machinist 14.1
equipment 13.6
men 12.9
black 12.7
repair 12.4
machine 12.2
safety 12
adult 11.6
building 11.2
occupation 11
iron 10.9
power 10.9
manufacturing 10.7
cowboy boot 10.7
skill 10.6
construction 10.3
hot 10
protection 10
labor 9.7
business 9.7
stall 9.7
tools 9.5
boot 9.4
fire 9.4
light 9.4
meat hook 9.1
vehicle 9.1
human 9
welder 8.9
welding 8.9
mechanic 8.8
device 8.7
helmet 8.7
footwear 8.4
danger 8.2
dirty 8.1
interior 8
mercantile establishment 7.9
craft 7.8
gloves 7.8
hook 7.7
car 7.6
fashion 7.5
house 7.5
tool 7.4
heat 7.4
art 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.5
black 69.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 51.4%
Calm 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Hat 67.4%

Captions

Text analysis

Amazon

MOTO
NATION
Of