Human Generated Data

Title

Untitled (tenant farmer, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1582

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (tenant farmer, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1582

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Wood 98.1
Hand 83.2
Body Part 83.2
Finger 83.2
Carpenter 74.4
Firearm 68.6
Weapon 68.6
Gun 57.7
Outdoors 57.6
Worker 57
Handgun 56.4
Countryside 56.2
Nature 56.2
Architecture 55.6
Building 55.6
Hut 55.6
Rural 55.6
Plywood 55.2

Clarifai
created on 2018-05-11

people 100
one 99.7
adult 99.2
two 98.7
administration 98.4
man 97.2
group 95.5
leader 94.6
three 93.5
portrait 91.3
wear 86.6
music 86
furniture 85.7
writer 84.8
war 84.3
woman 83.5
group together 82.5
sit 81.6
profile 80.5
home 79.7

Imagga
created on 2023-10-05

upright 26
person 26
man 22.8
adult 20.1
male 19.9
percussion instrument 19.4
people 19
working 17.7
musical instrument 17.7
piano 17.5
old 16
stringed instrument 14.3
portrait 13.6
building 13.5
worker 13.4
keyboard instrument 13
happy 12.5
smiling 12.3
business 12.1
home 12
attractive 11.9
work 11.8
job 11.5
sitting 11.2
hair 11.1
lifestyle 10.8
device 10.7
lady 10.5
one 10.4
alone 10
scholar 9.8
black 9.6
outside 9.4
smile 9.3
office 9.2
face 9.2
house 9.2
pretty 9.1
hands 8.7
men 8.6
construction 8.6
senior 8.4
hand 8.3
looking 8
professional 7.9
intellectual 7.8
corporate 7.7
serious 7.6
casual 7.6
head 7.6
fashion 7.5
wood 7.5
cheerful 7.3
confident 7.3
student 7.2
cute 7.2
handsome 7.1
clothing 7.1
indoors 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

building 99.3
person 98.5
man 94.1
outdoor 91.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Male, 100%
Calm 91.6%
Surprised 6.4%
Confused 6.3%
Fear 6%
Sad 2.2%
Disgusted 0.6%
Angry 0.4%
Happy 0.1%

Microsoft Cognitive Services

Age 61
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Male 99.1%
Man 99.1%

Categories