Human Generated Data

Title

Untitled (tenant farmer, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2531

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (tenant farmer, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2531

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Face 99.1
Head 99.1
Photography 99.1
Portrait 99.1
Wood 90.1
Body Part 57.4
Finger 57.4
Hand 57.4
Carpenter 57.1
Gun 56.8
Weapon 56.8
Worker 56.3

Clarifai
created on 2018-05-10

people 100
one 99.7
adult 99.3
two 98.5
man 97.7
administration 97.1
group 95.4
music 95.3
portrait 93.3
three 92.3
wear 92.2
musician 88.4
actor 88.1
writer 85.6
leader 85.2
furniture 84.8
outfit 82.4
book series 80.2
movie 77.9
group together 77.9

Imagga
created on 2023-10-05

man 32.2
male 29.1
person 24.6
musical instrument 23.8
portrait 22
adult 20.1
people 20.1
electronic instrument 19.4
device 19
working 15.9
blackboard 14.7
work 14.1
black 13.5
business 13.4
attractive 12.6
smiling 12.3
men 12
happy 11.9
upright 11.7
percussion instrument 11.6
sexy 11.2
alone 10.9
model 10.9
smile 10.7
face 10.6
guy 10.5
worker 10.4
hair 10.3
lifestyle 10.1
hand 9.9
pretty 9.8
old 9.7
job 9.7
one 9.7
looking 9.6
office 9
human 9
depression 8.7
professional 8.6
serious 8.6
expression 8.5
head 8.4
emotion 8.3
fashion 8.3
piano 8.3
lady 8.1
computer 8.1
building 8
indoors 7.9
love 7.9
urban 7.9
brunette 7.8
sitting 7.7
sad 7.7
outside 7.7
two 7.6
room 7.5
senior 7.5
help 7.4
world 7.4
cheerful 7.3
laptop 7.3
stringed instrument 7.3

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.5
man 92.9
outdoor 87.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 91.9%
Surprised 6.7%
Fear 6.1%
Confused 3.7%
Sad 2.5%
Disgusted 0.9%
Angry 0.6%
Happy 0.1%

Microsoft Cognitive Services

Age 58
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%