Human Generated Data

Title

Untitled (Arkansas, Kentucky, or Tennessee?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1123

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Arkansas, Kentucky, or Tennessee?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1123

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Photography 100
Face 100
Head 100
Portrait 100
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Accessories 95.7
Clothing 95
Formal Wear 95
Suit 95
Hat 94.3
Reading 91.6
Glasses 79.3
Coat 73.9
Body Part 73
Finger 73
Hand 73
Home Decor 56.5
Tie 56.2
Baseball Cap 55.9
Cap 55.9
Sunglasses 55.7

Clarifai
created on 2018-05-11

people 99.8
adult 98.5
man 97.1
one 95.8
lid 94.1
portrait 93.5
wear 92.8
veil 91.7
military 91.5
vehicle 90.5
outfit 89.5
two 89.3
war 88.7
transportation system 87.5
administration 87.2
uniform 86.2
offense 84.4
aircraft 80.8
street 78.7
elderly 78.6

Imagga
created on 2023-10-06

bow tie 69.3
necktie 57.1
man 42.3
male 33.3
garment 30.6
person 29.7
suit 24.8
businessman 23.8
clothing 23.3
face 22.7
business 21.9
people 20.6
adult 19.5
portrait 18.1
black 16.8
looking 16
office 14.6
expression 14.5
hand 14.4
handsome 14.3
dark 14.2
men 13.7
human 13.5
work 12.6
tie 12.3
glasses 12
corporate 12
success 11.3
guy 11.2
manager 11.2
hat 11
holding 10.7
executive 10.7
serious 10.5
one 10.4
senior 10.3
eye 9.8
hair 9.5
lifestyle 9.4
close 9.1
old 9.1
worker 9.1
job 8.8
working 8.8
look 8.8
modern 8.4
safety 8.3
metal 8
light 8
smile 7.8
television 7.8
eyes 7.7
boss 7.7
formal 7.6
head 7.6
phone 7.4
musical instrument 7.3
professional 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 87.1
window 80.2
picture frame 6.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 53-61
Gender Male, 99.7%
Calm 53.4%
Angry 21.7%
Sad 18.2%
Surprised 7.4%
Fear 6.4%
Disgusted 2.4%
Confused 1.6%
Happy 1.4%

Feature analysis

Amazon

Person 98.3%
Adult 98.3%
Male 98.3%
Man 98.3%
Glasses 79.3%