Human Generated Data

Title

Untitled (Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2754

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2754

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 100
Head 100
Photography 100
Portrait 100
Person 99.7
Adult 99.7
Male 99.7
Man 99.7
Body Part 94.5
Finger 94.5
Hand 94.5
Reading 91
Electronics 79.8
Phone 79.8
Mobile Phone 71.9
Happy 67.1
Smile 67.1
Text 66.7
Art 62.2
Painting 62.2
Senior Citizen 57.6
Beverage 57.1
Coffee 57.1
Coffee Cup 57.1
Accessories 57.1
Glasses 57.1
Clothing 55.9
Shirt 55.9
Coat 55.5

Clarifai
created on 2018-05-10

people 99.9
portrait 99.6
one 99.4
man 98.6
adult 97.8
elderly 96.4
administration 95.9
sit 94.7
book series 93.5
wear 92.7
monochrome 91.5
old 90.7
leader 90.2
writer 88.1
lid 87.5
scientist 84
war 80.3
art 79.2
cap 75.8
facial expression 75.8

Imagga
created on 2023-10-05

man 38.3
person 37.5
male 36.4
monk 35.2
adult 33.7
portrait 26.5
people 24
sitting 18.9
happy 18.8
attractive 17.5
face 16.3
handsome 16
smiling 15.9
model 15.5
kimono 13.8
human 13.5
hair 13.5
garment 13.3
old 13.2
lifestyle 13
smile 12.8
fashion 12.8
youth 12.8
clothing 12.7
head 12.6
guy 12.3
senior 12.2
juvenile 12.1
black 12
looking 12
casual 11.9
businessman 11.5
office 11.2
business 10.9
serious 10.5
one 10.4
robe 10.4
work 10.4
professional 10.3
mature 10.2
mother 10
holding 9.9
lady 9.7
computer 9.7
men 9.4
traditional 9.1
shirt 9
religion 9
child 8.9
working 8.8
indoors 8.8
executive 8.8
look 8.8
standing 8.7
boy 8.7
suit 8.2
style 8.2
sexy 8
home 8
women 7.9
jacket 7.9
vertical 7.9
hands 7.8
eyes 7.7
pretty 7.7
culture 7.7
laptop 7.6
clothes 7.5
outdoors 7.5
glasses 7.4
pen 7.1
love 7.1
job 7.1
adolescent 7.1
happiness 7.1

Microsoft
created on 2018-05-10

person 99.7
man 99.5
old 86.1
older 60.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 61-71
Gender Male, 99.7%
Calm 74.7%
Happy 12.9%
Surprised 6.8%
Fear 6.2%
Sad 5.6%
Confused 1.8%
Disgusted 0.9%
Angry 0.8%

Feature analysis

Amazon

Person 99.7%
Adult 99.7%
Male 99.7%
Man 99.7%

Categories

Captions

Microsoft
created on 2018-05-10

a man sitting on a table 86.4%
a man sitting at a table 86.3%
an old photo of a man 86.2%