Human Generated Data

Title

Untitled (sharecropper family, Little Rock, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2774

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecropper family, Little Rock, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2774

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Person 98.5
Architecture 97.1
Building 97.1
Hospital 97.1
Face 94.1
Head 94.1
People 78.2
Photography 74.2
Portrait 74.2
Childbirth 69.1
Clinic 55.8
Hairdresser 55.1

Clarifai
created on 2018-05-10

people 100
adult 99.3
group 98.8
woman 97.3
two 96.9
man 95.9
wear 95.3
sit 94.8
portrait 93.3
furniture 92.3
administration 92
three 92
actor 91.8
leader 91.2
group together 91
one 89.6
music 89.2
four 89.2
several 88.2
vehicle 88.1

Imagga
created on 2023-10-06

man 51.7
male 41.8
person 36
people 33.4
senior 22.5
adult 22.4
men 21.5
professional 20.2
work 19.6
portrait 18.8
mature 18.6
couple 18.3
worker 18.2
smiling 18.1
hat 17.7
happy 17.5
sitting 16.3
job 15.9
together 15.8
elderly 15.3
home 15.1
working 15
lifestyle 14.4
casual 14.4
indoors 13.2
smile 12.8
business 12.7
businessman 12.4
nurse 12.3
uniform 12.1
old 11.8
love 11.8
looking 11.2
outdoors 11.2
two 11
occupation 11
retired 10.7
retirement 10.6
attractive 10.5
glasses 10.2
camera 10.2
face 9.9
hand 9.9
office 9.6
clothing 9.3
20s 9.2
laptop 9.1
husband 8.9
medical 8.8
scholar 8.7
married 8.6
happiness 8.6
industry 8.5
wife 8.5
meeting 8.5
communication 8.4
leisure 8.3
jacket 8.3
grandfather 8
handsome 8
computer 8
family 8
helmet 8
musical instrument 7.8
device 7.8
table 7.8
father 7.6
reading 7.6
talking 7.6
relaxed 7.5
patient 7.5
holding 7.4
safety 7.4
cheerful 7.3
industrial 7.3
building 7.2
black 7.2
guy 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.4
man 92
text 86.5
black 67.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 39-47
Gender Male, 76.6%
Sad 97.1%
Surprised 15.7%
Disgusted 14%
Calm 11.4%
Fear 6.6%
Confused 4.5%
Angry 3.6%
Happy 1.3%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Sad 99.8%
Calm 22.3%
Surprised 6.5%
Fear 6.3%
Disgusted 3.4%
Happy 1.2%
Angry 1%
Confused 0.7%

Microsoft Cognitive Services

Age 76
Gender Male

Feature analysis

Amazon

Adult 98.7%
Male 98.7%
Man 98.7%
Person 98.7%

Categories

Imagga

paintings art 98.3%
people portraits 1.5%

Captions

Microsoft
created on 2018-05-10

a black and white photo of a man 90.2%
an old photo of a man 90.1%
a man holding a dog 61.4%