Human Generated Data

Title

Untitled (sharecroppers, Marked Tree, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2586

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecroppers, Marked Tree, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2586

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Brick 100
Face 100
Head 100
Photography 100
Portrait 100
Architecture 100
Building 100
Wall 100
Clothing 99.9
Person 99.7
Adult 99.7
Male 99.7
Man 99.7
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Cap 99.1
Person 99
Adult 99
Female 99
Woman 99
Coat 98.4
Baseball Cap 98.2
Accessories 89.7
Jewelry 87.2
Necklace 87.2
Shirt 76.6
Jacket 68.3
Hat 65.8
Glasses 64.7
Happy 57.1
Smile 57.1
Sunglasses 56.1
Sun Hat 55.8
Bonnet 55.5
People 55.2
Blouse 55.2
Body Part 55
Neck 55

Clarifai
created on 2018-05-10

people 99.9
two 99
adult 98.6
man 98.4
portrait 97.9
three 97
group 96.1
wear 94.3
administration 92.9
woman 90.9
four 90.7
child 89.8
group together 89.4
several 88.6
uniform 87.6
veil 86.3
lid 85.4
leader 84.4
five 84.4
one 84

Imagga
created on 2023-10-06

man 34.9
person 30.8
male 27.5
people 21.2
adult 18.2
portrait 16.2
mask 15.9
black 15.7
men 15.4
face 13.5
work 11.8
wall 11.8
looking 11.2
old 11.1
holding 10.7
hair 10.3
world 10
happy 10
worker 9.8
human 9.7
brick 9.7
serious 9.5
clothing 9.2
city 9.1
one 9
urban 8.7
building 7.9
standing 7.8
guy 7.7
child 7.7
attractive 7.7
industry 7.7
covering 7.6
art 7.5
fashion 7.5
dark 7.5
outdoors 7.5
teen 7.3
hat 7.3
teenager 7.3
smiling 7.2
home 7.2
religion 7.2
uniform 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 97.6
man 96.1
outdoor 95.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 51.2%
Fear 89.6%
Calm 17.3%
Surprised 6.4%
Sad 5.5%
Confused 1.2%
Angry 0.4%
Happy 0.3%
Disgusted 0.3%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Adult 99.7%
Male 99.7%
Man 99.7%
Female 99%
Woman 99%
Necklace 87.2%
Hat 65.8%
Glasses 64.7%

Categories

Imagga

paintings art 98.8%