Human Generated Data

Title

Untitled (cotton pickers, Alexander Plantation, Pulaski County, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2492

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cotton pickers, Alexander Plantation, Pulaski County, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2492

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Hat 100
People 99.9
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Coat 95.5
Coat 91.7
Outdoors 91.5
Footwear 87.7
Shoe 87.7
Face 79.6
Head 79.6
Shoe 79.4
Nature 78.4
Shorts 74.3
Shoe 68.6
Architecture 57.9
Building 57.9
Shelter 57.9
Pants 57.7
Skirt 56.7
Countryside 56.5
Cap 56.1
Shirt 55.9
Dress 55.8
Yard 55.4
Sun Hat 55.3
Agriculture 55.2
Field 55.2
Grass 55
Plant 55

Clarifai
created on 2018-05-10

people 100
group 99.6
adult 99.6
group together 99.4
man 98.2
administration 97.7
several 97.5
many 96.4
war 94.6
military 94
five 93.7
three 93.3
leader 92.8
woman 92.8
wear 92.4
child 89.8
two 89.6
four 87.4
uniform 86.7
outfit 86.4

Imagga
created on 2023-10-05

man 27.6
people 22.3
male 18.4
person 17.8
uniform 14.8
clothing 14.6
weapon 14
men 13.7
adult 11.9
old 11.8
war 11.6
hat 11.5
culture 11.1
sport 10.9
travel 10.6
standing 10.4
two 10.2
outdoor 9.9
statue 9.9
history 9.8
boy 9.6
home 8.8
musical instrument 8.8
military 8.7
gun 8.5
industry 8.5
mask 8.5
house 8.4
leisure 8.3
outdoors 8.2
religion 8.1
worker 8
love 7.9
together 7.9
work 7.8
building 7.8
device 7.8
soldier 7.8
architecture 7.8
portrait 7.8
life 7.7
horse 7.6
hand 7.6
walking 7.6
happy 7.5
style 7.4
historic 7.3
active 7.3
protection 7.3
game 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 100
grass 99.5
outdoor 99.1
man 94.1
standing 92.2
group 84.5
old 83.7
posing 75.9
black 75.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 86.2%
Happy 96.8%
Surprised 7.6%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Calm 0%

AWS Rekognition

Age 18-26
Gender Male, 99.5%
Sad 90.2%
Calm 52%
Surprised 6.6%
Fear 6%
Confused 2.6%
Disgusted 0.9%
Angry 0.7%
Happy 0.4%

AWS Rekognition

Age 23-33
Gender Female, 82%
Calm 84.6%
Sad 15.3%
Surprised 6.3%
Fear 5.9%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 70.2%
Sad 55.5%
Surprised 6.3%
Fear 6%
Happy 0.7%
Disgusted 0.2%
Angry 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 35
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%
Coat 95.5%
Shoe 87.7%

Categories

Imagga

paintings art 70.4%
people portraits 28.3%