Human Generated Data

Title

Cotton pickers ready for a day's work, 6:30 a.m., Pulaski County, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3055

Human Generated Data

Title

Cotton pickers ready for a day's work, 6:30 a.m., Pulaski County, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3055

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 100
Apparel 100
Hat 99.9
Person 99.3
Human 99.3
Person 98.6
Sun Hat 90.9
Person 86.5
Cowboy Hat 63
Person 56.7

Clarifai
created on 2023-10-15

people 99.9
two 99.6
portrait 99.6
lid 99.1
adult 98.9
three 98.7
man 98.5
woman 97
veil 96.5
wear 95.3
group 94.9
retro 93.1
vintage 93
four 92.5
monochrome 91.9
elderly 91.8
son 90.6
group together 90.3
family 88.2
one 87.7

Imagga
created on 2021-12-15

hat 69.1
headdress 37.4
sombrero 37.3
clothing 31
person 31
man 27.5
male 25.7
cowboy hat 24
people 22.3
portrait 22
adult 18.8
happy 17.5
fashion 16.6
hair 15.8
cowboy 15.7
attractive 15.4
smile 15
seller 14.6
guy 14.4
outdoors 14.2
covering 13.8
smiling 13.7
love 13.4
model 13.2
look 13.1
shirt 13.1
hand 12.9
expression 12.8
casual 12.7
consumer goods 12.6
umbrella 12.5
style 11.9
handsome 11.6
face 11.4
sexy 11.2
looking 11.2
posing 10.7
outside 10.3
two 10.2
old 9.7
western 9.7
couple 9.6
lifestyle 9.4
senior 9.4
pretty 9.1
dress 9
worker 8.9
together 8.8
work 8.7
standing 8.7
married 8.6
happiness 8.6
sitting 8.6
outdoor 8.4
dark 8.3
emotion 8.3
vintage 8.3
holding 8.2
pose 8.1
job 8
boy 7.8
oriental 7.7
youth 7.7
enjoy 7.5
relaxed 7.5
human 7.5
hold 7.4
child 7.3
cheerful 7.3
lady 7.3
business 7.3
relaxing 7.3
black 7.2
working 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.4
clothing 98.7
person 97.4
human face 96.7
hat 88.1
fashion accessory 87.9
man 81.4
smile 74
drawing 69.4
black and white 69
woman 66.2
people 60.6
old 57.5
posing 35.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-61
Gender Female, 98.7%
Confused 59.3%
Sad 16.9%
Calm 8.4%
Angry 5.8%
Surprised 3.1%
Fear 2.7%
Disgusted 2.2%
Happy 1.6%

AWS Rekognition

Age 22-34
Gender Female, 96.6%
Calm 37.9%
Sad 25.5%
Surprised 14.1%
Fear 14%
Confused 5.7%
Angry 1.1%
Happy 1.1%
Disgusted 0.5%

Microsoft Cognitive Services

Age 45
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.9%
Person 99.3%

Categories

Text analysis

Amazon

LITTLE
ARVIN
ARVIN LITTLE MANUFACTURED ROCK, HU
HU
ROCK,
MANUFACTURED
CAM

Google

KAM VIN HU LE ROCK
KAM
VIN
HU
LE
ROCK