Human Generated Data

Title

Untitled (cotton pickers, Alexander Plantation, Pulaski County, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2607

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (cotton pickers, Alexander Plantation, Pulaski County, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2607

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Clothing 99.2
Hat 99.2
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99
Male 99
Man 99
Person 99
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Male 98.4
Person 98.4
Boy 98.4
Child 98.4
Adult 98
Person 98
Female 98
Woman 98
Machine 97.7
Wheel 97.7
Wheel 96
Transportation 95.4
Vehicle 95.4
Wagon 95.4
Adult 94.8
Male 94.8
Man 94.8
Person 94.8
Face 86.4
Head 86.4
Animal 75
Bull 75
Mammal 75
Spoke 70.4

Clarifai
created on 2018-05-10

people 100
group 99.9
group together 99.9
vehicle 99.8
many 99.7
transportation system 99.4
adult 98.9
several 98.9
man 97.9
military 96.7
watercraft 96.3
administration 95.8
war 95.7
four 94.2
five 94
soldier 94
driver 92
leader 91.9
wear 91.5
wagon 90.5

Imagga
created on 2023-10-05

vehicle 54.3
half track 38.7
military vehicle 33.5
tracked vehicle 33.2
crate 30.2
man 29.5
box 27.7
male 26.9
wheeled vehicle 25
container 20.1
machine 20
conveyance 19.3
outdoor 17.6
transportation 17
transport 15.5
people 15
landscape 14.9
sky 14.7
old 14.6
person 14
adult 13.2
gun 12.9
men 12.9
cart 12.7
wagon 12.1
outdoors 11.9
military 11.6
industrial 10.9
cannon 10.9
farm 10.7
rural 10.6
sitting 10.3
industry 10.2
work 10.2
field 10
weapon 9.6
dirt 9.5
grass 9.5
danger 9.1
sport 9.1
tractor 9
antique 8.6
outside 8.5
horse 8.5
car 8.5
uniform 8.2
hay 8.2
handsome 8
mountain 8
working 7.9
wheels 7.8
travel 7.7
summer 7.7
tank 7.6
wheel 7.5
thresher 7.5
technology 7.4
speed 7.3
road 7.2
sunset 7.2
boy 7.2
businessman 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 96.6
person 93.2
transport 76.2
old 72.1
cart 66.8
pulling 57.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Female, 95.9%
Calm 84%
Surprised 6.6%
Sad 6.6%
Fear 6.5%
Confused 2.7%
Happy 1.4%
Angry 1%
Disgusted 0.7%

AWS Rekognition

Age 21-29
Gender Male, 75.2%
Happy 60.8%
Surprised 34%
Confused 7.1%
Fear 6.1%
Calm 4.5%
Sad 2.5%
Angry 1.4%
Disgusted 0.7%

AWS Rekognition

Age 36-44
Gender Female, 91.4%
Calm 58.3%
Happy 32%
Fear 6.6%
Surprised 6.5%
Sad 4.5%
Disgusted 0.9%
Angry 0.6%
Confused 0.4%

AWS Rekognition

Age 21-29
Gender Female, 91.1%
Surprised 96.6%
Happy 28.4%
Fear 5.9%
Sad 2.2%
Calm 0.5%
Confused 0.4%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 24-34
Gender Male, 99.8%
Sad 99.9%
Calm 18.5%
Surprised 7.1%
Fear 6%
Confused 4.6%
Angry 1%
Disgusted 0.4%
Happy 0.4%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Boy 98.4%
Child 98.4%
Female 98%
Woman 98%
Wheel 97.7%

Categories

Text analysis

Amazon

CREST