Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1312

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1312

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Sun Hat 99.7
Adult 99.5
Female 99.5
Person 99.5
Woman 99.5
Adult 99.4
Person 99.4
Male 99.4
Man 99.4
Adult 98.6
Person 98.6
Male 98.6
Man 98.6
Hat 96.8
Face 92.7
Head 92.7
Photography 90.9
Portrait 90.9
Footwear 90.5
Shoe 90.5
Jeans 87.9
Pants 87.9
Sitting 87.7
Outdoors 80.2
Wood 72.6
Shoe 71.2
Nature 69.1
Furniture 57.8
Bench 57.3
Firearm 56.7
Gun 56.7
Rifle 56.7
Weapon 56.7

Clarifai
created on 2018-05-11

people 100
adult 99.6
two 99.1
group together 98.8
group 98.7
three 98.3
man 97.9
veil 97.4
four 96.7
vehicle 95.4
wear 93.1
several 93
lid 92.4
elderly 92.2
five 92.1
transportation system 90.6
woman 89.3
one 88.2
administration 86.6
military 86.4

Imagga
created on 2023-10-07

man 37
seller 36.6
male 35.7
hat 34.4
person 26.6
worker 24.7
people 24.5
working 20.3
building 19.9
job 19.5
work 18.9
construction 18
builder 17.9
adult 17.5
industry 17.1
outdoors 15.7
occupation 15.6
men 14.6
uniform 14.6
happy 14.4
clothing 13.7
industrial 12.7
child 12.7
helmet 12.6
outdoor 12.2
hand 12.2
cowboy hat 12
site 11.3
equipment 11.2
cowboy 11
hardhat 10.8
smile 10.7
portrait 10.4
two 10.2
contractor 9.7
bench 9.6
outside 9.4
alone 9.1
old 9.1
looking 8.8
lifestyle 8.7
sitting 8.6
professional 8.5
house 8.4
street 8.3
park 8.2
engineer 8
guy 8
handsome 8
scholar 8
headdress 8
home 8
love 7.9
together 7.9
boy 7.8
labor 7.8
lonely 7.7
repair 7.7
casual 7.6
hard 7.6
senior 7.5
leisure 7.5
holding 7.4
sport 7.4
world 7.4
intellectual 7.4
children 7.3
business 7.3
travel 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98.8
man 97.2
person 95.9
sitting 94

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-60
Gender Male, 92.2%
Happy 94.3%
Surprised 7.4%
Fear 5.9%
Calm 2.6%
Sad 2.2%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 100%
Sad 98.1%
Calm 39.8%
Surprised 7%
Fear 6%
Confused 1.5%
Disgusted 1%
Happy 0.9%
Angry 0.7%

AWS Rekognition

Age 41-49
Gender Male, 99.6%
Happy 50.1%
Calm 22.6%
Surprised 9.4%
Disgusted 9%
Fear 7.2%
Angry 4.4%
Sad 3.9%
Confused 1.3%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 78
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Female 99.5%
Person 99.5%
Woman 99.5%
Male 99.4%
Man 99.4%
Hat 96.8%
Shoe 90.5%
Jeans 87.9%

Categories

Imagga

paintings art 98.8%

Captions