Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1313

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1313

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Hat 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Face 99.1
Head 99.1
Photography 99.1
Portrait 99.1
Male 98.8
Person 98.8
Boy 98.8
Child 98.8
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Bench 96.6
Furniture 96.6
Coat 93.2
Person 92.2
Sitting 91.7
Footwear 91.3
Shoe 91.3
People 89.5
Shoe 81
Wood 79.1
Outdoors 73.3
Pants 72.2
Nature 57.8
Couch 57.6
Bus Stop 56.4
Food 56.3
Fruit 56.3
Plant 56.3
Produce 56.3
Sun Hat 55.5
Pottery 55.2

Clarifai
created on 2018-05-11

people 100
adult 99.1
group 98.6
group together 98.2
two 98
three 97.6
man 97.1
military 97.1
four 96.3
war 95.7
child 94.9
wear 92.3
woman 91.5
several 91.5
five 91.1
outfit 90.9
administration 90.6
soldier 90.2
sit 88.6
uniform 87.5

Imagga
created on 2023-10-06

man 30.9
male 25.6
people 25.1
kin 24.8
person 23.7
old 19.5
adult 18.9
sitting 18.9
seller 18
happy 15
working 15
religion 14.3
home 13.6
couple 13.1
lifestyle 13
men 12.9
worker 12.5
portrait 12.3
work 12.2
casual 11.9
smile 11.4
senior 11.2
religious 11.2
child 10.7
job 10.6
indoors 10.5
two 10.2
passenger 10.1
outdoors 9.7
together 9.6
day 9.4
mature 9.3
face 9.2
horizontal 9.2
family 8.9
sculpture 8.8
stretcher 8.8
smiling 8.7
love 8.7
industry 8.5
travel 8.4
black 8.4
20s 8.2
building 8.1
patient 8.1
computer 8
room 8
businessman 7.9
business 7.9
boy 7.8
color 7.8
ancient 7.8
pray 7.8
elderly 7.7
statue 7.6
house 7.5
laptop 7.5
one 7.5
dad 7.4
occupation 7.3
hat 7.3
looking 7.2
history 7.2
mother 7.1
architecture 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.8
outdoor 97.7
sitting 94.3
subway 10.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 100%
Calm 94.5%
Surprised 6.4%
Fear 6%
Sad 3.5%
Happy 0.5%
Confused 0.3%
Angry 0.3%
Disgusted 0.1%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 52.2%
Surprised 39.7%
Happy 16.1%
Fear 6.4%
Sad 2.5%
Angry 1.2%
Confused 0.9%
Disgusted 0.5%

AWS Rekognition

Age 43-51
Gender Male, 98.8%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Confused 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 14-22
Gender Female, 97.1%
Calm 64.8%
Surprised 8.8%
Confused 8.4%
Fear 7.9%
Angry 7.2%
Sad 6.3%
Happy 1.5%
Disgusted 1.1%

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 10
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Boy 98.8%
Child 98.8%
Shoe 91.3%

Categories

Imagga

paintings art 98.4%