Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1346

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1346

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-18

Face 100
Head 100
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Photography 99.3
Portrait 99.3
Person 98.2
Male 98.2
Boy 98.2
Child 98.2
Person 96.5
Clothing 96.2
Person 95.7
Male 95.7
Boy 95.7
Child 95.7
Sad 95
Hat 92.9
Smoke 57.7
Happy 56.5
Smile 56.5
Body Part 55.5
Finger 55.5
Hand 55.5

Clarifai
created on 2023-10-18

people 100
adult 99.6
portrait 99.5
two 99.2
man 98.5
wear 95.3
group 94.1
three 93.8
one 93.5
administration 90.2
veil 88
facial expression 87.8
music 87.6
actor 86.8
elderly 86
leader 83.3
musician 82.6
group together 81.6
four 81
facial hair 80.9

Imagga
created on 2018-12-27

man 53.8
male 46.2
person 39.2
adult 31.8
portrait 29.1
businessman 28.3
call 25.7
senior 25.3
people 25.1
handsome 23.2
looking 22.4
face 22
business 21.3
phone 21.2
serious 20
expression 17.1
mature 16.7
old 16.7
suit 16.4
communication 16
hand 16
grandfather 15.8
telephone 15.1
happy 15
black 15
one 14.9
guy 14.7
office 13.8
mobile 13.2
smile 12.8
confident 12.7
talking 12.4
shirt 12.1
sitting 12
alone 11.9
hat 11.7
thoughtful 11.7
smiling 11.6
glasses 11.1
professional 11.1
work 11
cellphone 10.7
job 10.6
elderly 10.5
human 10.5
thinking 10.4
corporate 10.3
men 10.3
outdoors 9.7
look 9.6
standing 9.6
happiness 9.4
casual 9.3
emotion 9.2
holding 9.1
technology 8.9
worried 8.8
depression 8.8
retired 8.7
lifestyle 8.7
god 8.6
tie 8.5
executive 8.5
boy 8.3
performer 8
love 7.9
beard 7.9
couple 7.8
hands 7.8
pray 7.8
prayer 7.7
sad 7.7
modern 7.7
hope 7.7
attractive 7.7
think 7.7
spiritual 7.7
cell 7.7
businesspeople 7.6
dark 7.5
manager 7.5
clothing 7.4
indoor 7.3
hair 7.1
monk 7.1

Google
created on 2018-12-27

Microsoft
created on 2018-12-27

person 99.3
man 98.2
outdoor 93.1
old 41.1
crowd 1.1
street 1.1
black and white 0.4
hat 0.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 78.2%
Sad 25.8%
Surprised 6.3%
Fear 6%
Confused 1.1%
Angry 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 53-61
Gender Male, 99.5%
Calm 95.2%
Surprised 7.1%
Fear 6%
Sad 2.4%
Angry 0.9%
Confused 0.9%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 16-24
Gender Female, 99.8%
Calm 91.2%
Surprised 6.6%
Fear 5.9%
Sad 5%
Angry 0.8%
Disgusted 0.4%
Confused 0.4%
Happy 0.2%

AWS Rekognition

Age 12-20
Gender Female, 96.6%
Calm 43.2%
Confused 31%
Angry 9.7%
Surprised 7.4%
Fear 6.6%
Sad 6.4%
Happy 3.5%
Disgusted 0.6%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Male 99.5%
Man 99.5%
Boy 98.2%
Child 98.2%
Hat 92.9%

Categories

Imagga

people portraits 97.6%
paintings art 1.9%

Captions

Microsoft
created on 2018-12-27

a man wearing a hat 93.5%
a man wearing a black hat 91.4%
a man looking at the camera 91.3%