Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2581

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (unemployed trappers, Plaquemines Parish, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2581

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Photography 100
Face 100
Head 100
Portrait 100
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Clothing 99.2
Coat 99.2
Jacket 95.6
Body Part 57.5
Hand 57.5
Outdoors 55.3

Clarifai
created on 2018-05-10

people 99.9
adult 98.9
portrait 98.4
man 98.2
monochrome 97.8
two 96.9
one 96.3
wear 95.8
street 93.7
coat 92
outerwear 91.8
group 86.7
administration 84.7
veil 80.2
outfit 79.6
three 79.1
group together 78.3
leader 77
black and white 76.8
woman 76.1

Imagga
created on 2023-10-07

man 53.8
male 46.9
person 43.7
people 36.3
businessman 36.2
business 30.4
suit 29.9
office 29.5
senior 28.1
adult 27.1
corporate 26.6
professional 25.8
portrait 25.3
smiling 24.6
face 24.2
executive 23.6
mature 22.3
elderly 22
grandfather 21.9
old 21.6
men 21.5
happy 20.7
work 19.6
handsome 19.6
sitting 17.2
job 16.8
smile 16.4
communication 16
computer 15.3
worker 15.2
tie 15.2
working 15
glasses 14.8
retired 14.6
looking 14.4
together 14
couple 13.9
expression 13.7
retirement 13.5
scholar 13.1
manager 13
lifestyle 13
success 12.9
casual 12.7
aged 12.7
busy 12.5
businesspeople 12.3
meeting 12.3
pensioner 12.1
laptop 12.1
confident 11.8
head 11.8
bow tie 11.7
confidence 11.5
serious 11.4
intellectual 11.4
black 11.4
table 11.3
occupation 11
gray 10.8
older 10.7
indoors 10.5
talking 10.5
clothing 10.3
jacket 10.3
one 9.7
group 9.7
look 9.6
hair 9.5
happiness 9.4
guy 9.4
two 9.3
teamwork 9.3
human 9
team 9
cheerful 8.9
world 8.8
day 8.6
desk 8.5
attractive 8.4
hand 8.4
grandma 8.1
necktie 8
building 8
planner 8
women 7.9
love 7.9
employee 7.8
partner 7.7
corporation 7.7
telephone 7.7
profession 7.7
boss 7.7
formal 7.6
age 7.6
mobile 7.5
horizontal 7.5
garment 7.5
beard 7.4
phone 7.4
inside 7.4
active 7.3
alone 7.3
indoor 7.3
businesswoman 7.3
home 7.2
spectator 7.1
family 7.1
father 7.1
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.8
man 99.3
black 80.8
old 79.4
white 69.8
posing 57.1
older 48.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 95.7%
Surprised 6.4%
Fear 6%
Confused 2.3%
Sad 2.2%
Angry 0.6%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 98.1%
Surprised 6.4%
Fear 5.9%
Sad 2.4%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Male 99.5%
Man 99.5%

Categories

Imagga

paintings art 99.4%

Captions