Human Generated Data

Title

Untitled (Dyess Colony, Mississippi County, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1190

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Dyess Colony, Mississippi County, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1190

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.3
Person 99.3
Female 99.3
Woman 99.3
Person 98.9
Baby 98.9
Hat 92.1
Food 91.7
Meal 91.7
Face 87.6
Head 87.6
Cooking Pan 82.2
Cookware 82.2
Indoors 71.4
Kitchen 71.4
Sun Hat 71
Dish 57.6
Cafeteria 56.8
Restaurant 56.8
Pot 56.4
Stew 55.2

Clarifai
created on 2018-05-11

people 100
adult 99.2
group 98.9
group together 98.1
man 98.1
woman 97.8
several 96.6
two 96.5
wear 95.3
merchant 94.8
three 94.7
lid 94.6
four 93.8
five 87.8
child 85.7
commerce 84.1
veil 83.3
furniture 82.8
one 82.7
sit 82.3

Imagga
created on 2023-10-05

seller 52.5
man 31.6
person 27.1
male 26.4
home 26.3
kitchen 25
people 24.5
happy 23.8
smiling 22.4
adult 20.9
cooking 20.1
food 19.3
lifestyle 18.8
indoors 15.8
cook 15.6
portrait 15.5
sitting 15.5
cheerful 15.4
working 15
standing 14.8
chef 14.6
work 14.3
smile 14.2
stall 14
couple 13.9
percussion instrument 13.9
preparing 13.7
dinner 13.6
women 13.4
senior 13.1
steel drum 13.1
restaurant 13.1
men 12.9
hat 12.8
attractive 12.6
family 12.4
interior 12.4
meal 12.3
business 12.1
occupation 11.9
musical instrument 11.5
job 11.5
child 11.4
lunch 11.1
happiness 11
clothing 10.9
domestic 10.8
handsome 10.7
boy 10.4
children 10
school 10
professional 9.7
elderly 9.6
profession 9.6
education 9.5
counter 9.5
day 9.4
casual 9.3
two 9.3
eating 9.3
face 9.2
worker 9.2
student 9.2
house 9.2
one 9
together 8.8
preparation 8.6
husband 8.6
pretty 8.4
hand 8.4
color 8.3
20s 8.2
outdoors 8.2
60s 7.8
building 7.8
outside 7.7
wife 7.6
stove 7.6
mature 7.4
holding 7.4
glasses 7.4
teen 7.3
indoor 7.3
industrial 7.3
looking 7.2
mother 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.4
indoor 91.5
old 44.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 92.9%
Calm 72.7%
Sad 46.3%
Surprised 6.4%
Fear 6%
Confused 0.4%
Angry 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 2-10
Gender Male, 96.6%
Calm 38.3%
Confused 24.3%
Angry 14.1%
Disgusted 10.4%
Sad 7.5%
Surprised 7.3%
Fear 6.3%
Happy 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Female 99.3%
Woman 99.3%
Baby 98.9%
Hat 92.1%

Categories