Human Generated Data

Title

Untitled (Dyess Colony, Mississippi County, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1186

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Dyess Colony, Mississippi County, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1186

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Architecture 99.1
Building 99.1
Factory 99.1
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98
Person 98
Female 98
Woman 98
Clothing 97
Person 96.2
Baby 96.2
Face 89
Head 89
Hat 85.9
Indoors 79.8
Kitchen 79.8
Manufacturing 79.2
Cafeteria 76.7
Restaurant 76.7
Cookware 76
Pot 76
Food 64.6
Meal 64.2
Shop 62.6
Bowl 57.4
Assembly Line 57.2
Dish 56.1

Clarifai
created on 2018-05-11

people 100
group 99.7
adult 99.6
group together 99.1
several 98.7
man 98.2
woman 97.4
merchant 97.1
four 96.7
two 96.4
three 96
employee 95.5
furniture 94
room 92.5
many 92.3
bar 90.5
commerce 90.2
wear 89.3
recreation 88.4
container 88

Imagga
created on 2023-10-06

stall 53.3
restaurant 37.3
smiling 26
man 25.5
adult 24.7
food 24.6
people 24.5
kitchen 24.2
person 24.1
home 23.1
happy 21.9
male 21.3
cooking 19.2
shop 19.2
couple 19.2
lifestyle 18.8
waiter 18.5
indoors 18.4
meal 17.9
dinner 17.6
sitting 16.3
counter 15.9
smile 15.7
family 15.1
lunch 15
standing 14.8
cheerful 14.6
chef 14.5
drink 13.4
business 13.4
portrait 12.9
men 12.9
steel drum 12.9
cook 12.8
building 12.7
women 12.6
eating 12.6
senior 12.2
percussion instrument 12.1
day 11.8
dining-room attendant 11.7
30s 11.5
working 11.5
together 11.4
employee 11.3
outdoors 11.2
attractive 11.2
cafeteria 10.9
holding 10.7
drinking 10.5
pretty 10.5
table 10.5
friends 10.3
inside 10.1
color 10
preparing 9.8
interior 9.7
mid adult 9.6
work 9.5
mercantile establishment 9.4
store 9.4
musical instrument 9.1
bakery 9
seller 9
worker 8.8
looking 8.8
child 8.8
middle aged 8.8
happiness 8.6
elderly 8.6
domestic 8.6
casual 8.5
cake 8.5
coffee 8.3
occupation 8.2
20s 8.2
cup 8.1
clothing 8.1
job 8
boy 7.8
professional 7.8
outside 7.7
looking camera 7.7
profession 7.7
hotel 7.6
gourmet 7.6
enjoying 7.6
adults 7.6
horizontal 7.5
structure 7.5
wine 7.5
one 7.5
service 7.4
children 7.3
beverage 7.3
plate 7.2
cute 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.6
preparing 62.4
cooking 34.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Female, 100%
Calm 92.7%
Surprised 7.6%
Fear 6%
Sad 2.4%
Happy 1.4%
Angry 1.1%
Confused 0.7%
Disgusted 0.5%

AWS Rekognition

Age 28-38
Gender Male, 62.3%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0%
Angry 0%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 12
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Female 98%
Woman 98%
Baby 96.2%
Hat 85.9%

Categories