Human Generated Data

Title

Untitled (Circleville, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.23

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Circleville, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.23

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.6
Female 99.6
Person 99.6
Woman 99.6
Person 99.4
Shop 98.4
Person 96.7
Person 95.7
Clothing 95.4
Dress 95.4
Accessories 95.4
Bag 95.4
Handbag 95.4
Person 94.5
Person 88.3
Person 85.1
Face 83.7
Head 83.7
Hat 63.3
Footwear 63.2
Shoe 63.2
Person 61.1
Market 57.8
Blouse 57.7
Grocery Store 57.5
Photography 57.1
Portrait 57.1
Lady 56.9
Newsstand 56.9
Coat 56.4
Skirt 56
Fashion 55.7
Formal Wear 55.7
Gown 55.7
City 55.2
Kiosk 55.1
Indoors 55
Supermarket 55

Clarifai
created on 2018-05-11

people 100
group 99.2
group together 98.8
adult 98.8
two 96.8
many 95.6
several 95.2
merchant 95.2
leader 95.2
administration 94.9
woman 94.5
man 94.5
three 92.5
four 91.6
military 87.9
five 87.5
wear 86.8
street 85
child 84.8
commerce 84

Imagga
created on 2023-10-05

shop 58.8
mercantile establishment 42.3
place of business 27.9
old 22.3
newspaper 21.8
tobacco shop 20
product 17.8
barbershop 17.5
religion 16.1
architecture 14.8
city 14.1
establishment 13.5
man 13.4
people 13.4
traditional 13.3
antique 13.2
creation 13.1
building 12.9
sculpture 12.6
statue 12.3
ancient 12.1
stall 12
travel 12
historic 11.9
art 11.8
vintage 11.6
case 11.5
stone 11.1
tradition 11.1
toyshop 10.8
counter 10.8
shoe shop 10.6
monument 10.3
culture 10.2
tourism 9.9
history 9.8
retro 9.8
street 9.2
supermarket 8.8
business 8.5
window 8.4
house 8.3
dirty 8.1
adult 8
interior 8
urban 7.9
black 7.8
male 7.8
men 7.7
door 7.6
temple 7.6
historical 7.5
buy 7.5
religious 7.5
tourist 7.4
new 7.3
aged 7.2
family 7.1
person 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

store 84.7
standing 79.3
case 61.3
posing 43.1
sale 8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 56-64
Gender Female, 100%
Happy 90%
Calm 7.9%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Angry 0.3%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 57-65
Gender Female, 77.9%
Calm 86.5%
Surprised 6.5%
Fear 6.3%
Sad 5.3%
Angry 2.8%
Confused 1.6%
Happy 0.5%
Disgusted 0.5%

AWS Rekognition

Age 23-31
Gender Male, 87.9%
Sad 53.4%
Calm 52.1%
Confused 13.5%
Surprised 6.4%
Fear 6.3%
Happy 3.8%
Disgusted 1%
Angry 0.4%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 86.9%
Surprised 6.4%
Fear 6.1%
Sad 4.8%
Confused 4.2%
Angry 0.9%
Happy 0.7%
Disgusted 0.5%

AWS Rekognition

Age 21-29
Gender Female, 53.3%
Calm 84.4%
Sad 11%
Surprised 6.8%
Fear 6%
Angry 0.7%
Confused 0.6%
Disgusted 0.4%
Happy 0.4%

Microsoft Cognitive Services

Age 76
Gender Female

Microsoft Cognitive Services

Age 18
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Female 99.6%
Person 99.6%
Woman 99.6%
Hat 63.3%
Shoe 63.2%

Categories

Text analysis

Amazon

CARE
NOT CARE
CREAMS
NOT
BEAUTY
Each
H
IC
H Co
IO
TOILET
من
10
WOCDBURY
Co
World
من SM
ELLEREX
SM
Sample
KODER