Human Generated Data

Title

Untitled (Calumet, Pennsylvania)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1299

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Calumet, Pennsylvania)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1299

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Dress 100
Architecture 100
Building 100
House 100
Housing 100
Porch 100
Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Person 99.4
Person 99.1
Child 99.1
Female 99.1
Girl 99.1
Person 98.1
Wood 91
Deck 89.5
Footwear 80.5
Shoe 80.5
Door 72.1
Shorts 71.1
Skirt 63.1
Formal Wear 61.2
Lady 57.6
Outdoors 57.4
Hat 56.6
Pants 56.2
Fashion 56.1
Gown 56.1
Staircase 56.1
People 55.7
Home Decor 55.7
Linen 55.7
Blouse 55.7
Coat 55.6
Accessories 55.4
Bag 55.4
Handbag 55.4
Sandal 55.4
Art 55.2
Painting 55.2
Furniture 55.2

Clarifai
created on 2018-05-11

people 100
child 99.3
two 99
three 97.2
adult 97.2
group 96.9
actress 95.2
sibling 94.8
offspring 94.7
group together 94.3
woman 94.3
administration 93.7
wear 92.3
family 90.7
several 90.6
four 89.9
home 88.1
actor 87.7
step 87.1
five 86.2

Imagga
created on 2023-10-06

kin 23.6
people 21.2
statue 19.8
person 19.7
man 17.5
old 17.4
male 16.4
sculpture 15.4
adult 15.2
portrait 14.2
city 14.1
child 13.7
crutch 13.2
monument 13.1
world 13
dress 12.7
face 12.1
travel 12
stone 11.8
couple 11.3
ancient 11.2
staff 11.1
culture 11.1
women 11.1
historic 11
religion 10.8
human 10.5
art 10.4
mother 10.4
men 10.3
street 10.1
vintage 9.9
tourism 9.9
fashion 9.8
family 9.8
outdoors 9.7
love 9.5
architecture 9.4
traditional 9.2
history 8.9
lady 8.9
religious 8.4
park 8.2
landmark 8.1
looking 8
posing 8
happiness 7.8
standing 7.8
pretty 7.7
bride 7.7
stick 7.6
two 7.6
pedestrian 7.6
walking 7.6
historical 7.5
groom 7.4
black 7.2
building 7.1
day 7.1
parent 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.6
outdoor 97.1
posing 41.2
step 26.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Male, 90.6%
Disgusted 87.3%
Angry 6.9%
Surprised 6.7%
Fear 6.1%
Sad 2.4%
Calm 1.6%
Confused 1.5%
Happy 0.8%

AWS Rekognition

Age 54-64
Gender Female, 100%
Happy 96%
Surprised 6.5%
Fear 6%
Sad 2.3%
Confused 0.8%
Disgusted 0.6%
Angry 0.6%
Calm 0.5%

AWS Rekognition

Age 6-12
Gender Female, 99.8%
Happy 98.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.5%
Calm 0.4%
Disgusted 0.1%
Angry 0%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Child 99.1%
Female 99.1%
Girl 99.1%

Categories

Text analysis

Amazon

APH