Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1641

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1641

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Handrail 100
Architecture 99.5
Building 99.5
House 99.5
Housing 99.5
Staircase 99.5
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Person 99
Person 97.4
Photography 96.3
Person 95.2
Person 93.1
Person 92.5
Clothing 92.4
Coat 92.4
Face 90.5
Head 90.5
Person 87
Person 83.8
Person 81.6
Person 78.7
Portrait 77.4
Person 75.3
Railing 75.1
Person 73.8
Outdoors 70.4
Person 67.9
Boardwalk 67.9
Bridge 67.9
Person 66.5
Person 63.1
Person 60.1
Path 57.7
Knitwear 57.6
Sweater 57.6
Jacket 56.8
Art 56.8
Collage 56.8
Pants 56.1
Reading 55.7
City 55.6
Soil 55.3
Sidewalk 55.1

Clarifai
created on 2018-05-11

people 100
child 99.8
adult 98.6
two 98.4
monochrome 97.3
group 96.9
group together 96.7
step 95.2
one 95.2
street 95.1
boy 94.5
three 94.2
man 94.2
woman 93.1
offspring 90.5
four 89.7
war 89.2
wear 88.6
vehicle 88.1
son 87.3

Imagga
created on 2023-10-06

world 27.3
people 19.5
man 17.5
black 17.4
kin 16.2
city 15.8
portrait 15.5
adult 14.9
person 13.8
face 13.5
urban 13.1
male 12.7
parent 12.1
dad 12
old 11.8
child 11.8
sculpture 11.5
father 11.4
men 11.2
statue 10.5
outdoors 10.4
culture 10.3
architecture 10.1
human 9.7
mask 9.6
street 9.2
outdoor 9.2
dirty 9
building 9
women 8.7
art 8.5
head 8.4
step 8.3
mother 8.2
danger 8.2
dress 8.1
newspaper 8
hair 7.9
day 7.8
happiness 7.8
ancient 7.8
sepia 7.8
travel 7.7
sitting 7.7
fashion 7.5
traditional 7.5
monument 7.5
vintage 7.4
lady 7.3
detail 7.2
suit 7.2
religion 7.2
family 7.1
love 7.1
product 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 95.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 85.8%
Sad 100%
Fear 6.6%
Surprised 6.3%
Calm 2%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 18-24
Gender Female, 88.6%
Happy 38.3%
Sad 36.6%
Calm 18.2%
Disgusted 10.3%
Fear 7.7%
Surprised 7.1%
Confused 2.5%
Angry 2.1%

AWS Rekognition

Age 29-39
Gender Female, 71.1%
Surprised 98.1%
Happy 12.2%
Disgusted 6.8%
Fear 5.9%
Sad 2.7%
Calm 1%
Angry 0.5%
Confused 0.5%

AWS Rekognition

Age 18-24
Gender Female, 82.4%
Calm 97.1%
Surprised 6.3%
Fear 6.2%
Sad 2.3%
Happy 1%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 12-20
Gender Male, 77.4%
Surprised 79.9%
Calm 18.6%
Fear 14%
Sad 5.7%
Angry 4.5%
Happy 4.4%
Confused 2.6%
Disgusted 1.1%

AWS Rekognition

Age 23-31
Gender Male, 96.6%
Calm 96.6%
Surprised 6.3%
Fear 6.2%
Sad 2.5%
Happy 0.4%
Angry 0.4%
Disgusted 0.3%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 53%
Calm 91.3%
Fear 6.8%
Surprised 6.5%
Sad 2.8%
Happy 2.2%
Confused 0.7%
Angry 0.6%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%

Categories

Imagga

pets animals 99.5%

Text analysis

Amazon

ROCK

Google

ROCK
ROCK