Human Generated Data

Title

Untitled (Natchez, Mississippi)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1453

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Natchez, Mississippi)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1453

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

City 99.7
Road 99.7
Street 99.7
Urban 99.7
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Clothing 98.3
Adult 97
Male 97
Man 97
Person 97
Person 96.3
Person 95.6
Person 90.1
Sitting 86.4
Bus Stop 86.3
Outdoors 86.3
Furniture 86.2
Shorts 85.5
Footwear 81.1
Shoe 81.1
Bench 76.2
Shoe 75.1
Person 73.6
Hat 73.4
Head 73.1
Shoe 70.9
Coat 67.8
Path 67.4
Face 64.9
Shoe 64.9
Person 64.3
Door 61.1
Cap 60.8
Sidewalk 57.4
Accessories 57.2
Bag 57.2
Handbag 57.2
Chair 56.6
Formal Wear 56
Suit 56
Bicycle 56
Transportation 56
Vehicle 56
Hairdresser 55.4

Clarifai
created on 2018-05-11

people 100
group together 99.7
adult 97.9
child 97.8
group 97.8
man 97
two 96.6
uniform 96
military 94.3
many 94.1
three 92.5
four 92.3
street 91.6
wear 91.4
several 91.2
administration 89.3
soldier 88.7
war 87.2
woman 86.9
boy 86.7

Imagga
created on 2023-10-06

man 27.5
musical instrument 25.3
accordion 20.5
person 18.4
adult 18.1
people 16.7
city 16.6
keyboard instrument 16.4
wind instrument 15.8
street 14.7
male 14.6
portrait 14.2
barbershop 13.8
swing 12.3
outdoor 12.2
black 12
shop 11.9
urban 11.4
outdoors 11.2
building 11
business 10.9
sport 10.9
wall 10.3
leisure 10
suit 9.9
pretty 9.8
world 9.6
mechanical device 9.6
athlete 9.5
men 9.4
player 9.4
happy 9.4
plaything 9.1
attractive 9.1
sidewalk 9.1
dirty 9
couple 8.7
standing 8.7
mercantile establishment 8.7
dark 8.3
fashion 8.3
child 8.3
dress 8.1
businessman 7.9
women 7.9
parent 7.6
human 7.5
cricket bat 7.5
mother 7.5
one 7.5
alone 7.3
danger 7.3
lifestyle 7.2
active 7.2
hair 7.1
mechanism 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

building 99.8
outdoor 99.7
black 71.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-36
Gender Male, 94.2%
Calm 68.2%
Sad 21.9%
Angry 7%
Surprised 6.7%
Fear 6.2%
Confused 2.9%
Disgusted 1.3%
Happy 1.1%

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 98.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 1.6%
Angry 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 20-28
Gender Male, 72.8%
Surprised 62.3%
Happy 38.1%
Disgusted 9%
Fear 6.9%
Sad 4.5%
Angry 3.1%
Calm 2.5%
Confused 2.1%

AWS Rekognition

Age 40-48
Gender Male, 99.3%
Calm 33.1%
Surprised 22.1%
Confused 20.7%
Sad 11.6%
Happy 11%
Fear 6.4%
Disgusted 3.2%
Angry 1.2%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Angry 62.8%
Sad 24.3%
Surprised 17.6%
Fear 6%
Confused 2.2%
Calm 1.5%
Disgusted 0.5%
Happy 0.2%

AWS Rekognition

Age 24-34
Gender Female, 95.8%
Happy 60.3%
Fear 13%
Surprised 9.8%
Sad 8.9%
Confused 4%
Calm 3.3%
Angry 1.7%
Disgusted 1.6%

AWS Rekognition

Age 21-29
Gender Male, 77.1%
Happy 42.9%
Angry 23.2%
Calm 14.7%
Disgusted 7.6%
Surprised 7.2%
Fear 6.6%
Sad 4.1%
Confused 3.3%

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 19
Gender Male

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Shoe 81.1%
Hat 73.4%
Coat 67.8%

Categories

Text analysis

Amazon

for