Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2935

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2935

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Furniture 99.9
City 99.9
Boy 99.4
Child 99.4
Male 99.4
Person 99.4
Person 99.3
Person 98.8
Urban 98.6
Person 98
Person 97.9
Person 97.6
Person 94.3
Person 94
Person 92.8
Footwear 92.2
Shoe 92.2
Accessories 91.2
Bag 91.2
Handbag 91.2
Road 91.1
Street 91.1
Person 88.7
Person 86.3
Shorts 81.6
Outdoors 79.2
Face 78.6
Head 78.6
Shoe 78
Shoe 75.4
Path 69.9
Sidewalk 69.9
Person 67.3
Bench 66.4
Person 65.9
Person 60.8
Person 58.1
Back 57.9
Body Part 57.9
Coat 56.8
Neighborhood 56.5
Grass 56.4
Nature 56.4
Park 56.4
Plant 56.4
Walking 55.6
Sitting 55.2
Metropolis 55.2
Barefoot 55.2

Clarifai
created on 2018-05-10

people 99.9
group together 99.6
adult 97.4
group 95.5
many 95.2
three 94.9
two 94
man 93.5
child 92.8
several 91.5
four 90.6
athlete 87
one 85.6
administration 85.4
woman 84.9
recreation 83.6
five 83.1
competition 82.6
wear 82.2
sports equipment 82.2

Imagga
created on 2023-10-06

skateboard 81.4
wheeled vehicle 66.9
board 59.2
vehicle 52.8
conveyance 40
sport 28.2
people 21.2
person 18.9
adult 17.5
man 17.5
outdoors 17.3
male 16.3
outdoor 16.1
beach 15.2
leisure 14.1
street 13.8
exercise 13.6
sand 13.3
lifestyle 13
sports equipment 12.9
youth 12.8
fun 12.7
fitness 12.6
city 12.5
vacation 12.3
urban 12.2
men 12
one 11.9
athlete 11.6
boy 11.3
travel 11.3
outside 11.1
speed 11
summer 10.9
recreation 10.8
park 10.7
race 10.5
sea 10.2
ocean 10
road 9.9
car 9.7
body 9.6
women 9.5
sitting 9.4
legs 9.4
day 9.4
water 9.3
training 9.2
competition 9.2
pretty 9.1
active 9
activity 9
sky 8.9
happy 8.8
skate 8.8
running 8.6
cute 8.6
smile 8.5
casual 8.5
attractive 8.4
action 8.3
sports 8.3
runner 8
equipment 7.9
building 7.9
child 7.9
model 7.8
play 7.8
run 7.7
mat 7.5
sexy 7.2
black 7.2
coast 7.2
portrait 7.1
stretcher 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 98.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 31-41
Gender Female, 88.5%
Happy 59.5%
Calm 36.6%
Surprised 6.4%
Fear 6%
Sad 2.4%
Angry 1.9%
Disgusted 0.4%
Confused 0.4%

AWS Rekognition

Age 11-19
Gender Male, 94.7%
Sad 36.9%
Fear 23.8%
Calm 17.7%
Surprised 15.5%
Happy 8.3%
Confused 6.5%
Disgusted 6.1%
Angry 2.7%

AWS Rekognition

Age 37-45
Gender Male, 98.7%
Calm 90.9%
Surprised 6.3%
Fear 5.9%
Sad 4.4%
Angry 1.6%
Happy 1.4%
Confused 0.4%
Disgusted 0.2%

AWS Rekognition

Age 37-45
Gender Male, 87.6%
Happy 39.7%
Fear 14.8%
Calm 12%
Disgusted 10.8%
Surprised 10.3%
Sad 7.2%
Confused 4%
Angry 3.1%

AWS Rekognition

Age 12-20
Gender Male, 93.3%
Calm 86.2%
Surprised 6.7%
Fear 6.4%
Sad 3.8%
Angry 3.7%
Disgusted 1.7%
Happy 1.4%
Confused 0.8%

AWS Rekognition

Age 23-33
Gender Male, 76.7%
Calm 58.6%
Happy 32.2%
Surprised 7%
Fear 6.2%
Sad 3.4%
Angry 2.3%
Confused 0.8%
Disgusted 0.7%

Microsoft Cognitive Services

Age 15
Gender Female

Feature analysis

Amazon

Boy 99.4%
Child 99.4%
Male 99.4%
Person 99.4%
Shoe 92.2%
Handbag 91.2%
Shorts 81.6%
Bench 66.4%