Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3157

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3157

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.7
Female 99.7
Person 99.7
Woman 99.7
Face 90.4
Head 90.4
People 89
Fire Hydrant 87.5
Hydrant 87.5
Bag 73.3
Box 69.9
Shop 57.9
Market 57.8
Body Part 57.8
Finger 57.8
Hand 57.8
Bazaar 56.1
Photography 55.6
Portrait 55.6
Clothing 55.4
Coat 55.4

Clarifai
created on 2018-05-10

people 99.9
one 99.8
adult 99.6
woman 98.2
portrait 97.6
two 97.4
wear 97.2
veil 95.7
man 94.9
recreation 94.3
watercraft 92.3
monochrome 91.5
transportation system 88.8
group 86.9
three 86.5
water 86.3
child 86.1
street 85.7
vehicle 84.7
beach 83.6

Imagga
created on 2023-10-05

seller 36.6
person 27.1
silhouette 22.3
man 22.2
scholar 21.5
sunset 19.8
outdoors 18.1
water 18
people 17.3
intellectual 17.2
male 17.1
black 16.7
musical instrument 15.9
ocean 14.1
adult 13.7
beach 13.5
lifestyle 12.3
outdoor 12.2
sky 12.1
relax 11.8
sea 11.7
sun 11.3
sitting 11.2
outside 11.1
calm 11
travel 10.6
fashion 10.6
youth 10.2
laptop 10
park 9.9
business 9.7
portrait 9.7
computer 9.6
relaxation 9.2
alone 9.1
attractive 9.1
old 9.1
one 9
landscape 8.9
women 8.7
day 8.6
shore 8.4
hand 8.4
dark 8.3
leisure 8.3
holding 8.3
technology 8.2
lady 8.1
suit 8.1
love 7.9
sand 7.9
work 7.8
happiness 7.8
standing 7.8
newspaper 7.8
world 7.7
wind instrument 7.5
tourism 7.4
boat 7.4
vacation 7.4
street 7.4
peaceful 7.3
horizon 7.2
body 7.2
fisherman 7.2
coast 7.2
mask 7.1
summer 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 90.2
black 70

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Female, 98.2%
Calm 94%
Surprised 6.3%
Fear 5.9%
Happy 4.4%
Sad 2.2%
Confused 0.6%
Angry 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 58
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.7%
Female 99.7%
Person 99.7%
Woman 99.7%
Fire Hydrant 87.5%

Categories

Captions

Text analysis

Amazon

247
Rerach