Human Generated Data

Title

Untitled (New York Public Library, Forty-second Street and Fifth Avenue, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4252

Human Generated Data

Title

Untitled (New York Public Library, Forty-second Street and Fifth Avenue, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4252

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.8
Apparel 99.8
Person 98.9
Human 98.9
Person 98.4
Person 98.1
Person 96.7
Person 96.5
Coat 91.7
Tarmac 87.5
Asphalt 87.5
Road 86.1
Overcoat 85.3
Hat 70.8
Zebra Crossing 59.4
Advertisement 56.5
Text 56.5

Clarifai
created on 2023-10-25

people 99.8
lid 99.6
wear 97.5
portrait 97.4
veil 97.2
two 97.1
man 96.5
three 96.3
adult 95.4
train 95.1
group 94.7
four 92.9
cowboy hat 90.3
one 89.3
administration 89
fedora 88.4
woman 87.9
railway 87
retro 86.6
group together 86.6

Imagga
created on 2022-01-08

man 32.9
male 27.7
person 23.9
people 22.9
black 21.8
business 20
bow tie 17.9
portrait 17.5
face 17
hat 16.7
clothing 15.7
adult 15.5
necktie 15.5
suit 15.4
businessman 15
office 13.8
one 12.7
old 11.8
scholar 11.5
couple 11.3
men 11.2
computer 10.9
silhouette 10.8
looking 10.4
corporate 10.3
work 10.2
happy 10
hand 9.9
working 9.7
device 9.5
executive 9.4
headdress 9.3
professional 9.3
head 9.2
musical instrument 9.2
intellectual 9.2
alone 9.1
human 9
world 9
garment 8.9
product 8.9
job 8.8
newspaper 8.7
hands 8.7
love 8.7
laptop 8.7
smile 8.6
tie 8.5
vintage 8.3
holding 8.3
indoor 8.2
cowboy hat 8
handsome 8
television 7.9
happiness 7.8
creation 7.8
washboard 7.7
sitting 7.7
attractive 7.7
guy 7.7
serious 7.6
keyboard 7.5
web site 7.4
covering 7.1
uniform 7.1
conceptual 7.1

Google
created on 2022-01-08

Outerwear 95.2
Photograph 94.2
Hat 93.2
Coat 92.2
Fedora 89.8
Sleeve 87.2
Gesture 84.8
Sun hat 84.7
Font 81.4
Blazer 76.9
Snapshot 74.3
Collar 73.7
Suit 73.5
Vintage clothing 71.3
Trench coat 63.8
Overcoat 61.2
White-collar worker 60.5
Cowboy hat 58.1
Room 57.1
Formal wear 55.3

Microsoft
created on 2022-01-08

person 98.9
text 98.8
clothing 97.8
hat 93.9
man 93.2
fedora 85.3
fashion accessory 82.2
coat 81.3
cowboy hat 63.4
suit 61.3
old 47.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Female, 93.7%
Calm 98.7%
Confused 0.4%
Surprised 0.3%
Sad 0.2%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 20-28
Gender Male, 98.3%
Sad 40.7%
Calm 37.4%
Surprised 8.6%
Angry 6.6%
Confused 2.4%
Fear 2.3%
Disgusted 1.4%
Happy 0.6%

AWS Rekognition

Age 29-39
Gender Male, 93.7%
Calm 36.1%
Happy 21%
Sad 10.9%
Surprised 9.3%
Angry 9%
Confused 6.7%
Fear 5.5%
Disgusted 1.6%

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 75.1%
Sad 9.3%
Angry 7%
Confused 6.4%
Fear 0.8%
Surprised 0.7%
Disgusted 0.5%
Happy 0.2%

AWS Rekognition

Age 24-34
Gender Male, 99%
Calm 99.9%
Angry 0%
Surprised 0%
Confused 0%
Happy 0%
Sad 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.3%

Captions

Text analysis

Amazon

BNCHEOWAIC