Human Generated Data

Title

Untitled (Washington Square, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2997

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Washington Square, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2997

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-06-25

Clothing 99.5
Apparel 99.5
Person 99
Human 99
Person 98.7
Person 98.6
Wheel 98.3
Machine 98.3
Person 97.3
Face 89.6
Tie 87.3
Accessories 87.3
Accessory 87.3
Overcoat 79.1
Coat 79.1
Photographer 76.2
People 75
Photography 72
Photo 72
Hat 68.3
Portrait 67
Suit 66
Tree 65
Plant 65
Sunglasses 63.2
Cap 62.4
Female 62.2

Clarifai
created on 2023-10-29

people 99.9
two 98.7
group together 98.6
adult 98.3
man 96.6
group 96.5
street 96.3
administration 95.9
three 95.7
woman 94.9
vehicle 92.6
several 92.6
four 92.2
child 90.6
wear 89.7
many 89.2
music 88
one 85.7
veil 85.7
military 85.2

Imagga
created on 2022-06-25

city 35.7
people 26.8
street 26.7
urban 23.6
world 23.3
person 22.2
adult 21.4
man 20.2
building 19.5
architecture 16.5
travel 16.2
old 16
male 15.7
life 15
walking 14.2
business 14
tourist 13.4
black 12.7
tourism 12.4
town 12.1
women 11.9
men 11.2
garment 11.1
suit 10.8
outdoor 10.7
lifestyle 10.1
clothing 10.1
outdoors 9.8
human 9.7
looking 9.6
wall 9.5
buildings 9.5
stole 9.3
attractive 9.1
portrait 9.1
road 9
one 9
cheerful 8.9
sky 8.9
office 8.9
corporate 8.6
two 8.5
scarf 8.4
covering 8.4
holding 8.3
time 8.2
happy 8.1
device 8.1
businessman 7.9
happiness 7.8
scene 7.8
sitting 7.7
leisure 7.5
sidewalk 7.4
occupation 7.3
lady 7.3
businesswoman 7.3
work 7.2
jacket 7.2
hair 7.1
smile 7.1

Google
created on 2022-06-25

Microsoft
created on 2022-06-25

outdoor 99.8
person 98.9
black and white 97.1
clothing 94.9
street 94.2
monochrome 88.1
text 87.8
coat 84
jacket 73.2
man 63.8
way 49.4
sidewalk 28.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 98.9%
Calm 59.5%
Confused 24.4%
Surprised 6.7%
Fear 6.6%
Sad 5.6%
Disgusted 2.7%
Angry 2.4%
Happy 1.3%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 54.1%
Happy 19.5%
Surprised 9.8%
Confused 9.2%
Angry 8.2%
Fear 6.3%
Sad 2.6%
Disgusted 0.8%

AWS Rekognition

Age 18-26
Gender Male, 95.9%
Calm 96.9%
Surprised 6.5%
Fear 5.9%
Sad 2.7%
Angry 0.7%
Confused 0.3%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 21-29
Gender Female, 99.2%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 39-47
Gender Male, 84.7%
Calm 85.5%
Surprised 7.1%
Fear 6.1%
Angry 5.5%
Sad 3%
Happy 2.3%
Confused 1.5%
Disgusted 0.7%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Wheel 98.3%
Tie 87.3%
Sunglasses 63.2%

Text analysis

Amazon

BUSES
HERE
START HERE
START
2
N° 2
no
Jestin
ICHA
LAND ICHA
LAND

Google

N° 2 5 & 7 AVE BUSES START HERE
N
°
2
5
&
7
AVE
BUSES
START
HERE