Human Generated Data

Title

Untitled (Washington Square, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.914

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Washington Square, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.914

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-06-25

Wheel 99.2
Machine 99.2
Person 99.1
Human 99.1
Person 97.8
Person 97.7
Person 97.3
Tire 89.1
Clothing 88.8
Apparel 88.8
Overcoat 86.9
Coat 86.9
Suit 84.5
Spoke 82.3
Car Wheel 75.9
Alloy Wheel 70.6
People 69.5
Transportation 68.5
Vehicle 67.3
Car 59.3
Automobile 59.3
Photography 58.8
Photo 58.8
Military Uniform 58
Military 58

Clarifai
created on 2023-10-29

people 100
street 99.3
adult 99
two 98
group together 97.9
man 97.9
vehicle 96.9
woman 95.6
transportation system 94.6
monochrome 94.2
group 93.8
administration 92.4
three 91.9
several 89.3
wear 89
four 89
child 88.8
five 84.4
many 83.8
one 83.1

Imagga
created on 2022-06-25

man 21.5
city 19.9
world 19.1
people 16.7
urban 16.6
person 15.9
old 15.3
male 14.9
street 14.7
life 13
cemetery 12.9
stone 12.8
instrument 12.5
black 12.1
building 11.6
outdoors 11.2
device 10.4
standing 10.4
wall 10.4
adult 9.9
portrait 9.7
couple 9.6
guillotine 9.5
architecture 9.5
walking 9.5
sitting 9.4
travel 9.1
statue 8.9
sculpture 8.7
scene 8.6
tourism 8.2
park 8.2
time 8.2
dress 8.1
instrument of execution 7.8
art 7.8
sad 7.7
outdoor 7.6
light 7.3
instrument of punishment 7.3
lady 7.3
business 7.3
lifestyle 7.2
religion 7.2
love 7.1
groom 7

Google
created on 2022-06-25

Microsoft
created on 2022-06-25

outdoor 99.9
person 99.7
clothing 91.8
text 91.5
black and white 84.4
street 80
sidewalk 70
man 65.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Male, 98.4%
Calm 59.2%
Confused 15.4%
Angry 11.2%
Surprised 8.4%
Fear 6.4%
Sad 5.8%
Disgusted 1.5%
Happy 0.4%

AWS Rekognition

Age 35-43
Gender Male, 99.4%
Calm 73%
Fear 7.6%
Surprised 7%
Angry 6.7%
Disgusted 5.8%
Confused 4%
Sad 3.6%
Happy 1.4%

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Happy 73.9%
Calm 16.3%
Surprised 8.5%
Fear 6.3%
Sad 2.4%
Angry 1.7%
Confused 1.5%
Disgusted 0.9%

AWS Rekognition

Age 24-34
Gender Female, 67.6%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Confused 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 28-38
Gender Female, 92.6%
Sad 98.3%
Angry 23.4%
Calm 10.8%
Surprised 6.8%
Fear 6.3%
Disgusted 4.7%
Happy 2.3%
Confused 1.7%

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.2%
Person 99.1%

Text analysis

Amazon

BUSES
STAUT
N° 2
2
AVE
STAUT HERE
HERE
FOR
"
" - A7
CREAM
- A7

Google

Nº 2 7 AVE BUSES START HERE
2
7
AVE
BUSES
START
HERE