Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4261

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.1
Person 99.1
Person 99
Person 98.8
Person 98.3
Person 95.9
Face 86.6
Pedestrian 86.1
Clothing 77.6
Apparel 77.6
Urban 68.9
Building 67.7
Town 67.7
City 67.7
People 66.9
Person 66
Vehicle 65.1
Transportation 65.1
Road 62.8
Street 60.8
Photo 60.6
Photography 60.6
Automobile 58.9
Car 58.9
Military Uniform 58.4
Military 58.4
Coat 57
Overcoat 57
Suit 57
Officer 55.6

Imagga
created on 2022-01-08

building 40.6
architecture 40.2
city 35.7
mortarboard 31.1
university 29.1
academic gown 26.8
cap 24
travel 21.8
clothing 21.5
gown 21.1
tourism 20.6
monument 19.6
headdress 18.9
statue 18.3
history 17.9
house 16.9
town 16.7
street 16.6
palace 16.4
outerwear 16
people 15.1
landmark 14.4
urban 14
sculpture 13.4
old 13.2
college 13
scholar 12.5
arch 12.3
tourist 12.1
covering 12
stone 12
historic 11.9
consumer goods 11.9
square 11.8
england 11.4
buildings 11.3
famous 11.2
church 11.1
structure 11
person 10.8
exterior 10.1
intellectual 10
art 9.8
column 9.7
sky 9.6
historical 9.4
residence 9.3
window 9.2
day 8.6
culture 8.5
capital 8.5
business 8.5
outdoors 8.2
road 8.1
man 8.1
tower 8.1
night 8
home 8
tourists 7.8
male 7.8
royal 7.7
castle 7.7
park 7.4
religion 7.2
river 7.1
facade 7
memorial 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

building 99.1
text 98.2
street 96.5
clothing 95
person 94.4
man 88.1
human face 79.5
black and white 77.5
monochrome 69.7
vehicle 58.3
car 52.3
city 51.7

Face analysis

Amazon

Google

AWS Rekognition

Age 58-66
Gender Male, 83.3%
Calm 73.1%
Happy 9.2%
Surprised 4.3%
Confused 3.2%
Angry 3%
Disgusted 2.8%
Fear 2.5%
Sad 1.9%

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 83%
Angry 8%
Happy 5.2%
Sad 1.8%
Confused 0.9%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 99.1%
Angry 0.5%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 72.6%
Calm 43.8%
Sad 20.4%
Surprised 14.5%
Confused 13.6%
Fear 2.5%
Happy 2.2%
Angry 1.5%
Disgusted 1.4%

AWS Rekognition

Age 24-34
Gender Female, 63.1%
Sad 63.3%
Calm 24.8%
Fear 3.5%
Angry 2.6%
Surprised 1.7%
Happy 1.5%
Disgusted 1.4%
Confused 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people sitting at a bus stop 71.8%
a group of people walking in front of a store 71.7%
a group of people sitting in front of a building 71.6%

Text analysis

Amazon

5
AUTO
LOANS
831
25
NUTS
DASHEW
25 on.5
KOSSES
صمك
...
on.5
MARINE
٠...LWA
19747
ORBORA

Google

Fir m 831
831
Fir
m