Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4261

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4261

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Person 99
Person 98.8
Person 98.3
Person 95.9
Face 86.6
Pedestrian 86.1
Clothing 77.6
Apparel 77.6
Urban 68.9
City 67.7
Building 67.7
Town 67.7
People 66.9
Person 66
Vehicle 65.1
Transportation 65.1
Road 62.8
Street 60.8
Photography 60.6
Photo 60.6
Car 58.9
Automobile 58.9
Military Uniform 58.4
Military 58.4
Suit 57
Coat 57
Overcoat 57
Officer 55.6

Clarifai
created on 2023-10-25

people 99.7
street 99.5
two 95.9
monochrome 95.8
adult 95.7
portrait 95.4
man 94.3
group 93.6
woman 93.4
wear 93.3
one 91.8
train 91.5
analogue 90.8
administration 88.8
wait 87.8
city 87.3
police 86.9
war 86.3
boy 85.9
transportation system 84.3

Imagga
created on 2022-01-08

building 40.6
architecture 40.2
city 35.7
mortarboard 31.1
university 29.1
academic gown 26.8
cap 24
travel 21.8
clothing 21.5
gown 21.1
tourism 20.6
monument 19.6
headdress 18.9
statue 18.3
history 17.9
house 16.9
town 16.7
street 16.6
palace 16.4
outerwear 16
people 15.1
landmark 14.4
urban 14
sculpture 13.4
old 13.2
college 13
scholar 12.5
arch 12.3
tourist 12.1
covering 12
stone 12
historic 11.9
consumer goods 11.9
square 11.8
england 11.4
buildings 11.3
famous 11.2
church 11.1
structure 11
person 10.8
exterior 10.1
intellectual 10
art 9.8
column 9.7
sky 9.6
historical 9.4
residence 9.3
window 9.2
day 8.6
culture 8.5
capital 8.5
business 8.5
outdoors 8.2
road 8.1
man 8.1
tower 8.1
night 8
home 8
tourists 7.8
male 7.8
royal 7.7
castle 7.7
park 7.4
religion 7.2
river 7.1
facade 7
memorial 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

building 99.1
text 98.2
street 96.5
clothing 95
person 94.4
man 88.1
human face 79.5
black and white 77.5
monochrome 69.7
vehicle 58.3
car 52.3
city 51.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 58-66
Gender Male, 83.3%
Calm 73.1%
Happy 9.2%
Surprised 4.3%
Confused 3.2%
Angry 3%
Disgusted 2.8%
Fear 2.5%
Sad 1.9%

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 83%
Angry 8%
Happy 5.2%
Sad 1.8%
Confused 0.9%
Disgusted 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 99.1%
Angry 0.5%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 72.6%
Calm 43.8%
Sad 20.4%
Surprised 14.5%
Confused 13.6%
Fear 2.5%
Happy 2.2%
Angry 1.5%
Disgusted 1.4%

AWS Rekognition

Age 24-34
Gender Female, 63.1%
Sad 63.3%
Calm 24.8%
Fear 3.5%
Angry 2.6%
Surprised 1.7%
Happy 1.5%
Disgusted 1.4%
Confused 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

5
AUTO
LOANS
831
25
NUTS
DASHEW
25 on.5
KOSSES
صمك
...
on.5
MARINE
٠...LWA
19747
ORBORA

Google

Fir m 831
Fir
m
831