Human Generated Data

Title

Untitled (Civil Works Administration demonstration, New York City)

Date

December 1933-March 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4244

Human Generated Data

Title

Untitled (Civil Works Administration demonstration, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

December 1933-March 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4244

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Person 98.8
Military 98.6
Person 98.6
Military Uniform 98
Person 97.8
Person 97.1
Person 96.2
Person 95.6
Person 95.2
People 95.1
Army 95
Armored 95
Person 94.5
Person 93.5
Troop 91.6
Person 90.9
Person 89.8
Person 87.9
Soldier 85.9
Crowd 82.6
Person 80.6
Person 78.7
Person 77.1
Person 71.8
Person 69.2
Text 61.2
Person 60.9
Officer 56.9

Clarifai
created on 2023-10-25

people 99.9
group 99.1
many 98.4
street 97.7
wear 97
group together 96.8
man 96.7
train 96.4
adult 95.5
woman 95.4
railway 95.4
transportation system 92.7
art 89.8
child 89.5
two 88.9
several 88.4
cavalry 88.2
locomotive 87.2
war 87
crowd 84.3

Imagga
created on 2022-01-08

billboard 55.2
signboard 44.8
freight car 44.2
structure 35.3
car 34.8
wheeled vehicle 27
old 22.3
vehicle 19.7
architecture 16.4
art 16.3
black 16.2
grunge 16.2
television 16
vintage 15.7
city 15
building 13.9
shop 13.5
history 13.4
tourism 13.2
travel 12.7
retro 12.3
dirty 11.7
frame 11.6
antique 11.2
texture 11.1
film 11
historical 10.3
paint 9.9
landmark 9.9
business 9.7
landscape 9.7
barbershop 9.6
design 9.6
ancient 9.5
grungy 9.5
mercantile establishment 9.4
entertainment 9.2
rough 9.1
aged 9
broadcasting 9
transportation 9
sky 8.9
night 8.9
conveyance 8.9
web site 8.7
movie 8.7
screen 8.6
construction 8.5
culture 8.5
winter 8.5
snow 8.3
historic 8.2
transport 8.2
border 8.1
digital 8.1
symbol 8.1
graffito 8
graphic 8
scratches 7.9
urban 7.9
paper 7.8
frames 7.8
scratch 7.8
scene 7.8
animal 7.8
space 7.8
collage 7.7
edge 7.7
decoration 7.6
finance 7.6
house 7.5
equipment 7.5
destination 7.5
famous 7.4
water 7.3

Google
created on 2022-01-08

Photograph 94.3
Black 89.6
Line 81.7
Font 80.5
Display device 76.3
Rectangle 75.1
Snapshot 74.3
Electronic device 74
Crowd 71.8
Art 71.1
Suit 68.4
Advertising 68.2
Fun 67.8
Event 65.7
History 65.4
Vintage clothing 65.4
Stock photography 63.8
Visual arts 63.6
Paper product 63.5
Monochrome 61.8

Microsoft
created on 2022-01-08

text 99.9
person 96.1
clothing 94
outdoor 86
man 74.8
poster 62.2
several 11.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Female, 97.4%
Calm 96.8%
Angry 1.5%
Surprised 0.8%
Confused 0.3%
Disgusted 0.2%
Sad 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 67.9%
Angry 21.3%
Surprised 3.6%
Fear 3.3%
Happy 1.4%
Sad 1.3%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Female, 93.2%
Calm 81.7%
Sad 6.8%
Surprised 3.8%
Happy 2%
Fear 2%
Confused 1.7%
Angry 1.3%
Disgusted 0.8%

AWS Rekognition

Age 12-20
Gender Female, 77.5%
Calm 68.2%
Angry 18.3%
Disgusted 7.1%
Confused 2.4%
Surprised 2.1%
Sad 0.9%
Happy 0.6%
Fear 0.4%

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

BRU
BRI
-
DEMAND
g
WCCA.A
VAGE CER
LEITORES
120g

Google

BVNCHB OWVIIC PE BRU NAMAND NOCWA BRI
BVNCHB
OWVIIC
PE
BRU
NAMAND
NOCWA
BRI