Human Generated Data

Title

Untitled (Artists' Union demonstrators, New York City)

Date

1934-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4226

Human Generated Data

Title

Untitled (Artists' Union demonstrators, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1934-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98.2
Person 98.2
Person 97.8
Person 87.2
Person 86.9
Home Decor 85.7
Person 84.2
Outdoors 84.2
Transportation 84
Vehicle 84
Mammal 82.8
Horse 82.8
Animal 82.8
Automobile 82.6
Wheel 80.9
Machine 80.9
Nature 80
Person 74.8
Person 71.4
Car 69.9
Person 67.2
Person 65.9
Pedestrian 62.2
Road 58.5
Sports Car 57.9

Imagga
created on 2022-01-08

old 29.2
paper 27
grunge 26.4
retro 21.3
texture 20.8
vintage 18.5
architecture 17.4
wall 15.8
design 15.7
brown 15.4
art 15.2
hole 14.7
ancient 13.8
wallpaper 13.8
aged 13.6
antique 13.2
construction 12.8
dirty 12.6
business 12.1
blank 12
note 11.9
textured 11.4
room 11.1
money 11
pattern 10.9
container 10.9
drawing 10.8
currency 10.8
frame 10.5
building 10.4
finance 10.1
house 10.1
cash 10.1
home 9.6
floor 9.3
page 9.3
wood 9.2
map 9.1
empty 9.1
modern 9.1
tile 9
material 9
card 9
wooden 8.8
bill 8.6
industry 8.5
grungy 8.5
document 8.3
city 8.3
toilet 8.2
office 8
device 8
artistic 7.8
scrapbook 7.8
worn 7.6
bathroom 7.6
sign 7.5
backdrop 7.4
style 7.4
letter 7.3
color 7.2
work 7.2
equipment 7.2
surface 7

Google
created on 2022-01-08

Photograph 94.3
Wheel 93.3
White 92.2
Product 90.8
Hat 89.6
Organism 85.7
Motor vehicle 85.3
Photographic film 84.7
Working animal 83.9
Font 83
Fedora 82.4
Adaptation 79.4
Tints and shades 77
Sun hat 74.8
Snapshot 74.3
Vehicle 74.2
Pack animal 72.9
Rectangle 72.1
Tire 71.9
Design 68.8

Microsoft
created on 2022-01-08

vehicle 98.8
land vehicle 98
car 95.6
text 93.4
wheel 87.1

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 98.1%
Happy 49.5%
Calm 36%
Sad 4.4%
Confused 3%
Fear 2.9%
Angry 1.9%
Disgusted 1.4%
Surprised 0.8%

AWS Rekognition

Age 28-38
Gender Female, 66%
Calm 99.8%
Happy 0.1%
Surprised 0%
Disgusted 0%
Angry 0%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Female, 82.9%
Calm 89.5%
Sad 5%
Happy 2%
Angry 1.1%
Fear 0.9%
Confused 0.5%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 16-22
Gender Female, 99.1%
Calm 70.8%
Sad 17.9%
Disgusted 3%
Angry 2.9%
Happy 2%
Surprised 1.6%
Fear 1.2%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 98.2%
Horse 82.8%
Wheel 80.9%
Car 69.9%

Text analysis

Amazon

PRODUCERS
GUTTENBE
are
LOUIS GUTTENBE
LOUIS
BR
ARTISTS
Costume
A
E
Chealn
OCADO
A.S.A

Google

ON
RICAN
are
Ghealn
estume
ON RICAN ARIST are PRODUCERS NOUIS GUTTENB Ghealn estume
PRODUCERS
NOUIS
ARIST
GUTTENB