Human Generated Data

Title

Untitled (And a good '77 for you)

Date

1976

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5113

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (And a good '77 for you)

People

Artist: Bill Dane, American born 1938

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5113

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 98.8
Person 98.8
Person 94.8
Art 91.8
Graffiti 91.6
Mural 85.5
Painting 85.5
Person 82.1
Person 78.9
Wall 68.9
Animal 63.5
Bird 63.5
Drawing 60.4
Transportation 55.4

Clarifai
created on 2019-11-15

vehicle 99.5
people 99.2
transportation system 98.8
train 97.9
group 96.8
group together 96.5
railway 96.4
adult 95.2
war 94.7
two 93.2
man 93.2
many 89.4
calamity 89.3
three 88.9
military 86.9
engine 86.6
wreckage 86.2
no person 85.1
accident 84.7
skirmish 82.7

Imagga
created on 2019-11-15

billboard 46.4
signboard 37.5
car 37
structure 36.9
freight car 36.5
wheeled vehicle 31.8
conveyance 25.4
vehicle 25.2
transportation 17
city 15.8
graffito 15.5
travel 14.1
tramway 13.4
urban 13.1
building 12.7
old 12.5
snow 11.9
winter 11.9
transport 11.9
railway 11.8
sky 11.5
black 11.4
decoration 11.2
train 10.6
landscape 10.4
construction 10.3
industry 10.2
grunge 10.2
street 10.1
silhouette 9.9
tower 9.8
station 9.7
dark 9.2
industrial 9.1
dirty 9
road 9
trees 8.9
scene 8.7
line 8.6
track 8.5
tree 8.5
outdoor 8.4
vintage 8.3
park 8.2
light 8
railroad 7.9
rail 7.9
forest 7.8
architecture 7.8
high 7.8
season 7.8
cold 7.7
sign 7.5
frame 7.5
electric 7.5
business 7.3
steel 7.2

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

drawing 98.2
text 97.7
black and white 86.3
sketch 75.5
tree 69.1
person 64.2
cartoon 54.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Female, 54.8%
Fear 45%
Sad 45%
Disgusted 45%
Happy 54.8%
Angry 45%
Calm 45.1%
Surprised 45%
Confused 45%

AWS Rekognition

Age 38-56
Gender Male, 53.6%
Calm 46%
Confused 45.4%
Happy 45.2%
Surprised 45.2%
Sad 52.4%
Fear 45.7%
Angry 45.2%
Disgusted 45.1%

AWS Rekognition

Age 3-11
Gender Female, 54.2%
Fear 45.3%
Disgusted 45%
Calm 49.6%
Angry 45.1%
Sad 45.1%
Surprised 49.3%
Happy 45.4%
Confused 45.1%

Feature analysis

Amazon

Person 98.8%
Bird 63.5%

Categories

Imagga

cars vehicles 98.3%

Captions

Text analysis

Amazon

An
thID