Human Generated Data

Title

Untitled (Rome)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5131

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Rome)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5131

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 97.4
Person 97.4
Transportation 96.2
Vehicle 96.2
Automobile 96.2
Car 96.2
Car 95.5
Traffic Light 72.7
Light 72.7
Car 51
Person 49.9

Clarifai
created on 2019-11-15

people 98.6
monochrome 98
street 97.9
group 96.9
group together 95.9
vehicle 94
transportation system 90.5
man 90.3
many 89.3
town 87.2
black and white 86.4
war 85.7
no person 85.3
city 85
old 83.5
vintage 82.9
architecture 82.8
adult 82.8
road 82.6
home 82.1

Imagga
created on 2019-11-15

snow 85.8
weather 42.8
city 35.8
building 31.4
architecture 30.7
winter 23
street 22.1
old 21.6
urban 21
travel 18.3
cold 18.1
wheeled vehicle 16.9
scene 16.4
black 14.4
tourism 14
road 13.6
drawing 13.5
sky 13.4
tricycle 13.4
window 13.3
sketch 13
vehicle 12.5
wall 12.4
structure 11.6
exterior 11.1
landscape 10.4
buildings 10.4
tourist 10.3
grunge 10.2
house 10
light 10
conveyance 10
snowing 9.8
history 9.8
trees 9.8
frozen 9.6
park 9.5
construction 9.4
season 9.4
stone 9.3
town 9.3
outdoor 9.2
historic 9.2
vintage 9.1
outdoors 9
facade 8.9
forest 8.7
antique 8.7
architectural 8.7
lamp 8.6
roof 8.6
brick 8.5
tree 8.5
wood 8.3
texture 8.3
church 8.3
landmark 8.1
sidewalk 7.9
people 7.8
ancient 7.8
cityscape 7.6
transportation 7.2
day 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 97.7
black and white 88.9
street 83.9
vehicle 77.4
drawing 70.3
city 63.7
building 62.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Female, 50.4%
Surprised 49.6%
Confused 49.6%
Sad 49.5%
Disgusted 49.5%
Angry 50.1%
Calm 49.6%
Fear 49.6%
Happy 49.5%

Feature analysis

Amazon

Person 97.4%
Car 96.2%

Captions

Microsoft
created on 2019-11-15

a group of people on a city street 64.9%

Text analysis

Amazon

INARO
INARO 9
IJ11
9

Google

WA LEH
WA
LEH