Human Generated Data

Title

Untitled (Honolulu)

Date

1977

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5119

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Honolulu)

People

Artist: Bill Dane, American born 1938

Date

1977

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Transportation 99.9
Car 99.9
Automobile 99.9
Vehicle 99.9
Tire 90.7
Asphalt 90
Tarmac 90
Machine 87.6
Spoke 87.6
Truck 86.4
Alloy Wheel 84.3
Car 82.8
Car Wheel 78.7
Road 77.6
Wheel 66.4
Urban 63.9
Roof Rack 57.9
Intersection 57
Car 56.2

Clarifai
created on 2019-11-15

car 99.5
monochrome 99.5
street 99.1
vehicle 98.7
transportation system 97.1
people 94.7
vintage 93.8
travel 93.6
old 92.9
city 92.8
black and white 92
road 91.4
classic 90.7
urban 87.3
no person 86.4
outdoors 85.7
public show 82
wheel 80.9
luxury 80.5
light 78.9

Imagga
created on 2019-11-15

car 100
limousine 100
motor vehicle 98.7
beach wagon 61.5
wheeled vehicle 33.7
travel 28.1
city 24.1
sky 23.6
vehicle 23
architecture 22.6
building 21.6
transportation 21.5
tourism 20.6
road 19.9
urban 19.2
automobile 19.1
auto 18.2
water 17.3
house 16.7
street 16.6
vacation 16.4
drive 16.1
landscape 15.6
transport 15.5
ocean 14.9
sea 14.8
town 14.8
speed 13.7
old 13.2
structure 12.9
clouds 12.7
coast 12.6
buildings 12.3
tourist 11.8
tree 11.6
harbor 11.5
destination 11.2
beach 11
expensive 10.5
billboard 10.5
luxury 10.3
island 10.1
cars 9.8
motor 9.7
roof 9.5
cityscape 9.5
palm 9.4
resort 9.3
holiday 9.3
boat 9.3
summer 9
signboard 8.7
driving 8.7
light 8.7
engine 8.7
tropical 8.5
bay 8.5
vacations 8.5
fast 8.4
modern 8.4
park 8.3
land 8.3
style 8.2
landmark 8.1
river 8
trees 8
sand 7.9
bridge 7.8
pier 7.8
village 7.7
outdoor 7.6
traffic 7.6
wheel 7.5
outdoors 7.5
classic 7.4
new 7.3
sun 7.2
black 7.2
home 7.2
day 7.1
scenic 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 97.7
land vehicle 96.6
car 96.4
vehicle 96.2
street 96.1
outdoor 95.2
black and white 88.1
wheel 87.3
old 62.5

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Female, 54.5%
Happy 45.1%
Sad 47.7%
Disgusted 45.7%
Surprised 45.3%
Fear 49.7%
Angry 45.9%
Confused 45.3%
Calm 45.2%

Feature analysis

Amazon

Car 99.9%
Truck 86.4%
Wheel 66.4%

Captions

Microsoft

a white car in front of a building 91.5%
a car parked in front of a building 90%
an old photo of a car 89.9%

Text analysis

Amazon

B
B CORPORATION
CORPORATION

Google

শ B'GLAD CORPORATION FE
B'GLAD
CORPORATION
FE