Human Generated Data

Title

Untitled (Panama)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5167

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Panama)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5167

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Transportation 99.5
Vehicle 99.5
Truck 99.5
Car 99
Automobile 99
Van 95.3
Wheel 93
Machine 93
Human 84.6
Person 84.6
Wheel 78
Person 74.3
Bike 70
Bicycle 70
Urban 68.6
Caravan 67.5
Building 64.6
Rv 57.8
Wheel 55.1
Bicycle 54.3

Clarifai
created on 2019-11-15

street 99.5
monochrome 99.2
architecture 98.2
urban 96.5
building 96.5
city 96.2
house 96.1
window 95.7
no person 94.6
family 93.6
apartment 93.5
car 91.6
road 89.8
modern 89.7
black and white 89.1
transportation system 88.4
expression 87.7
old 87.4
business 86.6
balcony 86.4

Imagga
created on 2019-11-15

building 97.9
cinema 86.6
theater 70.3
architecture 58.2
structure 54.2
house 41.2
home 33.6
city 25.8
window 23
old 23
facade 22.4
roof 22
sky 21.7
estate 20.9
construction 20.6
residential 20.1
urban 20.1
town 19.5
windows 19.2
tourism 18.2
residence 17.8
balcony 17
modern 16.8
property 16.5
travel 16.2
brick 16
office 15.5
apartment 15.4
real 15.2
exterior 14.8
housing 14.7
street 13.8
new 13.8
historic 13.8
public house 13.4
buildings 13.3
dwelling 12.9
houses 12.6
living 12.3
wall 12.1
palace 11.9
real estate 11.7
door 11.4
historical 11.3
luxury 11.2
stone 11
glass 10.9
history 10.8
ancient 10.4
center 10.2
garage 9.9
scene 9.5
monument 9.4
style 8.9
apartments 8.9
mortgage 8.8
contemporary 8.5
famous 8.4
tile 8.4
vacation 8.2
square 8.2
landmark 8.1
yard 7.8
architectural 7.7
downtown 7.7
tower 7.2
summer 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

building 99.8
road 98.4
vehicle 94.9
outdoor 94.9
land vehicle 92.9
car 91.3
text 90.2
house 80.3
street 75.1
wheel 66.6
window 50.2
apartment building 31.7
sign 16.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Male, 50%
Angry 49.5%
Surprised 49.6%
Disgusted 49.5%
Confused 49.6%
Fear 49.8%
Calm 49.8%
Happy 49.6%
Sad 49.6%

Feature analysis

Amazon

Truck 99.5%
Car 99%
Wheel 93%
Person 84.6%
Bicycle 70%

Categories

Text analysis

Google

EHEVROLET
EHEVROLET