Human Generated Data

Title

Untitled (Knotts Berry Farm)

Date

1977

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5128

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Knotts Berry Farm)

People

Artist: Bill Dane, American born 1938

Date

1977

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5128

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.1
Human 99.1
Amusement Park 95.2
Theme Park 95.2
Boat 82.3
Vehicle 82.3
Transportation 82.3
Person 77.5
Apparel 73.7
Clothing 73.7
Plant 69.3
Tree 69.3
Boat 61.5
Building 55.8
Shorts 55
Boat 53.6

Clarifai
created on 2019-11-15

people 97
group 94
travel 91.8
water 90.7
no person 88.6
chair 87.9
street 85.5
monochrome 85.5
city 85.4
vehicle 85.3
tree 84.8
transportation system 84
watercraft 83
man 81.6
luxury 81.2
architecture 80.9
many 80.3
house 79.4
black and white 79.4
group together 78.3

Imagga
created on 2019-11-15

architecture 37.8
temple 29.6
building 28.1
structure 25.8
tree 23
travel 22.5
fountain 22.5
city 22.4
water 22
sky 20
tourism 19
culture 17.1
old 16.7
landscape 16.4
tower 16.1
history 16.1
night 16
construction 15.4
landmark 15.3
ancient 14.7
park 14.2
tourist 13.7
house 13
cemetery 13
symbol 12.8
religion 12.5
design 12.4
monument 12.1
palace 11.7
reflection 11.6
bridge 11.6
river 11.6
bonsai 11.2
lake 10.9
silhouette 10.8
outdoor 10.7
art 10.7
famous 10.2
historic 10.1
china 9.6
historical 9.4
religious 9.4
stone 9.3
summer 9
village 9
sculpture 8.8
woody plant 8.8
vehicle 8.8
urban 8.7
scene 8.7
day 8.6
cityscape 8.5
buildings 8.5
lights 8.3
vacation 8.2
light 8
bumper car 7.9
capital 7.6
oriental 7.6
carousel 7.4
exterior 7.4
flag 7.3
peaceful 7.3
sunset 7.2
bright 7.1
romantic 7.1
sea 7
scenic 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 97.9
black and white 84.8
ship 61.1
tree 58.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 50.5%
Angry 49.5%
Surprised 49.5%
Sad 49.8%
Fear 50%
Calm 49.6%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 41-59
Gender Male, 50.2%
Surprised 49.7%
Sad 49.6%
Confused 49.5%
Calm 49.7%
Happy 49.5%
Angry 49.7%
Fear 49.6%
Disgusted 49.6%

Feature analysis

Amazon

Person 99.1%
Boat 82.3%

Captions