Human Generated Data

Title

Untitled (Bogota)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5158

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5158

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Furniture 99.9
Person 99.6
Human 99.6
Person 98.8
Person 96.2
Shoe 95
Apparel 95
Clothing 95
Footwear 95
Person 92.2
Bench 91
Person 87.4
Person 86.2
Park Bench 85.3
Truck 78.7
Transportation 78.7
Vehicle 78.7
Person 76.2
Car 67.6
Automobile 67.6
Person 65.8
People 62.1
Person 61.5
Car 58.9
Shorts 58.2
Shoe 57.3
Person 55.6
Sitting 55.3
Shoe 50.5
Person 48.3

Clarifai
created on 2019-11-15

people 99.6
street 98.9
monochrome 97.4
child 97.2
man 96.2
adult 95.9
two 94.7
dog 94.7
group 93.9
woman 92.9
one 92.3
portrait 91.9
group together 91.7
girl 89.6
couple 87.9
black and white 87.2
city 87.1
park 85.3
three 84.4
boy 84.1

Imagga
created on 2019-11-15

bench 71.2
park bench 66.3
seat 41
park 27.2
furniture 26.2
tree 20.7
outdoors 20.4
trees 18.7
snow 16.2
sliding door 15.7
landscape 14.1
autumn 14.1
door 14
outdoor 13.8
winter 13.6
furnishing 13.1
forest 13.1
people 12.3
scene 12.1
structure 11.9
sitting 11.2
portrait 11
fall 10.9
couple 10.5
old 10.5
summer 10.3
sky 10.2
city 10
building 10
road 9.9
history 9.8
love 9.5
path 9.4
garden 9.4
movable barrier 9.4
day 9.4
man 9.4
male 9.2
wood 9.2
travel 9.2
statue 8.7
child 8.6
cold 8.6
sculpture 8.6
architecture 8.6
stone 8.3
street 8.3
barrier 8.3
memorial 8.3
fun 8.2
happy 8.1
sun 8
cemetery 7.9
outside 7.7
culture 7.7
track 7.5
light 7.4
water 7.3
lifestyle 7.2
black 7.2
adult 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 98.1
outdoor 98
street 97.7
black and white 94.5
person 94
text 92.8
monochrome 91.2
clothing 87.9
footwear 87.4
playground 63.3
city 52.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Surprised 49.5%
Confused 49.5%
Calm 50.5%
Happy 49.5%
Sad 49.5%
Angry 49.5%
Disgusted 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 99.6%
Shoe 95%
Truck 78.7%
Car 67.6%

Text analysis

Amazon

A3ADER

Google

ASADERO
ASADERO