Human Generated Data

Title

Untitled (Bogota)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5160

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5160

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.7
Human 99.7
Person 99.4
Person 98.7
Apparel 97.8
Clothing 97.8
Person 96.8
Person 96.1
Person 95.5
Person 94.3
Person 92.3
Person 88.2
Overcoat 87.1
Coat 87.1
Person 87
Suit 85
Sports 83.4
Sport 83.4
People 82.7
Pedestrian 81.7
Face 62.8
Ice Skating 62.3
Skating 62.3
Person 59.8
Person 48.2

Clarifai
created on 2019-11-15

street 99.8
people 99.3
city 97.7
business 96
monochrome 95.7
group together 95.7
man 95.1
many 94.6
group 94.4
urban 92.1
adult 91
crowd 90.1
road 89.8
pavement 88.4
woman 87.4
black and white 84.7
luggage 83.2
railway 82.7
building 81.8
bus 80.9

Imagga
created on 2019-11-15

gymnastic apparatus 27.5
horizontal bar 26.7
sports equipment 24.2
equipment 23.9
city 19.1
sport 18.9
man 18.1
percussion instrument 17.3
people 17.3
male 17
urban 15.7
musical instrument 15.2
building 13.7
travel 13.4
architecture 12.5
chime 12.4
person 11.6
silhouette 11.6
vacation 11.4
business 10.9
construction 10.3
tourist 10
road 9.9
sky 9.6
gate 9.4
transportation 9
men 8.6
walking 8.5
swing 8.4
turnstile 8.3
street 8.3
industrial 8.2
life 8
lamp 7.8
weapon 7.8
scene 7.8
adult 7.8
old 7.7
parallel bars 7.6
power 7.6
outdoors 7.5
human 7.5
landscape 7.4
tourism 7.4
device 7.3
transport 7.3
danger 7.3
game equipment 7.2
activity 7.2
rope 7.1
summer 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 97.2
outdoor 94.2
street 87.2
black and white 83
tree 71.4
people 59.5
playground 57.9
city 52.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 53.9%
Surprised 45%
Confused 45%
Calm 45.1%
Happy 45%
Sad 54.4%
Angry 45%
Disgusted 45%
Fear 45.5%

AWS Rekognition

Age 36-54
Gender Male, 50.5%
Happy 49.5%
Angry 49.7%
Fear 49.5%
Surprised 49.7%
Calm 50%
Sad 49.5%
Disgusted 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

COPERE
FLLSR COPERE
Py
Poieco Py
FLLSR
Poieco

Google

VorRES EN (A FILAR COOPERST U
VorRES
EN
(A
FILAR
U
COOPERST