Human Generated Data

Title

Untitled (Bogota)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5154

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5154

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.5
Human 99.5
Person 99.4
Person 98.7
Person 98.3
Person 98.2
Person 97.3
Person 96.4
Person 95.5
Person 95.4
Person 93.4
Person 90.3
Person 84.6
Person 82.7
Person 82.5
Road 81.8
Person 81.6
Transportation 78.5
Vehicle 75.6
Urban 72.7
Pedestrian 72.4
Car 70.9
Automobile 70.9
Tarmac 70.9
Asphalt 70.9
Truck 69.8
Tower 66.1
Architecture 66.1
Spire 66.1
Building 66.1
Steeple 66.1
Wheel 64.4
Machine 64.4
Town 61.4
City 61.4
Person 60.8
Person 60.6
Text 60.4
Food 60
Meal 60
Person 59.7
Person 56.7
Downtown 55.8
Outdoors 55.8
Person 43.1

Clarifai
created on 2019-11-15

people 99.4
group 98.3
street 98.3
monochrome 98
many 97.2
man 96.8
crowd 96.3
adult 96.3
city 94.9
group together 93.2
outdoors 91
architecture 91
woman 88
business 86.3
square 86.2
transportation system 84.8
travel 83.8
vehicle 83
building 82.2
town 80.6

Imagga
created on 2019-11-15

architecture 41.1
building 38.4
snow 37.5
sketch 32.8
city 32.4
wagon 32.3
wheeled vehicle 29
drawing 25.9
old 22.3
travel 21.8
house 21.4
winter 21.3
street 21.2
representation 18.6
sky 17.9
window 17.7
history 17
weather 16.8
tourism 16.5
container 15.2
vintage 14.1
ancient 13.8
historic 13.8
landmark 13.5
cold 12.9
tower 12.5
antique 12.2
town 12.1
culture 12
station 11.4
urban 11.4
historical 11.3
church 11.1
religion 10.8
vehicle 10.6
structure 10.3
balcony 10.3
wall 10.3
grunge 10.2
square 10
road 9.9
lamp 9.5
brick 9.4
season 9.4
stone 9.3
exterior 9.2
temple 9.2
tourist 9.1
trees 8.9
home 8.8
conveyance 8.6
destination 8.4
famous 8.4
landscape 8.2
black 7.8
door 7.7
train 7.7
village 7.7
outdoors 7.5
palace 7.4
new 7.3
aged 7.2
dirty 7.2
facility 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 94.4
black and white 83.6
vehicle 81.6
street 78.4
land vehicle 66.2
building 59.8
people 55.4
old 40.1
picture frame 7.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 54.5%
Angry 53.1%
Disgusted 45%
Happy 45%
Calm 46.6%
Sad 45.1%
Surprised 45%
Fear 45%
Confused 45.1%

AWS Rekognition

Age 26-42
Gender Female, 54.6%
Sad 46%
Confused 45.2%
Fear 46.6%
Happy 45.2%
Surprised 45.2%
Angry 45.6%
Calm 51.1%
Disgusted 45.1%

AWS Rekognition

Age 5-15
Gender Female, 50.1%
Fear 51.3%
Calm 46.2%
Angry 45.2%
Disgusted 45.1%
Confused 45.1%
Surprised 47.1%
Sad 45.1%
Happy 45%

AWS Rekognition

Age 14-26
Gender Male, 50.1%
Calm 50.1%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Surprised 49.5%
Sad 49.7%
Confused 49.6%
Angry 49.5%

AWS Rekognition

Age 19-31
Gender Male, 50.5%
Surprised 49.6%
Sad 49.5%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Fear 49.6%
Angry 50%
Calm 49.8%

AWS Rekognition

Age 15-27
Gender Female, 50.3%
Angry 49.6%
Happy 49.5%
Disgusted 49.5%
Surprised 49.6%
Fear 50.3%
Sad 49.5%
Confused 49.5%
Calm 49.5%

AWS Rekognition

Age 7-17
Gender Female, 50%
Confused 49.5%
Calm 49.5%
Surprised 49.5%
Fear 49.9%
Happy 49.5%
Angry 49.5%
Sad 49.9%
Disgusted 49.6%

AWS Rekognition

Age 27-43
Gender Male, 50.5%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Angry 50.4%
Fear 49.5%
Disgusted 49.5%
Calm 49.5%

AWS Rekognition

Age 8-18
Gender Male, 50.4%
Angry 49.7%
Disgusted 49.5%
Surprised 49.7%
Fear 49.9%
Confused 49.6%
Happy 49.5%
Sad 49.5%
Calm 49.6%

Feature analysis

Amazon

Person 99.5%
Car 70.9%
Truck 69.8%
Wheel 64.4%

Categories

Text analysis

Amazon

LAS
LINpEr
H

Google

EL2A LAS 5
EL2A
LAS
5