Human Generated Data

Title

Untitled (Bogota)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5156

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.5
Human 99.5
Person 99.4
Person 99.3
Person 99.3
Person 99.2
Person 98.9
Person 98.8
Person 98.6
Urban 98.2
Building 98.2
City 98.2
Downtown 98.2
Town 98.2
Person 97.4
Person 97.3
Architecture 95.2
Military Uniform 93.6
Military 93.6
Person 91.3
Pedestrian 88.2
Person 84.3
Crowd 80.1
Armored 79.5
Army 79.5
People 78.8
Person 76.6
Person 75.4
Person 70.9
Person 69.2
Road 66.8
Person 65.1
Musical Instrument 63.8
Musician 63.8
Soldier 63.2
Officer 62.7
Metropolis 62.7
Person 62.5
Person 62.4
Marching 57.1
Office Building 56.7
Person 43.5

Clarifai
created on 2019-11-15

people 99.3
man 95.5
many 93.6
city 91.9
business 91.4
crowd 90.6
woman 89.7
group together 88.4
exhibition 88
adult 87.9
group 87.1
street 85.5
travel 84.2
monochrome 81.6
commercial 79.5
urban 78.6
outdoors 77.2
museum 74.3
wear 73.8
architecture 70.6

Imagga
created on 2019-11-15

people 32.9
city 25
crowd 23.1
architecture 22.8
silhouette 21.5
urban 20.1
business 20.1
travel 19
group 16.1
sky 16.1
building 15.8
tourist 15.2
walking 15.2
man 14.8
gymnasium 14
men 13.7
transport 13.7
billboard 13.6
male 13.5
airport 12.7
sport 12.3
vacation 12.3
person 12.3
gate 11.9
transportation 11.7
station 11.5
window 11.5
athletic facility 11.4
reflection 11.4
motion 11.1
signboard 11
beach 10.8
structure 10.7
tourism 10.7
modern 10.5
pier 10.5
bridge 10.5
summer 10.3
facility 10.2
office 9.8
businessman 9.7
move 9.6
walk 9.5
scene 9.5
women 9.5
work 9.4
journey 9.4
sea 9.4
water 9.3
glass 9.3
ocean 9.2
passenger 9.2
hall 8.9
river 8.9
volleyball net 8.7
silhouettes 8.7
snow 8.7
net 8.6
meeting 8.5
clouds 8.5
center 8.4
landscape 8.2
spectator 8.2
flag 8.1
subway 7.9
support 7.9
corridor 7.9
entrance 7.7
construction 7.7
world 7.7
blurred 7.7
trip 7.5
time 7.3
design 7.3
device 7.3
landmark 7.2
sunset 7.2
activity 7.2
adult 7.1
weather 7.1
interior 7.1
country 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 95.8
text 94.4
people 88.9
sky 88.5
black and white 85.1
man 81.1
clothing 76.5
group 69.7
old 63.9
city 55.8
picture frame 12.2

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 52.3%
Angry 49.9%
Surprised 45.1%
Sad 45.2%
Happy 45%
Calm 48.7%
Fear 46%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 37-55
Gender Male, 50.4%
Angry 49.9%
Happy 49.5%
Fear 49.5%
Disgusted 49.6%
Sad 49.6%
Calm 49.8%
Surprised 49.5%
Confused 49.7%

AWS Rekognition

Age 10-20
Gender Male, 50.3%
Confused 49.5%
Fear 49.6%
Calm 49.6%
Happy 49.5%
Sad 50%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50%
Calm 50.1%
Fear 49.6%
Confused 49.6%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Sad 49.7%
Surprised 49.5%

AWS Rekognition

Age 17-29
Gender Male, 50.4%
Calm 50.5%
Surprised 49.5%
Angry 49.5%
Confused 49.5%
Happy 49.5%
Fear 49.5%
Sad 49.5%
Disgusted 49.5%

AWS Rekognition

Age 33-49
Gender Male, 50.5%
Calm 50%
Angry 50%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Confused 49.5%
Fear 49.5%
Surprised 49.5%

AWS Rekognition

Age 45-63
Gender Female, 50%
Happy 49.5%
Surprised 49.5%
Angry 49.8%
Fear 49.5%
Sad 49.8%
Confused 49.7%
Calm 49.6%
Disgusted 49.5%

AWS Rekognition

Age 32-48
Gender Male, 50.5%
Calm 50%
Surprised 49.6%
Disgusted 49.5%
Happy 49.5%
Angry 49.7%
Confused 49.5%
Sad 49.7%
Fear 49.5%

AWS Rekognition

Age 8-18
Gender Male, 50.1%
Fear 50.3%
Confused 49.5%
Calm 49.5%
Sad 49.5%
Disgusted 49.6%
Happy 49.5%
Surprised 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people walking in front of a crowd 96.9%
a group of people standing in front of a crowd 95%
an old photo of a group of people standing in front of a crowd 93.9%