Human Generated Data

Title

Untitled (Mexico City)

Date

1975

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5100

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Mexico City)

People

Artist: Bill Dane, American born 1938

Date

1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5100

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.3
Person 99.3
Person 98.5
Transportation 97.2
Car 97.2
Vehicle 97.2
Automobile 97.2
Apparel 89.9
Clothing 89.9
Pedestrian 89.5
Person 81.5
Person 81.3
Person 79.6
Home Decor 69.8
Person 69.3
Meal 62
Food 62
Face 61
Restaurant 58.2
Person 57.6
People 57.2
Person 42.4

Clarifai
created on 2019-11-15

people 99.6
vehicle 99.1
transportation system 98.5
car 97.9
street 97.2
adult 95.7
man 95.6
two 94.3
group 92.8
one 91.7
monochrome 91.1
woman 88.5
administration 87.3
train 85.8
group together 85.5
police 83.6
road 83.5
commerce 79.2
seat 79.1
military 79

Imagga
created on 2019-11-15

tramway 64
conveyance 56.5
car 41.3
vehicle 36.4
wheeled vehicle 32
streetcar 29.9
transportation 27.8
city 24.1
passenger 22.8
transport 21
urban 21
street 20.2
travel 19.7
building 19.1
road 19
architecture 18.7
automobile 16.3
traffic 16.1
industry 15.4
motor vehicle 15
model t 14.8
auto 13.4
light 13.4
people 12.8
driving 12.6
drive 12.3
industrial 11.8
old 11.1
town 10.2
factory 10
equipment 9.8
business 9.7
buildings 9.4
motion 9.4
man 9.4
truck 9.3
power 9.2
inside 9.2
speed 9.2
modern 9.1
bus 8.7
station 8.7
heavy 8.6
sky 8.3
tourism 8.2
landmark 8.1
reflection 8.1
night 8
structure 7.9
work 7.8
construction 7.7
lamp 7.6
engineering 7.6
cockpit 7.6
fast 7.5
tourist 7.4
safety 7.4
new 7.3
ride 7.3
steel 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

street 98.4
person 95.6
black and white 95.1
vehicle 94.2
land vehicle 93.5
car 92.7
outdoor 89.8
people 89
text 85.7
man 81.6
city 81.4
clothing 81.1
monochrome 72.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Female, 54.2%
Confused 45.1%
Surprised 45.1%
Fear 53%
Happy 45.1%
Disgusted 45%
Angry 45.3%
Sad 46.3%
Calm 45.1%

AWS Rekognition

Age 39-57
Gender Male, 50.3%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Fear 49.6%
Confused 49.5%
Surprised 49.5%
Calm 49.8%
Sad 50%

AWS Rekognition

Age 40-58
Gender Male, 50.4%
Happy 49.5%
Disgusted 49.5%
Calm 49.5%
Sad 49.6%
Surprised 49.5%
Confused 49.5%
Fear 49.5%
Angry 50.4%

AWS Rekognition

Age 21-33
Gender Female, 50.1%
Disgusted 49.5%
Sad 50.3%
Calm 49.5%
Fear 49.5%
Happy 49.5%
Confused 49.5%
Angry 49.6%
Surprised 49.5%

Feature analysis

Amazon

Person 99.3%
Car 97.2%

Text analysis

Amazon

Qath
mofi Qath
ehiver
mofi
T

Google

Patt
Patt