Human Generated Data

Title

Untitled (Richmond, Calif.)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5230

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Richmond, Calif.)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5230

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Tarmac 99.9
Asphalt 99.9
Road 99.8
Human 98.1
Person 98.1
Person 98.1
Person 98
Person 97.1
Zebra Crossing 95.4
Person 94.4
Person 94.3
Person 93.5
Person 90
Person 71
Clothing 68.7
Sleeve 68.7
Long Sleeve 68.7
Apparel 68.7

Clarifai
created on 2019-11-15

people 99.8
group together 98.1
chair 97.4
many 97.3
group 97.2
adult 96.9
man 96.9
woman 95
crowd 93.4
furniture 93
administration 89.5
street 89.4
leader 88.3
vehicle 87.3
seat 83.4
transportation system 79.3
monochrome 78.4
spectator 78.1
music 76.8
military 76.4

Imagga
created on 2019-11-15

shop 74.2
barbershop 67.4
mercantile establishment 56.9
place of business 37.4
architecture 33
building 32.5
city 29.1
travel 22.5
salon 19.6
urban 18.3
establishment 17.8
tourism 17.3
people 17.3
window 17.2
house 15.9
history 15.2
historic 14.7
old 14.6
business 13.4
bakery 13.2
transportation 12.5
home 12
landmark 11.7
structure 11.7
monument 11.2
art 11.1
style 11.1
wall 11.1
street 11
facade 11
sculpture 10.8
interior 10.6
famous 10.2
stone 10.2
inside 10.1
light 10
tourist 10
modern 9.8
station 9.7
palace 9.6
design 9.6
ancient 9.5
capital 9.5
historical 9.4
boutique 9.4
culture 9.4
town 9.3
religion 9
life 8.8
crowd 8.6
motion 8.6
temple 8.5
office 8.5
exterior 8.3
supermarket 8.2
road 8.1
stall 8.1
night 8
antique 7.8
glass 7.8
door 7.8
construction 7.7
residential 7.7
lamp 7.6
roof 7.6
statue 7.6
center 7.6
restaurant 7.5
destination 7.5
church 7.4
transport 7.3
square 7.2
hall 7.2
indoors 7
sky 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 77.6
text 76.3
black and white 73.6
funeral 68.1
table 51.8
altar 16.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Male, 50.4%
Disgusted 49.5%
Fear 49.5%
Calm 50.2%
Surprised 49.5%
Angry 49.7%
Confused 49.5%
Happy 49.5%
Sad 49.6%

AWS Rekognition

Age 11-21
Gender Female, 50.5%
Fear 49.5%
Sad 50.3%
Angry 49.5%
Calm 49.6%
Confused 49.5%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.3%
Happy 49.5%
Fear 49.5%
Sad 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Calm 50.5%
Angry 49.5%

AWS Rekognition

Age 51-69
Gender Male, 50.3%
Angry 49.6%
Surprised 49.5%
Sad 49.8%
Fear 49.6%
Calm 50%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 48-66
Gender Male, 55%
Happy 45%
Confused 45.1%
Calm 54.7%
Angry 45.1%
Fear 45%
Disgusted 45%
Sad 45.1%
Surprised 45%

AWS Rekognition

Age 39-57
Gender Male, 50.4%
Calm 49.8%
Disgusted 49.5%
Sad 50.1%
Happy 49.5%
Angry 49.5%
Fear 49.5%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.5%
Confused 49.5%
Calm 50.5%
Sad 49.5%
Surprised 49.5%
Happy 49.5%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%

AWS Rekognition

Age 47-65
Gender Male, 50.5%
Calm 49.6%
Sad 49.5%
Angry 50.3%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%
Fear 49.5%
Confused 49.5%

Feature analysis

Amazon

Person 98.1%

Categories

Imagga

cars vehicles 96.9%
interior objects 2.8%

Text analysis

Amazon

BOOKSELLEA
B Dalion BOOKSELLEA
B
Dalion

Google

BOalton BOOKSEL L ER
BOalton
ER
BOOKSEL
L