Human Generated Data

Title

Untitled (Tahoe)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5144

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Tahoe)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5144

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.7
Human 99.7
Person 99.6
Person 99.4
Pet 93.3
Animal 93.3
Mammal 93.3
Dog 93.3
Canine 93.3
Apparel 83.6
Clothing 83.6
Musical Instrument 79.9
Musician 79.9
Person 73.1
Leisure Activities 72.3
Pants 69.6
Person 61.5
Crowd 58.7
Sleeve 58.1
Long Sleeve 58.1
Home Decor 56.8
Toy 55.5

Clarifai
created on 2019-11-15

people 99.7
street 98.6
man 96.8
group 96
group together 94.5
child 94.4
adult 93.8
many 90.2
woman 90.1
wear 87.7
city 86.8
two 85.2
road 84.5
boy 84
one 81
vehicle 80.2
recreation 79.1
transportation system 79.1
movie 77.4
three 76.3

Imagga
created on 2019-11-15

punching bag 46.6
city 28.2
game equipment 27.9
people 24
man 22.8
equipment 19.6
stall 19.4
urban 17.5
business 16.4
adult 15.6
street 14.7
seller 14.6
person 14
male 13.5
men 12.9
women 12.6
shop 12.6
black 12.6
walking 12.3
fashion 11.3
travel 11.3
life 11.1
walk 10.5
bag 10.3
suit 9.9
activity 9.8
businessman 9.7
group 9.7
style 9.6
musical instrument 9.5
holiday 9.3
outdoors 9.1
clothing 8.7
store 8.5
clothes 8.4
portrait 8.4
old 8.3
shopping 8.2
human 8.2
girls 8.2
building 8
architecture 7.8
crowd 7.7
sale 7.4
tradition 7.4
vacation 7.4
transport 7.3
lifestyle 7.2
road 7.2
transportation 7.2
market 7.1
day 7.1
modern 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 95.1
clothing 92.6
footwear 92.2
person 91.2
street 90.7
black and white 86.1
monochrome 64.6
man 56.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Male, 52.6%
Disgusted 45%
Angry 45.9%
Fear 45%
Happy 45%
Confused 45.1%
Sad 45.1%
Calm 53.6%
Surprised 45.3%

AWS Rekognition

Age 13-23
Gender Male, 50.4%
Fear 49.5%
Happy 49.5%
Angry 49.6%
Disgusted 49.5%
Sad 49.9%
Surprised 49.5%
Calm 49.9%
Confused 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.5%
Happy 45.1%
Fear 45.7%
Angry 45.7%
Confused 45.1%
Surprised 45.1%
Disgusted 45%
Calm 48.6%
Sad 49.7%

Feature analysis

Amazon

Person 99.7%
Dog 93.3%