Human Generated Data

Title

Untitled (Oakland)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5199

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Oakland)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5199

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 97
Person 97
Person 94.9
Person 91.7
Transportation 80.1
Aircraft 79.2
Helicopter 79.2
Vehicle 79.2
Outdoors 66.8
People 60.9
Apparel 58.6
Clothing 58.6
Nature 58.3

Clarifai
created on 2019-11-15

people 99.4
street 96.2
man 95.3
adult 94
one 93.7
monochrome 91.7
group together 91.6
two 90.4
woman 87
child 86.1
recreation 81.5
three 80.7
transportation system 80.2
boy 78.9
action 76.3
group 76.1
vehicle 76
athlete 74.4
four 69.6
wear 68.9

Imagga
created on 2019-11-15

chairlift 70.5
ski tow 56.6
conveyance 47.1
sport 20.5
swing 20
man 18.1
fun 17.9
leisure 17.4
sky 17.3
landscape 17.1
outdoor 16.8
vacation 16.4
beach 15.2
outdoors 14.4
person 14.3
summer 14.1
sea 14.1
recreation 13.4
activity 13.4
travel 13.4
sand 13.1
active 12.8
mechanical device 11.9
adult 11
ocean 10.9
plaything 10.8
equipment 10.7
people 10.6
vehicle 10.5
mechanism 10.4
outside 10.3
park 10.1
playing 10
water 10
sun 9.7
high 9.5
grass 9.5
lifestyle 9.4
clouds 9.3
male 9.2
hat 9.2
field 9.2
road 9
mountain 8.9
sexy 8.8
play 8.6
wheeled vehicle 8.5
adventure 8.5
old 8.4
danger 8.2
exercise 8.2
color 7.8
extreme 7.7
youth 7.7
relax 7.6
holidays 7.5
action 7.4
competition 7.3
transportation 7.2
game 7.1
happiness 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

outdoor 97.7
person 87.2
text 82
umbrella 81.2
black and white 79.3
footwear 61.6
clothing 61.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 52.5%
Sad 51.9%
Fear 47.5%
Disgusted 45%
Surprised 45.1%
Angry 45.1%
Calm 45.4%
Happy 45.1%
Confused 45%

AWS Rekognition

Age 22-34
Gender Female, 54.2%
Disgusted 45.4%
Happy 45.3%
Angry 46%
Fear 47.9%
Sad 46.1%
Surprised 45.9%
Confused 45.6%
Calm 47.9%

Feature analysis

Amazon

Person 97%
Helicopter 79.2%

Text analysis

Amazon

QUNCE

Google

CON 3OUNC
CON
3OUNC