Human Generated Data

Title

Untitled (Marin, Calif)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5206

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Marin, Calif)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5206

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Home Decor 97.3
Person 96.3
Human 96.3
Pet 91.7
Canine 91.7
Dog 91.7
Animal 91.7
Mammal 91.7
Transportation 89.6
Vehicle 88.7
Clothing 70.9
Apparel 70.9
Truck 61.1
Tire 56.3
Pickup Truck 56.1
Furniture 55.8
Tarmac 55.8
Asphalt 55.8

Clarifai
created on 2019-11-15

people 99.9
adult 98.2
street 98.1
vehicle 97.9
group 97.5
woman 97.2
man 97.2
transportation system 95.7
group together 95
child 93.1
two 92.6
canine 92.3
dog 90.5
one 90.1
monochrome 89.3
administration 87
road 86.3
home 83.5
wear 82.9
analogue 82.4

Imagga
created on 2019-11-15

vehicle 46
half track 40.9
military vehicle 34.2
tracked vehicle 32.9
wheeled vehicle 20.7
car 17.5
conveyance 16.7
device 16.4
man 13.6
transportation 13.4
travel 13.4
sewing machine 13.3
old 13.2
machine 13.1
people 12.3
decoration 12.3
box 12.1
transport 11.9
home appliance 11.6
home 11.2
room 10.8
textile machine 10.7
male 10.6
art 10.4
container 10.3
design 10.1
tire 10.1
black 9.6
wheel 9.5
holiday 9.3
building 8.8
ancient 8.6
architecture 8.6
men 8.6
luxury 8.6
appliance 8.5
modern 8.4
antique 8.4
tourism 8.2
iron lung 8.2
style 8.1
interior 7.9
wall 7.7
traditional 7.5
silhouette 7.4
detail 7.2
celebration 7.2

Google
created on 2019-11-15

Photograph 95.7
Snapshot 85.9
Car 72.6
Vehicle 71.7
Photography 67.8
Art 65.5
Black-and-white 56.4
Visual arts 55

Microsoft
created on 2019-11-15

text 97.6
drawing 90.7
black and white 81.8
street 62.7
clothing 62
old 42.5
picture frame 6.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Male, 51.2%
Angry 45%
Happy 45.2%
Disgusted 45%
Confused 45%
Surprised 45.1%
Calm 54.1%
Fear 45.1%
Sad 45.4%

AWS Rekognition

Age 28-44
Gender Female, 53.7%
Angry 45%
Disgusted 45%
Surprised 45%
Sad 52.8%
Confused 45.1%
Calm 46.7%
Fear 45.1%
Happy 45.3%

AWS Rekognition

Age 7-17
Gender Female, 50.1%
Angry 45.4%
Disgusted 45.1%
Happy 45%
Calm 48.3%
Surprised 45.2%
Fear 45.4%
Confused 45.7%
Sad 49.9%

AWS Rekognition

Age 32-48
Gender Male, 54.7%
Disgusted 45%
Happy 45%
Angry 54.5%
Fear 45%
Calm 45.3%
Confused 45%
Surprised 45%
Sad 45%

AWS Rekognition

Age 21-33
Gender Male, 53.4%
Confused 45%
Calm 54.3%
Surprised 45%
Fear 45%
Happy 45%
Angry 45.1%
Sad 45.5%
Disgusted 45%

AWS Rekognition

Age 14-26
Gender Female, 53.7%
Calm 46.2%
Sad 53.2%
Happy 45.4%
Angry 45%
Fear 45%
Confused 45.1%
Surprised 45%
Disgusted 45%

Feature analysis

Amazon

Person 96.3%
Dog 91.7%

Categories

Captions

Microsoft
created on 2019-11-15

an old photo of a person 74.1%
an old photo of a person 72%
old photo of a person 71.5%