Human Generated Data

Title

Untitled (S. F.)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5242

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (S. F.)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5242

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 97.8
Person 97.8
Art 80.6
Plant 72.5
Painting 72
Home Decor 59.7
Sphere 58.7
People 56.8
Sculpture 56.2
Clothing 56.2
Apparel 56.2

Clarifai
created on 2019-11-15

people 99.6
one 99
adult 97.1
portrait 96.8
indoors 93.2
man 90.2
group 87.3
woman 87.3
food 87.1
wear 86.3
art 85.8
room 84.4
door 82.6
two 82
vintage 80.4
no person 79.1
leader 79
administration 79
music 77.1
child 75.9

Imagga
created on 2019-11-15

ball and chain 61.7
shackle 49.3
restraint 37
device 32.8
food 23.1
windowsill 18.8
harvest 18.8
vegetable 18
sill 17.5
pumpkin 17.4
autumn 15.8
agriculture 15.8
old 15.3
fresh 15
structural member 13.3
season 13.2
healthy 13.2
decoration 13.2
brown 12.5
natural 12
fall 11.8
organic 11.7
farm 11.6
holiday 11.5
basket 11.4
seasonal 11.4
produce 10.6
support 10.1
traditional 10
orange 10
yellow 9.9
market 9.8
health 9.7
close 9.7
wooden 9.7
country 9.7
vintage 9.3
grain 9.2
wood 9.2
vegetables 9.1
meal 8.9
gourd 8.9
rustic 8.6
squash 8.5
eat 8.4
garden 8.4
gold 8.2
aged 8.1
cooking 7.9
black 7.8
round 7.8
outside 7.7
dirt 7.6
field 7.5
dark 7.5
one 7.5
outdoors 7.5
festive 7.4
earth 7.3
color 7.2
china 7.2
cute 7.2
kitchen 7.2
breakfast 7.1
container 7
garlic 7
ingredient 7
plant 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

indoor 92.7
black and white 85.9
window 80.4
clothing 69
person 66.8
text 64.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Female, 54.8%
Disgusted 45.1%
Calm 53.6%
Angry 45.1%
Confused 45.1%
Fear 45.1%
Sad 45.4%
Surprised 45.3%
Happy 45.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Painting 72%

Captions

Text analysis

Amazon

NANO
1382
n

Google

MANG
MANG