Human Generated Data

Title

Untitled (Bogota - Wedding)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5157

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Bogota - Wedding)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5157

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.6
Human 99.6
Person 99.3
Person 99.1
Clothing 98.1
Apparel 98.1
Person 93.8
Plant 88.2
Tree 86
Shoe 82.4
Footwear 82.4
Outdoors 81.4
Person 81.1
Shorts 77.3
Shoe 62.7
Person 62.2
Garden 60.3
Arbour 57.2

Clarifai
created on 2019-11-15

people 99.7
adult 98.3
monochrome 97.9
man 97.6
street 97.1
two 96.6
woman 96
one 95.8
girl 92.2
child 91.7
portrait 90.6
group 88.8
black and white 84.9
couple 84.4
boy 83.6
art 81.4
three 80.9
wear 80.4
group together 79.3
nature 77.4

Imagga
created on 2019-11-15

snow 40.6
bench 31.4
park bench 31
tree 26.8
landscape 26
trees 24.9
winter 24.7
water 24
shovel 23.7
seat 20.8
park 19.8
forest 19.2
cold 18.9
old 18.8
weather 18.7
sky 18.5
travel 17.6
hand tool 16.2
river 16
outdoor 15.3
building 14.9
tool 14.9
season 14.8
rural 14.1
vacation 13.9
furniture 13.4
outdoors 13.4
structure 13.2
scenic 13.2
light 12.7
house 12.5
wood 12.5
mountain 12.5
cemetery 12.5
swing 12.4
scene 12.1
peaceful 11
scenery 10.8
holiday 10.8
snowy 10.7
architecture 10.2
natural 10
country 9.7
woods 9.6
boat 9.5
garden 9.4
lake 9.2
vintage 9.1
black 9
wooden 8.8
abandoned 8.8
mechanical device 8.8
seasonal 8.8
frozen 8.6
dusk 8.6
path 8.5
plaything 8.5
summer 8.4
dark 8.4
city 8.3
tourism 8.3
calm 8.2
sunset 8.1
sun 8.1
stone 8
autumn 7.9
sea 7.8
ice 7.5
ocean 7.5
environment 7.4
countryside 7.3
new 7.3
fall 7.2
beach 7.2
home 7.2
day 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 98.3
clothing 95.1
text 91.3
person 89.9
footwear 65.9
black and white 65.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Female, 50.2%
Fear 49.5%
Sad 49.7%
Confused 49.5%
Angry 49.5%
Happy 49.5%
Calm 50.1%
Surprised 49.6%
Disgusted 49.5%

AWS Rekognition

Age 6-16
Gender Male, 50.3%
Happy 49.5%
Disgusted 49.5%
Fear 49.5%
Calm 49.6%
Angry 49.5%
Confused 49.5%
Sad 50.3%
Surprised 49.5%

AWS Rekognition

Age 12-22
Gender Male, 53.9%
Confused 45.9%
Fear 45.4%
Angry 45.7%
Sad 47.1%
Disgusted 46.9%
Surprised 45.5%
Happy 47.3%
Calm 46.1%

Feature analysis

Amazon

Person 99.6%
Shoe 82.4%