Human Generated Data

Title

Untitled (S.F. area "Great America")

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5176

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (S.F. area "Great America")

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5176

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Handrail 99.9
Banister 99.9
Human 99.9
Person 99.9
Person 99.7
Railing 99.6
Person 72.9
Porch 71.1
Machine 61.9
Spoke 61.9
Building 56.5

Clarifai
created on 2019-11-15

people 99.6
man 98.4
adult 97.5
street 95.4
woman 95.3
group together 94
two 93.3
one 92.7
monochrome 92.5
bench 90.8
park 90.4
child 90.1
recreation 90
group 88.2
boy 86.7
fence 79.3
music 79.3
portrait 77.5
summer 77.2
outdoors 76.7

Imagga
created on 2019-11-15

chair 46.9
treadmill 41.5
exercise device 40.2
device 35.5
seat 27.9
rocking chair 25.3
people 21.7
furniture 21.3
man 20.8
lifestyle 15.9
machine 15.9
sitting 14.6
male 14.2
indoors 14.1
business 14
home 13.6
travel 13.4
work 13.3
person 13.3
office 13.2
modern 12.6
table 12.5
interior 12.4
exercise bike 10.7
job 10.6
adult 10.5
equipment 10.3
women 10.3
laptop 10.2
patient 10.1
house 10
health 9.7
building 9.7
day 9.4
smiling 9.4
outdoors 9.4
communication 9.2
transportation 9
technology 8.9
computer 8.9
happy 8.8
full length 8.7
couple 8.7
gym 8.6
portrait 8.4
exercise 8.2
furnishing 8
water 8
worker 8
structure 8
body 8
working 8
wheelchair 8
architecture 7.8
window 7.5
holding 7.4
vacation 7.4
inside 7.4
security 7.3
cheerful 7.3
indoor 7.3
recreation 7.2
businessman 7.1
happiness 7
room 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

skating 98.3
footwear 93.9
playground 92.6
outdoor 89.7
person 89.7
black and white 86.2
tree 84.7
text 83.8
clothing 82.7
park 67.8
trick 64.5
doing 61.6
street 61.2
concrete 25.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-65
Gender Male, 52.2%
Fear 47.6%
Sad 51.9%
Disgusted 45%
Surprised 45.1%
Calm 45.2%
Happy 45%
Angry 45.1%
Confused 45.1%

Feature analysis

Amazon

Person 99.9%

Text analysis

Amazon

OEY
84B

Google

BIB MAR OMAY
BIB
MAR
OMAY