Human Generated Data

Title

Untitled (School's in)

Date

1976

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5109

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (School's in)

People

Artist: Bill Dane, American born 1938

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5109

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Clothing 98.5
Apparel 98.5
Human 97.9
Person 97.9
Person 97.6
Person 97.1
Sport 77.5
Skateboard 77.5
Sports 77.5
Person 77
Person 74.8
Back 73.9
Person 71.8
Hat 70.7
Roof 66
Person 62.1
Skateboard 61.1
Hair 59.1
Person 56.3
Sitting 56

Clarifai
created on 2019-11-15

people 99.8
group together 98.3
group 98.1
adult 98
man 96.5
woman 94.8
vehicle 94.6
war 93.7
street 93.1
monochrome 92.4
child 91.8
many 90
one 89.5
administration 89.3
military 88.1
music 86.5
two 85.4
transportation system 84.2
soldier 82.7
police 82.6

Imagga
created on 2019-11-15

gun 21.1
building 19.1
cannon 18.8
person 17.3
city 15.8
people 15.6
weapon 14
office 13.7
architecture 13.4
business 13.4
man 12.8
percussion instrument 12.2
protection 11.8
world 11.5
industrial 10.9
male 10.6
travel 10.6
machine 10.4
equipment 10.1
outdoor 9.9
musical instrument 9.8
old 9.7
structure 9.7
military 9.7
war 9.5
grass 9.5
industry 9.4
device 9
sky 8.9
working 8.8
marimba 8.6
men 8.6
adult 8.5
portrait 8.4
danger 8.2
landscape 8.2
urban 7.9
work 7.8
soldier 7.8
destruction 7.8
black 7.8
nuclear 7.8
outside 7.7
tree 7.7
power 7.5
field 7.5
laptop 7.4
safety 7.4
dirty 7.2
television camera 7.2
suit 7.2
shadow 7.2
history 7.2
mask 7.1
job 7.1
summer 7.1
day 7.1
rural 7
modern 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

sky 99.3
outdoor 96
person 92.3
clothing 90.2
text 90.2
black and white 90.1
man 79
street 66.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 50-68
Gender Female, 50.4%
Calm 54.8%
Fear 45%
Confused 45%
Happy 45%
Angry 45%
Disgusted 45%
Sad 45.1%
Surprised 45%

AWS Rekognition

Age 34-50
Gender Male, 50.4%
Happy 49.5%
Calm 49.8%
Fear 49.9%
Confused 49.5%
Angry 49.5%
Surprised 49.8%
Disgusted 49.5%
Sad 49.5%

AWS Rekognition

Age 12-22
Gender Male, 50.3%
Calm 49.9%
Angry 49.6%
Fear 49.7%
Happy 49.5%
Sad 49.7%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 45-63
Gender Male, 50.4%
Disgusted 49.5%
Confused 49.5%
Angry 49.5%
Happy 49.5%
Sad 50.1%
Calm 49.8%
Surprised 49.5%
Fear 49.5%

AWS Rekognition

Age 23-35
Gender Male, 51.2%
Angry 46.2%
Calm 50.2%
Confused 45.2%
Disgusted 45.1%
Surprised 46.3%
Happy 46%
Sad 45.4%
Fear 45.5%

AWS Rekognition

Age 19-31
Gender Female, 51%
Calm 45%
Angry 45%
Surprised 45%
Confused 45%
Disgusted 45%
Happy 45%
Sad 45%
Fear 55%

AWS Rekognition

Age 49-67
Gender Female, 50.2%
Fear 49.5%
Sad 50.4%
Happy 49.5%
Disgusted 49.5%
Calm 49.5%
Confused 49.5%
Angry 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 97.9%
Skateboard 77.5%

Categories

Text analysis

Amazon

REST
REST ROOM
ROOM
ROOMS
REST ROOMS
WOMEN

Google

REST ROOM wOM EN ROAMS REST
REST
ROOM
wOM
EN
ROAMS