Human Generated Data

Title

Untitled (Cloverdale, Calif.)

Date

1982

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5234

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Cloverdale, Calif.)

People

Artist: Bill Dane, American born 1938

Date

1982

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5234

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.8
Person 99.8
Person 99.6
Person 98.9
Person 98.2
Person 98
Person 97
Clothing 96.9
Apparel 96.9
Sitting 90.7
Person 89.7
Automobile 80.9
Transportation 80.9
Car 80.9
Vehicle 80.9
Person 80
Footwear 76.2
Shoe 76.2
Furniture 74.8
Chair 74.8
Undershirt 65.3
People 60.8
Musical Instrument 56.9
Musician 56.9
Monitor 56
Electronics 56
Screen 56
Display 56

Clarifai
created on 2019-11-15

people 99.6
monochrome 98.9
adult 97.4
street 96.6
man 96.4
group together 96
woman 95.7
group 94.5
child 90.6
two 90.2
recreation 89.3
portrait 89.2
several 86
one 85
wear 84.4
three 83.7
boy 83.3
girl 82.4
family 81.8
black and white 81.4

Imagga
created on 2019-11-15

person 41.2
man 37.6
musical instrument 29.6
people 24.5
male 24.3
adult 22.4
patient 18.1
men 18
portrait 16.8
wheelchair 16.3
planner 15.7
accordion 14.9
case 14.8
wind instrument 14.6
sick person 14.1
music 13.8
guitar 13.7
outdoors 13.4
youth 12.8
black 12.6
stringed instrument 12.2
musician 12.1
keyboard instrument 11.9
chair 11.9
happy 11.9
lifestyle 11.6
old 11.1
helmet 10.7
outdoor 10.7
job 10.6
one 10.4
boy 10.4
day 10.2
sport 10.2
smiling 10.1
occupation 10.1
teenager 10
seat 9.9
uniform 9.8
backpack 9.8
device 9.7
snow 9.6
building 9.6
couple 9.6
cold 9.5
smile 9.3
city 9.1
holding 9.1
family 8.9
crutch 8.9
together 8.8
concert 8.7
urban 8.7
rock 8.7
player 8.7
play 8.6
winter 8.5
hand 8.4
clothing 8.3
banjo 8.3
equipment 8.1
cool 8
to 8
love 7.9
guitarist 7.9
stretcher 7.9
work 7.8
forest 7.8
active 7.7
attractive 7.7
musical 7.7
walk 7.6
professional 7.6
stand 7.6
instrument 7.6
leisure 7.5
park 7.4
worker 7.3
playing 7.3
danger 7.3
vehicle 7.2
recreation 7.2
art 7.2
grass 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 99.4
clothing 99
tree 98.3
text 96.6
outdoor 96.5
man 90.5
woman 84.7
human face 81.7
smile 76.1
black and white 75.6
posing 70.4
baby 65.6
people 62
footwear 55.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-20
Gender Male, 53.6%
Sad 46.6%
Angry 45.4%
Happy 51.4%
Calm 45.3%
Fear 45.7%
Disgusted 45.2%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 12-22
Gender Male, 98.2%
Angry 6%
Fear 10.3%
Confused 3.6%
Calm 16.9%
Sad 61.5%
Disgusted 0.4%
Surprised 1.2%
Happy 0.1%

AWS Rekognition

Age 43-61
Gender Male, 54.9%
Fear 45.4%
Calm 46%
Angry 45.2%
Happy 47.3%
Confused 45.3%
Disgusted 45%
Sad 45.1%
Surprised 50.7%

AWS Rekognition

Age 23-35
Gender Female, 54.7%
Fear 45.2%
Disgusted 45.4%
Surprised 45.1%
Sad 49.4%
Happy 45.2%
Confused 45.3%
Angry 45.4%
Calm 49%

AWS Rekognition

Age 3-9
Gender Male, 50.1%
Surprised 45.1%
Angry 45.4%
Sad 45.2%
Fear 45%
Disgusted 45.1%
Happy 45.1%
Calm 54.1%
Confused 45%

AWS Rekognition

Age 51-69
Gender Female, 54.1%
Happy 45.1%
Disgusted 45.1%
Confused 45.9%
Calm 52.2%
Fear 45.1%
Sad 46.4%
Angry 45.2%
Surprised 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Car 80.9%
Shoe 76.2%

Categories

Text analysis

Amazon

5