Human Generated Data

Title

Untitled (Arcadia, Calif.)

Date

1980

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5220

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Arcadia, Calif.)

People

Artist: Bill Dane, American born 1938

Date

1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5220

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 99.9
Human 99.9
Person 99.7
Person 99.6
Person 99.4
Person 98.3
Person 97.4
Apparel 92.8
Clothing 92.8
People 90.8
Person 90.6
Tree 87.6
Plant 87.6
Footwear 81.3
Shoe 81.3
Family 61.6
Conifer 58.5
Abies 57.3
Fir 57.3
Shorts 57.2

Clarifai
created on 2019-11-15

people 99.9
child 99.2
group 97.8
group together 97.8
man 96.3
street 96.2
adult 94.9
boy 94.6
monochrome 93
woman 91.5
many 90.2
family 89.2
several 88.7
wear 86.3
administration 85
three 82.8
home 82.6
recreation 81.8
war 81.1
offspring 80.5

Imagga
created on 2019-11-15

park bench 51.6
bench 47.8
seat 31
kin 26.9
swing 24.5
man 20.8
furniture 19.6
people 19.5
plaything 19.3
mechanical device 19.2
outdoors 15.1
mechanism 14.3
male 14.3
love 14.2
park 14
portrait 13.6
adult 13.6
child 12.9
person 12.5
couple 12.2
outdoor 11.5
old 11.1
building 10.9
danger 10.9
history 10.7
architecture 10.2
furnishing 10.2
dark 10
protection 10
tree 10
sunset 9.9
autumn 9.7
resort area 9.6
sculpture 9.6
day 9.4
youth 9.4
winter 9.4
summer 9
sky 8.9
trees 8.9
happy 8.8
lifestyle 8.7
scene 8.7
snow 8.6
statue 8.6
travel 8.4
area 8.3
holding 8.3
dirty 8.1
religion 8.1
family 8
soldier 7.8
sepia 7.8
horror 7.8
serenity 7.8
mask 7.8
parent 7.8
dangerous 7.6
world 7.6
fun 7.5
city 7.5
silhouette 7.5
peaceful 7.3
black 7.2
face 7.1
together 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

tree 99.9
outdoor 98.9
clothing 98
person 97.2
ground 96.6
text 94.6
footwear 84.3
man 70.2
child 67.9
boy 67.2
people 66.1
group 61.3
woman 56
black and white 55
house 54.7
smile 51.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Male, 53.7%
Happy 45%
Angry 55%
Disgusted 45%
Confused 45%
Sad 45%
Calm 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 31-47
Gender Male, 53.2%
Calm 45.5%
Happy 45%
Angry 45.3%
Disgusted 45%
Surprised 45%
Fear 45.1%
Sad 54.1%
Confused 45.1%

AWS Rekognition

Age 13-25
Gender Female, 54.4%
Angry 46%
Disgusted 47.9%
Happy 45.1%
Calm 48.1%
Sad 45.8%
Surprised 45.3%
Fear 45.1%
Confused 46.6%

AWS Rekognition

Age 16-28
Gender Female, 54.9%
Fear 45.5%
Disgusted 45.1%
Surprised 45.1%
Sad 50.9%
Happy 45.9%
Confused 45.7%
Angry 45.1%
Calm 46.6%

AWS Rekognition

Age 20-32
Gender Male, 50.2%
Confused 49.5%
Surprised 49.5%
Fear 49.5%
Happy 49.5%
Disgusted 49.5%
Angry 49.5%
Sad 49.5%
Calm 50.5%

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Surprised 49.8%
Confused 49.5%
Calm 50.1%
Happy 49.5%
Sad 49.5%
Angry 49.5%
Disgusted 49.5%
Fear 49.5%

AWS Rekognition

Age 11-21
Gender Female, 50.3%
Fear 49.6%
Disgusted 49.5%
Calm 49.5%
Happy 50%
Surprised 49.5%
Angry 49.5%
Sad 49.8%
Confused 49.5%

Feature analysis

Amazon

Person 99.9%
Shoe 81.3%

Categories

Text analysis

Amazon

BASEBALL
TEMPLE

Google

TEMPLE BASEBALL
TEMPLE
BASEBALL