Human Generated Data

Title

Untitled (Berkeley)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5200

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Berkeley)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5200

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Face 100
Human 100
Person 99.3
Person 99
Person 99
Person 98.4
Clothing 98.2
Apparel 98.2
Beard 97.9
Person 96.1
Transportation 95.7
Vehicle 95.7
Car 95.7
Automobile 95.7
Footwear 65.3
Shoe 65.3
Shoe 58.4
Hat 58
Coat 55.4
Machine 55.3
Spoke 55.3
Shoe 51.3

Clarifai
created on 2019-11-15

people 99.7
street 98.4
man 97.4
adult 96.9
monochrome 96.8
vehicle 96.4
transportation system 95
portrait 94.7
car 94.2
group 94.1
woman 94.1
two 92.9
one 92.1
child 89.8
black and white 88.9
girl 88.8
couple 87.8
boy 87.2
group together 86.9
city 84.7

Imagga
created on 2019-11-15

people 29
adult 26.5
man 24.9
person 24.7
male 20.6
portrait 18.8
world 18.7
passenger 18.2
women 16.6
urban 16.6
men 16.3
city 15.8
one 15.7
face 13.5
black 13.3
fashion 12.8
outdoors 12.8
fun 12.7
two 12.7
business 12.1
sitting 12
happy 11.9
hair 11.9
love 11.8
human 11.2
street 11
road 10.8
life 10.8
clothing 10.7
attractive 10.5
couple 10.4
model 10.1
sport 10
outdoor 9.9
pretty 9.8
looking 9.6
smiling 9.4
lifestyle 9.4
youth 9.4
smile 9.3
car 9
stretcher 8.9
color 8.9
train 8.8
casual 8.5
vehicle 8.3
transport 8.2
girls 8.2
dirty 8.1
sexy 8
businessman 7.9
together 7.9
head 7.6
leisure 7.5
helmet 7.4
park 7.4
back 7.3
cheerful 7.3
lady 7.3
cute 7.2
child 7.2
litter 7.1
working 7.1
day 7.1
happiness 7
autumn 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

person 98.8
outdoor 98.6
clothing 98.2
text 97.9
street 96.7
vehicle 93.7
black and white 93.4
car 92
man 91
land vehicle 87.8
monochrome 85.8
footwear 84.6
people 80.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-57
Gender Male, 98.2%
Angry 87.6%
Sad 5.9%
Confused 0.5%
Disgusted 0.2%
Surprised 0.1%
Happy 0%
Fear 0.6%
Calm 5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Car 95.7%
Shoe 65.3%