Human Generated Data

Title

Untitled (Berkeley)

Date

1983

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Human Generated Data

Title

Untitled (Berkeley)

People

Artist: Bill Dane, American born 1938

Date

1983

Classification

Photographs

Machine Generated Data

Tags

Amazon

Leisure Activities 98
Dance Pose 98
Person 97.2
Human 97.2
Flooring 96.7
Indoors 95.8
Interior Design 95.8
Floor 94
Wood 90.8
Apparel 88
Clothing 88
Person 86.7
Hardwood 78.9
Stage 74.1
Costume 71.7
Face 66.5
Photo 62
Photography 62
Dance 61.4
Female 60.3
Room 58.7
Girl 57.5
Screen 56.4
Electronics 56.4

Clarifai

people 98.7
man 97.2
woman 96.7
girl 96.2
indoors 93.8
adult 93.4
music 93
dancing 91.3
fashion 91.1
portrait 90.4
wear 90.2
television 89.2
dancer 89.2
model 87.6
couple 87
young 86
public show 85.5
performance 85
stage 84.6
collection 83.4

Imagga

teacher 52.8
educator 43.4
professional 37.1
person 36.4
dancer 35.7
adult 34.9
people 30.1
performer 27.2
man 22.8
happy 19.4
entertainer 19.1
male 18.4
dress 18.1
women 17.4
portrait 16.8
attractive 15.4
couple 14.8
fashion 14.3
love 14.2
happiness 14.1
pretty 14
dance 12.7
two 12.7
lifestyle 12.3
together 12.3
art 12.2
standing 12.2
smile 12.1
culture 12
old 11.8
business 11.5
interior 11.5
men 11.2
style 11.1
elegance 10.1
human 9.7
lady 9.7
corporate 9.4
church 9.2
black 9.2
indoor 9.1
girls 9.1
posing 8.9
businessman 8.8
clothing 8.7
dancing 8.7
weapon 8.6
model 8.5
walking 8.5
life 8.4
holding 8.2
sword 8.2
costume 8.1
activity 8.1
group 8.1
smiling 8
holiday 7.9
work 7.8
bags 7.8
full length 7.8
party 7.7
luxury 7.7
faith 7.7
customer 7.6
bag 7.6
hand 7.6
shop 7.6
adults 7.6
groom 7.5
entertainment 7.4
shopping 7.3
teenager 7.3
children 7.3
new 7.3
success 7.2
sexy 7.2

Google

Snapshot 84.3
Fun 76.8
Photography 67.8
Room 65.7
Art 65.5
Visual arts 55
Dance 51.5
Performance 50.9

Microsoft

floor 99.5
dance 98.4
indoor 98
clothing 96.7
person 91.4
footwear 91.1
text 79.3
flat 36.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 54.1%
Surprised 45%
Fear 45%
Disgusted 45%
Calm 45%
Angry 45%
Confused 45%
Happy 55%
Sad 45%

AWS Rekognition

Age 5-15
Gender Female, 54.9%
Disgusted 45%
Happy 45%
Confused 45.2%
Angry 45.5%
Fear 45.1%
Sad 50.9%
Calm 48.1%
Surprised 45%

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%

Captions

Microsoft

a person standing in front of a flat screen tv 74.5%
a person standing in front of a flat screen television 62.1%
a person standing in front of a television 62%