Human Generated Data

Title

Untitled (Pismo Beach, Cal)

Date

1978

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Human Generated Data

Title

Untitled (Pismo Beach, Cal)

People

Artist: Bill Dane, American born 1938

Date

1978

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.7
Human 99.7
Person 99.5
Person 99.4
Person 99.1
Apparel 94.2
Clothing 94.2
Person 93.8
Sitting 88.9
Automobile 78.8
Car 78.8
Vehicle 78.8
Transportation 78.8
Footwear 74.2
Shoe 74.2
Person 66.3
Outdoors 60.9
Sleeve 59.6
Long Sleeve 59.6
Shorts 58.8
Coat 58.7
Overcoat 58.7
People 58
Pedestrian 57.5
Patio 57.2

Clarifai

people 99.8
group together 98.2
group 98
street 97.9
adult 97.4
woman 97
man 96.9
monochrome 96.6
two 92.8
vehicle 92
transportation system 89.6
three 88.9
several 86
four 85.1
airport 84.4
child 83.9
many 80.5
five 79.7
wear 78.6
music 78.4

Imagga

trombone 33.3
man 30.9
brass 30.7
people 29.6
business 28.5
wind instrument 27.7
musical instrument 24.5
male 24.1
adult 24
person 20
work 19.6
businessman 18.5
office 18.3
corporate 18
men 15.4
group 15.3
building 15.1
job 14.1
city 14.1
meeting 14.1
modern 14
standing 13.9
women 13.4
professional 13.2
urban 13.1
worker 12.7
room 12.4
indoors 12.3
happy 11.9
laptop 11.8
suit 11.7
attractive 11.2
occupation 11
communication 10.9
businesswoman 10.9
team 10.7
working 10.6
chair 10.5
looking 10.4
smiling 10.1
engineer 10.1
clothing 10
smile 10
holding 9.9
pretty 9.8
portrait 9.7
black 9.6
education 9.5
walking 9.5
executive 9.4
happiness 9.4
teamwork 9.3
life 9.2
shop 9.2
travel 9.1
fashion 9
interior 8.8
equipment 8.6
walk 8.6
career 8.5
casual 8.5
student 8.4
manager 8.4
hand 8.4
shopping 8.3
cheerful 8.1
success 8
computer 8
couple 7.8
table 7.8
two 7.6
seller 7.6
businesspeople 7.6
school 7.6
store 7.6
bag 7.5
buy 7.5
clothes 7.5
one 7.5
teacher 7.4
inside 7.4
light 7.3
window 7.3
hall 7.3
indoor 7.3
industrial 7.3
board 7.2
lifestyle 7.2
employee 7.1
crutch 7.1
day 7.1

Microsoft

clothing 98.5
person 97.8
text 95.9
man 90
woman 88.6
black and white 82
footwear 78.2
people 63.7

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Female, 54.3%
Calm 51.9%
Confused 45.1%
Fear 45.2%
Sad 46.9%
Angry 45.5%
Surprised 45.1%
Disgusted 45.2%
Happy 45%

AWS Rekognition

Age 33-49
Gender Male, 54.8%
Sad 45.1%
Calm 54.1%
Happy 45%
Surprised 45%
Disgusted 45.5%
Confused 45.1%
Angry 45.2%
Fear 45%

AWS Rekognition

Age 3-11
Gender Female, 51.2%
Calm 45.1%
Fear 51.5%
Disgusted 45.1%
Surprised 46.2%
Happy 45%
Angry 45.3%
Confused 45.1%
Sad 46.8%

AWS Rekognition

Age 10-20
Gender Male, 51.7%
Sad 52.4%
Surprised 45%
Confused 45.1%
Angry 45.8%
Calm 46.3%
Fear 45.1%
Happy 45.2%
Disgusted 45%

Feature analysis

Amazon

Person 99.7%
Car 78.8%
Shoe 74.2%

Captions

Microsoft

a group of people standing in front of a store 92.5%
a group of people standing in front of a shop 92.4%
a group of people standing in front of a building 92.2%

Text analysis

Amazon

TAFFY
EMADE CANDIES
WARER TAFFY
WARER
O EMADE CANDIES
O
TAVSIRSS

Google

TAFFY
HAWURGERS
CANDIES
WATER
EMADE
HAWURGERS EMADE CANDIES WATER TAFFY