Human Generated Data

Title

Untitled (L. A.)

Date

1981

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Human Generated Data

Title

Untitled (L. A.)

People

Artist: Bill Dane, American born 1938

Date

1981

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.8
Person 99.8
Person 98.9
Leisure Activities 80.5
Musical Instrument 80.5
Piano 80.5
Wheel 74.4
Machine 74.4
Furniture 67.1
Wood 67
Plywood 67
Flooring 66.1
Table 60.9
Floor 59.6
Clothing 58.4
Apparel 58.4
Studio 56.9

Clarifai

indoors 99.2
people 99
room 98.7
man 94.8
furniture 94
adult 92.8
desk 89.7
two 88.7
woman 86.7
one 85.4
hospital 83.7
group 82.1
education 81.3
group together 80.8
exhibition 80
weapon 79.5
military 77.8
machine 77.3
gun 74.6
inside 74.2

Imagga

interior 50.4
room 45.8
circular saw 36.3
modern 34.4
home 31.9
table 31.6
furniture 29.8
power saw 29.2
house 28.4
indoors 27.2
kitchen 27
inside 25.8
chair 23.9
power tool 22.2
device 22.1
apartment 22
decor 21.2
machine 20
design 18.6
wood 18.3
window 18.3
indoor 18.3
equipment 18.2
light 17.4
stove 16.9
architecture 16.5
luxury 16.3
living 16.1
floor 15.8
lamp 15.7
wall 14.8
oven 14.7
domestic 14.5
man 14.1
glass 14
3d 13.9
people 13.9
residential 13.4
steel 13.3
cabinet 12.8
business 12.8
office 12.7
work 12.6
sofa 11.5
classroom 11.1
cook 11
instrument 10.9
sink 10.8
lighting 10.6
cooking 10.5
hospital 10.5
musical instrument 10.4
appliance 10.3
empty 10.3
decoration 10.1
lifestyle 10.1
refrigerator 10.1
counter 10
male 9.9
plotter 9.6
contemporary 9.4
clean 9.2
food 9.1
person 8.8
living room 8.8
stainless 8.7
comfortable 8.6
structure 8.6
estate 8.5
real 8.5
microwave 8.5
style 8.2
life 8.1
new 8.1
worker 8
tool 8
smiling 8
women 7.9
building 7.9
electronic instrument 7.9
urban 7.9
chairs 7.8
men 7.7
mirror 7.6
lights 7.4
center 7.3
color 7.2
wooden 7

Google

Photograph 96.4
Snapshot 87.7
Standing 83.6
Room 80.8
Photography 75.7
Furniture 61.5
Machine 59.5
Stock photography 59.4
Building 50.8

Microsoft

indoor 98.2
person 86.2
ceiling 77.9
black and white 75.4
clothing 71.4
furniture 67.8
table 61.3
text 59
man 50.5

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Male, 54.8%
Surprised 45.3%
Calm 51.6%
Fear 45.3%
Disgusted 45.5%
Happy 45.6%
Angry 46%
Sad 45.6%
Confused 45.1%

AWS Rekognition

Age 45-63
Gender Male, 51.9%
Angry 45.2%
Disgusted 45%
Surprised 45.1%
Sad 52.8%
Confused 45.1%
Calm 46.3%
Fear 45.5%
Happy 45%

Feature analysis

Amazon

Person 99.8%
Piano 80.5%
Wheel 74.4%

Captions

Microsoft

a group of people standing in a kitchen 89.3%
a man standing in a kitchen 88.5%
a man that is standing in the kitchen 86.4%

Text analysis

Amazon

Eo
EA