Human Generated Data

Title

Untitled ("Great America")

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Human Generated Data

Title

Untitled ("Great America")

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.8
Person 99.8
Person 99.8
Pedestrian 97.5
Furniture 94.6
Bench 94.6
Outdoors 88.5
Plant 88
Tree 88
Home Decor 78.9
Asphalt 78.1
Tarmac 78.1
Bench 77.6
Bench 73.6
Urban 73.1
Shoe 71.9
Clothing 71.9
Apparel 71.9
Footwear 71.9
Abies 71.3
Fir 71.3
Road 70.9
Building 70.3
Architecture 68.7
Town 65.7
City 65.7
Window 65.5
Handrail 62.3
Banister 62.3
Nature 58.6
Ice 56.3

Clarifai

people 99.7
street 97.2
airport 96.1
adult 96
man 95.8
group 94.1
business 93.8
group together 93.8
silhouette 93.3
monochrome 92
woman 91.6
city 90.6
one 90
vehicle 89.1
transportation system 87.2
building 87
two 86.8
shadow 86.6
music 85.5
luggage 85.2

Imagga

chairlift 100
ski tow 88.1
conveyance 69.1
cable 39.9
sky 28.7
trees 22.2
landscape 21.6
wire 21.2
electricity 18.9
industry 17.9
silhouette 17.4
power 16.8
snow 16.4
industrial 16.3
electric 15.9
city 15.8
urban 15.7
winter 15.3
high 14.7
tower 14.3
park 14
forest 13.9
voltage 13.7
metal 13.7
steel 13.6
energy 13.5
outdoor 13
line 13
cold 12.9
station 12.6
environment 12.3
tree 11.8
scenery 11.7
outdoors 11.3
clouds 11
lines 10.8
electrical 10.5
structure 10.5
swing 10
building 10
sunset 9.9
cables 9.8
business 9.7
day 9.4
sun 9.4
sport 9.2
road 9
distribution 8.8
scenic 8.8
scene 8.7
construction 8.6
travel 8.5
wood 8.3
new 8.1
recreation 8.1
transportation 8.1
river 8
grass 7.9
wires 7.9
supply 7.7
old 7.7
equipment 7.5
technology 7.4
town 7.4
mountains 7.4
man 7.4
light 7.4
transport 7.3
summer 7.1
season 7

Microsoft

tree 99.5
text 98.4
street 96.2
outdoor 96.1
black and white 94.5
monochrome 89.9
person 88
plant 64.4
playground 59.1
bench 56
footwear 51.9
sign 16.4

Face analysis

Amazon

AWS Rekognition

Age 40-58
Gender Male, 54.4%
Calm 50.4%
Fear 45.5%
Sad 45.6%
Happy 46%
Disgusted 45.3%
Surprised 45.6%
Confused 45.3%
Angry 46.3%

Feature analysis

Amazon

Person 99.8%
Bench 94.6%
Shoe 71.9%

Captions

Microsoft

a person standing in front of a window 81%
a person standing in front of a building 80.9%
a person holding a sign 73.2%