Human Generated Data

Title

Untitled (customs officers searching airplane compartments, Miami International Airport)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Human Generated Data

Title

Untitled (customs officers searching airplane compartments, Miami International Airport)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.8
Human 99.8
Person 99.4
Military 95.4
Officer 95.4
Military Uniform 95.4
Sailor Suit 78.8
Captain 71.2
Window 71
Helmet 70.5
Clothing 70.5
Apparel 70.5

Clarifai

people 99.9
vehicle 99.3
one 99.2
adult 99.1
train 99
transportation system 98.9
two 98.5
woman 97.5
wear 97.4
man 97.2
three 95.9
railway 93.1
aircraft 92.5
outfit 90.1
administration 89.6
indoors 89.5
portrait 89.4
military 87.9
group together 87.2
elderly 87.2

Imagga

passenger 50.1
car 33.7
vehicle 30.4
transportation 26.9
man 24.9
driver 23.3
machine 22.7
adult 22
person 21.2
automobile 21.1
people 20.6
male 19.9
truck 19.4
transport 19.2
device 18
inside 14.7
business 14.6
van 14.3
portrait 14.2
shop 14.2
work 14.1
barbershop 13.9
smile 13.5
pump 13.1
sitting 12.9
men 12.9
equipment 12.7
happy 12.5
working 12.4
road 11.7
motor vehicle 11.6
driving 11.6
gas pump 11.6
moving van 11.4
drive 11.3
jukebox 10.7
worker 10.7
travel 10.6
indoors 10.5
auto 10.5
new 10.5
one 10.4
industry 10.2
attractive 9.8
job 9.7
interior 9.7
smiling 9.4
lifestyle 9.4
service 9.3
safety 9.2
occupation 9.2
mercantile establishment 9
black 9
outdoors 9
color 8.9
emergency 8.7
record player 8.6
professional 8.4
horizontal 8.4
city 8.3
street 8.3
looking 8
to 8
bus 7.8
modern 7.7
health 7.6
wheel 7.5
mature 7.4
indoor 7.3
metal 7.2
women 7.1
office 7.1
ambulance 7.1
train 7

Google

Microsoft

person 99.7
man 98.8
clothing 96.8
outdoor 94.2
black and white 88.9
train 77.1
vehicle 68.2
bus 59.5

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 98.3%
Angry 60.1%
Calm 23.8%
Surprised 1.3%
Confused 0.9%
Disgusted 1.3%
Happy 0.6%
Sad 8.1%
Fear 3.9%

AWS Rekognition

Age 34-50
Gender Male, 98.4%
Surprised 0.8%
Calm 91.2%
Fear 0.2%
Disgusted 0.9%
Happy 0.4%
Angry 1.7%
Sad 3.5%
Confused 1.3%

Feature analysis

Amazon

Person 99.8%
Helmet 70.5%

Captions

Microsoft

a man sitting on a bench 92.7%
a man sitting on a bus 69.8%
a man sitting on a bench looking at the camera 69.7%

Text analysis

Amazon

XONOTOPEN
XONOTOPEN DOOR
DOOR
T
&

Google

3O NOTOPEN DOOR WHEN LIGHTS ON
3O
WHEN
LIGHTS
NOTOPEN
ON
DOOR