Human Generated Data

Title

Untitled (passengers exiting National Airlines plane, Miami International Airport)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12229

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (passengers exiting National Airlines plane, Miami International Airport)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12229

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.3
Human 99.3
Person 99
Person 98.6
Person 98.2
Person 97.7
Person 96.5
Transportation 84.4
Helicopter 84.4
Aircraft 84.4
Vehicle 84.4
Machine 84.1
Person 82.6
Train 67.8
Airplane 66.5
Person 65.3
People 62.1
Tarmac 61.7
Asphalt 61.7
Tire 59.6
Flooring 59.1
Airport 58.6
Airfield 58.6
Motorcycle 58.5
Building 57.8
Banister 55.3
Handrail 55.3

Clarifai
created on 2019-11-16

people 98.5
vehicle 98.4
transportation system 95.6
military 93.5
war 93.2
group together 92.8
adult 90
man 89.7
aircraft 89
airplane 87.1
train 86.9
street 86.2
group 84
weapon 83.7
skirmish 82.7
gun 82.4
soldier 78.8
monochrome 76.4
two 75
many 73.5

Imagga
created on 2019-11-16

vehicle 44.5
cannon 33.1
artillery 30.3
high-angle gun 27.5
military vehicle 26.6
armament 24.2
tank 21.9
tracked vehicle 20.4
machine 19.5
gun 18.2
conveyance 18.1
military 16.4
weaponry 16.4
transportation 16.1
war 15.4
old 15.3
weapon 15
field artillery 14.5
half track 14.1
wheeled vehicle 13.7
equipment 13.2
sky 12.7
soldier 12.7
power 12.6
device 12
industry 11.9
industrial 11.8
car 11.6
protection 10.9
danger 10.9
travel 10.6
metal 10.5
camouflage 9.9
building 9.9
transport 9.1
armored vehicle 9.1
machinery 8.9
technology 8.9
steel 8.8
destruction 8.8
army 8.8
man 8.7
male 8.5
city 8.3
environment 8.2
road 8.1
light 8
architecture 7.9
work 7.8
black 7.8
train 7.7
truck 7.7
smoke 7.4
rifle 7.3
uniform 7.1
indoors 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

sky 98.6
text 95
ship 87.4
white 77.3
vehicle 75.3
black and white 56.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Female, 51.7%
Sad 53.3%
Fear 45.3%
Angry 46.1%
Surprised 45%
Calm 45.2%
Disgusted 45.1%
Confused 45.1%
Happy 45%

AWS Rekognition

Age 42-60
Gender Male, 54.7%
Angry 51.6%
Sad 46.8%
Calm 45.6%
Fear 45.3%
Happy 45%
Surprised 45.1%
Disgusted 45.3%
Confused 45.3%

AWS Rekognition

Age 36-52
Gender Male, 54.7%
Confused 47.2%
Surprised 45.1%
Disgusted 45.1%
Happy 45.1%
Calm 52%
Sad 45.2%
Fear 45%
Angry 45.5%

AWS Rekognition

Age 23-35
Gender Male, 54.8%
Surprised 45.1%
Sad 46.2%
Calm 49%
Happy 45%
Confused 45.6%
Disgusted 46.1%
Fear 45.2%
Angry 47.7%

AWS Rekognition

Age 36-52
Gender Male, 50.3%
Sad 49.6%
Disgusted 49.5%
Happy 49.5%
Confused 49.6%
Fear 49.9%
Angry 49.9%
Calm 49.5%
Surprised 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Calm 50.3%
Surprised 49.5%
Sad 49.6%

AWS Rekognition

Age 36-52
Gender Male, 50.4%
Sad 49.5%
Fear 49.5%
Disgusted 49.5%
Surprised 49.5%
Angry 49.5%
Calm 50.2%
Happy 49.7%
Confused 49.5%

Feature analysis

Amazon

Person 99.3%
Helicopter 84.4%
Train 67.8%
Airplane 66.5%
Motorcycle 58.5%

Categories

Text analysis

Amazon

46
TIONA
ARI
CAD

Google

AION N
AION
N