Human Generated Data

Title

Untitled (inspector shining flashlight on airplane seat)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12227

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (inspector shining flashlight on airplane seat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12227

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Cushion 100
Sitting 97.5
Human 97.5
Person 95.2
Person 90.4
Transportation 80.8
Headrest 80.7
Vehicle 79.8
Person 74.6
Apparel 71.9
Clothing 71.9
Van 69.8
Automobile 59.6
Car 59.6
Caravan 56.7

Clarifai
created on 2019-11-16

monochrome 99.5
people 99.3
street 98.8
vehicle 98.4
transportation system 98.1
car 97.8
movie 97.5
train 97.4
man 96.9
seat 96.9
locomotive 95.4
airplane 94.7
vehicle window 94.6
indoors 93
subway system 92.8
adult 92.6
woman 92.4
black and white 91.3
chair 91.1
portrait 89.8

Imagga
created on 2019-11-16

passenger 83.4
car 61.9
seat 40.5
vehicle 39.4
driver 38.9
automobile 34.5
sitting 33.5
plane seat 31.9
person 31.2
man 30.3
transportation 28.7
adult 28.5
support 28.2
people 27.9
auto 27.8
driving 26.1
motor vehicle 24.5
smiling 23.2
business 23.1
happy 22.6
drive 20.8
male 19.9
device 19.8
smile 19.3
transport 19.2
attractive 18.2
businessman 16.8
seat belt 16.7
golf equipment 16.4
inside 15.7
professional 15.2
face 14.2
holding 14
pretty 14
portrait 13.6
work 13.4
wheel 13.2
cheerful 13
looking 12.8
travel 12.7
safety belt 12.6
happiness 12.6
sports equipment 12.3
women 11.9
businesswoman 11.8
working 11.5
office 11.4
businesspeople 11.4
new 11.3
equipment 11.2
steering 10.9
road 10.8
hand 10.6
job 10.6
outdoors 10.5
corporate 10.3
executive 10.1
lifestyle 10.1
20s 10.1
communication 10.1
worker 9.9
human 9.8
wheeled vehicle 9.7
one 9.7
indoors 9.7
restraint 9.5
men 9.5
laptop 9.4
occupation 9.2
window 9.2
suit 9.1
brunette 8.7
outside 8.6
traffic 8.6
modern 8.4
rest 8.3
computer 8.1
headrest 8.1
interior 8
hair 7.9
couple 7.8
two people 7.8
employee 7.8
luxury 7.7
talking 7.6
horizontal 7.5
smart 7.5
phone 7.4
guy 7.4
back 7.4
lady 7.3
black 7.2
handsome 7.1
model 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 94.8
clothing 92.5
person 90.8
vehicle 90.2
bus 86.9
land vehicle 79.8
man 58.3
train 55.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Male, 54.8%
Happy 45%
Angry 45.2%
Disgusted 45%
Surprised 45%
Calm 53.5%
Sad 46.2%
Fear 45%
Confused 45.1%

AWS Rekognition

Age 23-35
Gender Male, 54.6%
Calm 48.8%
Disgusted 45.7%
Surprised 46.7%
Sad 45.8%
Fear 46.1%
Happy 45.1%
Angry 46.3%
Confused 45.4%

Feature analysis

Amazon

Person 95.2%

Categories

Imagga

interior objects 98%
food drinks 1.3%

Captions

Microsoft
created on 2019-11-16

a man sitting on a bus 33.8%
a man sitting at a train station 33.7%
a man sitting on a train 33.6%

Text analysis

Amazon

39