Human Generated Data

Title

Untitled (Passenger #12)

Date

1995

People

Artist: John Schabel, American born 1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.427

Copyright

© John Schabel

Human Generated Data

Title

Untitled (Passenger #12)

People

Artist: John Schabel, American born 1957

Date

1995

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 93.3
Person 93.3
Vehicle 84.4
Train 84.4
Transportation 84.4
Window 81
Porthole 66.6
Mirror 63.9
Silhouette 62.3
Terminal 57.2
Train Station 57.2
Electronics 56.7
Screen 55.5

Clarifai
created on 2018-03-23

people 99.6
vehicle 98.8
adult 96.3
one 96.1
technology 95.5
monochrome 92.9
wear 92.9
medicine 92.3
man 90.8
access 90.1
ailment 89.7
safety 89.3
computer 89.3
desktop 88.7
transportation system 88.6
seat 87.9
symbol 87
safe 85.9
anatomy 85.8
security 84.9

Imagga
created on 2018-03-23

device 38
fastener 34.5
keyboard 30.5
computer 29.7
push button 29.2
key 29.1
button 26.3
restraint 26.1
buckle 24.6
hole 23.7
technology 21.5
business 20.6
black 20.4
sign 18.8
type 18.2
metal 17.7
number 16.8
close 16.5
data 15.5
loudspeaker 15.4
icon 15
closeup 14.8
keypad 14.6
word 14.1
letter 13.8
vintage 13.2
text 13.1
old 12.5
texture 12.5
equipment 12.5
electronics 12.3
office 12
control 11.9
keys 11.7
symbols 11.5
symbol 11.4
leather 11.4
electronic 11.2
against 11
mechanism 10.9
nobody 10.9
information 10.6
silver 10.6
electrical device 10.4
container 10.3
object 10.3
open 9.9
cruise control 9.9
input 9.8
brown 9.6
accessory 9.5
case 9.4
screw 9.4
lock 9.3
metallic 9.2
security 9.2
web 8.5
telephone jack 8.5
design 8.4
sound 8.4
communication 8.4
safety 8.3
connection 8.2
retro 8.2
music 8.1
wallet 8
material 8
work 7.8
plug 7.8
letters 7.7
modern 7.7
closed 7.7
navigation 7.7
money 7.7
jack 7.6
textile 7.6
pattern 7.5
notebook 7.5
laptop 7.3
detail 7.2
clothing 7.2

Google
created on 2018-03-23

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51%
Sad 16.1%
Surprised 1%
Calm 77.4%
Happy 1.6%
Angry 1.8%
Disgusted 0.7%
Confused 1.5%

AWS Rekognition

Age 26-43
Gender Female, 57.2%
Calm 1.7%
Disgusted 1%
Angry 5.5%
Surprised 1.8%
Confused 3%
Sad 84.5%
Happy 2.6%

Feature analysis

Amazon

Person 93.3%

Captions

Microsoft

a close up of a car 43.5%
a black car 28.1%
a close up of a black suitcase 28%