Human Generated Data

Title

Untitled (Passenger #7)

Date

1995

People

Artist: John Schabel, American born 1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.422

Copyright

© John Schabel

Human Generated Data

Title

Untitled (Passenger #7)

People

Artist: John Schabel, American born 1957

Date

1995

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.422

Copyright

© John Schabel

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 97.2
Person 97.2
Face 83.1
Window 81.3
Porthole 70.3
Finger 67.3
Art 66.4
Painting 66.4
Photo 56.8
Photography 56.8
Portrait 56.8

Clarifai
created on 2018-03-23

people 99.6
one 99.2
adult 98.7
facial expression 96.1
wear 95.3
portrait 94.8
man 92.5
furniture 90.7
vehicle 89.9
art 87.7
ailment 86.1
painting 85
veil 84.9
leader 83.6
side view 82.9
woman 81.5
music 80.2
print 79.4
illustration 79.3
illness 78.1

Imagga
created on 2018-03-23

fastener 46.9
device 42.9
buckle 40.5
restraint 35.9
push button 28.8
computer 25.6
keyboard 23.6
button 23.3
business 21.9
sign 21.1
black 21
key 21
wallet 18.7
data 18.3
metal 17.7
object 17.6
close 16.5
equipment 16.3
technology 16.3
container 15.8
old 15.3
information 15.1
type 14.4
number 14
bag 13.9
closeup 13.5
icon 13.5
vintage 13.2
case 13.2
texture 13.2
office 12.8
money 12.8
keypad 12.6
nobody 12.4
leather 12.3
lock 12.1
against 11.9
letter 11.9
symbols 11.5
accessory 11.4
paper 11
symbol 10.8
text 10.5
electronics 10.4
word 10.4
brown 10.3
finance 10.1
security 10.1
wealth 9.9
retro 9.8
hole 9.7
storage 9.5
box 9.3
plug 9.3
black box 9.1
single 9
input 8.8
design 8.4
savings 8.4
metallic 8.3
currency 8.1
detail 8
purse 8
financial 8
copy 8
silver 8
steel 8
pocket 7.8
keys 7.8
navigation 7.7
personal 7.6
web 7.6
recorder 7.6
textile 7.6
clothing 7.5
safety 7.4
cash 7.3
connection 7.3
digital 7.3
gray 7.2
safe 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

indoor 87.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 84.2%
Happy 1%
Angry 4.3%
Disgusted 1.4%
Calm 56.4%
Sad 23.1%
Surprised 5.8%
Confused 8%

Feature analysis

Amazon

Person 97.2%
Painting 66.4%

Captions

Microsoft
created on 2018-03-23

a close up of a car 27.7%
close up of a car 26.5%
a black car 12.8%