Human Generated Data

Title

Track Three

Date

2000

People

Artist: Laura Blacklow, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Alvin Winder, P2002.3

Human Generated Data

Title

Track Three

People

Artist: Laura Blacklow, American born 1945

Date

2000

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-02

Human 99.5
Person 99.5
Person 96.9
Transportation 95.8
Vehicle 95.8
Train 95.8
Art 67.3

Clarifai
created on 2018-03-23

people 97.6
man 95.4
adult 95.4
vehicle 94.6
offense 91.6
woman 90.8
portrait 90.2
street 88
battle 87
business 86.4
transportation system 86.2
military 86
police 85.9
car 84.8
administration 84.7
war 84
city 81.9
two 81.3
wear 80.1
technology 79.9

Imagga
created on 2018-03-23

kitchen appliance 40.4
microwave 40.3
man 37.6
car 33.3
home appliance 30.2
male 26.9
vehicle 23.8
people 23.4
person 22.9
transportation 21.5
appliance 20.7
automobile 19.1
support 18.9
adult 17.4
auto 17.2
business 17
headrest 16.8
driver 16.5
work 16.5
happy 16.3
engineer 16.1
device 15.6
sitting 15.4
worker 15.1
senior 15
portrait 14.9
drive 14.2
working 14.1
mature 13.9
rest 13.6
equipment 13.4
looking 12.8
cathode-ray tube 12.8
old 12.5
job 12.4
wheel 12.3
black 12
inside 12
transport 11.9
professional 11.8
businessman 11.5
face 11.4
computer 11.3
travel 11.3
men 11.2
industrial 10.9
smiling 10.8
smile 10.7
gas-discharge tube 10.2
seat 10.1
occupation 10.1
screen 10
hat 10
uniform 9.8
handsome 9.8
driving 9.7
durables 9.6
industry 9.4
hand 9.1
road 9
one 9
technology 8.9
toaster 8.8
elderly 8.6
one person 8.5
tube 8.4
attractive 8.4
holding 8.2
room 8.2
office 8
lifestyle 7.9
indoors 7.9
middle aged 7.8
modern 7.7
repair 7.7
traffic 7.6
monitor 7.6
power 7.5
outdoors 7.5
safety 7.4
light 7.3
television 7.3
alone 7.3
protection 7.3
active 7.2
plane seat 7.2
steel 7.1

Google
created on 2018-03-23

transport 84
purple 83.5
automotive design 75.8
vehicle 75.1
art 61.2
mural 57.3
public transport 55.3
car 54.3
painting 53.7

Microsoft
created on 2018-03-23

person 98

Face analysis

Amazon

AWS Rekognition

Age 60-90
Gender Male, 96.6%
Disgusted 0.6%
Sad 44.3%
Surprised 0.7%
Confused 3.4%
Calm 43.7%
Happy 0.4%
Angry 6.8%

AWS Rekognition

Age 20-38
Gender Female, 95.7%
Sad 85.2%
Angry 1.5%
Happy 0.3%
Surprised 0.8%
Disgusted 0.9%
Calm 9.5%
Confused 1.8%

Feature analysis

Amazon

Person 99.5%
Train 95.8%

Captions

Microsoft

a person standing in front of a bus 58.7%
a man and a woman standing in front of a bus 35.5%
a group of people standing in front of a bus 35.4%

Text analysis

Amazon

SHEBEGS
FATHER
SHEBEGS HER
HER
FATHER TO
TO
TRACK
KNOWSHE
MUST
STAY
STAr U STAY
STAr U
TRACK E
E

Google

TR SHE BEGS HER FATHER TO MUST GO
BEGS
FATHER
GO
SHE
TR
TO
MUST
HER