Human Generated Data

Title

Untitled (four men outside service station, two in uniform)

Date

c. 1955, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6703

Human Generated Data

Title

Untitled (four men outside service station, two in uniform)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1955, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 99.3
Tie 99.2
Accessories 99.2
Accessory 99.2
Person 99.2
Person 98.5
Tie 97
Wheel 86.4
Machine 86.4
Clothing 86.3
Apparel 86.3
Footwear 65
Shoe 65
People 64.2
Overcoat 61.7
Coat 61.7
Food 61.4
Meal 61.4
Sleeve 58.4
Long Sleeve 58.4
Crowd 57.4
Stage 55

Clarifai
created on 2019-11-16

people 99.3
man 95.8
adult 94.5
group 94.2
woman 93.1
wear 91.3
vehicle 91
street 89.5
two 89.2
transportation system 88.1
group together 87.6
one 82.8
business 82.6
room 82.5
television 80.6
indoors 80.2
fashion 80.1
industry 79.2
many 77.4
administration 76.4

Imagga
created on 2019-11-16

gas pump 25.7
transportation 22.4
pump 22.3
vehicle 19.3
transport 19.2
old 17.4
industrial 17.2
device 16.9
mechanical device 16.3
industry 16.2
machine 15.3
building 15.3
architecture 14.8
equipment 14.2
mechanism 14.2
factory 14.1
metal 13.7
car 13.4
shop 13.1
track 12.9
railway 12.7
train 12.7
travel 12.7
power 12.6
engine 12.5
wheeled vehicle 12.3
window 12.2
barbershop 12.1
locomotive 12
steel 11.6
station 11.5
black 10.8
railroad 10.8
mercantile establishment 10.4
house 10
light 10
technology 9.6
city 9.1
freight car 9.1
interior 8.8
urban 8.7
fuel 8.7
structure 8.3
machinery 8.1
work 8
antique 7.9
design 7.9
steam 7.8
pollution 7.7
engineering 7.6
energy 7.6
wheel 7.5
vintage 7.5
wood 7.5
heat 7.4
environment 7.4
street 7.4
inside 7.4
home 7.2
conveyance 7.1
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.6
black and white 83.9
street 80.9
store 73
person 71.3
clock 67.9
clothing 62.5
man 57.9

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 54.7%
Confused 45.9%
Surprised 45.3%
Calm 49.4%
Fear 45.7%
Disgusted 45.4%
Happy 45.2%
Sad 47.6%
Angry 45.5%

AWS Rekognition

Age 29-45
Gender Male, 54.5%
Disgusted 45.1%
Calm 51.4%
Fear 45.1%
Happy 45%
Confused 45.2%
Sad 45.5%
Surprised 45.1%
Angry 47.6%

AWS Rekognition

Age 32-48
Gender Male, 54.8%
Disgusted 45.3%
Sad 45.5%
Fear 45.8%
Angry 45.7%
Confused 46.4%
Happy 45.1%
Calm 49.5%
Surprised 46.8%

AWS Rekognition

Age 57-75
Gender Male, 54.9%
Sad 47.1%
Fear 45.1%
Angry 45.7%
Surprised 45.1%
Calm 51.2%
Disgusted 45.3%
Confused 45.1%
Happy 45.3%

AWS Rekognition

Age 44-62
Gender Male, 54.8%
Angry 45.1%
Fear 45.1%
Calm 45.4%
Sad 54%
Disgusted 45.1%
Happy 45%
Confused 45.3%
Surprised 45%

Feature analysis

Amazon

Person 99.4%
Tie 99.2%
Wheel 86.4%
Shoe 65%

Captions

Microsoft

a person standing in front of a store window 81.4%
a group of people standing in front of a store window 70.7%
a person standing in front of a store window 70.6%

Text analysis

Amazon

SMOKING
NO SMOKING
ANTI-KNOCK
NO
100%
OUR
PART
NRA
Goodrich
WEDD OUR PART
FIRECHIEF
Coodrie
WEDD
RAVOLINE
STKTIOS
TIRES
F RAVOLINE FIRECHIEF
Mied me
F

Google

NO SMOKING NRA WE DO OUR PART Meel me HONOR NAVONEFIRE-CHIEF STATION I00% ANTI-KNOCK CENTINAE Goodrick Coodrieh TIRES TIRES
NRA
WE
Meel
TIRES
NO
SMOKING
me
STATION
Coodrieh
NAVONEFIRE-CHIEF
DO
PART
HONOR
I00%
ANTI-KNOCK
CENTINAE
Goodrick
OUR