Human Generated Data

Title

Untitled (employees from E.N. Lanouette concessions behind display counter)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4041

Human Generated Data

Title

Untitled (employees from E.N. Lanouette concessions behind display counter)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4041

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.6
Human 99.6
Person 98.3
Person 95.6
Shop 94.6
Newsstand 91.4
Kiosk 75.6
Apparel 60.9
Clothing 60.9

Clarifai
created on 2019-06-01

people 99.4
monochrome 98.5
group together 97.7
adult 97
many 95.4
group 95.2
furniture 91.5
man 90.8
street 89.8
administration 88.8
several 88.5
stock 83.4
indoors 83.4
vehicle 83.2
two 83.2
war 80.9
room 80.6
woman 80.4
four 80
three 77.4

Imagga
created on 2019-06-01

stall 38.2
negative 36.8
film 29.6
architecture 27.3
house 25.1
snow 23.8
building 21.8
sketch 21.3
city 20.8
photographic paper 20.6
drawing 20.4
modern 19.6
design 19.1
construction 18.8
newspaper 18.6
home 17.7
urban 16.6
plan 16.1
interior 15.9
structure 15.5
blueprint 14.7
product 14.3
wall 14
photographic equipment 13.7
office 13.7
winter 13.6
architect 13.5
business 13.4
street 12.9
creation 12.2
floor 12.1
frame 11.8
project 11.5
old 11.1
scene 10.4
room 10.3
window 10.2
town 10.2
district 9.7
residential 9.6
engineering 9.5
industry 9.4
weather 9.4
grunge 9.4
development 8.8
furniture 8.7
architectural 8.7
cold 8.6
glass 8.6
skyline 8.5
art 8.5
black 8.4
vintage 8.3
table 8.2
new 8.1
lines 8.1
paper 7.8
3d 7.7
travel 7.7
designer 7.7
entrance 7.7
built 7.7
pencil 7.6
cityscape 7.6
technology 7.4
retro 7.4
metal 7.2
square 7.2
sky 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 91.7
person 89.3
black and white 78.9
man 67.3
refrigerator 63.9
snow 59.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Female, 52.4%
Calm 50.1%
Disgusted 45.3%
Sad 45.6%
Surprised 45.3%
Happy 47%
Confused 46.3%
Angry 45.4%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Confused 45.9%
Surprised 45.6%
Sad 46%
Angry 45.4%
Happy 45.2%
Calm 51.6%
Disgusted 45.1%

AWS Rekognition

Age 48-68
Gender Female, 54.3%
Confused 45.4%
Happy 45.3%
Calm 51.1%
Disgusted 45.1%
Sad 47.4%
Angry 45.2%
Surprised 45.4%

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

ABAG
POPCORN
10
Fresh
ABX
LAnOuT
LAnOuT TE
TE

Google

E.D ESSEM PCORN ABOX A BAG
E.D
ESSEM
PCORN
ABOX
A
BAG