Human Generated Data

Title

Untitled (woman posing in front of cosmetic counter)

Date

c. 1935

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4302

Human Generated Data

Title

Untitled (woman posing in front of cosmetic counter)

People

Artist: Durette Studio, American 20th century

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4302

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Shop 95.7
Person 88.3
Human 88.3
Indoors 85.2
Interior Design 85.2
Window Display 75.9
Room 59.8
Bakery 58.6

Clarifai
created on 2019-06-01

people 99.5
adult 97.8
indoors 97.6
furniture 97
man 96.3
room 95.2
monochrome 93.3
chair 93.2
one 92.9
inside 92.2
woman 91.6
two 91.3
table 89.5
group 89.1
business 86.5
sit 84.9
technology 84.7
stock 84.2
architecture 84.1
counter 83.8

Imagga
created on 2019-06-01

grunge 23.8
design 21.9
vintage 18.2
old 18.1
business 17
dirty 16.3
art 15.6
black 15.6
retro 15.6
drawing 15.4
antique 15
architecture 14.9
frame 14.7
pattern 13.7
paper 13.4
graphic 13.1
border 12.7
modern 12.6
stall 12.5
building 12.3
home 12.2
computer 12.1
paint 11.8
texture 11.1
decoration 10.8
digital 10.5
screen 10.5
room 10.5
damaged 10.5
negative 10.4
grungy 10.4
window 10.2
house 10
film 9.5
space 9.3
finance 9.3
rough 9.1
silhouette 9.1
bank 9
currency 9
people 8.9
blackboard 8.9
structure 8.8
urban 8.7
money 8.5
dollar 8.3
color 8.3
banking 8.3
technology 8.2
style 8.2
symbol 8.1
office 7.9
life 7.9
text 7.9
laptop 7.8
frames 7.8
wall 7.8
scratch 7.8
messy 7.7
collage 7.7
web site 7.6
horizontal 7.5
sign 7.5
cash 7.3
success 7.2
reflection 7.2
material 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

cartoon 92

Color Analysis

Feature analysis

Amazon

Person 88.3%

Categories

Imagga

paintings art 52.4%
interior objects 43.3%
text visuals 3.2%

Captions

Text analysis

Amazon

MAX
FACIOR MAX
FACIOR
AVITTS
A
LEAVITIS
Mar Tndter
Mar Tndter Mla
Mla
QA
--0o

Google

MAX NIT EAVIT
MAX
NIT
EAVIT