Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (women looking into mirror)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19255

Human Generated Data

Title

Untitled (women looking into mirror)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19255

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 96.3
Person 91.9
Advertisement 73.9
Poster 72.1
Face 71.5
Text 70.8
Cat 68.5
Animal 68.5
Mammal 68.5
Pet 68.5
Collage 65.4
Female 60.4
Clothing 59
Apparel 59
Finger 56.7

Clarifai
created on 2023-10-22

people 99.6
monochrome 98.9
adult 97.8
woman 96.6
man 94.1
one 93
portrait 92.8
analogue 92.3
indoors 91.7
room 88.5
art 88
child 87.6
technology 86
two 84.8
music 84.3
window 83.9
family 81.9
display 79.8
vintage 79.1
furniture 78.8

Imagga
created on 2022-03-05

people 24
computer 19.8
negative 19.8
portrait 19.4
laptop 19.3
office 18.6
person 18.4
work 18
adult 17.7
film 17.6
home 17.5
happy 17.5
man 16.8
pretty 16.8
business 16.4
looking 16
black 15
smile 15
attractive 14.7
smiling 13.7
student 13.6
hair 13.5
interior 13.3
cute 12.9
photographic paper 12.9
face 12.8
women 12.7
technology 12.6
house 12.5
lifestyle 12.3
professional 12.2
happiness 11.8
room 11.6
male 11.3
modern 11.2
one 11.2
corporate 11.2
sitting 11.2
casual 11
device 11
call 10.8
lady 10.5
child 10.3
human 9.7
blackboard 9.7
job 9.7
working 9.7
indoors 9.7
career 9.5
businesswoman 9.1
photographic equipment 9
monitor 8.9
sexy 8.8
businessman 8.8
brunette 8.7
desk 8.7
sad 8.7
eyes 8.6
expression 8.5
relaxation 8.4
blond 8.3
notebook 8.3
indoor 8.2
worker 8
sofa 7.8
youth 7.7
head 7.6
lying 7.5
study 7.5
future 7.4
alone 7.3
girls 7.3
relaxing 7.3
art 7.2
love 7.1
table 7.1

Microsoft
created on 2022-03-05

text 99.1
person 98
drawing 97.4
indoor 92.9
sketch 92.7
human face 90.7
painting 89.8
black and white 88.5
clothing 75.2
woman 67.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 99.9%
Happy 83%
Surprised 5%
Fear 2.6%
Calm 2.5%
Confused 2.2%
Sad 1.9%
Angry 1.9%
Disgusted 0.8%

Feature analysis

Amazon

Person
Cat
Person 99.7%

Categories

Text analysis

Amazon

3
KLEENEX
KLEENEX TISSUES
TISSUES
в
t
TELA
MAGOM
XAOOX
20% t E L A tira
2017 TELA kira
E L A
20%
tira
2017
kira
ханатия

Google

KODYK 20rEEIA EIrn
KODYK
20rEEIA
EIrn