Human Generated Data

Title

Untitled (patient having feet treated)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19144

Human Generated Data

Title

Untitled (patient having feet treated)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19144

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.4
Person 97.4
Human 97.4
Chair 91.2
Clothing 86
Apparel 86
Crib 70.6
Wood 64
Room 63.7
Indoors 63.7
Portrait 63.3
Photography 63.3
Face 63.3
Photo 63.3
Bed 55.6

Clarifai
created on 2023-10-22

people 99.8
monochrome 99.4
portrait 98.5
one 97.5
adult 97.2
art 96.3
man 94.7
woman 92.8
model 92
street 90.6
black and white 89.4
wear 88.3
music 86
indoors 86
analogue 84.1
room 83.6
nude 83.1
vintage 83
seat 82.7
furniture 82.6

Imagga
created on 2022-03-05

sexy 31.3
adult 30.4
device 29.1
fashion 25.6
support 25
model 24.9
body 24
attractive 23.1
hair 23
seat 22.7
pretty 21.7
person 21.5
rest 19.9
armrest 19.5
blond 19.2
posing 18.6
people 18.4
lady 17.8
skin 17.8
portrait 17.5
armchair 16.8
studio 16.7
face 16.3
erotic 16.3
black 16
style 14.8
sensual 14.5
sensuality 14.5
elegance 14.3
one 14.2
lingerie 14.1
pose 13.6
clothing 13.3
legs 13.2
cute 12.9
human 12.7
interior 11.5
elegant 11.1
casual 11
long 11
treadmill 10.2
happy 10
smile 10
gorgeous 10
looking 9.6
brunette 9.6
women 9.5
sitting 9.4
lips 9.2
head 9.2
inside 9.2
makeup 9.1
leg 9.1
exercise 9.1
furniture 8.9
look 8.8
hands 8.7
lifestyle 8.7
sexual 8.7
eyes 8.6
seductive 8.6
passion 8.5
exercise device 8.4
slim 8.3
nice 8.2
dress 8.1
indoors 7.9
sculpture 7.8
happiness 7.8
underwear 7.7
statue 7.7
healthy 7.6
floor 7.4
man 7.4
plane seat 7.4
smiling 7.2
fitness 7.2
night 7.1

Microsoft
created on 2022-03-05

person 98.8
text 96.3
clothing 90.1
dance 68.8
man 64.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Surprised 76.2%
Calm 17.3%
Sad 2.5%
Disgusted 1.7%
Happy 0.8%
Confused 0.6%
Angry 0.4%
Fear 0.4%

Feature analysis

Amazon

Person
Person 97.4%

Categories

Captions

Microsoft
created on 2022-03-05

a man standing next to a window 55.2%
a man standing in front of a window 55.1%

Text analysis

Amazon

YT37A2
M_M7 YT37A2
M_M7

Google

MME YT33A2 032MA
MME
YT33A2
032MA