Human Generated Data

Title

Untitled (semi-nude woman sitting on chair)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19463

Human Generated Data

Title

Untitled (semi-nude woman sitting on chair)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 99.9
Furniture 99.9
Human 98.5
Person 98.5
Indoors 93.6
Room 93.6
Apparel 88.4
Clothing 88.4
Living Room 75.4
Cabinet 73.4
Underwear 67.7
Lingerie 67.7
Table 60.5
Bedroom 56.6
Bed 55.8
Female 55.4
Person 47.2

Imagga
created on 2022-03-05

adult 30
person 27
sexy 24.9
fashion 24.9
model 21.8
attractive 21.7
portrait 21.3
people 20.6
chair 18.9
lady 18.7
black 18.2
sitting 18
musical instrument 17
posing 16.9
elegance 16.8
hair 15.8
device 15.6
style 15.6
pretty 15.4
dress 15.3
elegant 14.6
human 13.5
accordion 13.1
brunette 13.1
cute 12.9
clothing 12.8
face 12.8
sensual 12.7
one 12.7
wind instrument 12.3
stylish 11.7
keyboard instrument 11.3
seat 10.9
lifestyle 10.8
man 10.7
furniture 10.4
body 10.4
women 10.3
smiling 10.1
happy 10
sensuality 10
smile 10
indoors 9.7
looking 9.6
seductive 9.6
casual 9.3
male 9.2
makeup 9.1
modern 9.1
vogue 8.7
youth 8.5
erotic 8.5
legs 8.5
support 8.5
passion 8.5
studio 8.4
dark 8.3
dog 8.3
sport 8.2
teenager 8.2
make 8.2
gorgeous 8.2
pose 8.1
teacher 8.1
look 7.9
luxury 7.7
expression 7.7
head 7.6
relaxation 7.5
city 7.5
hound 7.4
vintage 7.4
holding 7.4
color 7.2
interior 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97
furniture 77.7
black and white 69.5
chair 59.4

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.3%
Happy 66.2%
Surprised 17.1%
Sad 6.2%
Calm 3.4%
Confused 2.3%
Angry 2.2%
Disgusted 2%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a person with a dog in a cage 51.7%
a dog in a cage 44.2%
a person and a dog in a cage 41.2%

Text analysis

Amazon

53

Google

NAGON YT37A2-NAMTZA3
NAGON
YT37A2-NAMTZA3