Human Generated Data

Title

Untitled (two women sitting on love seat)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19450

Human Generated Data

Title

Untitled (two women sitting on love seat)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 100
Furniture 100
Person 99.5
Human 99.5
Person 98.4
Couch 97.1
Clothing 97.1
Apparel 97.1
Interior Design 89.9
Indoors 89.9
Sitting 86.4
Female 85.1
Swimwear 84.3
Face 72.6
Table 70.9
Flooring 68.1
Girl 67.5
Water 67.4
Woman 65.9
Skin 63.9
Leisure Activities 63.6
Bikini 63.3
Portrait 62.8
Photography 62.8
Photo 62.8
Room 62.5
Armchair 58
Floor 57.7
Dress 56.5
Shorts 56.2
Living Room 55.7

Imagga
created on 2022-03-05

maillot 60.8
swimsuit 48.8
clothing 46.7
garment 38.2
sitting 34.4
sexy 32.9
attractive 32.2
adult 30.1
people 25.1
person 24.8
covering 24.8
model 24.1
portrait 24
pretty 23.8
fashion 22.6
lifestyle 21.7
happy 21.3
hair 20.6
lady 20.3
laptop 19.9
sofa 18.4
body 18.4
beachwear 17.1
relaxation 16.8
chair 16.3
blond 15.8
sensual 15.5
smiling 15.2
home 15.2
leisure 14.9
indoors 14.9
brunette 14.8
consumer goods 14.8
studio 14.4
smile 14.3
women 14.2
one 14.2
legs 14.2
man 13.4
computer 13.1
dress 12.7
couch 12.6
erotic 12.4
cute 12.2
looking 12
room 11.9
casual 11.9
relaxing 11.8
sensuality 11.8
elegance 11.8
cheerful 11.4
face 11.4
elegant 11.1
style 11.1
skin 11
relax 11
posing 10.7
life 10.6
black 10.2
happiness 10.2
gorgeous 10
lovely 9.8
luxury 9.4
leg 9.3
nice 9.2
indoor 9.1
make 9.1
fun 9
seat 8.7
expression 8.5
youth 8.5
enjoyment 8.4
modern 8.4
human 8.3
20s 8.2
technology 8.2
pose 8.2
bathing cap 8
interior 8
look 7.9
couple 7.8
seated 7.8
eyes 7.7
desire 7.7
seductive 7.7
hairstyle 7.6
pleasure 7.5
hot 7.5
lying 7.5
outdoors 7.5
lips 7.4
inside 7.4
long 7.3
business 7.3
newspaper 7.3
male 7.2
cap 7.1
working 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.4
furniture 96.3
person 95.1
chair 95
black and white 67.7
table 60.9

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 60.3%
Calm 46.5%
Sad 42.4%
Surprised 3.9%
Happy 2.3%
Confused 1.6%
Angry 1.5%
Disgusted 1.2%
Fear 0.5%

AWS Rekognition

Age 38-46
Gender Male, 61.4%
Happy 94.2%
Surprised 2.1%
Calm 1.7%
Sad 0.8%
Disgusted 0.3%
Confused 0.3%
Fear 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person sitting on a bench 38.1%

Text analysis

Amazon

28
MAMTSA3
HAGOX-YT37A2 MAMTSA3
HAGOX-YT37A2

Google

YT33A2
HAGOX
MIT a8 HAGOX一YT33A2 4AMT2Aョ
MIT
a8
4AMT2A