Human Generated Data

Title

Untitled (woman on chair in bedroom)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19432

Human Generated Data

Title

Untitled (woman on chair in bedroom)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.9
Chair 99.9
Person 95.1
Human 95.1
Apparel 85.9
Clothing 85.9
Indoors 79.7
Room 79.7
Interior Design 76.6
Person 74.6
Shorts 71.3
Mammal 68.3
Cat 68.3
Animal 68.3
Pet 68.3
Den 67
Living Room 66.3
Dog House 55.8
Footwear 55.4
Shoe 55.4

Imagga
created on 2022-03-05

sexy 24.9
adult 24.9
person 23.9
portrait 20.7
device 20.3
people 19.5
model 17.9
black 17.6
lifestyle 17.3
attractive 16.8
body 15.2
fashion 15.1
hair 15.1
dress 14.4
chair 14.3
sitting 13.7
sensual 13.6
one 13.4
pretty 13.3
lady 13
exercise bike 12.9
face 12.8
man 12.8
human 12.7
gorgeous 12.7
equipment 12.6
blond 12.1
fitness 11.7
exercise device 11.5
indoors 11.4
studio 11.4
erotic 11.3
elegance 10.9
style 10.4
strength 10.3
women 10.3
smile 10
brunette 9.6
smiling 9.4
happy 9.4
skin 9.3
lips 9.2
training 9.2
male 9.2
seat 9.2
interior 8.8
glamorous 8.7
exercising 8.7
happiness 8.6
elegant 8.6
legs 8.5
furniture 8.4
head 8.4
dark 8.3
health 8.3
leisure 8.3
vintage 8.3
nice 8.2
sensuality 8.2
clothing 8.2
exercise 8.2
pose 8.1
cheerful 8.1
cute 7.9
men 7.7
luxury 7.7
sport 7.7
gym 7.6
club 7.5
fun 7.5
room 7.5
holding 7.4
entertainment 7.4
child 7.1
posing 7.1
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 65.7%
Happy 98.7%
Calm 0.5%
Surprised 0.3%
Sad 0.2%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.1%
Cat 68.3%

Captions

Microsoft

a person standing in a room 56%
a man and a woman standing in a room 38%
a person that is standing in a room 37.9%