Human Generated Data

Title

Untitled (woman on chair under painting)

Date

1946

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20290

Human Generated Data

Title

Untitled (woman on chair under painting)

People

Artist: Samuel Cooper, American active 1950s

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Dress 99.5
Clothing 99.5
Apparel 99.5
Human 95.3
Female 95.3
Robe 92.3
Gown 92.3
Evening Dress 92.3
Fashion 92.3
Person 88
Person 87.6
Person 85.9
Woman 85.5
Furniture 79.3
Room 68.7
Indoors 68.7
Girl 65.9
Portrait 61.8
Face 61.8
Photo 61.8
Photography 61.8
People 60.5
Chair 59.9
Sitting 57.1
Overcoat 56.9
Suit 56.9
Coat 56.9

Imagga
created on 2022-03-05

newspaper 29.3
man 24.9
person 22.9
product 20.1
male 18.5
people 18.4
chair 18
room 18
sitting 16.3
lifestyle 15.9
adult 15.8
creation 15.6
alone 14.6
indoors 14.1
black 13.2
portrait 12.9
women 12.7
dress 12.6
interior 12.4
business 12.1
blackboard 11.8
smiling 11.6
businessman 11.5
men 11.2
work 10.6
working 10.6
attractive 10.5
couple 10.5
seat 10.4
happiness 10.2
casual 10.2
light 10
happy 10
harp 9.9
holding 9.9
groom 9.8
painter 9.8
human 9.7
sad 9.6
home 9.6
fashion 9
one 9
cheerful 8.9
love 8.7
furniture 8.7
luxury 8.6
model 8.6
life 8.5
old 8.4
style 8.2
computer 8.1
looking 8
wall 7.7
bride 7.7
call 7.7
finance 7.6
wedding 7.4
indoor 7.3
lady 7.3
relaxing 7.3
danger 7.3
dirty 7.2
building 7.2
hair 7.1
smile 7.1
worker 7.1
posing 7.1
face 7.1
barber chair 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.2
indoor 93.4
furniture 73.1
person 70.9
clothing 66.5
drawing 56.9
dress 54.8
chair 54.3
woman 51.8

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 50.1%
Happy 52.3%
Calm 36.2%
Surprised 5.7%
Confused 2%
Sad 1.2%
Fear 1.1%
Disgusted 1.1%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 88%

Captions

Microsoft

a person sitting on a bed 44.8%
a person sitting in a room 44.7%
a person sitting on a bed 28.2%

Text analysis

Amazon

a
570
60