Human Generated Data

Title

Untitled (young woman seated at desk reading letter)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12923

Human Generated Data

Title

Untitled (young woman seated at desk reading letter)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.8
Chair 98.4
Furniture 98.4
Person 97.1
Flooring 96.8
Sitting 96.3
Floor 76.7
Person 72.2
Game 60.9
Chess 60.9
Cabinet 58.9
Indoors 55.2
Interior Design 55.2

Clarifai
created on 2019-11-16

people 100
one 99.1
adult 98.7
two 98.4
seat 98
furniture 97.6
room 97.1
chair 96.6
man 96.5
easy chair 94.4
sit 94
administration 93.8
woman 92.9
indoors 92.2
home 91.5
leader 88.5
group 87.3
music 86.3
vehicle 83.1
three 82.5

Imagga
created on 2019-11-16

barber chair 79.2
chair 72.2
seat 53.3
barbershop 43.7
shop 37.3
furniture 34.9
architecture 28.1
old 27.9
mercantile establishment 26.6
building 24
window 19.3
ancient 19
history 18.8
city 18.3
place of business 17.7
furnishing 17.3
interior 16.8
arch 16.5
religion 15.2
tourism 14.8
historic 14.7
people 14.5
man 14.1
church 13.9
salon 13.6
inside 12.9
wall 12.8
cathedral 12.5
culture 12
art 11.7
vintage 11.6
door 11.5
historical 11.3
travel 11.3
monument 11.2
home 11.2
landmark 10.8
light 10.7
male 10.6
indoors 10.5
stone 10.1
house 10
style 9.6
palace 9.6
black 9.6
sculpture 9.6
men 9.4
religious 9.4
fashion 9
establishment 8.9
love 8.7
antique 8.7
luxury 8.6
room 8.4
famous 8.4
indoor 8.2
dress 8.1
catholic 8
column 7.9
person 7.8
entrance 7.7
holy 7.7
architectural 7.7
hairdresser 7.6
street 7.4
tourist 7.2
adult 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

indoor 98.9
wall 97.7
living 92.9
room 85.9
black and white 74.5
person 72.6
chair 64.7
furniture 59
clothing 58.7
bedroom 30.1

Face analysis

Amazon

AWS Rekognition

Age 13-25
Gender Female, 50.3%
Happy 45.3%
Calm 54.5%
Fear 45%
Confused 45%
Angry 45%
Surprised 45.1%
Disgusted 45%
Sad 45.1%

AWS Rekognition

Age 28-44
Gender Female, 51.9%
Fear 45.5%
Sad 45.6%
Disgusted 45.1%
Surprised 49.3%
Calm 49.1%
Happy 45%
Angry 45.2%
Confused 45.3%

Feature analysis

Amazon

Chair 98.4%
Person 97.1%

Captions

Microsoft

a black and white photo of a living room 87.6%
an old photo of a living room 87.5%
a person in a white room 87.4%

Text analysis

Google

EirW
79LY
79LY MDVLE EirW
MDVLE