Human Generated Data

Title

Untitled (woman seated in armchair with elbows on arms of chair)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12971

Human Generated Data

Title

Untitled (woman seated in armchair with elbows on arms of chair)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12971

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Jar 97.5
Pottery 97.5
Vase 97.5
Human 97
Person 96.9
Interior Design 96.7
Indoors 96.7
Ornament 96.3
Ikebana 96.3
Art 96.3
Plant 96.3
Flower Arrangement 96.3
Blossom 96.3
Flower 96.3
Living Room 94.2
Room 94.2
Table Lamp 92.4
Lamp 92.4
Home Decor 87.3
Furniture 80.3
Couch 80.3
Potted Plant 73.4
Person 60.8
Sitting 59.3
Flooring 58

Clarifai
created on 2019-11-16

people 99.9
adult 98.1
street 97.6
one 96.9
man 96.7
two 94.1
monochrome 93.3
woman 92.4
portrait 92.1
administration 89.7
room 87.7
furniture 87.4
wear 86.8
sit 85.5
actor 85.1
chair 83.7
music 81.6
group 80.6
movie 77.1
indoors 76.1

Imagga
created on 2019-11-16

barbershop 78.1
shop 60.2
mercantile establishment 47
hairdresser 42
place of business 31.3
chair 30.3
barber chair 26.8
man 22.8
window 17.5
male 17
seat 16.9
black 16.9
interior 16.8
person 16.2
people 16.2
establishment 15.6
inside 15.6
adult 14.9
room 14.9
furniture 14.4
men 13.7
old 13.2
building 12.9
home 12.8
salon 11.8
house 11.7
indoors 11.4
fashion 11.3
sexy 11.2
style 11.1
vintage 10.7
architecture 10.1
indoor 10
light 10
portrait 9.7
urban 9.6
door 9.5
dark 9.2
attractive 9.1
retro 9
human 9
one 9
work 8.6
city 8.3
looking 8
body 8
ancient 7.8
model 7.8
sitting 7.7
wall 7.7
pretty 7.7
elegance 7.6
office 7.5
equipment 7.3
alone 7.3
lady 7.3
sensual 7.3
sensuality 7.3
dress 7.2
hair 7.1
love 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 95.6
indoor 94.8
vase 93.3
black and white 92.9
person 91.7
man 90.3
text 83.5
furniture 83.1
monochrome 65.9
clothing 58.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-57
Gender Female, 96.2%
Confused 1.6%
Calm 39.6%
Sad 2.2%
Surprised 1.3%
Happy 51.6%
Disgusted 1.3%
Fear 0.6%
Angry 1.7%

Microsoft Cognitive Services

Age 46
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%

Categories