Human Generated Data

Title

Untitled (studio portrait of woman seated reading from book)

Date

c. 1905-1915, printed c. 1970

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5987

Human Generated Data

Title

Untitled (studio portrait of woman seated reading from book)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5987

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Interior Design 98.9
Indoors 98.9
Accessories 96.9
Accessory 96.9
Tie 96.9
Chair 95
Furniture 95
Sitting 93.7
Suit 91.9
Clothing 91.9
Overcoat 91.9
Coat 91.9
Apparel 91.9
Person 88.8
Room 85.7
Face 79.1
Person 78.3
Photography 65.7
Portrait 65.7
Photo 65.7
Performer 59.5
Theater 57.9
Musician 57
Musical Instrument 57
Electronics 56.7
Screen 56.7
Bedroom 55.2

Clarifai
created on 2019-11-16

people 99.8
adult 96.9
monochrome 95.9
man 95.6
music 93.1
group 92.8
room 92.5
movie 91.7
portrait 91
woman 89.6
two 88.6
indoors 88
musician 86.5
furniture 85
actor 84.6
chair 83.6
sit 81.1
wear 80.7
outfit 80.1
family 77.6

Imagga
created on 2019-11-16

musical instrument 28.6
window 22.6
accordion 20.6
keyboard instrument 18.2
chair 18.1
blackboard 17.2
black 16.9
wind instrument 16.3
man 16.1
old 14.6
male 12.8
room 12.5
interior 12.4
building 12.3
light 12
person 12
barbershop 11.8
people 10.6
device 10.3
wall 10.3
dark 10
silhouette 9.9
shop 9.9
art 9.8
men 9.4
architecture 9.4
seat 9.2
business 9.1
dirty 9
grunge 8.5
house 8.4
history 8
home 8
adult 7.9
urban 7.9
antique 7.8
glass 7.8
human 7.5
vintage 7.4
historic 7.3
design 7.3
indoor 7.3
body 7.2
mercantile establishment 7.1
working 7.1
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97.6
text 95.6
person 94.8
furniture 89.8
black and white 89.4
clothing 83.7
man 79.3
human face 71.3
chair 70.9
table 69.7
monochrome 51.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Female, 54.8%
Fear 45%
Sad 45.1%
Angry 45.1%
Calm 54.4%
Confused 45.1%
Disgusted 45.1%
Happy 45.3%
Surprised 45%

AWS Rekognition

Age 38-56
Gender Male, 61.9%
Happy 0.2%
Fear 0.2%
Sad 36.9%
Confused 0.8%
Disgusted 0.5%
Angry 2%
Calm 59.2%
Surprised 0.2%

AWS Rekognition

Age 37-55
Gender Male, 97.4%
Calm 0.1%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 99.7%
Sad 0%
Fear 0%
Confused 0.1%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 58
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 96.9%
Chair 95%
Suit 91.9%

Categories

Imagga

interior objects 68%
paintings art 17.9%
food drinks 12.9%