Human Generated Data

Title

Untitled (nun with two women, Manchester, New Hampshire)

Date

1932, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.80

Human Generated Data

Title

Untitled (nun with two women, Manchester, New Hampshire)

People

Artist: Durette Studio, American 20th century

Date

1932, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.80

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.1
Human 99.1
Furniture 99.1
Person 97.6
Living Room 93.7
Room 93.7
Indoors 93.7
Person 90.7
Interior Design 81.5
Painting 69.3
Art 69.3
Bedroom 65.7
Advertisement 63.9
Ceiling Fan 63.6
Appliance 63.6
Couch 57.5
Bed 56.7
Poster 55.8
Person 44.8

Clarifai
created on 2023-10-15

people 99.9
furniture 98.9
room 98.9
indoors 98.3
chair 96.8
woman 96.6
adult 96.3
seat 96.3
home 95.7
leader 95.3
group 95.1
man 94.2
monochrome 93.9
family 93.3
two 92.6
mirror 91.4
sit 91.3
art 90.5
portrait 90
painting 87.6

Imagga
created on 2021-12-14

barbershop 60.6
shop 46.3
mercantile establishment 36.6
television 25.3
place of business 24.6
black 21
man 20.1
sitting 18
people 17.8
person 17.4
male 17
window 16.4
telecommunication system 16.1
adult 14.4
hair 14.3
one 14.2
room 14
home 12.8
silhouette 12.4
establishment 12.3
chair 12.2
love 11.8
portrait 11.6
business 11.5
light 11.4
sexy 11.2
indoor 10.9
office 10.9
lifestyle 10.8
looking 10.4
body 10.4
dark 10
interior 9.7
women 9.5
pretty 9.1
attractive 9.1
fashion 9
businessman 8.8
model 8.5
skin 8.5
relax 8.4
relaxation 8.4
child 8.3
happy 8.1
lady 8.1
posing 8
bed 8
working 8
indoors 7.9
happiness 7.8
house 7.5
passion 7.5
technology 7.4
musical instrument 7.4
style 7.4
alone 7.3
sensuality 7.3
smiling 7.2
dress 7.2
romance 7.1
modern 7

Microsoft
created on 2021-12-14

text 100
person 80
black and white 77.1
house 76.6
screen 74.9
clothing 59
drawing 51
image 34.8
picture frame 8.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 99.5%
Calm 68.5%
Happy 25.8%
Angry 1.9%
Disgusted 1.8%
Sad 1%
Confused 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 26-42
Gender Female, 97.8%
Happy 98.5%
Calm 0.4%
Surprised 0.4%
Confused 0.2%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 21-33
Gender Female, 94.7%
Calm 52.5%
Disgusted 26.7%
Happy 7.1%
Angry 5.8%
Sad 4.2%
Fear 1.6%
Surprised 1.4%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 69.3%
Ceiling Fan 63.6%
Couch 57.5%

Text analysis

Amazon

112
67%
150
George
08
New
1932
George Durette Studio
Durette
Studio
Hampshire
Manchester, New Hampshire
Manchester,

Google

112 67% 08 150 George Durette Studio Manchester, New Hampshire 1932
112
67%
150
George
Manchester,
New
Hampshire
08
Durette
Studio
1932