Human Generated Data

Title

Untitled (nun standing next to altar with cross and palm plant on stand)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6017

Human Generated Data

Title

Untitled (nun standing next to altar with cross and palm plant on stand)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Accessories 99.1
Bag 99.1
Accessory 99.1
Handbag 99.1
Purse 92.4
Art 84.1
Painting 84.1
Shop 74.8
Window Display 74.8
Furniture 64.2
Tote Bag 59.5

Clarifai
created on 2019-11-16

furniture 99.7
people 99.7
room 99.2
seat 98.9
print 98.7
chair 98.4
group 97.9
illustration 97.8
adult 97.3
two 96.8
wear 96.1
no person 94
indoors 93.7
man 93.6
outfit 91.1
one 91.1
art 88.4
three 87.9
woman 87.3
vehicle 86.6

Imagga
created on 2019-11-16

television 41.7
broadcasting 21.6
design 18
home theater 17.2
window 16.8
telecommunication 16.1
modern 16.1
house 15.9
building 15.8
theater 15.2
interior 15
art 15
black 14.4
laptop 14.3
desk 14.1
office 13.6
business 13.3
silhouette 13.2
home 12.8
night 12.4
structure 12.3
people 12.3
monitor 12.2
furniture 12.2
room 12.1
celebration 12
computer 11.9
man 11.4
male 11.3
medium 10.7
cartoon 10.7
table 10.7
moon 10.6
indoors 10.5
life 10.2
architecture 10.1
decoration 10.1
elegance 10.1
dark 10
chair 10
person 9.6
technology 9.6
urban 9.6
light 9.3
card 9.3
holiday 9.3
blackboard 9.2
indoor 9.1
equipment 9.1
working 8.8
autumn 8.8
happy 8.8
graphic 8.8
horror 8.7
screen 8.7
drawing 8.7
wall 8.5
grunge 8.5
telecommunication system 8.5
sofa 8.3
color 7.8
scary 7.7
men 7.7
pattern 7.5
personal computer 7.3
shop 7.1
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

drawing 96.4
wall 95.7
text 94.8
chair 91.6
indoor 91.6
cartoon 85.8
black and white 84.1
table 80.7
sketch 61.7
painting 59.6
clothing 55.3
furniture 29.1

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 53.7%
Surprised 45.3%
Sad 45.6%
Confused 45.8%
Calm 50%
Happy 45%
Angry 47.7%
Fear 45.5%
Disgusted 45.2%

AWS Rekognition

Age 19-31
Gender Male, 53%
Happy 45%
Angry 45.1%
Disgusted 45%
Sad 54.4%
Surprised 45%
Calm 45.4%
Fear 45.1%
Confused 45.1%

AWS Rekognition

Age 45-63
Gender Male, 54.5%
Calm 55%
Surprised 45%
Angry 45%
Confused 45%
Happy 45%
Fear 45%
Sad 45%
Disgusted 45%

Feature analysis

Amazon

Painting 84.1%

Captions

Microsoft

an old photo of a living room 80.3%
a black and white photo of a living room 71.4%
old photo of a living room 71.3%

Text analysis

Google

P
P