Human Generated Data

Title

Untitled (religious statue in garden)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19184

Human Generated Data

Title

Untitled (religious statue in garden)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19184

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Poster 94.7
Advertisement 94.7
Text 91.1
Person 86.8
Human 86.8
Flyer 83.3
Brochure 83.3
Paper 83.3

Clarifai
created on 2023-10-22

people 98.8
wear 97.9
art 97.5
one 97.1
woman 97.1
portrait 96.4
dress 95.9
painting 95.8
adult 94.2
print 92.4
child 92.1
picture frame 89.5
wedding 88.5
no person 88.3
two 86.6
retro 86
bride 85.4
old 84.8
fashion 82.5
window 81.5

Imagga
created on 2022-02-25

book jacket 46.3
jacket 37.9
bookend 27.7
wrapping 27.4
vintage 27.3
old 23.7
frame 23.4
support 22.9
retro 22.1
device 20.5
covering 20.4
paper 20.4
antique 19.9
black 18.6
blackboard 18.5
texture 18
blank 18
grunge 17.9
film 16.3
empty 14.6
wall 14.5
art 14.3
book 14
ancient 13.8
design 13.5
page 13
letter 12.8
border 12.7
culture 12
structure 11.3
negative 11.1
note 11
aged 10.9
office 10.7
style 10.4
business 10.3
brown 10.3
symbol 10.1
element 9.9
album 9.7
photograph 9.7
bookmark 9.7
text 9.6
sheet 9.4
space 9.3
dirty 9
material 8.9
product 8.9
printed 8.8
slide 8.8
textured 8.8
graphic 8.7
stamp 8.7
memory 8.7
decoration 8.7
damaged 8.6
post 8.6
grungy 8.5
wood 8.3
sign 8.3
envelope 8.3
template 8.3
historic 8.2
building 8.1
icon 7.9
wooden 7.9
masterpiece 7.9
known 7.9
postmark 7.9
paintings 7.8
spot 7.7
mail 7.7
worn 7.6
unique 7.6
desk 7.5
one 7.5
room 7.4
ornate 7.3
global 7.3
people 7.2
painter 7.1
interior 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 99.4
tree 98.8
handwriting 96.1
picture frame 63.7
illustration 55.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 64.3%
Calm 37.5%
Sad 20.7%
Surprised 12.3%
Fear 9.9%
Disgusted 7.8%
Confused 6.4%
Angry 4.1%
Happy 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 86.8%

Categories

Captions

Microsoft
created on 2022-02-25

a close up of a sign 80.6%
a sign for a photo 65.8%
a sign on a wall 65.7%

Text analysis

Amazon

64
DEC
Salette
1964
130
14
La Salette 130
La

Google

1964 DEC 64 Ca Salette 20
1964
DEC
64
Ca
Salette
20