Human Generated Data

Title

The Creation III

Date

c. 1875-c. 1877

People

Artist: Frederick Hollyer, British 1837 - 1933

Artist after: Edward Burne-Jones, British 1833 - 1898

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Alvin Whitley Fund, P1994.10

Human Generated Data

Title

The Creation III

People

Artist: Frederick Hollyer, British 1837 - 1933

Artist after: Edward Burne-Jones, British 1833 - 1898

Date

c. 1875-c. 1877

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Alvin Whitley Fund, P1994.10

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Art 94.9
Painting 93
Person 89.4
Human 89.4
Person 79.8
Wood 79.4

Clarifai
created on 2023-10-26

art 99.3
no person 98
painting 96.8
people 96
Renaissance 95.2
man 95.1
ancient 94.8
veil 94.6
print 94.2
retro 93.5
old 92.7
carpentry 92
one 92
sculpture 91.1
illustration 90.1
adult 88.3
woman 87.9
religion 87.9
antique 87.5

Imagga
created on 2021-12-14

carving 99.6
sculpture 84.2
plastic art 51.3
art 44
old 30.7
pattern 27.4
ancient 26
texture 25.7
figure 22.9
design 22.8
temple 20.6
wood 20
wall 19.7
detail 18.5
stone 18.3
architecture 18.1
close 17.7
rough 17.3
statue 16.9
brown 16.9
surface 16.8
wooden 16.7
religion 16.1
structure 16.1
history 16.1
material 16.1
antique 14.9
oak 14.7
grunge 14.5
dirty 14.5
textured 14
plank 13.7
carved 13.7
dark 12.5
face 12.1
board 11.6
pine 11.6
panel 11.3
religious 11.2
hardwood 10.8
vintage 10.8
travel 10.6
backgrounds 10.5
construction 10.3
wallpaper 10
element 9.9
tree 9.5
color 9.5
closeup 9.4
culture 9.4
cemetery 9.4
frame 9.2
effect 9.1
tourist 9.1
aged 9.1
decoration 9
retro 9
building 8.7
mystery 8.7
obsolete 8.6
god 8.6
rusty 8.6
monument 8.4
floor 8.4
grain 8.3
bark 8.1
fantasy 8.1
natural 8
light 8
lumber 7.9
carve 7.8
carpentry 7.8
century 7.8
timber 7.8
golden 7.7
weathered 7.6
covering 7.6
decorative 7.5
tourism 7.4
style 7.4
gold 7.4
exterior 7.4
historic 7.3
sketch 7.2
paper 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 88.9
painting 87.7
art 85.5
drawing 80.7
sketch 62.2
stone 7.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 12-22
Gender Female, 94.7%
Calm 81.8%
Sad 13.3%
Angry 3.1%
Confused 0.6%
Happy 0.5%
Fear 0.3%
Disgusted 0.3%
Surprised 0.1%

AWS Rekognition

Age 12-22
Gender Female, 98.7%
Sad 57.5%
Calm 34.9%
Happy 2.4%
Angry 2.3%
Fear 1.6%
Confused 0.9%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 13-25
Gender Female, 96.7%
Angry 71.5%
Calm 26.5%
Sad 0.9%
Disgusted 0.4%
Confused 0.3%
Surprised 0.1%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Painting 93%
Person 89.4%

Captions

Microsoft
created on 2021-12-14

an old photo of a person 87.7%
an old photo of a girl 72.7%
old photo of a person 72.6%

Text analysis

Google