Human Generated Data

Title

Over the Ultimate

Date

1926

People

Artist: Rockwell Kent, American 1882 - 1971

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs, M3333

Human Generated Data

Title

Over the Ultimate

People

Artist: Rockwell Kent, American 1882 - 1971

Date

1926

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 97.3
Art 77.1
Hand 67.6
Finger 61
Acrobatic 59.8
Advertisement 57.3
Drawing 56.5
Poster 50.3
Person 47.1

Imagga
created on 2022-03-04

blackboard 58.3
frame 38.2
vintage 33.9
chalkboard 33.3
grunge 30.6
chalk 27.3
texture 27.1
old 26.5
retro 26.2
design 25.3
board 24.7
school 22.4
blank 22.3
black 21.4
card 21.2
education 20.8
message 19.2
border 19
text 18.3
paper 17.5
graphic 17.5
classroom 17.5
aged 17.2
background 16.6
antique 16.4
art 16.2
drawing 15.6
note 15.6
symbol 14.8
teach 14.6
learn 14.2
element 14.1
screen 14
study 14
business 14
wallpaper 13.8
floral 13.6
pattern 13
envelope 12.9
empty 12.9
lesson 12.7
space 12.4
write 12.2
wall 12
ancient 11.2
display 11.2
letter 11
billboard 11
decoration 10.9
wood 10.8
silhouette 10.8
class 10.6
word 10.4
writing 10.3
leaf 10.1
communication 10.1
flower 10
idea 9.8
notice 9.7
invitation 9.6
college 9.5
label 9.4
swirl 9.2
banner 9.2
backdrop 9.1
gold 9
sign 9
wooden 8.8
photograph 8.8
celebration 8.8
monitor 8.6
container 8.5
finance 8.4
classic 8.4
artwork 8.2
brown 8.1
object 8.1
notebook 8
chocolate 8
icon 7.9
book 7.9
album 7.8
reminder 7.8
modern 7.7
university 7.7
rusty 7.6
elegance 7.6
traditional 7.5
greeting 7.4
style 7.4
template 7.3
ornate 7.3
dirty 7.2
holiday 7.2
science 7.1
copy 7.1
creative 7.1

Google
created on 2022-03-04

Elbow 84
Knee 81.8
Art 81.6
Font 75.2
Rectangle 75.1
Balance 70.5
Marine mammal 68.1
Recreation 67.5
Tail 66.9
Wrist 65.3
Illustration 64.1
Visual arts 60.5
Vintage clothing 59.5
Advertising 55.7
Photographic paper 50.7

Microsoft
created on 2022-03-04

text 93.5
gallery 87.9
room 78.7
scene 71.8
drawing 71.5
dance 53.3
painting 51

Feature analysis

Amazon

Poster 50.3%
Person 47.1%

Captions

Microsoft

a screen shot of a person 28.3%
a photo of a person 28.2%

Text analysis

Amazon

and