Human Generated Data

Title

Reading

Date

1896

People

Artist: Eugène Carrière, French 1849 - 1906

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Arthur K. and Mariot F. Solomon Collection, 2021.255

Human Generated Data

Title

Reading

People

Artist: Eugène Carrière, French 1849 - 1906

Date

1896

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-07-05

Art 97
Person 79.3
Human 79.3
Painting 71

Imagga
created on 2022-07-05

window screen 73.8
screen 59.7
protective covering 48.6
vintage 46.3
old 43.2
grunge 41.7
blackboard 41.7
texture 40.3
frame 38.4
covering 37.7
blank 35.2
wall 29.9
chalkboard 29.4
aged 28.1
retro 27.9
black 27.1
antique 26.8
empty 24.9
board 23.2
dirty 22.6
chalk 21.4
paper 21.2
textured 20.2
grungy 19.9
television 19.3
rough 19.1
art 19.1
space 18.6
ancient 18.2
border 18.1
school 18
education 17.3
design 17
classroom 16.5
wood 15.9
pattern 15.7
message 15.5
gray 15.3
wooden 15
text 14.8
letter 14.7
film 14.6
telecommunication system 14.6
concrete 14.4
worn 14.3
symbol 14.1
note 13.8
material 13.4
rusty 12.4
copy 12.4
learn 12.3
wallpaper 12.3
lesson 11.7
nobody 11.7
reminder 11.6
binding 11.6
surface 11.5
damaged 11.5
study 11.2
communication 10.9
photograph 10.8
backdrop 10.7
teach 10.7
decoration 10.5
write 10.4
sheet 10.3
icon 10.3
cover 10.2
global 10
dark 10
decorative 10
album 9.7
business 9.7
close 9.7
notice 9.7
announcement 9.7
stucco 9.7
stain 9.6
painted 9.5
weathered 9.5
writing 9.4
stone 9.4
stamp 8.9
museum 8.9
brown 8.8
book jacket 8.7
rust 8.7
stained 8.7
aging 8.6
obsolete 8.6
card 8.5
word 8.5
drawing 8.3
paint 8.2
closeup 8.1
metal 8.1
billboard 8
structure 8
interior 8
printed 7.9
instant 7.9
postage 7.9
book 7.9
scratched 7.8
paintings 7.8
envelope 7.8
render 7.8
jacket 7.8
spot 7.7
mail 7.7
post 7.6
age 7.6
photography 7.6
element 7.4
page 7.4
metallic 7.4
idea 7.1
work 7.1

Google
created on 2022-07-05

Microsoft
created on 2022-07-05

text 98.8
painting 97.8
drawing 96.2
monitor 92.8
gallery 91.2
sketch 90.9
person 89.7
human face 88.8
room 82.5
picture frame 27.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-30
Gender Female, 97.6%
Calm 88.9%
Happy 7.9%
Surprised 6.3%
Fear 6%
Sad 2.8%
Disgusted 0.4%
Confused 0.3%
Angry 0.1%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 79.3%
Painting 71%

Text analysis

Amazon

tires
tyma. tires
tyma.
Original

Google

gent Connere tson on a tirer p Sugona Classione
gent
Connere
tson
on
a
tirer
p
Sugona
Classione