Human Generated Data

Title

Study for "Man and Woman on a Bed"

Date

c. 1880-82

People

Artist: John Singer Sargent, American 1856 - 1925

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.7.27.3.A

Human Generated Data

Title

Study for "Man and Woman on a Bed"

People

Artist: John Singer Sargent, American 1856 - 1925

Date

c. 1880-82

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.7.27.3.A

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Art 94.2
Human 92.8
Person 92.7
Painting 89.4
Drawing 88
Sketch 82.5
Photo 58.1
Photography 58.1

Clarifai
created on 2020-04-25

art 99
people 98.9
wear 98.7
man 96.3
adult 96.2
painting 95.5
retro 95.4
vintage 94.3
old 94.1
portrait 92.4
illustration 91.6
sepia pigment 91.6
paper 91.3
antique 91.1
print 89.5
one 87.1
two 86
artistic 85.1
veil 82.3
ancient 81.1

Imagga
created on 2020-04-25

sketch 100
drawing 100
representation 100
vintage 33.9
grunge 30.7
paper 30.6
retro 30.3
old 29.3
antique 26
aged 21.7
ancient 21.6
texture 21.5
art 20.8
money 19.6
design 16.3
currency 16.2
frame 15.8
cash 15.6
stamp 15.5
banking 13.8
mail 13.4
grungy 13.3
graphic 13.1
letter 12.8
postmark 12.8
finance 12.7
material 12.5
damaged 12.4
business 12.2
border 11.8
pattern 11.6
financial 11.6
wallpaper 11.5
textured 11.4
rough 10.9
space 10.9
dirty 10.9
postage 10.8
wealth 10.8
bank 10.8
worn 10.5
exchange 10.5
old fashioned 10.5
brown 10.3
savings 10.3
dollar 10.2
page 10.2
decorative 10
paint 10
history 9.8
postal 9.8
market 9.8
detail 9.7
parchment 9.6
post 9.5
symbol 9.4
floral 9.4
note 9.2
global 9.1
decoration 9.1
black 9
envelope 9
philately 8.9
bills 8.7
us 8.7
pay 8.6
empty 8.6
blank 8.6
close 8.6
wall 8.6
canvas 8.5
flower 8.5
rich 8.4
element 8.3
investment 8.3
backdrop 8.3
backgrounds 8.1
collection 8.1
grime 7.8
color 7.8
states 7.7
dollars 7.7
decay 7.7
edge 7.7
card 7.7
one 7.5
economy 7.4
style 7.4
structure 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 99.8
sketch 99.7
painting 92.5
child art 85.7
person 79.8
woman 77
art 74.2
illustration 72.6
human face 62.9
text 59.4
clothing 56.1
cartoon 50.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 93.3%
Happy 4.1%
Calm 85.3%
Disgusted 1.4%
Surprised 5.2%
Confused 0.3%
Angry 3.3%
Fear 0.2%
Sad 0.3%

AWS Rekognition

Age 12-22
Gender Male, 87.9%
Happy 0.1%
Surprised 0%
Confused 0.1%
Disgusted 0%
Calm 99.1%
Sad 0.5%
Fear 0%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.7%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2020-04-25

an old photo of a person 55.5%
old photo of a person 48.4%
a photo of a person 48.3%