Human Generated Data

Title

Tête-a-tête

Date

1895

People

Artist: Edvard Munch, Norwegian 1863 - 1944

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Lynn and Philip A. Straus, class of 1937, M21538

Human Generated Data

Title

Tête-a-tête

People

Artist: Edvard Munch, Norwegian 1863 - 1944

Date

1895

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Lynn and Philip A. Straus, class of 1937, M21538

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Art 99.1
Human 94.9
Drawing 94.9
Painting 88.6
Sketch 86.2
Modern Art 82.4
Person 79.7
Canvas 63

Clarifai
created on 2019-10-30

illustration 99
art 98.9
painting 98.7
people 98.3
print 96.8
adult 95.6
one 92.7
retro 92.6
vintage 92.4
man 92.3
wear 90.9
old 90.6
symbol 89.7
picture frame 89.4
portrait 89
desktop 88.9
antique 88.8
furniture 87.5
paper 87
text 86.8

Imagga
created on 2019-10-30

sketch 46.2
paper 43.1
money 39.1
currency 37.7
drawing 36.2
cash 31.1
old 30
dollar 28.8
bill 27.6
representation 27.5
finance 26.2
vintage 24.8
business 24.3
grunge 23.8
banking 22.1
wealth 20.6
bank 20.6
savings 20.5
financial 20.5
note 20.2
antique 20.1
dollars 19.3
retro 18.8
exchange 18.1
ancient 16.4
pay 16.3
close 16
texture 16
hundred 15.5
book 15.5
aged 15.4
bills 13.6
us 13.5
book jacket 13.4
investment 12.8
one 12.7
dirty 12.6
banknote 12.6
history 12.5
payment 12.5
blank 12
clock 12
states 11.6
worn 11.5
stamp 11.3
covering 11.2
page 11.1
frame 10.8
symbol 10.8
binding 10.8
art 10.7
design 10.6
sign 10.5
old fashioned 10.5
jacket 10.5
instrument 10.2
success 9.7
object 9.5
empty 9.4
card 9.4
rich 9.3
container 9.3
economy 9.3
wrapping 8.9
binder 8.9
brown 8.8
closeup 8.8
finances 8.7
loan 8.6
post 8.6
united 8.6
wallpaper 8.4
device 8.4
letter 8.3
historic 8.2
message 8.2
style 8.2
black 7.8
income 7.8
parchment 7.7
profit 7.7
save 7.6
pattern 7.5
wood 7.5
stock 7.5
measuring instrument 7.4
document 7.4
protective covering 7.4
blackboard 7.4
collection 7.2
travel 7

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

drawing 99.8
sketch 99.4
painting 98.9
art 94.9
text 93.8
gallery 88.1
scene 83
room 81.5
child art 77.8
person 53.6
illustration 52.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 39-57
Gender Male, 92.8%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Sad 2.3%
Angry 1.4%
Fear 0.1%
Calm 95.7%
Surprised 0.3%

AWS Rekognition

Age 4-12
Gender Female, 59.3%
Sad 41.7%
Happy 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 4.2%
Calm 53.5%
Surprised 0%

AWS Rekognition

Age 22-34
Gender Female, 96.2%
Sad 12.5%
Happy 58.5%
Angry 0.4%
Confused 2.9%
Disgusted 0.3%
Fear 1.3%
Calm 19.4%
Surprised 4.7%

Microsoft Cognitive Services

Age 31
Gender Female

Feature analysis

Amazon

Painting 88.6%
Person 79.7%

Categories

Captions

Microsoft
created on 2019-10-30

a room with art on the wall 60.5%
a room with pictures on the wall 59.6%
a picture of a room 54.5%