Human Generated Data

Title

[Pen drawing by Lyonel Feininger, 1902]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1016.125

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Pen drawing by Lyonel Feininger, 1902]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1016.125

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 94.3
Person 94.3
Face 71.4
Figurine 55.7
Clothing 55.4
Apparel 55.4
Wall 55.2

Clarifai
created on 2019-05-29

people 97.8
wear 96.8
retro 95.4
one 94.8
adult 94.7
man 92.9
woman 90.2
art 89
painting 88.6
vintage 87.9
portrait 83.5
old 81.1
door 80.9
no person 76.8
bill 75.5
paper 75.2
card 74.1
antique 73.7
illustration 71.6
wall 71.5

Imagga
created on 2019-05-29

push button 65.2
door 33.3
old 30.6
device 30.3
wall 27.4
architecture 21.1
building 18.2
wood 16.7
mechanical device 16.4
ancient 14.7
winder 14.5
house 14.2
vintage 14
antique 13.8
entrance 13.5
wooden 13.2
buzzer 12.9
lock 12.9
mechanism 12.3
key 12.2
detail 12.1
grunge 11.9
window 11.9
texture 11.8
telephone 11.4
call 11.3
home 11.2
dirty 10.8
metal 10.5
iron 9.9
signaling device 9.9
sculpture 9.7
dial telephone 9.7
style 9.6
urban 9.6
weathered 9.5
street 9.2
city 9.1
black 9.1
posing 8.9
fastener 8.8
rust 8.7
handle 8.6
statue 8.6
padlock 8.5
stone 8.4
fashion 8.3
rough 8.2
electronic equipment 8.1
paint 8.1
aged 8.1
history 8
closed 7.7
decoration 7.6
brown 7.4
equipment 7
modern 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

drawing 99.2
painting 98.2
art 96.4
old 93.9
sketch 92.4
clothing 86.9
cartoon 83.8
person 81.8
black 72.7
child art 68
white 64.7
dirty 11.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-52
Gender Female, 82.1%
Sad 92.3%
Happy 0.9%
Disgusted 0.9%
Surprised 0.8%
Calm 1.4%
Angry 1.8%
Confused 1.9%

AWS Rekognition

Age 48-68
Gender Male, 84.4%
Confused 56.2%
Disgusted 2.2%
Angry 7.9%
Happy 3.4%
Calm 23.3%
Sad 2.6%
Surprised 4.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.3%

Categories

Captions

Microsoft
created on 2019-05-29

a vintage photo of an old building 42.4%