Human Generated Data

Title

[Untitled]

Date

1972

People

Artist: Brandt & Lenze,

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, 1996.152.31

Human Generated Data

Title

[Untitled]

People

Artist: Brandt & Lenze,

Date

1972

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection, Gift of Charlotte Reber, 1996.152.31

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Advertisement 100
Poster 100
Human 99.6
Person 99.6
Person 99.5
Person 99.3
Person 99.2
Person 98.5
Person 98.4
Person 97.2
Military 84.1
Military Uniform 84.1
Person 80.1
Person 75.2
People 72.3
Officer 64.6
Soldier 62.6
Art 60.5
Sculpture 60.5
Army 60.2
Armored 60.2
Musical Instrument 57.7
Musician 57.7
Person 56.8
Duel 55.6

Clarifai
created on 2019-11-09

people 99.5
illustration 99.3
adult 97.2
man 97.2
art 94.9
vertical 94.8
woman 91.5
print 90.7
bill 89.8
wear 86.3
portrait 85.2
leader 84.6
text 83.5
indoors 82.8
monochrome 80
group 76.8
military 76.3
administration 75.2
sword 74.6
lithograph 73.2

Imagga
created on 2019-11-09

stringed instrument 38.2
musical instrument 36.8
guitar 31.5
statue 28.2
electric guitar 24.3
sculpture 22.8
black 20.4
art 19.1
old 18.1
architecture 18
religion 17
ancient 16.4
travel 14.8
person 14.7
culture 14.5
brass 14.4
building 14.3
wind instrument 14.2
device 14.1
monument 14
weapon 13.9
trombone 13.3
bowed stringed instrument 13
instrument 11.9
city 11.6
tourism 11.5
bass 11.4
face 11.4
sexy 11.2
history 10.7
fashion 10.6
body 10.4
sword 10.3
historic 10.1
man 10.1
people 10
landmark 9.9
music 9.9
vintage 9.9
violin 9.8
night 9.8
temple 9.6
sax 9.5
light 9.4
entertainment 9.2
metal 8.9
ornament 8.6
golden 8.6
religious 8.4
portrait 8.4
dark 8.4
church 8.3
tourist 8.2
antique 7.8
catholic 7.8
worship 7.7
artist 7.7
pretty 7.7
grunge 7.7
performance 7.7
historical 7.5
stage 7.5
traditional 7.5
gold 7.4
adult 7.4
performer 7.3
musician 7.2
hand 7.2
hair 7.1
male 7.1

Google
created on 2019-11-09

Poster 91.2
Illustration 70.1
Art 58.1

Microsoft
created on 2019-11-09

text 100
book 98.2
person 89.5
clothing 85.4
statue 85.4
old 80.5
black 77.7
man 65.9
posing 63.2
vintage 28.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 54.9%
Happy 45%
Calm 52.1%
Angry 46.9%
Sad 45.1%
Surprised 45.5%
Confused 45%
Disgusted 45%
Fear 45.3%

AWS Rekognition

Age 36-52
Gender Male, 53.6%
Disgusted 45.1%
Confused 45%
Surprised 45.1%
Fear 45%
Sad 45.4%
Happy 45%
Calm 53.7%
Angry 45.6%

AWS Rekognition

Age 32-48
Gender Male, 54.9%
Happy 45%
Calm 53.9%
Surprised 45%
Angry 45%
Sad 46.1%
Fear 45%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 32-48
Gender Male, 55%
Angry 45.7%
Surprised 45%
Calm 54.2%
Confused 45%
Sad 45%
Fear 45%
Happy 45%
Disgusted 45%

AWS Rekognition

Age 37-55
Gender Male, 54.5%
Happy 45%
Sad 45.1%
Surprised 45.1%
Fear 51.4%
Angry 48.3%
Confused 45%
Disgusted 45.1%
Calm 45%

AWS Rekognition

Age 35-51
Gender Male, 54.4%
Happy 45.8%
Fear 45.2%
Surprised 45.3%
Disgusted 45.1%
Angry 47.7%
Sad 45.4%
Confused 45.1%
Calm 50.4%

AWS Rekognition

Age 44-62
Gender Male, 53.8%
Happy 45%
Angry 45%
Fear 45%
Surprised 45.1%
Calm 54.6%
Sad 45.2%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 54-72
Gender Male, 54.6%
Disgusted 45.1%
Fear 45%
Surprised 45%
Confused 45%
Angry 45.1%
Happy 45%
Sad 45.1%
Calm 54.7%

AWS Rekognition

Age 45-63
Gender Male, 54.6%
Calm 47%
Disgusted 45%
Sad 45.1%
Happy 51.5%
Confused 45%
Fear 45.2%
Angry 45.6%
Surprised 45.6%

AWS Rekognition

Age 37-55
Gender Male, 53.5%
Calm 53.9%
Angry 45%
Sad 46.1%
Disgusted 45%
Fear 45%
Confused 45%
Happy 45%
Surprised 45%

Feature analysis

Amazon

Poster 100%
Person 99.6%

Categories

Text analysis

Amazon

1933
RETTUNG
VATERLANDES
DES VATERLANDES
SAMMLUNGSBEWEGUNG
DES
SAMMLUNGSBEWEGUNG ZUR RETTUNG
ZUR
1971
ns

Google

1933 1971 SAMMLUNGSBEWEGUNG ZUR RETTUNG DES VATERLANDES A d
1933
1971
SAMMLUNGSBEWEGUNG
ZUR
RETTUNG
DES
VATERLANDES
A
d