Human Generated Data

Title

INVESTMENTFOND, from the portfolio "Box 1"

Date

1970

People

Artist: Otto Muehl, Austrian 1925 - 2013

Publisher: Edition Hundertmark, Berlin,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, The Willy and Charlotte Reber Collection. Gift of Charlotte Reber, 1996.151.7

Copyright

© Archives Otto Muehl / Artists Rights Society (ARS), New York

Human Generated Data

Title

INVESTMENTFOND, from the portfolio "Box 1"

People

Artist: Otto Muehl, Austrian 1925 - 2013

Publisher: Edition Hundertmark, Berlin,

Date

1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 98.9
Person 98.9
Person 96.2
Furniture 90.2
Sitting 85.1
Chair 84.6
Stroller 82.9
Clothing 80.7
Apparel 80.7
Floor 69.4
Coat 64.5
Overcoat 64.5
Flooring 62.4
Photography 60
Photo 60
Suit 59.6
Undershirt 58.2
Art 57.8
Machine 57.2
Spoke 57.2
Face 56.6
Portrait 56.6

Clarifai
created on 2018-02-09

people 99.6
two 97.2
adult 97
woman 96.2
man 95.3
one 95.2
wear 91.8
indoors 82.4
administration 77.9
music 77.1
three 76.3
family 75.6
home 75.2
group 74.9
child 74.4
recreation 74
portrait 73.8
love 73.4
street 72.7
four 72.4

Imagga
created on 2018-02-09

black 19.8
man 19.5
adult 17.8
portrait 16.8
person 16.8
male 15
device 14.3
people 12.3
fashion 12.1
wall 11.1
model 10.9
equipment 10.5
urban 10.5
body 10.4
city 10
cleaner 9.9
sexy 9.6
sport 9.5
call 9.5
window 9.2
sensuality 9.1
holding 9.1
dress 9
human 9
building 8.9
telephone 8.8
hair 8.7
women 8.7
face 8.5
grunge 8.5
business 8.5
crutch 8.4
elegance 8.4
street 8.3
alone 8.2
style 8.2
dirty 8.1
suit 8.1
posing 8
businessman 7.9
play 7.8
men 7.7
concrete 7.7
newspaper 7.6
hand 7.6
mask 7.5
one 7.5
room 7.4
light 7.4
danger 7.3
stick 7.2
lifestyle 7.2
active 7.2
clothing 7.2

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

wall 97.5
gallery 96.2
man 94.6
scene 80.2
room 78.4

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Female, 99.6%
Sad 20.5%
Calm 68.8%
Confused 2.9%
Angry 1.2%
Happy 2.1%
Disgusted 3.1%
Surprised 1.3%

AWS Rekognition

Age 30-47
Gender Male, 77.3%
Sad 8.4%
Calm 41.6%
Angry 18.8%
Surprised 7.7%
Confused 10%
Happy 2.2%
Disgusted 11.3%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a black and white photo of a man 62.4%
a man standing in a room 62.3%
a man jumping in the air 45.1%

Text analysis

Amazon

otlo
INVESTMENTFO/)
Mar2
muehl INVESTMENTFO/) Mar2 /9Fo
muehl
/9Fo