Human Generated Data

Title

Transvestite performer in dressing room at a performance of the “Cockettes,” New York City

Date

1971

People

Artist: Leonard Freed, American 1929 - 2006

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2014.457

Copyright

© Leonard Freed/Magnum Photos

Human Generated Data

Title

Transvestite performer in dressing room at a performance of the “Cockettes,” New York City

People

Artist: Leonard Freed, American 1929 - 2006

Date

1971

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 2014.457

Copyright

© Leonard Freed/Magnum Photos

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Skin 99.1
Furniture 98.1
Person 95.4
Human 95.4
Back 93.7
Table 89.9
Desk 88.8
Screen 82.8
Display 82.8
LCD Screen 82.8
Electronics 82.8
Monitor 82.8
Indoors 81.5
Face 71
Room 70.9
Pc 62.3
Computer 62.3
Finger 58
Beard 56.5

Clarifai
created on 2018-02-10

people 99.9
one 99.1
adult 99
two 98.6
group 98.1
vehicle 97.1
administration 95.8
man 94.9
wear 94.5
group together 94.2
woman 93.1
military 91.8
many 91.4
war 91
three 89.9
transportation system 87.2
recreation 86.8
veil 86.8
music 86.8
leader 86.2

Imagga
created on 2018-02-10

person 20.7
man 20.1
work 19.9
people 19.5
adult 18.5
hair 17.4
device 17
worker 16.1
industry 15.4
salon 14.6
appliance 14.4
human 13.5
job 13.3
black 13.2
hand 12.9
male 12.8
attractive 12.6
factory 12.5
sexy 12
looking 12
body 12
women 11.9
equipment 11.3
metal 11.3
manufacturing 10.7
steel 10.6
working 10.6
computer 10.5
technology 10.4
machine 10.3
industrial 10
welder 9.9
studio 9.9
fashion 9.8
mask 9.8
labor 9.7
portrait 9.7
skin 9.6
repair 9.6
light 9.3
blond 9.2
occupation 9.2
close 9.1
business 9.1
tool 8.9
welding 8.9
manufacture 8.8
durables 8.7
brunette 8.7
hands 8.7
lifestyle 8.7
skill 8.7
craft 8.6
sitting 8.6
men 8.6
model 8.5
face 8.5
dryer 8.5
blower 8.5
safety 8.3
sensuality 8.2
home 8
interior 8
weld 7.9
laptop 7.9
pretty 7.7
health 7.6
workplace 7.6
book 7.6
relax 7.6
head 7.6
relaxation 7.5
instrument 7.5
happy 7.5
care 7.4
protection 7.3
student 7.2
music 7.2
office 7.2
smile 7.1
table 7.1
indoors 7

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 98.3
indoor 88.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Female, 94.7%
Happy 0.6%
Sad 5.2%
Surprised 0.6%
Confused 0.6%
Calm 90%
Angry 1.6%
Disgusted 1.3%

AWS Rekognition

Age 26-43
Gender Female, 81.1%
Calm 39.7%
Surprised 1.7%
Sad 39.8%
Angry 6.4%
Disgusted 8.7%
Confused 1.8%
Happy 1.8%

Feature analysis

Amazon

Person 95.4%

Captions

Microsoft
created on 2018-02-10

a person holding a cat 48.4%
a person standing next to a cat 48.3%
a person holding a cat 43.2%

Text analysis

Amazon

ME