Human Generated Data

Title

Untitled from the photo-romance Phantom Lady or Kismet

Date

1996-1998

People

Artist: Pushpamala N., Indian born 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Elizabeth C. Lyman, Francis H. Burr Memorial Fund, Gweneth Knight Memorial Fund, Ann B. Goodman, Jose Soriano, Arthur Pardee and Ann Goodman, Lotty Zucker Foundation, 2010.515.12

Copyright

© Pushpamala N.

Human Generated Data

Title

Untitled from the photo-romance Phantom Lady or Kismet

People

Artist: Pushpamala N., Indian born 1956

Date

1996-1998

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Human 99.3
Person 99.3
Person 98.4
Cafe 96.2
Restaurant 96.2
Sitting 95.7
Person 87
Person 84.5
Apparel 83
Clothing 83
Cafeteria 67.7
Flooring 66.3
Studio 63
Person 63
Leisure Activities 62.5
Silhouette 61.3
Photography 59.3
Photo 59.3
Person 58.3
Indoors 58.3
Room 57.5
Furniture 56.8
Couch 56.8
Person 49.6
Person 46.3

Clarifai
created on 2018-03-22

people 99.9
group 98.2
music 97.2
man 97
furniture 96.9
room 96.6
woman 96.5
adult 96.2
chair 96
musician 94
street 93.4
child 93.2
instrument 92.8
guitar 91.8
seat 90.3
monochrome 90.2
group together 90.1
sit 89.2
easy chair 87.1
family 85.6

Imagga
created on 2018-03-22

grand piano 50.3
piano 44.5
stringed instrument 36.4
keyboard instrument 33.7
percussion instrument 31.8
musical instrument 31.4
silhouette 28.1
man 23.5
people 20.1
adult 17.4
kin 16.3
sax 15.9
male 15.7
window 15
person 14.8
sunset 14.4
black 14
couple 13.1
teacher 12.9
business 12.8
life 11.9
women 11.1
urban 10.5
boy 10.4
men 10.3
love 10.3
city 10
educator 9.8
interior 9.7
professional 9.6
together 9.6
mother 9.6
child 9.6
indoor 9.1
world 9.1
travel 8.4
building 8.4
evening 8.4
music 8.1
wind instrument 8.1
shadow 8.1
home 8
portrait 7.8
play 7.8
youth 7.7
sky 7.7
walk 7.6
dark 7.5
future 7.4
vacation 7.4
body 7.2
transportation 7.2
family 7.1
summer 7.1
businessman 7.1

Google
created on 2018-03-22

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51.5%
Confused 45.4%
Sad 51.8%
Angry 45.3%
Happy 45.3%
Disgusted 45.1%
Surprised 45.1%
Calm 47%

AWS Rekognition

Age 48-68
Gender Male, 54.8%
Angry 46%
Confused 45.1%
Disgusted 45%
Surprised 45.1%
Calm 53.1%
Sad 45.3%
Happy 45.4%

AWS Rekognition

Age 26-44
Gender Female, 52.7%
Angry 45.2%
Happy 45.1%
Sad 51.1%
Disgusted 45.1%
Surprised 45.2%
Calm 48.2%
Confused 45.1%

AWS Rekognition

Age 20-38
Gender Female, 54.6%
Disgusted 45.2%
Surprised 45.3%
Angry 45.2%
Confused 45.2%
Calm 50.4%
Sad 46%
Happy 47.6%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people sitting in front of a window 82.9%
a group of people in a room 82.8%
a group of people sitting in a chair 82.7%