Human Generated Data

Title

Untitled from the photo-romance Phantom Lady or Kismet

Date

1996-1998

People

Artist: Pushpamala N., Indian born 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Elizabeth C. Lyman, Francis H. Burr Memorial Fund, Gweneth Knight Memorial Fund, Ann B. Goodman, Jose Soriano, Arthur Pardee and Ann Goodman, Lotty Zucker Foundation, 2010.515.17

Copyright

© Pushpamala N.

Human Generated Data

Title

Untitled from the photo-romance Phantom Lady or Kismet

People

Artist: Pushpamala N., Indian born 1956

Date

1996-1998

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Elizabeth C. Lyman, Francis H. Burr Memorial Fund, Gweneth Knight Memorial Fund, Ann B. Goodman, Jose Soriano, Arthur Pardee and Ann Goodman, Lotty Zucker Foundation, 2010.515.17

Copyright

© Pushpamala N.

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Human 98.5
Person 98.5
Person 98.1
Person 96.6
Person 92.5
Clothing 76.3
Apparel 76.3
Room 74.5
Indoors 74.5
Person 68.4
Housing 66
Building 66
Door 61.3
Prison 58.3
Workshop 56.5

Clarifai
created on 2018-03-22

people 100
adult 99.2
one 98.5
man 97.9
group 97.9
two 97.4
woman 97.4
group together 95.2
indoors 95
vehicle 93.3
administration 91.4
transportation system 91.2
military 90.2
wear 89.9
furniture 88.5
four 88.3
offense 88
train 87.8
three 87.7
room 87.3

Imagga
created on 2018-03-22

pay-phone 38.5
shop 37.3
telephone 31.8
elevator 30.1
device 26.6
mercantile establishment 26.6
barbershop 26.3
electronic equipment 23.6
lifting device 23.3
equipment 19.6
man 18.1
old 18.1
place of business 17.7
call 15.7
shoe shop 15.5
building 15.2
people 14.5
male 14.2
steel 14.1
urban 14
industrial 13.6
interior 13.3
black 13.2
men 12.9
industry 12.8
window 12
factory 11.8
metal 11.3
person 11.2
architecture 10.9
worker 10.7
structure 10.6
fashion 10.5
door 9.9
portrait 9.7
establishment 9.6
power 9.2
wood 9.2
machine 9.1
adult 9
light 8.7
work 8.6
sitting 8.6
glass 8.6
business 8.5
modern 8.4
energy 8.4
house 8.4
city 8.3
instrument of execution 8.2
history 8
working 8
machinery 7.8
electric chair 7.8
station 7.7
fuel 7.7
historic 7.3
sexy 7.2
home 7.2
travel 7
life 7

Google
created on 2018-03-22

Microsoft
created on 2018-03-22

person 93.9
black 73.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 91.1%
Calm 16%
Disgusted 2.1%
Surprised 0.9%
Happy 1%
Sad 77.5%
Angry 1.2%
Confused 1.2%

AWS Rekognition

Age 26-43
Gender Female, 82.8%
Sad 0.8%
Confused 0.4%
Surprised 0.8%
Angry 0.5%
Disgusted 0.2%
Happy 0.3%
Calm 97.1%

AWS Rekognition

Age 35-52
Gender Female, 51.2%
Disgusted 45.1%
Sad 45.2%
Confused 45%
Angry 45.1%
Calm 48.2%
Happy 51.3%
Surprised 45.1%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Text analysis

Amazon

6