Human Generated Data

Title

Genita and Kim

Date

1994

People

Artist: Nicholas Nixon, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.177

Copyright

© Nicholas Nixon

Human Generated Data

Title

Genita and Kim

People

Artist: Nicholas Nixon, American born 1947

Date

1994

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2001.177

Copyright

© Nicholas Nixon

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.7
Human 98.7
Person 96.8
Face 96.3
Clothing 91.7
Apparel 91.7
Finger 89
Person 82.9
Sleeve 66.9
Portrait 62.6
Photography 62.6
Photo 62.6
Skin 60.4
Home Decor 56.8

Clarifai
created on 2023-10-25

monochrome 99.9
portrait 99.8
people 99.6
girl 99.4
woman 98.9
black and white 97.8
adult 97.2
street 96.8
son 96.3
model 96.3
two 96.1
sepia 95.7
man 95.3
fashion 95.1
smile 94.8
couple 94.5
love 94.5
beautiful 92.4
child 90.4
one 90.1

Imagga
created on 2022-01-09

portrait 43.4
adult 37.5
face 34.1
person 34.1
attractive 31.5
black 31.4
sexy 26.5
people 26.2
pretty 25.9
fashion 24.9
hair 23.8
model 23.4
women 21.4
happy 21.3
brunette 20.1
skin 18
cute 18
smile 17.8
looking 17.6
lips 17.6
sensual 17.3
sensuality 17.3
world 16.7
lifestyle 16.6
eyes 16.4
expression 16.2
human 15.8
child 15.1
lady 14.6
elegance 14.3
love 14.2
man 14.1
smiling 13.8
close 13.7
make 13.6
head 13.4
eye 13.4
style 13.4
look 13.2
male 13.1
makeup 12.9
youth 12.8
two 12.7
couple 12.2
dress 11.8
dark 11.7
posing 11.6
one 11.2
elegant 11.1
happiness 11
girls 10.9
20s 10.1
seat belt 10
studio 9.9
monochrome 9.8
adolescent 9.8
juvenile 9.7
body 9.6
hands 9.6
car 9.2
long 9.2
cover girl 9.1
teenager 9.1
hand 9.1
cheerful 8.9
blond 8.9
closeup 8.8
bride 8.6
bow tie 8.5
casual 8.5
mouth 8.5
cosmetics 8.4
joy 8.4
emotion 8.3
nice 8.3
safety belt 8
erotic 7.9
together 7.9
clothing 7.8
married 7.7
gorgeous 7.3
lovely 7.1
restraint 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

human face 98.9
text 97.4
person 93.9
clothing 86.3
black and white 76
fashion accessory 59.4
woman 56.7
portrait 55.2
hair 48.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 81.1%
Calm 53.1%
Sad 25.3%
Happy 9.8%
Fear 3%
Confused 3%
Surprised 2.2%
Disgusted 2.1%
Angry 1.6%

AWS Rekognition

Age 20-28
Gender Female, 85.9%
Angry 33.3%
Sad 28.7%
Calm 24%
Disgusted 7.5%
Fear 3.2%
Surprised 1.4%
Confused 1.2%
Happy 0.7%

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

paintings art 72.4%
people portraits 26.9%

Captions

Microsoft
created on 2022-01-09

a man and a woman taking a selfie 28.7%