Human Generated Data

Title

Pam and Marigny, New Orleans

Date

1985

People

Artist: Sarah Benham, American born 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of of the Artist, P1989.4

Human Generated Data

Title

Pam and Marigny, New Orleans

People

Artist: Sarah Benham, American born 1941

Date

1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of of the Artist, P1989.4

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.4
Human 97.4
Leisure Activities 87.9
Finger 84.1
Musician 82.2
Musical Instrument 82.2
Clothing 71.5
Apparel 71.5
Cat 66.6
Animal 66.6
Mammal 66.6
Pet 66.6
Sleeve 57.8

Clarifai
created on 2023-10-26

monochrome 99.7
portrait 99.5
girl 99.3
people 99.1
analogue 96.7
baby 96.1
self 95.8
child 95.4
art 94.2
light 94
woman 94
music 93.9
man 93.9
studio 93.6
shadow 93.3
black and white 93.2
model 93.1
one 92.7
son 92.6
adult 91.8

Imagga
created on 2022-01-09

black 37.2
person 34.3
hair 28.5
portrait 27.8
adult 26.6
people 23.4
fashion 23.4
model 23.3
attractive 21.7
sexy 21.7
human 18.8
studio 18.2
face 17.8
pretty 17.5
man 17.5
male 17.1
one 14.9
style 14.8
lady 14.6
skin 14.5
cute 14.4
dark 14.2
posing 13.3
make 12.7
women 12.7
dress 12.7
lips 12
body 12
expression 11.9
sensuality 11.8
performer 11.7
brunette 11.3
hands 11.3
eyes 11.2
makeup 11
elegance 10.9
clothing 10.9
comedian 10.8
looking 10.4
television 10.3
blond 10.2
sitting 9.5
youth 9.4
sensual 9.1
look 8.8
hat 8.8
nude 8.7
love 8.7
hand 8.6
teenager 8.2
smiling 8
covering 7.9
smile 7.8
child 7.7
erotic 7.7
desire 7.7
thinking 7.6
happy 7.5
entertainer 7.4
light 7.4
long 7.3
pose 7.3
lifestyle 7.2
eye 7.2
night 7.1
device 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.1
human face 96.4
person 92.7
black and white 92
monochrome 68.8
clothing 63.3
girl 56.3
portrait 54.2
screenshot 22.4
picture frame 12.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 87.4%
Sad 72.2%
Calm 27.6%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.4%
Cat 66.6%

Categories

Imagga

paintings art 96.4%
pets animals 3.4%

Captions