Human Generated Data

Title

Untitled (studio portrait of woman with baby in dress)

Date

1942

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9040

Human Generated Data

Title

Untitled (studio portrait of woman with baby in dress)

People

Artist: Martin Schweig, American 20th century

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9040

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 97.2
Person 97.2
Person 95.8
Door 87.6
Clothing 85.5
Apparel 85.5
Face 83.4
Art 72.7
Portrait 70
Photography 70
Photo 70
Baby 55

Clarifai
created on 2023-10-26

monochrome 99.6
portrait 99
people 98.5
black and white 96.5
man 95.5
girl 95
mask 94.9
face 94.3
woman 94.1
Halloween 93.8
art 93.7
adult 92.4
studio 92.2
costume 90.6
model 90.1
person 89.3
guy 86.7
dress 85.8
lid 84.7
music 82.4

Imagga
created on 2022-01-23

sexy 36.9
person 35.6
adult 31.2
fashion 30.9
black 29
model 28.8
attractive 28.7
portrait 26.5
body 26.4
hair 23
face 22.7
people 21.2
lady 20.3
style 20
pretty 19.6
sensual 19.1
make 19.1
human 18
studio 17.5
sensuality 17.3
dark 16.7
brunette 16.6
posing 16
male 15.6
microphone 15.2
elegance 15.1
expression 14.5
looking 13.6
mask 13.5
man 13.4
device 13
gorgeous 12.7
women 12.7
vogue 12.6
seductive 12.4
brass 12.3
skin 11.9
erotic 11.7
lifestyle 11.6
costume 11.5
one 11.2
dress 10.8
art 10.8
lovely 10.7
modern 10.5
elegant 10.3
lamp 10
clothing 10
music 10
covering 9.9
stylish 9.9
nude 9.7
spotlight 9.7
hairstyle 9.5
musician 9.2
musical instrument 9.2
emotion 9.2
slim 9.2
makeup 9.2
blond 8.9
performer 8.8
love 8.7
wind instrument 8.7
cute 8.6
eyes 8.6
sitting 8.6
smoke 8.4
singer 8.2
retro 8.2
pose 8.2
life 8
warrior 7.8
desire 7.7
dance 7.6
source of illumination 7.6
passion 7.5
figure 7.5
vintage 7.4
holding 7.4
lips 7.4
entertainment 7.4
protective covering 7.3
danger 7.3
weapon 7.2
bass 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.2
person 89.2
black and white 85.6
human face 79.1
clothing 68.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Female, 78.6%
Happy 97.2%
Calm 1.3%
Surprised 1%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 18-24
Gender Female, 98.8%
Calm 80.9%
Surprised 13%
Fear 4.1%
Happy 0.6%
Sad 0.6%
Disgusted 0.5%
Angry 0.3%
Confused 0.1%

Feature analysis

Amazon

Person 97.2%

Categories

Text analysis

Amazon

محجد
VT27A2
MUJ VT27A2 محجد
MUJ