Human Generated Data

Title

Untitled (three debutantes)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19276

Human Generated Data

Title

Untitled (three debutantes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19276

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Dress 99.3
Person 98.2
Human 98.2
Evening Dress 98
Fashion 98
Gown 98
Robe 98
Person 97.5
Person 97.3
Female 95.1
Woman 86.6
Girl 56.4

Clarifai
created on 2023-10-22

people 99.5
woman 98.1
adult 97
man 95.6
monochrome 95.4
portrait 94.8
art 94.4
wear 92.9
dress 92
indoors 90.9
group 86.5
music 85.6
two 81.8
actress 80.9
girl 79.7
three 79.5
musician 79
couple 78
costume 77.8
one 77.3

Imagga
created on 2022-03-05

robe 52
clothing 41.6
garment 41.5
person 34.7
dress 27.1
adult 24
people 24
black 22.8
fashion 18.8
male 17.7
covering 17.1
model 17.1
man 16.8
attractive 15.4
sexy 14.4
dark 14.2
silhouette 14.1
consumer goods 13.9
attendant 13.6
teacher 12.9
portrait 12.3
lady 12.2
business 12.1
face 12.1
professional 12
bride 11.9
pretty 11.9
women 11.9
love 11.8
clothes 11.2
body 11.2
figure 10.9
pose 10.9
standing 10.4
elegant 10.3
outfit 10.3
wedding 10.1
elegance 10.1
brunette 9.6
celebration 9.6
party 9.5
men 9.4
happy 9.4
expression 9.4
kin 9.2
style 8.9
couple 8.7
youth 8.5
costume 8.4
educator 8.3
groom 8
posing 8
dancer 8
businessman 7.9
hair 7.9
ceremony 7.8
eyes 7.7
luxury 7.7
dance 7.7
performer 7.7
sword 7.6
mannequin 7.6
holding 7.4
planner 7.3
make 7.3
gorgeous 7.2
lifestyle 7.2
looking 7.2
happiness 7
modern 7

Microsoft
created on 2022-03-05

text 97.7
dress 97.7
clothing 86.9
woman 84.8
black and white 82.7
person 71.3
posing 58.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Female, 63.9%
Happy 51.7%
Calm 22%
Surprised 16.3%
Confused 3.4%
Disgusted 3.1%
Sad 1.9%
Fear 0.9%
Angry 0.5%

AWS Rekognition

Age 45-53
Gender Male, 86.5%
Happy 59%
Calm 22.3%
Surprised 6.5%
Fear 5.6%
Sad 3.1%
Disgusted 1.7%
Confused 1.2%
Angry 0.7%

Feature analysis

Amazon

Person
Person 98.2%
Person 97.5%
Person 97.3%

Categories

Imagga

events parties 91.8%
text visuals 6.5%

Text analysis

Amazon

4
MJIR
с
g
KODYK
MJIR YT37A2 MAGOM
MAGOM
KODYK EIRN
YT37A2
EIRN

Google

C. MJIR YT37 A2 XAGOX MJIR Y T37 A°2 XAGOX
C.
MJIR
YT37
A2
XAGOX
Y
T37
A°2