Human Generated Data

Title

Untitled (four women in dresses)

Date

1964

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19193

Human Generated Data

Title

Untitled (four women in dresses)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19193

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Clothing 99.3
Apparel 99.3
Person 98.7
Person 98.4
Person 98.2
Evening Dress 90.5
Fashion 90.5
Gown 90.5
Robe 90.5
Female 89
Mannequin 87.1
Dress 84.2
Woman 78.3
Room 73.4
Indoors 73.4
Door 68.8
Flooring 66
Sleeve 64.3
Floor 60.3
People 60.3
Shop 57.7
Furniture 55.2

Clarifai
created on 2023-10-22

people 99.7
woman 97.8
man 97.5
two 96.9
adult 96.5
monochrome 95.5
wear 94.6
wedding 91.8
dress 91.2
group 90.4
group together 88.1
indoors 86.2
room 86.1
three 86.1
street 85.5
bride 85.4
four 82.3
family 81.5
dressing room 78.9
child 78.1

Imagga
created on 2022-03-05

people 27.9
person 27.7
women 25.3
adult 23.9
man 23.5
pretty 22.4
door 20.6
lifestyle 20.2
portrait 20.1
dress 19.9
modern 18.9
attractive 18.2
fashion 18.1
business 17.6
office 17.6
interior 16.8
window 16.6
men 16.3
happy 16.3
lady 16.2
sexy 16.1
male 14.9
room 14.8
hair 14.3
model 14
clothing 13.9
corporate 13.7
black 13.3
sliding door 13.2
indoors 13.2
standing 13
professional 13
movable barrier 12.2
couple 12.2
building 12.1
two 11.9
work 11.8
businessman 11.5
casual 11
smiling 10.8
human 10.5
boutique 10.4
body 10.4
suit 10.3
elegant 10.3
house 10
smile 10
attendant 9.9
worker 9.6
looking 9.6
home 9.6
light 9.4
face 9.2
shop 9.1
sensual 9.1
garment 9.1
success 8.8
luxury 8.6
gate 8.6
communication 8.4
elegance 8.4
barrier 8.3
training 8.3
inside 8.3
makeup 8.2
indoor 8.2
style 8.2
group 8.1
job 8
cute 7.9
love 7.9
doorway 7.9
brunette 7.8
happiness 7.8
skin 7.7
dancer 7.6
hand 7.6
businesspeople 7.6
executive 7.6
togetherness 7.6
shopping 7.3
sensuality 7.3
gorgeous 7.2
turnstile 7.2
team 7.2
lovely 7.1
look 7
glass 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

dress 99.6
text 99.1
woman 95.7
clothing 94.8
person 91.7
outdoor 85.5
black and white 83.2
wedding dress 62.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 98.4%
Happy 84%
Confused 5.6%
Sad 3.1%
Disgusted 2.4%
Surprised 1.9%
Calm 1.6%
Fear 0.9%
Angry 0.5%

AWS Rekognition

Age 47-53
Gender Male, 78.9%
Calm 76.8%
Happy 11.4%
Confused 5.1%
Sad 4.3%
Disgusted 0.9%
Angry 0.6%
Surprised 0.6%
Fear 0.3%

AWS Rekognition

Age 36-44
Gender Male, 86.8%
Happy 55.3%
Calm 42.2%
Surprised 1.2%
Disgusted 0.3%
Fear 0.3%
Confused 0.3%
Sad 0.2%
Angry 0.2%

AWS Rekognition

Age 28-38
Gender Female, 70.3%
Calm 61.1%
Disgusted 13.3%
Sad 7.7%
Fear 6.5%
Confused 4.3%
Surprised 3%
Happy 2.8%
Angry 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 98.7%
Person 98.4%
Person 98.2%

Categories

Imagga

events parties 96.2%
people portraits 2.9%

Text analysis

Amazon

6
KODYK
A°2
MJIR
YTEA A°2 KODYK EIEW
YTEA
MJIR YT37A2 MAGOM
MAGOM
YT37A2
EIEW

Google

KODYK 2.rEEJAEirn KODYK
KODYK
2.rEEJAEirn