Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (two women in dresses)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19252

Human Generated Data

Title

Untitled (two women in dresses)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19252

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.3
Human 99.3
Person 98.7
Clothing 96.1
Apparel 96.1
Home Decor 92
Shoe 90.9
Footwear 90.9
Door 87.2
Floor 80.3
Overcoat 65.1
Coat 65.1
Flower 59.7
Plant 59.7
Blossom 59.7
Evening Dress 57.2
Fashion 57.2
Gown 57.2
Robe 57.2
Text 55.7

Clarifai
created on 2023-10-22

people 99.7
woman 97.9
two 97.2
family 95.6
wedding 95
adult 94.7
door 94.6
doorway 94.4
man 90.8
wear 89.1
dress 87.7
street 87.3
love 86.8
house 86.3
portrait 83.6
home 79.8
window 79.5
groom 78.8
indoors 78
child 77.9

Imagga
created on 2022-02-25

barbershop 54.7
shop 46.5
mercantile establishment 33.1
people 23.4
adult 23.4
home 22.3
place of business 22.1
man 20.8
interior 19.5
person 18.8
telephone 18.2
male 17.7
call 17.6
hairdresser 17.1
portrait 16.8
indoors 16.7
dress 15.4
happy 15
couple 14.8
lady 14.6
salon 14.3
smile 14.3
pretty 14
fashion 13.6
pay-phone 13.5
house 12.5
family 12.5
holding 12.4
cleaner 12.4
inside 12
kitchen 11.6
business 11.5
cheerful 11.4
brunette 11.3
standing 11.3
old 11.1
establishment 11
happiness 11
indoor 11
room 10.6
attractive 10.5
office 10.2
smiling 10.1
window 10.1
electronic equipment 10
child 10
equipment 9.7
life 9.6
black 9.6
professional 9.4
model 9.3
two 9.3
20s 9.2
style 8.9
posing 8.9
women 8.7
men 8.6
sitting 8.6
vintage 8.3
human 8.2
alone 8.2
one 8.2
sexy 8
job 8
businessman 7.9
chair 7.9
urban 7.9
building 7.7
bride 7.7
city 7.5
children 7.3
new 7.3
lifestyle 7.2
cute 7.2
hair 7.1
lovely 7.1
face 7.1
working 7.1
modern 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 98.5
clothing 95.6
person 89.1
dress 89
woman 83.3
smile 61.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 100%
Happy 99%
Surprised 0.5%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Sad 0%
Calm 0%

Feature analysis

Amazon

Person
Shoe
Person 99.3%

Categories

Text analysis

Amazon

133
16

Google

133 ..... ..... ..... ..... ..... .....
133
.....