Human Generated Data

Title

Untitled (woman and girl in costume)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19122

Human Generated Data

Title

Untitled (woman and girl in costume)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Apparel 99.7
Clothing 99.7
Furniture 99.3
Chair 99.3
Person 98
Human 98
Costume 89.6
Female 77.5
Skirt 68.8
Plaid 66
Tartan 66
Photo 64.2
Photography 64.2
Food 63.8
Dish 63.8
Meal 63.8
Shorts 63.1
Coat 62
Woman 61.4
Face 60.5
Suit 59.5
Overcoat 59.5
Floor 55.7
Dining Table 55.1
Table 55.1

Imagga
created on 2022-03-12

fashion 30.9
person 27.5
adult 27.2
people 26.8
attractive 26.6
outfit 23.3
women 22.9
pretty 21.7
portrait 20.7
city 19.1
walking 18
street 17.5
urban 17.5
clothing 17.5
black 17.2
lady 17
lifestyle 16.6
shopping 16.5
man 16.1
cute 15.8
human 15.7
smile 15.7
standing 15.6
business 15.2
male 14.9
dress 14.5
happy 14.4
model 14
coat 13.6
musician 13.4
style 13.3
legs 13.2
holding 13.2
singer 13
tartan 12.9
men 12.9
performer 12.2
sexy 12
one 11.9
expression 11.9
hair 11.9
sensuality 11.8
bag 11.7
professional 11.5
brunette 11.3
elegance 10.9
smiling 10.8
shop 10.7
lovely 10.7
walk 10.5
looking 10.4
corporate 10.3
two 10.2
fabric 9.9
bags 9.7
mall 9.7
group 9.7
body 9.6
clothes 9.4
casual 9.3
student 9.2
stylish 9
suit 9
posing 8.9
high 8.7
jacket 8.7
costume 8.6
happiness 8.6
luxury 8.6
talking 8.6
wall 8.5
fashionable 8.5
stand 8.5
store 8.5
modern 8.4
building 8.3
sale 8.3
full 8.2
girls 8.2
office 8.1
raincoat 8
face 7.8
motion 7.7
skirt 7.6
retail 7.6
studio 7.6
trench coat 7.6
joy 7.5
boutique 7.4
20s 7.3
cheerful 7.3
pose 7.2
hat 7.2
bright 7.1
garment 7.1

Google
created on 2022-03-12

Microsoft
created on 2022-03-12

clothing 89.7
footwear 77.4
text 74.1
person 72.2
chair 66.1
black and white 55.3
furniture 54.3

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 87.1%
Happy 10.9%
Surprised 0.7%
Angry 0.5%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Sad 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.3%
Person 98%

Captions

Microsoft

a person standing in a room 86.4%
a person standing in front of a building 74.6%
a person standing next to a building 70.9%

Text analysis

Amazon

MJIR
MJIR YE3RAS ACHAA
YE3RAS
ACHAA

Google

YT3RA
MJIR
A
MJIR YT3RA A