Human Generated Data

Title

Untitled (woman in hat and striped coat posed with Vogue magazine, dog at her feet)

Date

1962

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13023

Human Generated Data

Title

Untitled (woman in hat and striped coat posed with Vogue magazine, dog at her feet)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 98.3
Person 98.3
Clothing 92.2
Apparel 92.2
Room 79.2
Indoors 79.2
Advertisement 72.3
Poster 68.9
Paper 67.5
Reading 59.8
Flyer 59.6
Brochure 59.6
Female 59
Footwear 58.2
Shoe 58.2

Imagga
created on 2022-02-05

perfume 31.9
negative 31.1
toiletry 26
sexy 25.7
fashion 25.6
film 25.5
model 24.9
body 24
attractive 23.1
bottle 22.4
adult 21.4
pretty 21
person 20.6
people 19.5
photographic paper 19
portrait 18.1
water bottle 17.9
dress 17.2
posing 16.9
studio 16
sensual 15.5
elegance 14.3
style 14.1
sensuality 13.6
human 13.5
container 13.5
black 13.4
standing 13
lady 13
cute 12.9
photographic equipment 12.6
health 12.5
slim 12
vessel 11.7
dance 11.4
face 11.4
brunette 11.3
hair 11.1
skin 11
clothing 10.4
elegant 10.3
women 10.3
dancer 10
gorgeous 10
pose 10
fitness 9.9
fashionable 9.5
man 9.4
clothes 9.4
20s 9.2
modern 9.1
art 8.9
lovely 8.9
healthy 8.8
seductive 8.6
wearing 8.6
fit 8.3
stylish 8.1
lifestyle 8
medical 7.9
ballet 7.9
fresh 7.8
life 7.8
male 7.8
erotic 7.6
happy 7.5
one 7.5
makeup 7.3
exercise 7.3
make 7.3
looking 7.2

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 91.9
clothing 88.7
person 80.2
black and white 77.5
footwear 63

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 78.4%
Calm 96.4%
Surprised 1%
Sad 0.8%
Fear 0.5%
Happy 0.4%
Disgusted 0.4%
Confused 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a person holding a racket 35.2%
a person with a racket 35.1%
a person is holding a racket 33%

Text analysis

Amazon

vopi
01-88362