Human Generated Data

Title

Untitled (woman trying on fur coat)

Date

1950s

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1531

Human Generated Data

Title

Untitled (woman trying on fur coat)

People

Artist: John Deusing, American active 1940s

Date

1950s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.9
Apparel 99.9
Person 99.1
Human 99.1
Person 98.7
Plant 98.7
Suit 98.4
Coat 98.4
Overcoat 98.4
Person 95.6
Blossom 93.5
Flower 93.5
Flower Bouquet 93.2
Flower Arrangement 93.2
Chair 90.1
Furniture 90.1
Robe 83.3
Fashion 83.3
Gown 78.9
Tuxedo 76.6
Female 74.7
Face 74.6
Wedding 72
Shoe 71.9
Footwear 71.9
Portrait 68.7
Photography 68.7
Photo 68.7
Shoe 68.2
Wedding Gown 65.3
People 64.5
Woman 61.9
Home Decor 60.3
Man 58.8
Dress 58.6

Imagga
created on 2021-12-14

musical instrument 49.9
accordion 49.1
keyboard instrument 39.3
wind instrument 30.4
wheeled vehicle 28.4
people 24
man 23.5
person 20.9
tricycle 18.3
male 17.7
adult 16.2
shopping cart 14.4
happy 14.4
vehicle 14
outdoors 13.4
lady 13
handcart 13
fashion 12.8
conveyance 12.8
city 12.5
park 12.3
chair 12.3
portrait 12.3
couple 12.2
suit 12
business 11.5
walking 11.4
attractive 11.2
smiling 10.8
youth 10.2
street 10.1
lifestyle 10.1
dress 9.9
child 9.7
urban 9.6
women 9.5
world 9.4
teen 9.2
outdoor 9.2
pretty 9.1
human 9
boy 8.7
container 8.7
cute 8.6
walk 8.6
smile 8.5
culture 8.5
clothing 8.5
summer 8.4
fun 8.2
skateboard 8.1
success 8
parent 7.9
day 7.8
face 7.8
model 7.8
outside 7.7
tree 7.7
casual 7.6
building 7.6
friends 7.5
clothes 7.5
friendship 7.5
holding 7.4
style 7.4
pedestrian 7.4
holiday 7.2
seat 7.1
businessman 7.1
modern 7

Google
created on 2021-12-14

Flowerpot 86.9
Plant 86.2
Gesture 85.2
Rectangle 78.2
Houseplant 78.1
Tints and shades 75.3
Font 73.5
Monochrome photography 71.9
Chair 71.5
Monochrome 70.7
Hat 68.5
Room 68.5
Pattern 67.9
Vintage clothing 67.8
Stock photography 63.6
Art 63
History 61.3
Visual arts 58.5
Sitting 54.4
Square 54.1

Microsoft
created on 2021-12-14

clothing 95.1
text 94.2
person 89.9
black and white 88.9
footwear 75
furniture 69.9
woman 69.3
dress 58.3

Face analysis

Amazon

Google

AWS Rekognition

Age 47-65
Gender Male, 70.4%
Calm 52.7%
Sad 23.2%
Angry 19.7%
Confused 1.4%
Surprised 0.9%
Happy 0.9%
Fear 0.7%
Disgusted 0.5%

AWS Rekognition

Age 33-49
Gender Female, 67.9%
Sad 65.6%
Calm 27.5%
Happy 6.5%
Confused 0.2%
Angry 0.1%
Surprised 0%
Fear 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 71.9%

Captions

Microsoft

a person standing in front of a building 76.7%
a person standing in front of a building 62.6%
a man and a woman standing in front of a building 44%

Text analysis

Amazon

YT33A2
M_117 YT33A2
M_117
A

Google

MJ17 YT3RA2 002MA
MJ17
002MA
YT3RA2