Human Generated Data

Title

Untitled (girl sitting by window with dolls)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17354

Human Generated Data

Title

Untitled (girl sitting by window with dolls)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Musical Instrument 88.8
Musician 88.8
Leisure Activities 88.7
Performer 88.1
Guitarist 88.1
Guitar 88.1
Apparel 81
Clothing 81
Suit 72
Overcoat 72
Coat 72
Female 71.4
Girl 66.6
Photography 60.7
Photo 60.7
Dog 60.5
Mammal 60.5
Animal 60.5
Canine 60.5
Pet 60.5
Person 60.3
Shorts 57.9
Door 57.8
Smoke 57.7
Face 57.4
Floor 57
Brick 56.1

Imagga
created on 2022-02-26

man 24.2
person 22.4
people 21.7
men 18.9
adult 18.8
world 15.3
black 15
negative 14.9
male 14.3
portrait 13.6
urban 13.1
business 12.7
city 12.5
film 12
women 11.9
danger 11.8
worker 11.6
smile 11.4
life 11.2
work 10.3
happy 10
human 9.7
group 9.7
building 9.1
modern 9.1
suit 9
photographic paper 8.6
casual 8.5
attractive 8.4
house 8.3
silhouette 8.3
speed 8.2
light 8.2
dirty 8.1
professional 8.1
home 8
smiling 8
working 7.9
couple 7.8
happiness 7.8
industry 7.7
crowd 7.7
grunge 7.7
walking 7.6
power 7.6
fashion 7.5
style 7.4
instrument 7.4
sport 7.3
mask 7.2
lifestyle 7.2
music 7.2
transportation 7.2
team 7.2
activity 7.2
gun 7.1

Google
created on 2022-02-26

Window 90.6
Black 89.7
Black-and-white 85.5
Style 84
Plant 84
Tree 82.2
Door 78.5
Monochrome 77.9
Tints and shades 76.8
Monochrome photography 76.4
Art 71.3
Glass 69.2
Room 67.5
Wood 65.1
Visual arts 63.9
Vintage clothing 63.6
Toddler 62.3
Sitting 59.4
Still life photography 58.8
Fun 56.7

Microsoft
created on 2022-02-26

black and white 92.6
text 92.4
person 91
snow 73.4
monochrome 61.7
clothing 51.9

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 95.6%
Calm 63.3%
Sad 19.6%
Happy 9%
Surprised 3.6%
Disgusted 2%
Angry 0.9%
Fear 0.9%
Confused 0.6%

Feature analysis

Amazon

Person 99.5%
Dog 60.5%

Captions

Microsoft

a man and a woman standing in front of a window 69.6%
a man and a woman standing next to a window 69.5%
a person standing next to a window 69.4%

Text analysis

Amazon

VISRAS
VISRAS HOADOY
HOADOY