Human Generated Data

Title

Untitled (two girls and dollhouse)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17111

Human Generated Data

Title

Untitled (two girls and dollhouse)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 100
Chair 100
Human 99.6
Person 99.6
Person 96.4
Apparel 89.7
Clothing 89.7
Indoors 81.2
Living Room 81.2
Room 81.2
Shelf 75.1
Table 65.9
Child 65.9
Kid 65.9
Couch 65.8
Person 62.4
Photography 62.1
Photo 62.1
Portrait 62.1
Face 62.1
Cabinet 59.8
Bed 56.7
Electronics 56.7
Display 56.7
Screen 56.7
Monitor 56.7
Baby 56.2
Sitting 55.7

Imagga
created on 2022-02-26

people 29.5
person 29
man 25.5
shop 24.5
male 19.8
adult 19.1
home 17.5
mercantile establishment 16.9
senior 16.9
musical instrument 16.1
smiling 15.9
indoors 15.8
world 15.2
happy 15
patient 14.8
toyshop 14.6
television 13.5
wind instrument 13.2
portrait 12.9
men 12.9
concertina 12.8
old 12.5
room 12
sitting 12
place of business 11.9
holding 11.5
blackboard 11.4
free-reed instrument 11.1
lifestyle 10.8
holiday 10.7
retirement 10.6
child 10.4
occupation 10.1
telecommunication system 10
case 9.9
working 9.7
business 9.7
retired 9.7
looking 9.6
negative 9.2
cheerful 8.9
equipment 8.6
elderly 8.6
casual 8.5
camera 8.3
film 8.3
accordion 8.2
playing 8.2
childhood 8.1
computer 8
family 8
worker 8
couple 7.8
happiness 7.8
black 7.8
education 7.8
class 7.7
school 7.7
fun 7.5
mature 7.4
technology 7.4
sick person 7.4
back 7.3
classroom 7.3
teacher 7.2
cute 7.2
smile 7.1
face 7.1
kid 7.1
interior 7.1
work 7.1

Google
created on 2022-02-26

Black 89.6
Black-and-white 85.2
Style 84.1
Table 76.4
Flash photography 75.9
Monochrome photography 73.1
Monochrome 72.7
Eyewear 71.6
Sitting 69.3
Room 68.8
T-shirt 64.8
Stock photography 63.7
Hat 61.3
House 60.6
Recreation 59.7
Vintage clothing 58.8
Font 58.7
Street 55.9
Child 55.6
Chair 55.5

Microsoft
created on 2022-02-26

text 94.1
person 91.5
indoor 88.6
black and white 78.9
clothing 68.1

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 95%
Happy 77.9%
Calm 8.7%
Surprised 8.4%
Fear 2.6%
Sad 1%
Angry 0.6%
Disgusted 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a person sitting on a table 84.2%
a boy and a girl sitting on a table 40.8%
a small child sitting on a table 40.7%

Text analysis

Amazon

DEFEE
been