Human Generated Data

Title

Untitled (baby looking at mirrored table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17159

Human Generated Data

Title

Untitled (baby looking at mirrored table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98
Human 98
Banister 81.1
Handrail 81.1
Door 74.5
Floor 72.2
Teen 71.9
Blonde 71.9
Kid 71.9
Girl 71.9
Child 71.9
Female 71.9
Woman 71.9
Finger 66
Portrait 64.4
Photography 64.4
Photo 64.4
Face 64.4
Flooring 57.1

Imagga
created on 2022-02-26

bathroom 37.7
washbasin 37
vessel 34
basin 33.6
interior 30
room 26.7
dishwasher 25
modern 23.1
home 21.5
bathtub 20.5
house 19.2
white goods 18.9
clean 17.5
furniture 17.3
container 17.1
home appliance 15.9
sink 15.9
toilet 15.7
appliance 15.3
bath 15.2
inside 14.7
tub 13.8
shower 13.6
architecture 12.8
faucet 12.8
wash 12.5
design 12.4
decor 12.4
person 12.3
contemporary 12.2
wall 12
kitchen 11.7
domestic 11
lifestyle 10.8
man 10.7
indoors 10.5
building 10.4
window 10.3
luxury 10.3
3d 10.1
washing 9.7
apartment 9.6
equipment 9.6
people 9.5
glass 9.4
water 9.3
style 8.9
metal 8.8
steel 8.8
light 8.7
decoration 8.7
floor 8.4
health 8.3
technology 8.2
machine 8.1
washstand 8.1
tile 8.1
women 7.9
black 7.8
male 7.8
high 7.8
tiles 7.8
stainless 7.7
mirror 7.6
business 7.3
stylish 7.2
work 7.2
device 7.1
science 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 88.4
black and white 87.8
text 61.9

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 71.8%
Calm 51.6%
Sad 47.2%
Angry 0.3%
Surprised 0.3%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%

Captions

Microsoft

a person sitting in a cage 69.5%
a person sitting in a cage 62.7%
a person in a cage 62.6%

Text analysis

Amazon

FAS