Human Generated Data

Title

Untitled (baby looking at mirrored table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17159

Human Generated Data

Title

Untitled (baby looking at mirrored table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17159

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98
Human 98
Handrail 81.1
Banister 81.1
Door 74.5
Floor 72.2
Blonde 71.9
Female 71.9
Teen 71.9
Woman 71.9
Kid 71.9
Girl 71.9
Child 71.9
Finger 66
Portrait 64.4
Face 64.4
Photography 64.4
Photo 64.4
Flooring 57.1

Clarifai
created on 2023-10-29

people 99.8
child 99.3
one 97.9
recreation 97.7
boy 96.9
monochrome 95
movie 94.5
wear 94
adult 93.6
outfit 92.3
man 92
fun 90.8
facial expression 90
enjoyment 89.4
vehicle 89.4
ladder 88.2
girl 87.5
two 86.6
actor 86.6
indoors 85.8

Imagga
created on 2022-02-26

bathroom 37.7
washbasin 37
vessel 34
basin 33.6
interior 30
room 26.7
dishwasher 25
modern 23.1
home 21.5
bathtub 20.5
house 19.2
white goods 18.9
clean 17.5
furniture 17.3
container 17.1
home appliance 15.9
sink 15.9
toilet 15.7
appliance 15.3
bath 15.2
inside 14.7
tub 13.8
shower 13.6
architecture 12.8
faucet 12.8
wash 12.5
design 12.4
decor 12.4
person 12.3
contemporary 12.2
wall 12
kitchen 11.7
domestic 11
lifestyle 10.8
man 10.7
indoors 10.5
building 10.4
window 10.3
luxury 10.3
3d 10.1
washing 9.7
apartment 9.6
equipment 9.6
people 9.5
glass 9.4
water 9.3
style 8.9
metal 8.8
steel 8.8
light 8.7
decoration 8.7
floor 8.4
health 8.3
technology 8.2
machine 8.1
washstand 8.1
tile 8.1
women 7.9
black 7.8
male 7.8
high 7.8
tiles 7.8
stainless 7.7
mirror 7.6
business 7.3
stylish 7.2
work 7.2
device 7.1
science 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 88.4
black and white 87.8
text 61.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 71.8%
Calm 51.6%
Sad 47.2%
Angry 0.3%
Surprised 0.3%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98%

Categories

Imagga

paintings art 97.2%
interior objects 1.7%

Captions

Microsoft
created on 2022-02-26

a person sitting in a cage 69.5%
a person sitting in a cage 62.7%
a person in a cage 62.6%

Text analysis

Amazon

FAS