Human Generated Data

Title

Untitled (women and baby at vanity looking into mirror)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17628

Human Generated Data

Title

Untitled (women and baby at vanity looking into mirror)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17628

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.7
Human 97.7
Person 95.8
Clothing 94.6
Apparel 94.6
Person 94.4
Person 89.2
Elevator 87.5
Person 81.1
Person 72.4
Female 62.1
Portrait 59.3
Photography 59.3
Photo 59.3
Face 59.3

Clarifai
created on 2023-10-29

people 99.6
monochrome 99.3
wedding 99.1
window 98.1
mirror 97.1
woman 96.3
bride 95.1
veil 94.5
adult 94.2
couple 92.8
groom 92.7
indoors 92.4
two 90
man 89.8
family 89
wear 88.4
girl 86.9
street 86.9
group 84.1
marriage 83.6

Imagga
created on 2022-02-26

china cabinet 49.7
sliding door 44.5
cabinet 42.6
door 39
case 35.7
furniture 35.4
window 27.5
movable barrier 26.7
furnishing 20.5
man 20.1
people 18.9
barrier 17.7
interior 17.7
room 17.5
indoors 16.7
person 14.7
home 14.3
male 14.2
shop 12.8
casual 12.7
hospital 12.3
adult 12.3
lifestyle 12.3
modern 11.9
technology 11.9
black 11.4
doctor 11.3
business 10.9
house 10.8
worker 10.7
men 10.3
industry 10.2
architecture 10.1
glass 10.1
human 9.7
working 9.7
medical 9.7
medicine 9.7
illness 9.5
women 9.5
work 9.4
happiness 9.4
smile 9.2
environment 9
dress 9
office 9
obstruction 8.9
looking 8.8
surgery 8.8
light 8.7
patient 8.6
surgeon 8.5
store 8.5
design 8.4
communication 8.4
health 8.3
happy 8.1
retail 7.6
professional 7.6
fashion 7.5
child 7.5
floor 7.4
style 7.4
building 7.4
inside 7.4
smiling 7.2
nurse 7.2
hair 7.1
framework 7.1
life 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 93.2
person 92.2
window 85.4
text 80
clothing 66.6
monochrome 62.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Female, 86.6%
Calm 96%
Surprised 3.3%
Fear 0.2%
Sad 0.2%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Angry 0.1%

Feature analysis

Amazon

Person
Person 97.7%
Person 95.8%
Person 94.4%
Person 89.2%
Person 81.1%
Person 72.4%

Categories

Imagga

interior objects 99.2%

Text analysis

Amazon

9
ITC