Human Generated Data

Title

Untitled (man, woman, and girl looking into mirror)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17091

Human Generated Data

Title

Untitled (man, woman, and girl looking into mirror)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17091

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 89.8
Face 77.7
Furniture 74.4
Person 71.5
Texture 70.1
Photography 61.2
Photo 61.2
Clothing 58.6
Apparel 58.6

Clarifai
created on 2023-10-29

people 99.4
child 97.5
family 96.9
woman 94.8
adult 94.6
monochrome 94.6
indoors 94.2
one 93.6
love 93.1
two 91.9
girl 91.1
man 90.3
house 87.3
pain 85.7
room 85.6
baby 85.3
son 83.3
little 81.3
sit 80.7
person 79.8

Imagga
created on 2022-02-26

person 28.7
people 27.3
happy 26.9
grandma 25.3
smiling 24.6
portrait 23.9
child 23.5
adult 23.5
senior 23.4
home 20.7
elderly 20.1
mature 18.6
baby bed 18.2
happiness 18
blond 17.3
smile 17.1
old 16.7
retired 16.5
alone 16.4
sitting 16.3
women 15.8
furniture 15.4
retirement 15.4
male 15.3
cute 15.1
cradle 14.3
face 14.2
man 14.1
indoors 14.1
lady 13.8
lifestyle 13.7
looking 13.6
one 13.4
daughter 13.4
pretty 13.3
holding 13.2
cheerful 13
love 12.6
reading 12.4
wife 12.3
business 12.1
human 12
attractive 11.9
day 11.8
age 11.4
student 11
kid 10.6
couple 10.5
office 10.4
hair 10.3
expression 10.2
furnishing 9.7
together 9.6
bride 9.6
husband 9.5
study 9.3
wedding 9.2
mother 9.2
book 9.1
room 9.1
dress 9
outdoors 9
pensioner 8.9
family 8.9
little 8.8
grandmother 8.8
older 8.7
holiday 8.6
life 8.6
hand 8.4
glasses 8.3
clothing 8.2
school 8.2
aged 8.1
worker 8
working 7.9
vertical 7.9
veil 7.8
education 7.8
table 7.8
two 7.6
head 7.6
fashion 7.5
house 7.5
groom 7.3
work 7.1
interior 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 95
person 92.5
human face 78.7
dress 66.9
clothing 62.2
woman 61.2
black and white 56.1
open 50

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-56
Gender Male, 99.5%
Confused 46.3%
Surprised 30.3%
Sad 6.5%
Happy 6%
Calm 5.3%
Disgusted 3.3%
Fear 1.3%
Angry 1%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Happy 84.5%
Calm 6.5%
Sad 2.6%
Confused 2.2%
Angry 1.5%
Surprised 1.3%
Disgusted 0.9%
Fear 0.5%

Feature analysis

Amazon

Person
Person 71.5%

Categories

Imagga

paintings art 98.3%

Text analysis

Amazon

VI72A2
VI72A2 - 10004
- 10004