Human Generated Data

Title

Untitled (woman and baby looking into mirror, seated)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16325

Human Generated Data

Title

Untitled (woman and baby looking into mirror, seated)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16325

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99
Human 99
Clothing 95.7
Apparel 95.7
Indoors 92.8
Room 92.6
Person 91.2
Person 87.1
Dressing Room 77.9
Mirror 72.4
People 62.7
Robe 57
Fashion 57

Clarifai
created on 2023-10-28

people 99.3
mirror 99
monochrome 97.4
woman 97.4
wedding 97.3
man 93.7
retro 93.6
adult 93.4
indoors 90.5
veil 90.1
bride 89.5
two 86.9
love 86.7
group 86.1
science 85.9
square 85.8
illustration 85.5
ball-shaped 85.4
luxury 84.6
couple 81.9

Imagga
created on 2022-02-11

chandelier 78.8
lighting fixture 62.3
fixture 46.6
globe 25.9
gong 22.3
world 21.8
earth 21
planet 18.8
percussion instrument 18.2
global 17.3
digital 14.6
musical instrument 14.2
case 14
map 14
business 14
art 13.1
people 12.8
light 12.7
technology 12.6
celebration 12
adult 11.7
cap 11.6
person 11.5
shower cap 11.4
decoration 10.8
man 10.7
science 10.7
design 10.1
holiday 10
medicine 9.7
color 9.4
happy 9.4
device 9.4
finance 9.3
communication 9.2
male 9.2
connection 9.1
sky 8.9
shape 8.9
style 8.9
graphic 8.7
continent 8.7
geography 8.7
work 8.6
international 8.6
space 8.5
3d 8.5
travel 8.4
modern 8.4
network 8.3
health 8.3
gramophone 8.3
human 8.2
night 8
medical 7.9
conceptual 7.9
happiness 7.8
black 7.8
headdress 7.8
wallpaper 7.7
horizontal 7.5
ocean 7.5
symbol 7.4
time 7.3
computer 7.2
star 7.2
portrait 7.1
face 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 97.5
wall 97.2
person 97.1
indoor 93.4
wedding 87.1
wedding dress 82.7
vase 76.3
mirror 70.5
old 59.7
flower 59.2
bride 59.2
clothing 50.9
posing 45.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 81%
Surprised 77.7%
Happy 19.3%
Fear 1.2%
Sad 0.6%
Disgusted 0.3%
Angry 0.3%
Confused 0.3%
Calm 0.2%

AWS Rekognition

Age 33-41
Gender Male, 92.7%
Happy 84.4%
Surprised 8.4%
Calm 3.3%
Confused 1.2%
Disgusted 1.1%
Sad 0.9%
Angry 0.4%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Person 99%
Person 91.2%
Person 87.1%

Categories

Text analysis

Amazon

183
EF
Fire
LAIN EF
Mount
and Mount Fire
and
LAIN