Human Generated Data

Title

Untitled (bride seated at mirror having hair done)

Date

c. 1945

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1467

Human Generated Data

Title

Untitled (bride seated at mirror having hair done)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1467

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.7
Human 98.7
Person 97.2
Person 96.6
Person 94.6
Person 94.1
Painting 93.8
Art 93.8
Person 92.8
Furniture 77.9
Clothing 76.8
Apparel 76.8
Bed 68.8
Figurine 64.3
Cake 57.6
Food 57.6
Dessert 57.6
Home Decor 57.4

Clarifai
created on 2023-10-26

people 99.9
woman 98.3
child 97.9
group 97.5
adult 96.5
wear 95.1
two 94
art 93.6
man 92.9
family 91.7
son 91
monochrome 90.9
room 89.7
medical practitioner 89.4
three 88.6
retro 87.2
dressing room 87.2
furniture 86.5
portrait 85.9
painting 85.4

Imagga
created on 2022-01-23

home 29.5
person 23.3
adult 22.7
bedroom 22.3
man 22.2
people 21.7
room 21.4
bed 21.1
family 18.7
indoors 18.4
male 17.1
baby bed 16.8
furniture 16.5
happy 15.7
love 15
happiness 14.9
iron lung 14.8
breathing device 14.7
smiling 13.7
hospital 13.7
cradle 13.6
portrait 13.6
smile 13.5
house 13.4
clothing 13.2
couple 13.1
brother 12.6
child 12.5
lying 12.2
women 11.9
respirator 11.8
dress 11.7
covering 11.7
pajama 11.6
interior 11.5
resting 11.4
lady 11.4
pretty 11.2
device 11
mother 11
relaxing 10.9
furnishing 10.8
kid 10.6
females 10.4
attractive 9.8
patient 9.7
couch 9.7
bride 9.6
adults 9.5
bathrobe 9.4
lifestyle 9.4
indoor 9.1
care 9.1
sexy 8.8
brunette 8.7
cute 8.6
daughter 8.6
chair 8.5
black 8.4
relaxation 8.4
fashion 8.3
wedding 8.3
human 8.2
girls 8.2
childhood 8.1
hair 7.9
garment 7.8
face 7.8
sitting 7.7
nurse 7.7
break 7.6
senior 7.5
vintage 7.4
nightwear 7.4
inside 7.4
20s 7.3
oxygen mask 7.3
clock 7.2
holiday 7.2
life 7.2
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

wedding dress 97.8
bride 94.5
indoor 94.1
text 90.6
clothing 89.6
dress 86.9
woman 85.9
person 85.5
painting 67.8
human face 60.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Sad 26.1%
Happy 26%
Angry 17.3%
Surprised 11%
Calm 8.8%
Fear 5.3%
Disgusted 3.3%
Confused 2.2%

AWS Rekognition

Age 20-28
Gender Female, 61.2%
Calm 70.8%
Confused 5.3%
Disgusted 5.1%
Angry 4.9%
Happy 4.6%
Surprised 4.5%
Sad 2.9%
Fear 1.9%

AWS Rekognition

Age 23-33
Gender Female, 100%
Disgusted 99.8%
Confused 0.1%
Calm 0%
Angry 0%
Happy 0%
Sad 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 99.7%
Surprised 0.1%
Sad 0%
Calm 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Calm 88.9%
Sad 8.7%
Confused 0.9%
Fear 0.5%
Happy 0.3%
Angry 0.2%
Surprised 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 98.7%
Person 97.2%
Person 96.6%
Person 94.6%
Person 94.1%
Person 92.8%
Painting 93.8%

Categories

Captions