Human Generated Data

Title

Untitled (woman seated in chair holding mocking bird)

Date

1930s

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12392

Human Generated Data

Title

Untitled (woman seated in chair holding mocking bird)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12392

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.2
Person 99.2
Indoors 83.4
Interior Design 83.4
Camera 83.1
Electronics 83.1
Furniture 79.8
Table Lamp 76.1
Lamp 69.4
Person 67.6
Room 58
Living Room 58
Photographer 55.6
Clothing 55.2
Apparel 55.2

Clarifai
created on 2023-10-26

people 99.9
two 98.5
adult 97.9
furniture 97.5
woman 97.2
room 96.7
portrait 96.7
one 96.1
seat 95.2
vintage 94.6
sepia 94.5
retro 93.9
man 93.9
sit 92.7
monochrome 90.1
family 89.6
wear 89.2
chair 86.3
nostalgia 85.2
music 81.7

Imagga
created on 2022-01-22

chair 31
salon 31
home 30.3
man 28.9
person 26.5
people 25.6
male 24.9
indoors 22.8
senior 21.5
lifestyle 20.9
happy 20.7
sitting 20.6
barber chair 19.6
adult 19.5
seat 18.4
portrait 18.1
hat 17.5
room 16.9
hairdresser 16.6
smiling 15.9
couple 15.7
happiness 15.7
old 15.3
face 14.9
mature 14.9
indoor 14.6
elderly 14.4
cheerful 13.8
worker 13.3
holding 13.2
clothing 13.1
casual 12.7
retirement 12.5
family 12.4
interior 12.4
furniture 12.3
smile 12.1
alone 11.9
love 11.8
lady 11.4
domestic 11.1
gramophone 11.1
aged 10.9
leisure 10.8
retired 10.7
loving 10.5
men 10.3
armchair 10.3
machine 10.2
day 10.2
two 10.2
book 10.1
relaxing 10
dress 9.9
fashion 9.8
attractive 9.8
pensioner 9.8
child 9.7
together 9.6
looking 9.6
reading 9.5
women 9.5
living 9.5
enjoyment 9.4
relaxation 9.2
house 9.2
blond 9
one 9
record player 8.9
work 8.9
older 8.7
couch 8.7
device 8.6
husband 8.6
relaxed 8.4
horizontal 8.4
hand 8.3
color 8.3
fun 8.2
cup 8.1
medical 7.9
call 7.9
good mood 7.8
living room 7.8
boy 7.8
nurse 7.8
mother 7.7
drinking 7.7
only 7.6
joy 7.5
father 7.5
camera 7.4
20s 7.3
children 7.3
shop 7.2
grandfather 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 96.2
indoor 96
fashion accessory 87.9
clothing 82
text 76.7
furniture 69.7
hat 65.5
old 55.6
woman 53.2
lamp 50.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 99.5%
Calm 97.4%
Fear 0.5%
Sad 0.5%
Angry 0.5%
Happy 0.4%
Confused 0.3%
Disgusted 0.3%
Surprised 0.1%

AWS Rekognition

Age 13-21
Gender Male, 52%
Calm 97.4%
Sad 0.8%
Surprised 0.6%
Fear 0.4%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 59.1%
Calm 56.2%
Sad 26.2%
Disgusted 6.2%
Surprised 4%
Fear 2.7%
Angry 2.6%
Confused 1.1%
Happy 0.9%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Lamp 69.4%

Categories

Captions