Human Generated Data

Title

Untitled (mourners beside open casket)

Date

c. 1920

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1854

Human Generated Data

Title

Untitled (mourners beside open casket)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1854

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 99.9
Person 98.8
Human 98.8
Person 98.4
Cradle 96.9
Person 96.3
Tie 90.1
Accessories 90.1
Accessory 90.1
Person 86.7
Bed 85.6
Tie 62.9
Drawing 61.3
Art 61.3

Clarifai
created on 2023-10-25

chair 98.4
people 98.2
adult 95.2
man 95.2
furniture 94.3
sit 90.3
indoors 90.3
room 88.5
group 86.5
illustration 85.3
woman 84.7
two 82.3
monochrome 81.8
seat 79.6
leader 75.9
one 75.5
uniform 74
portrait 72.5
armchair 70.5
table 68.6

Imagga
created on 2021-12-14

home 39.3
room 38.6
chair 31.8
interior 30.9
furniture 29.6
sitting 24
indoors 23.7
house 23.4
person 22.8
man 22.2
seat 20.8
living 20.8
people 20.1
table 19.5
relax 19.4
adult 19.3
modern 18.9
happy 18.2
indoor 16.4
male 16.3
comfortable 16.2
decor 15.9
lifestyle 15.9
laptop 15.9
computer 15.5
couch 15.4
smiling 15.2
sofa 14.9
luxury 14.6
casual 14.4
lamp 14.3
armchair 14.1
design 13.5
grand piano 13.5
apartment 13.4
day 13.3
bedroom 13.2
business 12.7
style 12.6
relaxation 12.6
wood 12.5
decoration 12.4
sit 12.3
couple 12.2
piano 11.9
pillow 11.8
happiness 11.7
mid adult 11.6
bed 11.5
businessman 11.5
portrait 11
office 10.8
smile 10.7
family 10.7
looking 10.4
mature 10.2
domestic 10.2
grandfather 10.2
inside 10.1
20s 10.1
light 10
color 10
work 10
cheerful 9.7
sleep 9.7
working 9.7
keyboard instrument 9.7
antique 9.7
stringed instrument 9.6
women 9.5
senior 9.4
percussion instrument 9.3
bright 9.3
clothing 9.2
window 9.2
leisure 9.1
relaxing 9.1
old 9
stylish 9
30s 8.7
bench 8.6
hotel 8.6
face 8.5
horizontal 8.4
floor 8.4
desk 8.2
living room 7.8
rest 7.7
wall 7.7
elderly 7.7
notebook 7.6
executive 7.6
one 7.5
mother 7.4
alone 7.3
musical instrument 7.3
love 7.1
life 7.1
job 7.1
architecture 7
cushion 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

furniture 88.8
text 81.5
old 70.6
black and white 64.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Male, 77.6%
Calm 70.5%
Angry 19.9%
Surprised 6.3%
Sad 1.1%
Confused 0.8%
Fear 0.6%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 39-57
Gender Male, 74.4%
Calm 95.5%
Sad 2.6%
Happy 1%
Surprised 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 12-22
Gender Female, 73.8%
Calm 71%
Happy 19.8%
Sad 4.5%
Confused 1.8%
Angry 1.1%
Surprised 0.8%
Fear 0.6%
Disgusted 0.4%

AWS Rekognition

Age 26-40
Gender Male, 67%
Fear 45.7%
Calm 23.3%
Happy 17.5%
Sad 8.3%
Angry 2.1%
Surprised 1.5%
Confused 1.1%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Tie 90.1%
Bed 85.6%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

KODAK-1TW