Human Generated Data

Title

Mr. deRham and an unidentified woman [possibly Charles deRham and Laura Fredericka Schmidt deRham]

Date

c. 1857

People

Artist: Mathew B. Brady & Studio, American active 1844 - 1895

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Special Collections, Fine Arts Library, Harvard College Library, Bequest of Evert Jansen Wendell, 2010.42

Human Generated Data

Title

Mr. deRham and an unidentified woman [possibly Charles deRham and Laura Fredericka Schmidt deRham]

People

Artist: Mathew B. Brady & Studio, American active 1844 - 1895

Date

c. 1857

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Special Collections, Fine Arts Library, Harvard College Library, Bequest of Evert Jansen Wendell, 2010.42

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99
Chair 98.7
Furniture 98.7
Person 98.5
Person 95.6
Game 87.5
Chess 68.6

Clarifai
created on 2023-10-25

people 99.9
two 99
adult 98.3
group 98.1
woman 97.9
man 97.6
portrait 97.5
three 96
wear 93
leader 92.8
art 92.4
elderly 91.4
sit 90.9
furniture 90.2
chair 90.1
outfit 88.5
sepia 87.7
retro 86.8
recreation 86.2
administration 84.8

Imagga
created on 2022-01-09

man 27.5
person 27.2
male 22
adult 20.7
people 19.5
portrait 18.7
fashion 15.1
human 15
room 14
musical instrument 13.7
black 13.5
antique 13.3
wind instrument 13.3
old 13.2
senior 13.1
interior 12.4
looking 12
device 11.6
vintage 11.6
posing 11.5
brass 11.4
suit 11.2
sexy 11.2
attractive 11.2
sitting 11.2
expression 11.1
sculpture 11
alone 10.9
dress 10.8
hair 10.3
model 10.1
pretty 9.8
family 9.8
lady 9.7
one 9.7
home 9.6
happy 9.4
art 9.3
indoor 9.1
clothing 9
retro 9
style 8.9
businessman 8.8
chair 8.8
ancient 8.6
men 8.6
luxury 8.6
elegant 8.6
smile 8.5
face 8.5
business 8.5
make 8.2
teacher 8.2
pose 8.1
history 8
office 8
mask 7.9
love 7.9
couple 7.8
happiness 7.8
elderly 7.7
health 7.6
grandfather 7.3
stylish 7.2
body 7.2
to 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99
text 97.8
wall 95.5
clothing 95.3
man 84.7
old 76.1
black 65.9
group 63.8
people 60.8
human face 50.3
vintage 27.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 39-47
Gender Male, 76.9%
Sad 51.9%
Calm 41%
Confused 2.9%
Fear 1.8%
Disgusted 1.1%
Surprised 0.6%
Angry 0.5%
Happy 0.2%

AWS Rekognition

Age 76-86
Gender Male, 99.2%
Calm 97.3%
Sad 1.9%
Confused 0.4%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%

Microsoft Cognitive Services

Age 69
Gender Male

Microsoft Cognitive Services

Age 36
Gender Female

Feature analysis

Amazon

Chair 98.7%
Person 98.5%
Chess 68.6%

Categories