Human Generated Data

Title

Rufus Choate (1799-1859)

Date

c. 1883

People

Artist: Southworth & Hawes, American active 1843-1863

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sam Pratt, P2003.303

Human Generated Data

Title

Rufus Choate (1799-1859)

People

Artist: Southworth & Hawes, American active 1843-1863

Date

c. 1883

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sam Pratt, P2003.303

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Person 97.6
Human 97.6
Art 93.4
Clothing 89.7
Apparel 89.7
Painting 86.3
Face 84.1
Portrait 67.7
Photography 67.7
Photo 67.7
Text 62.7
Coat 61.9
People 61.1
Drawing 56.7

Clarifai
created on 2023-10-30

people 100
portrait 99.9
adult 99.3
one 99.3
art 99
man 98.7
facial hair 95.4
leader 95.3
music 94.6
wear 94.5
elderly 94.2
engraving 94
sit 93.1
two 92.3
sepia 91.9
print 91.2
mustache 91.1
writer 90.1
outerwear 89.8
neckwear 89.7

Imagga
created on 2022-05-28

person 37
portrait 36.3
adult 33.1
male 29.2
people 29
attractive 28.7
man 28.2
hair 23.8
fashion 22.6
face 22
pretty 21
expression 20.5
bow tie 20
lady 18.7
sexy 18.5
black 17.9
model 17.9
comedian 17.4
human 17.3
suit 17.2
performer 16.8
lifestyle 15.9
necktie 15.5
body 15.2
handsome 15.2
happy 15
dress 14.5
smile 13.5
sensual 12.7
smiling 12.3
eyes 12.1
fun 12
entertainer 11.9
style 11.9
casual 11.9
love 11.8
posing 11.6
business 11.5
cute 11.5
serious 11.4
couple 11.3
makeup 11.2
lips 11.1
clothing 11.1
sensuality 10.9
make 10.9
device 10.6
businessman 10.6
elevator 10.5
looking 10.4
youth 10.2
dark 10
room 9.9
hand 9.9
brunette 9.6
elegant 9.4
alone 9.1
garment 9
one 9
together 8.8
urban 8.7
wig 8.6
tie 8.5
two 8.5
lifting device 8.4
guy 8.3
mother 8.2
world 7.8
erotic 7.8
old 7.7
skin 7.6
passion 7.5
relationship 7.5
call 7.5
office 7.3
women 7.1
family 7.1
interior 7.1
happiness 7.1
modern 7
look 7

Google
created on 2022-05-28

Microsoft
created on 2022-05-28

text 99.6
person 98.9
man 97.3
clothing 97.2
wall 97.2
human face 94.8
old 87
portrait 67.8
photograph 59.1
posing 51.9
vintage 31

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 56-64
Gender Male, 100%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 0.2%
Angry 0.1%
Calm 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 66
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Painting 86.3%

Categories

Imagga

people portraits 77.2%
paintings art 20.6%
food drinks 1.4%