Human Generated Data

Title

R. A. Cram

Date

1916

People

Artist: John H. Garo, American 1868 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.831

Human Generated Data

Title

R. A. Cram

People

Artist: John H. Garo, American 1868 - 1939

Date

1916

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.831

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.4
Human 97.4
Graduation 78.1
Portrait 63.8
Photography 63.8
Face 63.8
Photo 63.8

Clarifai
created on 2023-10-25

people 100
portrait 99.9
one 99.5
man 99.4
adult 98.8
monochrome 97.1
music 94.6
wear 93
profile 89.1
writer 88.5
retro 88.4
musician 86.5
cigar 84.9
light 83.7
mustache 83.5
side view 83.3
art 83.2
vintage 83.2
eyewear 82.8
two 82.8

Imagga
created on 2022-01-08

harmonica 89.2
free-reed instrument 71.3
wind instrument 61.6
musical instrument 42.7
man 38.3
male 33.4
person 28.2
black 27.3
portrait 24.6
people 22.9
suit 22.2
adult 22
handsome 18.7
businessman 17.7
looking 17.6
face 17.1
call 16.8
hand 16.7
business 16.4
men 15.5
serious 15.3
dark 15
human 15
love 14.2
couple 13.9
expression 13.7
hair 13.5
attractive 13.3
senior 13.1
sexy 11.3
one 11.2
old 11.2
device 11
performer 10.6
thinking 10.5
pretty 9.8
corporate 9.5
sitting 9.5
model 9.3
manager 9.3
guy 9.3
romance 8.9
office 8.8
lifestyle 8.7
work 8.6
elegant 8.6
youth 8.5
professional 8.4
relationship 8.4
fashion 8.3
executive 8.3
kazoo 8.1
eye 8
posing 8
happiness 7.8
criminal 7.8
hands 7.8
depression 7.8
emotional 7.8
eyes 7.8
girlfriend 7.7
two 7.6
passion 7.5
smoke 7.4
emotion 7.4
lady 7.3
confident 7.3
sensual 7.3
singer 7.3
romantic 7.1
job 7.1
together 7
look 7

Google
created on 2022-01-08

Glasses 96.7
Jaw 87.8
Vision care 87.4
Sleeve 87.2
Gesture 85.3
Collar 81.7
Tints and shades 77.1
Blazer 76.4
Suit 72.4
Formal wear 72.1
Vintage clothing 69.7
Eyewear 68.7
Art 67.9
Hat 66.7
Rectangle 66.3
Monochrome photography 64.4
Stock photography 64.3
Monochrome 64
Room 62.3
Sitting 60.2

Microsoft
created on 2022-01-08

person 99.8
human face 98.5
text 98.4
man 98.2
clothing 91
portrait 87.9
black and white 84
glasses 76.5
monochrome 65.4
image 39.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Confused 54.2%
Calm 28.3%
Sad 10.6%
Disgusted 5.2%
Surprised 0.6%
Angry 0.6%
Happy 0.3%
Fear 0.3%

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.4%

Captions

Microsoft
created on 2022-01-08

a man sitting on a bed 65.6%
a man sitting on a couch 65.5%
a man sitting in a chair 65.4%

Text analysis

Amazon

1916