Human Generated Data

Title

Untitled (photograph of portrait of man in old-fashioned tuxedo sitting in elaborately carved chair)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13156

Human Generated Data

Title

Untitled (photograph of portrait of man in old-fashioned tuxedo sitting in elaborately carved chair)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13156

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.6
Human 98.6
Tie 92
Accessories 92
Accessory 92
Art 89.8
Clothing 88.7
Apparel 88.7
Painting 78.2
Drawing 66.4
Text 64.7
Face 59.7
Coat 56.5

Clarifai
created on 2023-10-26

people 99.9
portrait 99.9
art 99.4
one 99
adult 98.5
man 98.5
painting 98.1
wear 96.8
print 93.9
music 93.4
facial hair 91.5
vintage 91
mustache 89.8
two 88.4
old 88.4
scientist 87.1
leader 85.9
illustration 85.8
retro 83
veil 82.4

Imagga
created on 2022-01-22

person 36.8
man 35.6
male 32.7
disk jockey 30.1
adult 29.7
call 26.3
people 25.7
broadcaster 24.1
portrait 23.3
black 22
communicator 19.3
serious 17.2
face 17
expression 16.2
attractive 16.1
fashion 15.8
business 14.6
human 14.2
one 14.2
suit 13.5
telephone 13.3
model 13.2
couple 13.1
guy 13
looking 12.8
pretty 12.6
sexy 12
dark 11.7
posing 11.5
hand 11.4
happy 11.3
style 11.1
sensuality 10.9
hat 10.8
lifestyle 10.8
handsome 10.7
businessman 10.6
office 10.4
men 10.3
love 10.3
casual 10.2
professional 10.1
holding 9.9
hair 9.5
jacket 9.4
room 9.4
device 9.3
vintage 9.1
lady 8.9
cool 8.9
body 8.8
eyes 8.6
youth 8.5
vessel 8.4
glasses 8.3
phone 8.3
clothing 8.2
retro 8.2
job 8
old 7.7
outdoors 7.5
alone 7.3
romance 7.1
women 7.1
work 7.1
modern 7
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 98.9
clothing 96.4
human face 94.1
man 92
person 91.6
drawing 79.8
sketch 75.9
portrait 58.3
old 58
retro 53.3
posing 48.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 98.8%
Confused 0.4%
Sad 0.3%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Tie 92%

Categories

Imagga

paintings art 99.8%