Human Generated Data

Title

William Leonard Langer (1896-1977)

Date

1952

People

Artist: Umberto Romano, American 1905 - 1984

Sitter: William Leonard Langer, 1896-1977

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Bequest of Rowena Morse Langer, 1990, H704

Human Generated Data

Title

William Leonard Langer (1896-1977)

People

Artist: Umberto Romano, American 1905 - 1984

Sitter: William Leonard Langer, 1896-1977

Date

1952

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Bequest of Rowena Morse Langer, 1990, H704

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Clothing 83.4
Coat 83.4
Accessories 71.5
Formal Wear 71.5
Tie 71.5
Drawing 57.9

Clarifai
created on 2018-05-10

people 100
one 99.7
portrait 99.3
adult 99.2
leader 98.6
man 98.2
sit 97.5
book series 97.4
wear 95.9
administration 95.8
gown (clothing) 93
art 92.8
print 92.6
engraving 92.5
writer 91.4
seat 90.9
furniture 89.7
facial hair 88.9
lawyer 88.1
scientist 87.9

Imagga
created on 2023-10-06

person 42.7
man 33.6
male 33.3
suit 31.9
adult 28.6
portrait 28.5
scholar 26.3
black 24.8
people 24
business 23.1
intellectual 21
attractive 20.3
sexy 20.1
businessman 19.4
fashion 18.9
face 17.8
model 17.1
looking 16.8
elegant 15.4
handsome 15.2
hair 15.1
dark 15
one 14.9
lady 14.6
expression 14.5
musical instrument 14.2
posing 14.2
pretty 14
guy 13.9
love 13.4
style 13.4
professional 13.2
office 13.2
stringed instrument 13.2
serious 12.4
lifestyle 11.6
couple 11.3
human 11.2
manager 11.2
corporate 11.2
room 11.1
makeup 11
sensual 10.9
studio 10.6
look 10.5
tie 10.4
passion 10.3
sitting 10.3
executive 10.1
confident 10
sensuality 10
hand 9.9
boss 9.6
men 9.4
work 9.4
newspaper 9.2
alone 9.1
bowed stringed instrument 9.1
holding 9.1
dress 9
clothing 8.9
success 8.9
happy 8.8
erotic 8.5
jacket 8.5
cello 8.1
night 8
body 8
working 8
singer 7.9
smile 7.8
boy 7.8
performer 7.8
wind instrument 7.7
youth 7.7
seductive 7.6
adolescent 7.6
businesspeople 7.6
vintage 7.4
glasses 7.4
church 7.4
product 7.4
successful 7.3
make 7.3
stylish 7.2
home 7.2
cool 7.1
interior 7.1
job 7.1
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

text 92.9
person 88.2
black 70.6
white 64
old 54.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-57
Gender Male, 100%
Confused 45.7%
Sad 28%
Calm 21.6%
Surprised 6.9%
Fear 6.8%
Disgusted 4.2%
Angry 3.9%
Happy 0.8%

Microsoft Cognitive Services

Age 70
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%
Tie 71.5%

Categories

Captions

Text analysis

Amazon

TURKEY
SYRIA
UND
Black
you
Dollars