Human Generated Data

Title

Untitled (man with pipe)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1960

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (man with pipe)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1960

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98
Person 97.8
Jacket 96.5
Clothing 96.5
Coat 96.5
Apparel 96.5
Smoke Pipe 91.4
Smoking 71.4
Smoke 71.4

Clarifai
created on 2023-10-25

portrait 99.6
people 99.6
one 98.4
man 97.7
adult 97.4
music 96.8
smoke 96.8
pipe 94.4
musician 94.2
street 92.4
smoker 90.1
monochrome 86.8
wear 84.2
cigar 81.9
festival 80.7
actor 80.4
facial expression 78.2
tobacco 76.4
concert 75.2
mustache 75

Imagga
created on 2022-01-08

portrait 31.7
world 31
person 29.9
people 27.9
adult 27.8
man 26.2
male 24
looking 20.8
attractive 20.3
face 19.2
wind instrument 17.6
happy 17.5
pretty 17.5
hair 16.6
fashion 16.6
handsome 16
smiling 15.9
couple 15.7
sexy 15.3
musical instrument 15.1
smile 15
cute 14.3
love 14.2
black 14.1
lady 13.8
eyes 12.9
women 12.7
head 12.6
lifestyle 12.3
outdoor 12.2
brunette 12.2
human 12
one 11.9
outdoors 11.9
model 11.7
child 11.4
hand 11.4
sitting 11.2
phone 11.1
casual 11
happiness 11
look 10.5
serious 10.5
boy 10.4
weapon 10.4
expression 10.2
youth 10.2
holding 9.9
romance 9.8
talk 9.6
glasses 9.3
bow and arrow 9
style 8.9
microphone 8.8
pipe 8.7
business 8.5
instrument 8.4
guy 8.4
summer 8.4
studio 8.4
teen 8.3
device 8.3
telephone 8.2
family 8
businessman 7.9
together 7.9
life 7.8
men 7.7
relationship 7.5
fun 7.5
teenager 7.3
blond 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 98
music 94.6
human face 92.5
black and white 86.7
person 73.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Sad 48.5%
Calm 27.3%
Happy 10.9%
Fear 5.7%
Surprised 2.3%
Disgusted 2.1%
Angry 1.8%
Confused 1.5%

Microsoft Cognitive Services

Age 50
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Jacket 96.5%

Categories

Imagga

paintings art 75.8%
people portraits 22%

Captions