Human Generated Data

Title

Untitled (man in doorway)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1945

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (man in doorway)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1945

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Face 99.1
Beard 74.3
Leisure Activities 72.8
Man 70.5
Musician 70.2
Musical Instrument 70.2
Portrait 68.6
Photography 68.6
Photo 68.6
Performer 58.9
Door 55.6
Skin 55.6

Clarifai
created on 2023-10-25

people 99.9
portrait 99.8
one 99.7
adult 99.1
man 98.7
facial hair 93.1
wear 88.4
administration 85.8
street 85.4
actor 83.8
art 82.3
leader 81.9
offense 81
music 80
veil 77.4
monochrome 76.5
military 75.4
musician 72.5
war 72.2
two 71.8

Imagga
created on 2022-01-08

man 49
male 42.7
person 29.6
portrait 27.8
elevator 27.1
lifting device 21.7
adult 21.3
face 21.3
people 20.1
black 20
device 19.5
handsome 18.7
men 18
happy 14.4
attractive 14
smiling 13.7
guy 13
expression 12.8
serious 12.4
juvenile 11.2
beard 11.1
smile 10.7
boy 10.6
one 10.4
model 10.1
color 9.5
work 9.4
businessman 8.8
looking 8.8
home 8.8
child 8.7
muscular 8.6
sitting 8.6
casual 8.5
emotion 8.3
inside 8.3
confident 8.2
sexy 8
hair 7.9
look 7.9
monk 7.7
head 7.6
human 7.5
suit 7.5
mature 7.4
happiness 7
indoors 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.4
human face 98.9
wall 98.8
clothing 96
man 96
black and white 94.1
indoor 91.1
human beard 84.6
portrait 77.8
monochrome 76.7
text 62.7
old 44.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Fear 64%
Sad 30.6%
Calm 2.1%
Confused 1.2%
Disgusted 0.8%
Angry 0.7%
Happy 0.5%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

pets animals 98.4%