Human Generated Data

Title

Untitled (girl with injuries to face)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17890

Human Generated Data

Title

Untitled (girl with injuries to face)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17890

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 98.3
Apparel 98.3
Person 98.2
Human 98.2
Face 83.7
Door 71.8
Head 68.2
Portrait 66.1
Photography 66.1
Photo 66.1
Glasses 62
Accessories 62
Accessory 62
Finger 61.4
Curtain 56
Window 56

Clarifai
created on 2023-10-29

monochrome 99.8
portrait 99.6
people 99.4
studio 94.4
black and white 93.6
adult 93.5
mask 92.7
art 92.2
costume 91.6
man 90.9
model 90.6
music 89.9
fashion 89.5
girl 89.3
Halloween 89.2
face 89.1
vintage 88.4
one 87.7
woman 85.8
retro 85.4

Imagga
created on 2022-02-26

mask 84
negative 68.6
film 53.3
covering 52.5
disguise 46.5
photographic paper 41.2
attire 31.2
man 28.2
photographic equipment 27.5
portrait 24.6
face 23.4
black 22.9
person 22.8
clothing 22.8
male 20.6
danger 18.2
military 15.4
people 15.1
soldier 14.7
eyes 13.8
adult 13.6
lady 13
costume 12.7
war 12.5
weapon 11.9
protection 11.8
toxic 11.7
chemical 11.6
gun 11.6
gas 11.6
human 11.2
art 11.1
safety 11
dark 10.9
fantasy 10.8
horror 10.7
fashion 10.6
sexy 10.4
model 10.1
consumer goods 10.1
head 10.1
vintage 9.9
studio 9.9
radiation 9.8
army 9.7
one 9.7
pollution 9.6
hair 9.5
makeup 9.2
attractive 9.1
old 9.1
dress 9
disaster 8.8
warrior 8.8
protective 8.8
carnival 8.8
death 8.7
uniform 8.7
mystery 8.6
culture 8.5
make 8.2
respirator 7.9
warfare 7.9
camouflage 7.8
skull 7.8
evil 7.8
chemistry 7.7
fear 7.7
hairstyle 7.6
hand 7.6
traditional 7.5
equipment 7.4
looking 7.2
work 7.1

Microsoft
created on 2022-02-26

text 98.8
wall 98.6
human face 98.2
person 97.1
clothing 87.1
black and white 72.1
woman 68.2
drawing 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 59.3%
Calm 72.9%
Happy 13.2%
Surprised 10.3%
Sad 1.4%
Confused 0.9%
Angry 0.6%
Disgusted 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.2%

Text analysis

Amazon

YT3RAS
YT3RAS NACO>
NACO>

Google

NACON
NACON