Human Generated Data

Title

Untitled (two nuns wearing habits)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21737

Human Generated Data

Title

Untitled (two nuns wearing habits)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21737

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 99.8
Apparel 99.8
Person 98.2
Human 98.2
Person 97.6
Head 94.9
Face 92.4
Hood 91.4
Jaw 86
Helmet 79.1
Portrait 66.5
Photography 66.5
Photo 66.5
Sweatshirt 58.6
Sweater 58.6
Veil 58.2
Hat 57.1

Clarifai
created on 2023-10-22

portrait 99.7
monochrome 99.5
girl 98.7
art 98.5
model 98.4
people 98.1
man 97.7
black and white 97.2
studio 96.9
face 96.9
lid 95.6
smile 94.5
beautiful 94
cap 93.7
wedding 93.5
self 92.5
sepia 92.4
eye 92.1
veil 91.6
shadow 91.2

Imagga
created on 2022-03-11

man 30.9
attendant 30.4
covering 26
mask 25.8
face 24.1
bow tie 22.2
portrait 22
adult 22
clothing 21.3
male 21.3
hat 20.3
person 20.3
people 20.1
attractive 19.6
disguise 18.7
necktie 18.1
car 17.5
seat 16.7
attire 16.6
plane seat 16.1
smile 15.7
black 15.1
modern 14.7
device 14.3
model 14
sexy 13.6
expression 13.6
support 13.6
fashion 13.5
pretty 13.3
smiling 12.3
suit 12.1
happy 11.9
hair 11.9
style 11.9
head 11.7
guy 11.5
business 11.5
women 11.1
lady 10.5
looking 10.4
garment 10.3
men 10.3
love 10.2
work 10.2
transportation 9.9
driver 9.7
hands 9.5
sitting 9.4
headdress 9.3
professional 9.3
transport 9.1
studio 9.1
consumer goods 9
human 9
businessman 8.8
couple 8.7
cowboy hat 8.7
cute 8.6
drive 8.5
wheel 8.5
inside 8.3
20s 8.2
make 8.2
blond 8.1
vehicle 8
posing 8
interior 8
look 7.9
brunette 7.8
eyes 7.7
automobile 7.7
auto 7.6
two 7.6
hand 7.6
one 7.5
holding 7.4
closeup 7.4
office 7.4
businesswoman 7.3
passenger 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

wall 98.9
person 98.8
human face 98.2
man 95.5
text 90.1
black and white 84
fashion accessory 82.7
standing 76.4
posing 75.7
white 72.9
clothing 54.4
hat 51.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 98.6%
Calm 94.3%
Sad 3%
Surprised 0.8%
Happy 0.7%
Fear 0.4%
Angry 0.4%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 19-27
Gender Female, 51.8%
Surprised 53.7%
Calm 41.8%
Disgusted 1.2%
Fear 1.1%
Happy 1%
Sad 0.4%
Angry 0.3%
Confused 0.3%

Feature analysis

Amazon

Person
Helmet
Person 98.2%
Person 97.6%
Helmet 79.1%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

ГАД

Google

VT37A°2-XAGO
VT37A°2-XAGO