Human Generated Data

Title

Untitled (two men and a woman)

Date

c. 1940

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1311

Human Generated Data

Title

Untitled (two men and a woman)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1311

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.9
Person 98.4
Person 98.4
Person 98.2
People 97.8
Family 96.6
Face 65.9
Clothing 61.5
Apparel 61.5
Performer 57.1
Glasses 55.8
Accessories 55.8
Accessory 55.8

Clarifai
created on 2023-10-26

portrait 99.7
people 99.4
man 98.7
adult 96.4
group 94.8
three 94.6
two 94.4
wear 92.2
retro 90.4
group together 90
music 89.6
administration 89.5
four 86.2
musician 85.9
leader 85.9
actor 85.2
facial expression 80.4
outfit 79.4
five 78.2
movie 75.7

Imagga
created on 2022-01-23

kin 74.7
man 40.3
male 35.6
portrait 31
sibling 30.5
black 27
face 21.3
love 21.3
people 21.2
adult 20.7
couple 20
family 19.6
eyes 18.9
person 17.5
happy 17.5
dark 17.5
attractive 17.5
brother 16.6
expression 16.2
smile 15.7
together 14.9
grandma 14.5
studio 13.7
human 13.5
hair 13.5
husband 13.3
mother 13.3
smiling 13
world 12.8
father 12.7
one 12.7
model 12.4
relationship 12.2
two 11.9
serious 11.4
wife 11.4
child 11.3
pretty 11.2
happiness 11
guy 10.8
masculine 10.7
handsome 10.7
grandfather 10.6
look 10.5
lips 10.2
skin 10.2
looking 9.6
married 9.6
senior 9.4
head 9.2
close 9.1
hand 9.1
romance 8.9
sexy 8.8
parent 8.8
lifestyle 8.7
men 8.6
youth 8.5
lady 8.1
suit 8.1
light 8
daughter 8
kid 8
women 7.9
cute 7.9
hands 7.8
culture 7.7
nose 7.6
casual 7.6
mouth 7.5
bow tie 7.4
emotion 7.4
body 7.2
romantic 7.1
clothing 7.1

Google
created on 2022-01-23

Forehead 98.5
Outerwear 95.4
Coat 91
Black 89.5
Collar 81
People 79.3
Vintage clothing 74.9
Snapshot 74.3
Tints and shades 73.5
Formal wear 70.1
Stock photography 66.2
History 65.2
Suit 64.6
Event 62.5
Classic 62.4
Font 60.4
Uniform 60.1
Crew 59.5
Monochrome 57
Retro style 55.2

Microsoft
created on 2022-01-23

text 99.4
human face 99.3
person 97.3
clothing 94.5
man 92.9
black 92.5
smile 89.1
standing 77.8
posing 70.1
white 64.9
portrait 57.9
glasses 56.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 44.8%
Disgusted 25.4%
Confused 14.8%
Angry 5.6%
Sad 3.8%
Fear 2%
Surprised 1.8%
Happy 1.7%

Feature analysis

Amazon

Person
Glasses
Person 98.4%

Categories

Text analysis

Amazon

PORTER
S

Google

PORTER
PORTER