Human Generated Data

Title

Untitled (two men and a woman)

Date

c. 1940

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1309

Human Generated Data

Title

Untitled (two men and a woman)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Person 99.3
People 98.9
Family 98.9
Person 98.2
Person 97.1
Accessories 79.3
Accessory 79.3
Glasses 79.3
Face 74
Glasses 62.3

Imagga
created on 2022-01-23

kin 46.5
brother 45.6
male 36.1
portrait 33
man 32.2
black 30.6
adult 28.4
sibling 28.4
people 25.1
love 24.4
couple 24.4
family 22.2
person 20.6
child 19.8
studio 19.7
father 19.7
happy 19.4
attractive 18.2
buddy 17.8
dark 17.5
together 17.5
smiling 17.4
smile 17.1
happiness 16.4
face 16.3
relationship 15.9
eyes 15.5
mother 15.4
boy 14.8
looking 14.4
model 14
lifestyle 13.7
fashion 13.6
one 13.4
handsome 13.4
sexy 12.8
human 12.7
casual 12.7
two 12.7
brunette 12.2
cute 12.2
group 12.1
youth 11.9
hair 11.9
kid 11.5
parent 11.1
women 11.1
girls 10.9
sitting 10.3
men 10.3
expression 10.2
son 10.1
children 10
romance 9.8
pretty 9.8
cheerful 9.7
husband 9.7
look 9.6
wife 9.5
skin 9.3
teenager 9.1
holding 9.1
daughter 9
married 8.6
serious 8.6
togetherness 8.5
hand 8.4
alone 8.2
dad 8.2
guy 8.1
romantic 8
posing 8
body 8
masculine 7.8
loving 7.6
friends 7.5
style 7.4
adorable 7.4
emotion 7.4
sensuality 7.3
pose 7.2
world 7.2
cool 7.1
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.8
human face 99.6
smile 98.3
clothing 96.7
person 96.1
man 90.6
posing 86.7
black 83.3
old 66.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-59
Gender Male, 100%
Calm 50.7%
Happy 22.6%
Disgusted 7%
Sad 5.9%
Confused 5.6%
Angry 3.6%
Fear 2.5%
Surprised 2%

AWS Rekognition

Age 27-37
Gender Male, 100%
Happy 99.4%
Calm 0.2%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Fear 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 43-51
Gender Female, 94.5%
Calm 86.6%
Disgusted 5.7%
Happy 2.6%
Confused 1.9%
Surprised 1%
Sad 0.9%
Fear 0.7%
Angry 0.5%

Microsoft Cognitive Services

Age 47
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 50
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Glasses 79.3%

Captions

Microsoft

a group of people posing for a photo 96.5%
a vintage photo of a group of people posing for the camera 94.5%
a vintage photo of a group of people posing for a picture 94.4%

Text analysis

Amazon

PORTER
S

Google

PORTER
PORTER