Human Generated Data

Title

Untitled (people sitting in airport waiting area)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16059

Human Generated Data

Title

Untitled (people sitting in airport waiting area)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Clothing 99.9
Apparel 99.9
Person 99.4
Human 99.4
Person 99.4
Shorts 98.5
Person 98.1
Person 97.8
Chair 96.5
Furniture 96.5
Female 95.2
Woman 85.4
Door 82.8
Sitting 76.2
Footwear 71.8
Shoe 71
Face 67.7
Dress 66.2
Girl 65.8
Photography 61.8
Photo 61.8
Portrait 61.8
Couch 61.7
Indoors 61.3
Monitor 59.2
Electronics 59.2
Display 59.2
Screen 59.2
Sleeve 56.8
Suit 56.2
Overcoat 56.2
Coat 56.2

Imagga
created on 2022-03-25

sexy 33.7
fashion 29.4
attractive 28.7
adult 26.8
hair 23.8
person 22.8
women 22.1
model 21.8
body 19.2
sensual 19.1
dress 19
pretty 18.2
gorgeous 18.1
portrait 18.1
lady 17
happy 16.9
clothing 16.8
people 16.7
lingerie 16.4
style 16.3
erotic 16.3
blond 16.2
sitting 15.5
posing 15.1
lifestyle 13.7
sensuality 13.6
skin 13.5
smile 13.5
man 13.4
happiness 13.3
interior 13.3
couple 13.1
one 12.7
smiling 12.3
legs 12.3
cute 12.2
love 11.8
pose 11.8
underwear 11.6
male 11.6
adults 11.4
looking 11.2
makeup 11
elegance 10.9
black 10.9
stylish 10.8
face 10.7
couch 10.6
indoors 10.5
together 10.5
casual 10.2
two 10.2
hot 10
sexual 9.6
brunette 9.6
home 9.6
elegant 9.4
bikini 8.8
glamorous 8.7
standing 8.7
glamor 8.6
luxury 8.6
fashionable 8.5
world 8.5
youth 8.5
passion 8.5
feminine 8.4
lips 8.3
mother 8.1
cheerful 8.1
covering 8.1
handsome 8
dancer 7.9
eyes 7.7
desire 7.7
bed 7.7
sofa 7.7
seductive 7.6
jeans 7.6
studio 7.6
chair 7.6
joy 7.5
human 7.5
leisure 7.5
vintage 7.4
slim 7.4
inside 7.4
room 7.3
performer 7.1
shop 7.1
lovely 7.1
look 7

Microsoft
created on 2022-03-25

person 99.8
text 97.2
sitting 97.1
clothing 96.8
smile 94.1
woman 92.4
dress 88.6
footwear 69.9
posing 56.2
human face 54.6
old 52.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Male, 73%
Calm 66.4%
Sad 27.1%
Surprised 2.5%
Confused 1.3%
Disgusted 0.9%
Angry 0.7%
Fear 0.7%
Happy 0.5%

AWS Rekognition

Age 42-50
Gender Female, 99.9%
Confused 63.1%
Disgusted 14.8%
Angry 8.7%
Sad 5.6%
Calm 3.9%
Happy 2.8%
Fear 0.7%
Surprised 0.4%

AWS Rekognition

Age 59-67
Gender Male, 82.1%
Calm 99.7%
Sad 0.1%
Confused 0.1%
Surprised 0%
Angry 0%
Fear 0%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 45
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 71%

Captions

Microsoft

a vintage photo of a woman sitting on a bench posing for the camera 90.4%
a woman sitting on a bench posing for the camera 90.3%
a vintage photo of a woman sitting on a bench 89.8%

Text analysis

Amazon

KODAK
7E
KODAK SAFETY FILM
SAFETY
FILM
ILM

Google

S'AFETY
KODAK
FILM
ILM
7E ILM KODAK S'AFETY FILM
7E