Human Generated Data

Title

Untitled (three older women in dresses with corsages with drinks, cigarettes and

Date

c. 1955

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12593

Human Generated Data

Title

Untitled (three older women in dresses with corsages with drinks, cigarettes and

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Clothing 99.8
Apparel 99.8
Human 98.5
Person 98.5
Person 98
Person 97.3
Couch 86.7
Furniture 86.7
Gown 75.3
Robe 75.3
Evening Dress 75.3
Fashion 75.3
Female 75
Room 74.3
Living Room 74.3
Indoors 74.3
Hat 70.2
Leisure Activities 65.9
Musical Instrument 65.9
Piano 65.9
Woman 62.5
Costume 58
Face 57
Dress 56.2

Imagga
created on 2022-02-04

man 28.9
person 28.1
people 27.3
male 25.5
table 21.6
couple 20
home 19.9
adult 19.8
sitting 19.7
teacher 17.7
lifestyle 17.3
musical instrument 17.1
room 16.6
women 16.6
indoor 16.4
chair 16.3
smiling 15.9
happiness 15.7
men 15.5
interior 15
happy 14.4
kin 13.9
two 13.5
business 13.4
family 13.3
businessman 13.2
indoors 13.2
cheerful 13
dress 12.6
groom 12.5
smile 12.1
love 11.8
bride 11.6
portrait 11
holding 10.7
fashion 10.6
scholar 10.5
work 10.3
mature 10.2
wedding 10.1
mother 10
professional 10
office 9.9
hand 9.9
group 9.7
together 9.6
bouquet 9.4
clothing 9.4
worker 9.3
laptop 9.1
pretty 9.1
student 9
educator 9
color 8.9
husband 8.7
couch 8.7
boy 8.7
education 8.7
classroom 8.6
enjoying 8.5
senior 8.4
modern 8.4
old 8.4
intellectual 8.4
house 8.4
technology 8.2
computer 8.1
looking 8
to 8
job 8
day 7.8
full length 7.8
class 7.7
married 7.7
drinking 7.7
stringed instrument 7.6
elegance 7.6
meeting 7.5
enjoyment 7.5
keyboard instrument 7.5
outdoors 7.5
phone 7.4
wind instrument 7.3
lady 7.3
new 7.3
celebration 7.2

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 97.3
wall 95.4
dress 89.5
clothing 88.5
person 84.6
woman 79.8
furniture 65.6
black and white 65.5
old 43.8

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Calm 87.9%
Happy 8.4%
Sad 2.4%
Disgusted 0.4%
Confused 0.4%
Surprised 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 71.1%
Confused 13.5%
Happy 6.4%
Surprised 2.7%
Sad 2.2%
Disgusted 1.6%
Angry 1.5%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Piano 65.9%

Captions

Microsoft

a vintage photo of a person 90%
a vintage photo of a man and a woman looking at the camera 65.4%
a vintage photo of a person 65.3%

Text analysis

Amazon

2
4.
RAS
RAS DOPUA
DOPUA