Human Generated Data

Title

Untitled (woman seated on wicker bench holding baby)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12460

Human Generated Data

Title

Untitled (woman seated on wicker bench holding baby)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Furniture 99.3
Human 98.7
Person 98.7
Leisure Activities 92.5
Clothing 89.6
Apparel 89.6
Home Decor 79.2
Female 77.9
Musical Instrument 76.1
Sitting 67.5
Musician 65.7
Dress 63.4
Guitar 62.7
Photography 62.2
Photo 62.2
Portrait 62.2
Face 62.2
Viola 61.8
Violin 61.8
Fiddle 61.8
Girl 61.6
Woman 58.9
Chair 55

Imagga
created on 2022-01-29

man 23.5
person 21.8
net 16.7
male 16.4
people 16.2
stringed instrument 14.4
adult 14.3
lifestyle 13.7
groom 13.7
portrait 12.9
black 12
technology 11.1
musical instrument 11.1
sexy 10.4
boy 10.4
art 10.4
body 10.4
hair 10.3
future 10.2
youth 10.2
happy 10
work 9.8
fun 9.7
device 9.7
player 9.6
sport 9.4
newspaper 9.3
modern 9.1
attractive 9.1
child 8.9
world 8.7
light 8.7
play 8.6
sitting 8.6
model 8.6
smile 8.5
business 8.5
outdoors 8.5
chair 8.4
active 8.4
fashion 8.3
safety 8.3
holding 8.3
human 8.2
dress 8.1
digital 8.1
suit 8.1
looking 8
working 7.9
women 7.9
love 7.9
ball 7.9
athlete 7.7
outside 7.7
power 7.6
joy 7.5
product 7.5
smoke 7.4
danger 7.3
futuristic 7.2
worker 7.1
face 7.1
summer 7.1
clothing 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 94.1
guitar 92.5
concert 90.3
person 84.1
musical instrument 80.9
black and white 70.1
clothing 66.2
music 59.2

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 83.8%
Calm 84.7%
Sad 12.6%
Confused 1.5%
Disgusted 0.3%
Happy 0.3%
Surprised 0.3%
Angry 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Chair 55%

Text analysis

Amazon

2
4
5
G
a
ХН 4 2 G 5
ХН
a 205H It
It
205H
2VLE1X-KODVK

Google

26
XH
4
XH 4 26 5 ZUCH
5
ZUCH