Human Generated Data

Title

Untitled (women holding baby seated on couch with younger woman and two boys)

Date

c. 1955

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12546

Human Generated Data

Title

Untitled (women holding baby seated on couch with younger woman and two boys)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 97.9
Person 97.9
Person 97.7
Helmet 97.3
Clothing 97.3
Apparel 97.3
Shoe 96.3
Footwear 96.3
Person 94.7
Person 90.4
Helmet 89.1
People 86.9
Leisure Activities 86
Guitar 86
Musical Instrument 86
Shorts 75.5
Helmet 70.7
Musician 65.1
Shoe 55.9
Music Band 55.9

Imagga
created on 2022-01-29

accordion 100
keyboard instrument 100
musical instrument 100
wind instrument 96.5
man 20.8
people 20.6
male 19.1
adult 18.4
person 17.2
portrait 16.8
black 16.2
sexy 13.6
lifestyle 13
play 12.9
attractive 12.6
model 12.4
sport 12.4
happy 11.9
women 11.9
body 11.2
leisure 10.8
silhouette 10.8
fashion 10.5
fun 10.5
couple 10.5
sitting 10.3
summer 10.3
pretty 9.8
posing 9.8
outdoors 9.7
kin 9.5
happiness 9.4
art 9.2
face 9.2
joy 9.2
outdoor 9.2
music 9
lady 8.9
boy 8.7
smiling 8.7
men 8.6
youth 8.5
relaxation 8.4
dark 8.3
holding 8.3
human 8.2
park 8.2
girls 8.2
sensuality 8.2
dress 8.1
hair 7.9
standing 7.8
wall 7.7
culture 7.7
old 7.7
relax 7.6
energy 7.6
head 7.6
power 7.6
playing 7.3
business 7.3
child 7.3
sunset 7.2
smile 7.1
love 7.1
musician 7.1
sky 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 96.8
text 95.1
outdoor 87
black and white 67.5
clothing 65.1
man 60.9
footwear 59.6

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Helmet 97.3%
Shoe 96.3%
Guitar 86%

Captions

Microsoft

a group of people jumping in the air 77.3%
a group of people riding on top of a building 68.3%
a group of people riding on the back of a man 49.4%

Text analysis

Amazon

B B
a
ARDA
32
32 12 B B كَ a
Я Н
12
كَ