Human Generated Data

Title

Untitled (two young girls posing on front lawn with one girl in tree and the other with her hands over her eyes)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9765

Human Generated Data

Title

Untitled (two young girls posing on front lawn with one girl in tree and the other with her hands over her eyes)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 98.8
Human 98.8
Shoe 93.5
Apparel 93.5
Footwear 93.5
Clothing 93.5
Grass 93.2
Plant 93.2
Yard 91.5
Nature 91.5
Outdoors 91.5
Person 87.6
People 86.9
Face 81.2
Shorts 75
Photography 66
Photo 66
Portrait 66
Female 60.8
Team Sport 59.9
Sport 59.9
Team 59.9
Sports 59.9
Shoe 59.8
Machine 59.6
Housing 59.2
Building 59.2
Child 57.1
Kid 57.1

Imagga
created on 2022-01-24

musical instrument 35.5
harp 28.1
man 24.2
stringed instrument 19.1
wind instrument 18.4
device 17.4
people 16.7
accordion 16.6
instrument 16.4
black 15.6
male 15.6
adult 15.2
silhouette 14.9
person 14.8
keyboard instrument 14.3
weapon 13.3
sky 12.8
outdoors 12.7
sunset 11.7
musician 11.2
protection 10.9
danger 10.9
music 10.9
dark 10.9
holding 10.7
sport 10.7
water 10.7
guitar 10.6
light 10.5
musical 10.5
portrait 10.4
smoke 10.2
performer 10.1
happy 10
sax 10
destruction 9.8
swing 9.5
play 9.5
model 9.3
industrial 9.1
clothing 9.1
fashion 9
suit 9
bassoon 8.9
disaster 8.8
nuclear 8.7
outdoor 8.4
landscape 8.2
style 8.2
copy space 8.1
hair 7.9
radioactive 7.9
stage 7.8
radiation 7.8
standing 7.8
rock 7.8
protective 7.8
men 7.7
gas 7.7
industry 7.7
mask 7.7
mechanical device 7.5
one 7.5
bass 7.4
safety 7.4
playing 7.3
business 7.3
singer 7.3
bow 7.3
color 7.2
dirty 7.2
night 7.1
day 7.1
sword 7

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Female, 89.5%
Calm 95.9%
Sad 1.2%
Happy 0.8%
Surprised 0.7%
Angry 0.7%
Disgusted 0.4%
Fear 0.2%
Confused 0.1%

Feature analysis

Amazon

Person 98.8%
Shoe 93.5%

Captions

Microsoft

a vase of flowers sitting on a swing 52.2%
a vase of flowers on a table 52.1%
a vase filled with flowers sitting on a swing 43.9%

Text analysis

Amazon

RODVK
RODVK CoVEEIX--E1T
CoVEEIX--E1T