Human Generated Data

Title

Untitled (two girls in matching dresses, holding hands, standing on patio in yard)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12508

Human Generated Data

Title

Untitled (two girls in matching dresses, holding hands, standing on patio in yard)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.8
Clothing 99.8
Shorts 99.8
Human 99.1
Person 99.1
Path 97.5
Nature 93.3
Person 92.1
Plant 91.5
Vegetation 91.5
Outdoors 86.7
Dress 83.6
Tree 83.6
Footwear 80.3
Shoe 80.3
Female 76.9
Forest 75.4
Land 75.4
Woodland 75.4
Walkway 75.3
Ice 69.8
People 68.2
Photo 64.6
Photography 64.6
Portrait 64.1
Face 64.1
Urban 62.8
Sidewalk 59.6
Pavement 59.6
Building 58.5
City 58.5
Town 58.5
Street 58.5
Road 58.5
Brick 58.2
Woman 57.8
Shoe 57.5
Water 56
Sea 55.6
Ocean 55.6

Imagga
created on 2022-01-29

wheeled vehicle 20.6
skateboard 19
people 17.9
man 17.5
person 16.9
vehicle 16.6
board 14.8
wall 14.7
water 14.7
outdoors 14.5
grunge 14.5
sport 13.8
adult 13
travel 12.7
snow 12.4
outdoor 12.2
old 11.8
danger 10.9
dirty 10.8
male 10.7
conveyance 10.6
sand 10.6
rock 10.4
action 10.2
dark 10
exercise 10
cool 9.8
beach 9.6
sky 9.6
stone 9.3
art 9.2
summer 9
vacation 9
river 8.9
building 8.8
jump 8.6
black 8.5
relax 8.4
ocean 8.4
attractive 8.4
fashion 8.3
style 8.2
dress 8.1
fitness 8.1
active 8.1
recreation 8.1
weapon 7.8
sea 7.8
architecture 7.8
color 7.8
portrait 7.8
jumping 7.7
fun 7.5
silhouette 7.5
landscape 7.4
tourism 7.4
weather 7.4
device 7.3
teenager 7.3
mountain 7.3
sunset 7.2
coast 7.2
love 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97.9
wall 96
black and white 94.9
monochrome 75
old 64.5
image 35.2

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 76.7%
Happy 79%
Calm 9.4%
Sad 9.3%
Surprised 0.6%
Angry 0.5%
Disgusted 0.4%
Confused 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 80.3%

Captions

Microsoft

a vintage photo of a person 78.9%
a vintage photo of a bench 71.7%
a vintage photo of a person 71.6%

Text analysis

Amazon

3
4
ХН
ХН 4 3 8 4L
8
TH
YPEN TH
YPEN
4L
PIXII

Google

4
YT3RA2-NAMTZA XH 4 3 4
YT3RA2-NAMTZA
XH
3