Human Generated Data

Title

Untitled (woman standing on brick patio under trees)

Date

1933

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13009

Human Generated Data

Title

Untitled (woman standing on brick patio under trees)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1933

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Apparel 98.9
Clothing 98.9
Person 98.4
Human 98.4
Dress 97
Female 93.1
Woman 84
Outdoors 83
Tree 82.3
Plant 82.3
Nature 80.9
Gown 77
Evening Dress 77
Fashion 77
Robe 77
Standing 58.5
Face 58.2
Furniture 58.1
Home Decor 57

Imagga
created on 2022-02-05

grunge 43.4
fountain 41.7
old 36.2
structure 34
wall 28.4
vintage 28.1
texture 27.1
antique 26.8
aged 24.4
grungy 23.7
ancient 23.4
dirty 22.6
retro 21.3
rough 21
textured 19.3
pattern 18.5
paint 17.2
art 16.9
architecture 16.5
stucco 16.5
building 16.3
weathered 16.2
material 15.2
detail 14.5
border 14.5
paper 14.2
frame 14.2
space 14
blank 13.7
worn 13.4
damaged 13.4
text 13.1
brown 12.5
surface 12.4
canvas 12.3
black 12.1
history 11.6
close 10.8
aging 10.5
dirt 10.5
man 10.1
people 10
stone 10
rust 9.6
stained 9.6
parchment 9.6
design 9.6
rusty 9.5
screen 9.4
wallpaper 9.2
travel 9.2
tourism 9.1
landmark 9
scratch 8.8
torn 8.7
messy 8.7
artistic 8.7
stain 8.7
tree 8.5
dark 8.4
color 8.3
city 8.3
silhouette 8.3
backdrop 8.2
person 8.2
cell 8.2
groom 7.9
child 7.8
burned 7.8
burnt 7.8
concrete 7.7
old fashioned 7.6
decorative 7.5
park 7.4
exterior 7.4
world 7.2

Google
created on 2022-02-05

Photograph 94.1
Water 90.1
Black 89.8
Rectangle 86.8
Black-and-white 85.3
Style 83.8
Line 81.8
Adaptation 79.3
Tints and shades 77.4
Monochrome photography 76.7
Art 75.1
Snapshot 74.3
Monochrome 73
Visual arts 68.3
Room 67.3
Plant 63.9
Tree 63.7
Stock photography 63.5
History 63.1
Wood 57.3

Microsoft
created on 2022-02-05

text 96
black and white 93.4
statue 86.1
clothing 78.6
monochrome 74.9
person 74.6

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a person that is standing in the rain 43.6%
a person standing in front of water 42.5%
a person standing in front of a wall 42.4%

Text analysis

Amazon

oces

Google

AOON-YT3RA2-AMT2
AOON-YT3RA2-AMT2