Human Generated Data

Title

Untitled (close-up image of the hands of a bride and groom)

Date

1948-1950

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9292

Human Generated Data

Title

Untitled (close-up image of the hands of a bride and groom)

People

Artist: Martin Schweig, American 20th century

Date

1948-1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 100
Apparel 100
Gown 98.1
Evening Dress 98.1
Robe 98.1
Fashion 98.1
Female 95.3
Human 95.3
Woman 88.7
Person 86.1
Long Sleeve 65.9
Sleeve 65.9
Wedding Gown 60.3
Wedding 60.3

Imagga
created on 2022-01-23

clothing 21.8
people 15.1
home 14.4
adult 14.2
person 13.7
hand 13.7
body 12.8
laptop 12.4
black 12.2
covering 11.9
garment 11.8
bed 11.4
sexy 11.2
interior 10.6
man 10.1
male 9.9
satin 9.8
device 9.7
room 9.6
love 9.5
bedroom 9.4
house 9.2
brassiere 9.2
modern 9.1
pretty 9.1
fashion 9
human 9
consumer goods 9
lady 8.9
support 8.9
bag 8.8
fabric 8.8
happy 8.8
lifestyle 8.7
computer 8.7
care 8.2
medical 7.9
silk 7.9
hands 7.8
gift 7.7
monitor 7.5
lying 7.5
woman's clothing 7.3
undergarment 7.3
indoor 7.3
business 7.3
sensual 7.3
dress 7.2
celebration 7.2
hair 7.1
smile 7.1
women 7.1
portrait 7.1
working 7.1
indoors 7

Google
created on 2022-01-23

Black 89.7
Sleeve 87.2
Black-and-white 86.3
Organism 86.2
Gesture 85.3
Style 84
Line 81.8
Font 78.9
Tints and shades 76.5
Monochrome photography 76.3
Art 75.6
Beauty 75.4
Snapshot 74.3
Monochrome 74.2
Plant 73.6
Pattern 68.9
Design 68.3
Petal 64.3
Visual arts 64
Stock photography 63.4

Microsoft
created on 2022-01-23

Feature analysis

Amazon

Person 86.1%

Captions

Microsoft

a person posing for the camera 78.8%
an old photo of a person 60.3%
a person posing for a photo 55.8%

Text analysis

Amazon

real