Human Generated Data

Title

Untitled (people wearing long capes in procession, seen from above)

Date

c. 1950

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21749

Human Generated Data

Title

Untitled (people wearing long capes in procession, seen from above)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-18

Person 97.9
Human 97.9
Furniture 97.7
Person 93.5
Person 90.8
Person 87.9
Interior Design 83.8
Indoors 83.8
Person 82.8
Person 81.6
Amusement Park 78.9
Theme Park 78.9
Person 78
Vehicle 73.5
Transportation 73.5
Automobile 73.5
Car 73.5
Person 71.1
Leisure Activities 69.5
Table 66.1
Person 65.2
People 65
Person 63
Person 61.8
Photo 61
Portrait 61
Photography 61
Face 61
Person 59.9
Room 59.5
Crowd 55.2
Person 45

Imagga
created on 2022-03-18

chandelier 74.5
lighting fixture 55.8
fixture 41.7
plaza 24.6
architecture 22.1
building 19.1
industrial 17.2
industry 16.2
equipment 15.8
modern 15.4
interior 13.3
ride 13.1
factory 12.5
heavy 12.4
inside 12
carousel 11.3
travel 11.3
decoration 10.9
light 10.7
steel 10.6
urban 10.5
engineering 10.5
metal 10.4
water 10
tower 9.8
park 9.8
power 9.2
structure 9
design 9
plant 8.9
manufacturing 8.8
mechanical device 8.6
business 8.5
house 8.5
city 8.4
old 8.4
technology 8.2
shop 8.1
glass 8
art 7.8
mechanical 7.8
lights 7.4
reflection 7.3
people 7.2
black 7.2
transportation 7.2
night 7.1
tract 7.1
decor 7.1
work 7.1
indoors 7

Google
created on 2022-03-18

White 92.2
Black 89.9
Black-and-white 85.7
Style 84.1
Art 76.8
Font 75.3
Monochrome photography 74.5
Snapshot 74.3
Monochrome 72.5
Event 67.6
Urban design 66.4
Visual arts 66.3
Stock photography 64.2
Room 63.6
Fun 59.7
City 59.7
Metal 58
History 55.3
Photography 52.9
Street 51.4

Microsoft
created on 2022-03-18

text 97.6
ship 92.3
black and white 77.6
old 45.8

Feature analysis

Amazon

Person 97.9%
Car 73.5%

Captions

Microsoft

a vintage photo of a group of people in a room 78.7%
a vintage photo of a person 66.8%
a vintage photo of some people in a room 66.7%

Text analysis

Amazon

BROS
ROS
KL
KLIEG
KL TEGL
KLIEGL
JJe
TEGL
KLIEGL RUS
RUS
YТ37°-А

Google

..... KL. IEGL AOS KLIEG BROS se
.....
KLIEG
se
IEGL
KL.
AOS
BROS