Human Generated Data

Title

Untitled (church floats in parade)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4394

Human Generated Data

Title

Untitled (church floats in parade)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99
Human 99
Person 94.2
Drawing 82.7
Art 82.7
Building 82
Architecture 80.3
Spire 75.4
Steeple 75.4
Tower 75.4
Pedestrian 74.1
Person 73.3
Sketch 69.3
Apparel 69
Clothing 69
Outdoors 67.7
Nature 62.5
Shorts 56.6
Dome 56.1
Vehicle 55.2
Transportation 55.2

Clarifai
created on 2019-06-01

architecture 98.1
house 97.9
furniture 96.9
no person 95.8
building 95.5
city 94.6
home 93.4
room 93.3
people 93.2
window 92.8
street 92.8
chair 91.7
town 91.3
illustration 88.8
seat 87.1
travel 85.3
family 85.2
vehicle 84.7
design 83.6
urban 83.3

Imagga
created on 2019-06-01

sketch 100
representation 100
drawing 100
architecture 29.1
construction 26.6
design 26.5
house 25.9
plan 22.7
structure 21.1
building 20.6
project 18.3
art 17.6
architect 17.4
blueprint 15.7
business 15.2
city 15
home 14.7
pattern 13.7
engineering 13.3
silhouette 13.3
urban 13.1
modern 12.6
graphic 12.4
interior 12.4
retro 12.3
new 12.2
office 12.1
old 11.9
drafting 11.8
paper 11.8
architectural 11.5
line 11.1
industry 11.1
black 10.8
lines 10.8
grunge 10.2
element 9.9
tower 9.9
close 9.7
technology 9.7
built 9.7
decoration 9.4
finance 9.3
exterior 9.2
vintage 9.1
style 8.9
symbol 8.8
development 8.7
designer 8.7
diagram 8.6
outline 8.5
3d 8.5
tree 8.5
floor 8.4
investment 8.3
window 8.3
shape 8.2
backgrounds 8.1
idea 8
rural 7.9
draft 7.9
travel 7.8
scale 7.7
wall 7.7
texture 7.6
pencil 7.6
site 7.5
frame 7.5
industrial 7.3
growth 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

drawing 95.1
sketch 91.6
black and white 72.5
old 65.6
fog 52.6

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.3%
Happy 49.6%
Disgusted 49.7%
Sad 49.6%
Surprised 49.6%
Angry 49.7%
Calm 49.8%
Confused 49.6%

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a vintage photo of an old building 79.5%
a vintage photo of a building 79.4%
a vintage photo of a person 73.2%