Human Generated Data

Title

Untitled (observation balloon)

Date

c. 1914-1918

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Richard E. Bennink, 2.2002.5075

Human Generated Data

Title

Untitled (observation balloon)

People

Artist: Unidentified Artist,

Date

c. 1914-1918

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Richard E. Bennink, 2.2002.5075

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Person 99.5
Human 99.5
Person 99.2
Building 84.1
Architecture 76.8
Person 75.6
Person 73.5
Person 70.8
Person 68.2
Airplane 65.3
Transportation 65.3
Vehicle 65.3
Aircraft 65.3
Pillar 56.4
Column 56.4
Observatory 55.2
Person 54.2
Person 50.8
Person 49.3
Person 46.7
Person 45.9

Clarifai
created on 2023-10-22

people 98.1
beach 95.7
no person 94.3
art 92.9
sea 91.4
man 88.3
group 86.5
print 86.4
ocean 86.2
sepia 85.8
water 84.3
retro 84.2
winter 82.4
architecture 81.8
vintage 81.8
street 81.7
travel 80.4
seashore 80.3
outdoors 80.2
dawn 80.2

Imagga
created on 2022-03-12

old 38.3
grunge 36.6
vintage 34.7
book 34.4
antique 31.3
retro 31.1
frame 29.1
texture 26.4
product 23.8
ancient 23.3
paper 22.7
aged 22.6
damaged 21.9
empty 21.5
text 20.9
film 19.8
material 19.6
book jacket 19.1
border 19
dirty 19
grungy 18
structure 17.9
space 17.8
creation 17.7
graphic 17.5
art 16.9
slide 16.6
jacket 15.8
photographic 15.7
strip 15
negative 14.9
historic 14.7
design 14.6
rough 14.6
movie 14.5
screen 14.2
page 13.9
blank 13.7
pattern 13.7
dirt 13.4
brown 13.2
frames 12.7
stains 12.6
element 12.4
black 12
wall 12
binding 12
messy 11.6
your 11.6
rust 11.6
edge 11.5
weathered 11.4
textured 11.4
wrapping 11.3
style 11.1
grain 11.1
decoration 10.9
scratches 10.8
broad 10.8
grime 10.7
scratch 10.7
crumpled 10.7
decay 10.6
color 10.6
spot 10.5
backgrounds 10.5
roll 10.4
sheet 10.3
light 10.1
paint 10
designed 9.8
layered 9.8
covering 9.8
tracery 9.7
fracture 9.7
layer 9.7
collage 9.6
mask 9.6
parchment 9.6
building 9.5
decorative 9.2
digital 8.9
overlay 8.9
noisy 8.9
highly 8.9
mess 8.8
computer 8.8
noise 8.8
mottled 8.8
succulent 8.7
sepia 8.7
detailed 8.7
worn 8.6
camera 8.5
financial 8
architecture 7.9
burned 7.8
device 7.7
stained 7.7
great 7.7
money 7.7
wood 7.5
clip 7.4
bank 7.2

Google
created on 2022-03-12

Microsoft
created on 2022-03-12

outdoor 98.1
text 87.3
aircraft 67.9
dome 27.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 97.4%
Calm 66.9%
Happy 11%
Angry 7.1%
Fear 6.1%
Surprised 2.9%
Disgusted 2.9%
Sad 1.8%
Confused 1.2%

AWS Rekognition

Age 23-31
Gender Female, 53.2%
Happy 90.5%
Angry 3.3%
Calm 2.3%
Sad 2.1%
Fear 0.9%
Surprised 0.3%
Confused 0.3%
Disgusted 0.3%

Feature analysis

Amazon

Person
Airplane
Person 99.5%
Person 99.2%
Person 75.6%
Person 73.5%
Person 70.8%
Person 68.2%
Person 54.2%
Person 50.8%
Person 49.3%
Person 46.7%
Person 45.9%
Airplane 65.3%

Categories

Text analysis

Amazon

BALLON
10.
S.T.L.
10. BALLON D'ORSERVATION
(DRACHEN)
D'ORSERVATION

Google

10. BALLON D'ORSERVATION (ORACHEN) S.T.L.
10.
BALLON
D'ORSERVATION
(ORACHEN)
S.T.L.