Human Generated Data

Title

Untitled (Harvard College Observatory group, California)

Date

1880s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2605

Human Generated Data

Title

Untitled (Harvard College Observatory group, California)

People

Artist: Unidentified Artist,

Date

1880s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2605

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.6
Human 99.6
Person 99.5
Person 98.2
Building 89.8
Person 87.7
Wood 87.4
Outdoors 87
Nature 84.5
Person 78.7
Housing 77.1
Person 73.9
Countryside 72.7
Person 69.4
Rural 63.2
Art 61.1
Workshop 60.9
Painting 60.8
Bunker 58.1
Flooring 56.4
Drawing 55.2

Clarifai
created on 2023-10-25

people 99.5
art 98.6
painting 98.2
vintage 98.1
sepia pigment 96.5
wear 96.5
collage 96.1
retro 95.5
adult 94.8
print 93.3
group 92.9
album 91.9
old 91.5
sepia 90.8
paper 89.5
picture frame 88.8
one 86.6
no person 86.2
man 85.7
antique 85.7

Imagga
created on 2021-12-15

structure 47
billboard 46.4
signboard 37.6
old 36.9
building 31.3
ancient 29.4
wall 27.8
architecture 27.7
stone 25.7
city 25
crate 24.1
box 23.5
container 21.9
vintage 20.7
town 20.4
sky 19.1
antique 19.1
historic 18.3
grunge 17.9
history 17
texture 16.7
tourism 16.5
retro 16.4
aged 16.3
travel 16.2
tower 16.1
landscape 15.6
construction 15.4
paper 14.9
church 13
exterior 12.9
house 12.7
landmark 12.6
frame 11.7
damaged 11.5
grain 11.1
brick 10.9
religion 10.8
roof 10.5
old fashioned 10.5
grungy 10.4
art 10.4
monument 10.3
material 9.9
freight car 9.8
crumpled 9.7
urban 9.6
great 9.6
empty 9.5
culture 9.4
facade 9.2
border 9.1
dirty 9
style 8.9
grime 8.8
abandoned 8.8
broken 8.7
village 8.7
sea 8.6
england 8.6
buildings 8.5
outdoors 8.2
car 8.2
scenery 8.1
scenic 7.9
design 7.9
text 7.9
scene 7.8
ruin 7.8
decay 7.7
obsolete 7.7
window 7.5
tall 7.5
cathedral 7.5
desert 7.5
boat 7.4
street 7.4
water 7.3
rough 7.3
industrial 7.3
tourist 7.3
rural 7.1
modern 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.3
old 78.5
white 65.7
different 33.5
vintage 27
picture frame 23.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 85.5%
Calm 85.6%
Happy 10.6%
Disgusted 2.8%
Angry 0.6%
Sad 0.2%
Surprised 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 8-18
Gender Female, 84.9%
Sad 55.4%
Fear 27.5%
Calm 7.4%
Happy 3.9%
Surprised 2.6%
Angry 1.9%
Confused 0.9%
Disgusted 0.3%

AWS Rekognition

Age 43-61
Gender Male, 82.5%
Calm 78.4%
Happy 12%
Angry 4.5%
Confused 2.9%
Sad 1.2%
Surprised 0.6%
Disgusted 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 99.6%
Painting 60.8%

Captions

Text analysis

Amazon

9
LOWS
California
YYAE LOWS
SIB
USE
YYAE
CAU
CAU озеть
озеть

Google

fornia 6.
fornia
6.