Human Generated Data

Title

Untitled (street artists eating in front of paintings hung on fence, wide shot)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15716

Human Generated Data

Title

Untitled (street artists eating in front of paintings hung on fence, wide shot)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15716

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Interior Design 96.8
Indoors 96.8
Clothing 91.9
Apparel 91.9
Human 90.5
Person 84.5
Living Room 80.2
Room 80.2
Furniture 79.7
Person 74.8
Face 71.6
Housing 67.7
Building 67.7
Art 66.3
Person 65
Couch 64.3
Text 62.9
Drawing 58.7
Floor 56.2
Staircase 56
Handrail 55.4
Banister 55.4

Clarifai
created on 2023-10-29

people 99.9
home 98.5
group 98.5
adult 98.4
many 96.9
furniture 96.3
man 95
print 94.1
woman 94
room 93.4
administration 93.3
several 93
two 91.1
group together 90.8
leader 90.5
one 90.2
monochrome 89.7
war 88.2
child 87.3
family 86

Imagga
created on 2022-02-05

barbershop 80.6
shop 69.1
mercantile establishment 53.6
place of business 35.7
architecture 32.7
building 29.7
window 26.9
house 24.2
city 22.4
wall 20.5
old 18.8
establishment 17.7
home 17.5
urban 16.6
travel 16.2
balcony 14.9
facade 14.6
structure 14.5
street 13.8
room 12.8
modern 12.6
interior 12.4
light 12
glass 11.7
business 11.5
windows 11.5
ancient 11.2
town 11.1
stone 11.1
vintage 10.7
door 10.5
roof 10.5
brick 10.4
historic 10.1
office 9.9
history 9.8
scene 8.7
antique 8.7
decoration 8.6
construction 8.6
buildings 8.5
finance 8.4
sky 8.3
aged 8.1
apartment 7.7
grunge 7.7
texture 7.6
lamp 7.6
perspective 7.5
tourism 7.4
technology 7.4
furniture 7.4
back 7.3
design 7.3
indoor 7.3
road 7.2
dirty 7.2
black 7.2
working 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.9
black and white 88.4
house 88.1
white 83.2
old 66.5

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 84.5%
Person 74.8%
Person 65%

Categories

Imagga

interior objects 99.9%

Captions