Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (group of people watching street portrait artist)

Date

c.1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15715

Human Generated Data

Title

Untitled (group of people watching street portrait artist)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c.1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15715

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.6
Human 99.6
Person 99
Person 97.8
Person 97.6
Person 95.9
Person 94.6
Art 90.3
Person 88.9
Drawing 87.9
Person 83.5
Person 82.4
Wheel 72.9
Machine 72.9
Sketch 71.7
Clinic 69.7
People 68.1
Person 66
Car 62.9
Transportation 62.9
Vehicle 62.9
Automobile 62.9
Person 58.5
Nurse 57.2
Person 51.3

Clarifai
created on 2023-10-29

people 100
adult 99.4
group 99.2
many 98.7
group together 98
man 97.3
woman 97.1
administration 95.7
music 95.2
wear 93.1
leader 93
furniture 92.3
boy 89.1
child 87.9
several 87.5
musician 86.9
chair 86.5
outfit 86.4
war 86.1
facial expression 85.9

Imagga
created on 2022-02-05

newspaper 100
product 82.7
creation 65.1
daily 22.7
old 19.5
building 17.2
window 17.1
art 16.9
vintage 15.7
architecture 15.6
house 15
design 14.1
negative 12.8
currency 12.6
film 12.5
sketch 12.2
drawing 12
money 11.9
paper 11.8
history 11.6
antique 11.2
home 11.2
dollar 11.1
wall 11.1
texture 11.1
grunge 11.1
finance 11
business 10.9
black 10.8
city 10.8
retro 10.6
dollars 10.6
ancient 10.4
banking 10.1
symbol 10.1
historic 10.1
cash 10.1
bank 9.8
glass 9.5
culture 9.4
aged 9
pattern 8.9
church 8.3
frame 8.3
paint 8.1
interior 8
travel 7.7
bill 7.6
sign 7.5
one 7.5
savings 7.5
photographic paper 7.4
detail 7.2
wealth 7.2
modern 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
drawing 93.9
person 93.2
outdoor 90
clothing 84.3
sketch 64.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 84%
Sad 74.1%
Calm 13%
Disgusted 3.9%
Confused 3.2%
Angry 2.5%
Surprised 2%
Fear 0.7%
Happy 0.6%

Feature analysis

Amazon

Person
Wheel
Car
Person 99.6%

Categories

Imagga

paintings art 99.8%