Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (man and woman around grandstands and stairs)

Date

c.1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15889.4

Human Generated Data

Title

Untitled (man and woman around grandstands and stairs)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c.1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15889.4

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99.5
Clothing 94.9
Apparel 94.9
Shorts 82.7
Female 74.7
Brick 73.2
Shelter 70.6
Nature 70.6
Countryside 70.6
Rural 70.6
Building 70.6
Outdoors 70.6
Sport 69.2
Sports 69.2
Photography 67.6
Photo 67.6
Architecture 66.5
Chair 66.2
Furniture 66.2
Portrait 65.4
Face 65.4
Sleeve 60.9
Woman 59.8
Dress 59.8
Pillar 57.1
Column 57.1

Clarifai
created on 2023-10-29

people 100
group together 97.9
two 97.8
adult 97.7
monochrome 97.5
music 96.6
man 95.6
group 94.2
vehicle 94
one 93.6
three 93.5
wear 93.5
outfit 92
child 91.6
woman 91.2
four 86.9
aircraft 86.8
veil 85.3
musician 85.2
several 84

Imagga
created on 2022-02-05

man 22.8
interior 21.2
male 19.8
people 19.5
business 18.8
work 16.7
modern 15.4
adult 14.3
person 14.2
men 13.7
window 13.6
architecture 13.6
businessman 13.2
inside 12.9
professional 12.9
corridor 12.8
silhouette 12.4
indoors 12.3
device 11.7
glass 11.7
building 11.7
floor 11.1
hall 11
light 10.7
job 10.6
room 10.5
gate 10.4
businesswoman 10
hallway 9.8
office 9.8
airport 9.8
wall 9.7
city 9.1
indoor 9.1
dress 9
black 9
design 9
transportation 9
group 8.9
urban 8.7
women 8.7
standing 8.7
corporate 8.6
walk 8.6
construction 8.5
walking 8.5
life 8.5
two 8.5
reflection 8.5
travel 8.4
human 8.2
equipment 8.2
newspaper 7.8
crowd 7.7
chair 7.7
move 7.7
happy 7.5
bride 7.2
furniture 7.2
worker 7.2
working 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.9
outdoor 87.7
old 75.3
clothing 70.1
person 61.9
posing 49.4
vintage 28.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 97.4%
Calm 99.2%
Surprised 0.3%
Sad 0.2%
Happy 0.2%
Confused 0.1%
Disgusted 0%
Fear 0%
Angry 0%

Feature analysis

Amazon

Person
Person 99.8%

Categories

Text analysis

Amazon

HARIE
07.05.co

Google

HARE
HARE