Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (woman sitting in chair in living room with dog)

Date

1941

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21905

Human Generated Data

Title

Untitled (woman sitting in chair in living room with dog)

People

Artist: Hamblin Studio, American active 1930s

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21905

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 97.5
Human 97.5
Room 93.2
Indoors 93.2
Furniture 92.6
Sitting 83.2
Living Room 81.9
Face 74.1
Workshop 70.6
Clothing 66.8
Apparel 66.8
Portrait 64
Photography 64
Photo 64
Shelf 63.6
Person 61.6
Bedroom 57.5
Clinic 55.9

Clarifai
created on 2023-10-22

people 100
one 98.8
adult 98.6
furniture 98.4
two 96.9
monochrome 96.3
room 95.7
merchant 95.5
street 95.1
wear 95
woman 94.8
group 94.6
home 92.7
sit 92.7
three 91.2
man 91.1
chair 90.4
seat 90.4
art 89.8
commerce 89.1

Imagga
created on 2022-03-11

barbershop 100
shop 100
mercantile establishment 78.9
place of business 52.6
establishment 26.3
old 18.1
man 14.1
chair 13.5
people 13.4
window 12.8
building 12.5
room 11.8
ancient 11.2
home 11.2
house 10.9
person 10.8
vintage 10.8
male 10.6
interior 10.6
wall 10.3
indoor 10
decoration 9.5
sitting 9.4
men 9.4
architecture 9.4
light 9.4
holiday 9.3
dark 9.2
dirty 9
retro 9
style 8.9
indoors 8.8
newspaper 8.3
business 7.9
antique 7.8
scene 7.8
portrait 7.8
city 7.5
seller 7.4
danger 7.3
adult 7.2
family 7.1
women 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 95.7
black and white 90.8
clothing 86.3
person 82.6
monochrome 59.9
man 52.1
old 47.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 85.8%
Calm 98.3%
Surprised 0.9%
Sad 0.3%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Fear 0.1%

Feature analysis

Amazon

Person
Person 97.5%

Text analysis

Amazon

MJIR
MJIR YT3RAS А704
YT3RAS
А704