Human Generated Data

Title

Queen Mother Audley Moore

Date

1987-1988

People

Artist: Brian Lanker, American 1947 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.349

Copyright

© Brian Lanker

Human Generated Data

Title

Queen Mother Audley Moore

People

Artist: Brian Lanker, American 1947 - 2011

Date

1987-1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.349

Copyright

© Brian Lanker

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Furniture 99.3
Shelf 99
Person 98.7
Human 98.7
Person 96.9
Face 92.3
Bookcase 89.3
Person 88.1
Person 86.6
Couch 86
Apparel 84.8
Clothing 84.8
Person 80.1
Sitting 64.3
Portrait 64.2
Photography 64.2
Photo 64.2
Indoors 62.1
Skin 58.8
Home Decor 56.7
Room 56.5
Living Room 56.5
Screen 55.5
Monitor 55.5
Electronics 55.5
LCD Screen 55.5
Display 55.5

Clarifai
created on 2018-02-10

people 99.9
group 98.3
adult 97.7
furniture 96.4
man 96.1
woman 95.7
portrait 94.5
room 93
sit 92.8
one 92.4
seat 89.9
monochrome 89.8
wear 89.3
music 89.3
group together 88.7
leader 87.7
administration 87.3
chair 86.9
two 86.4
indoors 85.1

Imagga
created on 2018-02-10

city 18.3
building 17
accordion 15.9
window 15.1
architecture 14.8
device 14.5
machine 13.5
interior 13.3
shop 13.2
urban 13.1
work 13.1
art 12.4
equipment 12
old 11.8
furniture 11.6
chair 11.6
black 11.5
religion 10.7
wicker 10.6
barbershop 10.3
light 10
jukebox 9.9
glass 9.9
inside 9.2
house 9.2
room 9
retro 9
metal 8.8
structure 8.8
man 8.7
dishwasher 8.7
design 8.5
power 8.4
wood 8.3
historic 8.2
landmark 8.1
antique 8
home 8
business 7.9
factory 7.7
industry 7.7
vintage 7.5
silhouette 7.4
industrial 7.3
music 7.2
history 7.2
night 7.1
product 7.1
travel 7

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 95.1
indoor 93.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-77
Gender Female, 54.3%
Happy 7.2%
Sad 5.9%
Calm 68.3%
Confused 2.2%
Disgusted 6.1%
Surprised 2.7%
Angry 7.7%

AWS Rekognition

Age 48-68
Gender Male, 90%
Confused 5.4%
Calm 4.4%
Surprised 6.6%
Sad 10.6%
Happy 6.9%
Disgusted 39.9%
Angry 26.3%

AWS Rekognition

Age 26-43
Gender Female, 97%
Happy 2.1%
Disgusted 3.3%
Angry 3.4%
Calm 79.3%
Surprised 2%
Sad 4.1%
Confused 5.8%

AWS Rekognition

Age 38-59
Gender Female, 57.1%
Surprised 6.2%
Calm 40.8%
Confused 30.3%
Happy 4.2%
Angry 7.9%
Disgusted 3.1%
Sad 7.4%

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Happy 45.1%
Sad 52.8%
Surprised 45.1%
Confused 45.6%
Calm 45.6%
Angry 45.4%
Disgusted 45.3%

AWS Rekognition

Age 29-45
Gender Male, 50.9%
Surprised 45.6%
Calm 45.2%
Sad 46.4%
Confused 45.3%
Angry 47.8%
Happy 45.5%
Disgusted 49.2%

AWS Rekognition

Age 38-59
Gender Female, 50.4%
Calm 46.2%
Happy 46%
Disgusted 45.8%
Confused 45.2%
Angry 48.6%
Sad 47.4%
Surprised 45.8%

AWS Rekognition

Age 45-65
Gender Female, 66.3%
Disgusted 4.6%
Sad 67.5%
Happy 4.8%
Surprised 5.1%
Calm 5.3%
Angry 8.6%
Confused 4.1%

AWS Rekognition

Age 26-43
Gender Male, 89.9%
Sad 3.2%
Calm 80.1%
Disgusted 1.9%
Surprised 3.8%
Angry 2.5%
Happy 6%
Confused 2.5%

Microsoft Cognitive Services

Age 84
Gender Male

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 54
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Azure OpenAI

Created on 2024-11-19

This black and white photograph depicts a person dressed in an ornate and patterned garment, adorned with an abundance of beads and bracelets. The attire suggests a cultural or traditional significance. They are seated in a room that appears to be filled with an eclectic collection of items, including books, artworks, photographs, and various decorative objects, which collectively create a rich, textured backdrop. The setting has the appearance of a personal space or a studio, reflecting a sense of individuality and perhaps artistic or intellectual pursuits, given the visible presence of books and artwork. There is also a visual contrast between the patterns and textures of the clothing and the surrounding environment, adding to the intricate composition of the image.

Anthropic Claude

Created on 2024-11-19

The image depicts an elderly woman sitting in a chair in what appears to be her home. She is wearing traditional-looking jewelry and clothing, and the background shows bookshelves, artwork, and other personal effects, suggesting this is her living space. The image has a sense of depth and perspective, creating an intimate and introspective atmosphere. The black and white format adds a timeless quality to the scene.