Human Generated Data

Title

Untitled (woman on chair with granddaughter and dog)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17725

Human Generated Data

Title

Untitled (woman on chair with granddaughter and dog)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17725

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Chair 100
Furniture 100
Person 96.5
Human 96.5
Clothing 90.1
Apparel 90.1
Shoe 80.4
Footwear 80.4
Shoe 71.4
Shorts 65.5
Grass 61.3
Plant 61.3
Portrait 61.1
Face 61.1
Photography 61.1
Photo 61.1
Kid 61.1
Child 61.1
Fence 59.8
Outdoors 57.7
Tabletop 56.4
Yard 55.8
Nature 55.8
Person 44.8

Clarifai
created on 2023-10-29

people 99.8
two 98.9
child 98.8
chair 96.8
monochrome 96.3
seat 96
one 94.7
wear 94
man 93.5
furniture 93.1
nostalgia 92.9
sit 92
recreation 91.8
sitting 91.6
retro 91.1
vintage 90.4
movie 90.2
adult 89.6
three 88.5
portrait 87.2

Imagga
created on 2022-02-26

chair 38.1
brass 32.8
seat 27.4
wheelchair 26.9
wind instrument 25.1
musical instrument 17.8
man 16.1
people 15.6
person 15.1
adult 14.9
city 14.1
sunglasses 12.9
sitting 12.9
human 12.7
bench 11.3
furniture 11.2
youth 11.1
device 11
black 10.8
male 10.6
attractive 10.5
old 10.4
portrait 10.3
outdoor 9.9
park 9.9
summer 9.6
war 9.6
urban 9.6
wheel 9.6
clothing 9.6
scene 9.5
sport 9.3
dirty 9
transportation 9
sexy 8.8
body 8.8
military 8.7
vehicle 8.7
fashion 8.3
street 8.3
one 8.2
spectacles 8.1
building 8
cornet 7.8
soldier 7.8
model 7.8
death 7.7
wheeled vehicle 7.6
outdoors 7.5
vacation 7.4
protection 7.3
danger 7.3
tricycle 7.3
metal 7.2
lifestyle 7.2
holiday 7.2
gun 7.1
snow 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 97.3
text 79.7
black and white 70.2
black 66

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 54-62
Gender Female, 78.6%
Happy 56.9%
Calm 32.3%
Surprised 7.9%
Sad 1.4%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%
Fear 0.3%

Feature analysis

Amazon

Person
Shoe
Person 96.5%
Person 44.8%
Shoe 80.4%
Shoe 71.4%

Categories

Captions

Text analysis

Amazon

VALUX