Human Generated Data

Title

Untitled (man in top hat and coat, woman in fur shawl)

Date

1943, printed later

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.140

Human Generated Data

Title

Untitled (man in top hat and coat, woman in fur shawl)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

1943, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.140

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Car 98.5
Transportation 98.5
Vehicle 98.5
Automobile 98.5
Clothing 98.3
Apparel 98.3
Person 94.7
Human 94.7
Leisure Activities 91.1
Car 82.5
Car 82.1
Face 72.2
Musician 70.7
Musical Instrument 70.7
Piano 70.4
Performer 64.8
Pianist 63.6
Person 62.7
Overcoat 60.7
Coat 60.7
Violin 58.6
Fiddle 58.6
Viola 58.6
Home Decor 57.5

Clarifai
created on 2023-10-15

people 99.9
monochrome 99.6
wear 99
woman 98.3
coat 98.2
street 98
adult 97.9
outerwear 97.4
man 97
two 95
stock 94.9
one 94.8
fur coat 94.5
child 92.1
portrait 92.1
shopping 92
winter 91.4
overcoat 90.4
veil 90
group 89.5

Imagga
created on 2021-12-15

person 21.9
man 20.9
people 20.1
city 19.9
building 18.3
architecture 17.3
urban 15.7
old 15.3
male 14.9
business 14.6
street 13.8
clothing 13.6
sculpture 13.4
adult 12.9
landmark 11.7
ancient 11.2
sitting 11.2
tourism 10.7
travel 10.6
tourist 10.5
suit 10.4
sky 10.2
outdoor 9.9
statue 9.8
world 9.8
famous 9.3
stone 9.3
historic 9.2
transportation 9
history 8.9
briefcase 8.7
men 8.6
culture 8.5
walking 8.5
historical 8.5
black 8.4
portrait 8.4
town 8.3
life 7.7
uniform 7.6
casual 7.6
fashion 7.5
outdoors 7.5
holding 7.4
figure 7.2
device 7.2
art 7.2
religion 7.2
businessman 7.1
child 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.7
clothing 96
street 95.5
coat 90.2
black and white 90.2
person 89.7
footwear 86.8
monochrome 71
woman 68.7
people 50.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-43
Gender Female, 70.3%
Calm 55.4%
Sad 42.1%
Angry 0.6%
Confused 0.6%
Happy 0.6%
Fear 0.4%
Surprised 0.3%
Disgusted 0.1%

Feature analysis

Amazon

Car 98.5%
Person 94.7%

Categories

Text analysis

Amazon

891

Google

891
891