Human Generated Data

Title

Untitled (two women in dresses and hats)

Date

c. 1950

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18941

Human Generated Data

Title

Untitled (two women in dresses and hats)

People

Artist: Bachrach Studios, founded 1868

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18941

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Clothing 98.4
Apparel 98.4
Indoors 95.4
Person 95.2
Room 95.1
Furniture 92.1
Dressing Room 75.4
Cooktop 71.3
Living Room 70.8
People 64.4
Hat 62.9
Female 62.8
Dress 59.2
Bedroom 57.4
Dish 57
Food 57
Meal 57
Woman 55.8
Worker 55.3

Clarifai
created on 2023-10-22

people 99.9
two 98.5
furniture 98.5
adult 98.3
woman 97.9
group 97.7
man 96.6
room 95.6
wear 95.4
barber 93.1
monochrome 92.8
seat 92.4
one 92.2
veil 92.1
three 91.5
mirror 89.9
indoors 89.5
sit 89.1
chair 88.8
group together 88.6

Imagga
created on 2022-03-05

barbershop 76.1
shop 65.8
mercantile establishment 47
hairdresser 33.4
place of business 31.3
man 26.9
salon 25.2
people 22.3
chair 22.2
male 21.3
business 17.6
adult 17
barber chair 16.7
establishment 15.6
person 15.5
indoors 14.9
seat 14.6
professional 13.9
men 13.7
job 12.4
couple 12.2
office 12.2
occupation 11.9
work 11.8
room 11.2
women 11.1
portrait 11
window 11
dress 10.8
robe 10.7
interior 10.6
building 10.4
corporate 10.3
sitting 10.3
furniture 10.2
architecture 10.1
happy 10
suit 9.9
modern 9.8
businessman 9.7
home 9.6
house 9.2
indoor 9.1
group 8.9
church 8.3
fashion 8.3
worker 8.2
lady 8.1
religion 8.1
history 8
medical 7.9
garment 7.8
bride 7.7
city 7.5
inside 7.4
wedding 7.4
smiling 7.2
black 7.2
looking 7.2
holiday 7.2
smile 7.1
love 7.1
working 7.1
travel 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 94.2
indoor 92.5
black and white 91.7
furniture 83.4
dress 72
clothing 70.2
book 65.8
house 59.7
person 57.6

Color Analysis

Feature analysis

Amazon

Person
Cooktop
Person 99.2%

Categories

Captions

Microsoft
created on 2022-03-05

a man sitting in a room 67.7%
a man standing in a room 67.6%
a man in a room 67.5%

Text analysis

Amazon

EO
E
G 17 EO
17
G
NAGOX
M.J.I NAGOX
M.J.I

Google

EO 2 8 8 4 9
EO
2
8
4
9