Human Generated Data

Title

Untitled (car with woman sitting in front seat, man and woman in back seat)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19467

Human Generated Data

Title

Untitled (car with woman sitting in front seat, man and woman in back seat)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19467

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 98.8
Furniture 98.8
Person 98.5
Human 98.5
Person 88.3
Clothing 86.6
Apparel 86.6
Train 80.5
Transportation 80.5
Vehicle 80.5
Silhouette 80.5
Outdoors 76.8
Window 75
Nature 74.4
Meal 71.5
Food 71.5
Face 71.1
Door 69.3
Flooring 68.7
Floor 61.5
Ice 57.6
Outer Space 56.7
Space 56.7
Universe 56.7
Astronomy 56.7

Clarifai
created on 2023-10-22

people 99.7
street 98.8
monochrome 97.2
woman 96.3
adult 93.2
man 92.5
subway system 91.7
black and white 91.7
group 91.2
locomotive 90.9
train 90.7
two 89
art 88.1
transportation system 87.4
one 87.3
child 87
group together 84.3
veil 81.5
vehicle window 81.3
vintage 81.1

Imagga
created on 2022-03-05

passenger 53.7
man 26.2
people 20.6
male 19.1
adult 16.9
world 16.6
business 14.6
person 14.4
city 12.5
urban 12.2
black 12
men 11.2
working 10.6
window 10.5
one 10.4
sitting 10.3
happy 10
businessman 9.7
building 9.7
office 9.6
back 9.2
looking 8.8
smiling 8.7
work 8.6
architecture 8.6
glass 8.6
smile 8.5
color 8.3
street 8.3
human 8.2
women 7.9
holiday 7.9
old 7.7
fashion 7.5
car 7.5
leisure 7.5
hat 7.4
vehicle 7.3
transport 7.3
mask 7.3
group 7.2
worker 7.2
travel 7
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 96.7
text 93.2
clothing 92
person 85.5
monochrome 68
white 63.3
furniture 17.6

Color Analysis

Feature analysis

Amazon

Person
Train
Person 98.5%
Person 88.3%
Train 80.5%

Captions

Text analysis

Amazon

MAOON
MAMT3AR