Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (two women looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14684

Human Generated Data

Title

Untitled (two women looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14684

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.6
Human 99.6
Person 97.8
Helmet 97.2
Clothing 97.2
Apparel 97.2
Person 97.1
Appliance 71.5
Person 68.4
Car 57
Transportation 57
Vehicle 57
Automobile 57

Clarifai
created on 2023-10-27

people 99.8
adult 96.3
vehicle 93.5
man 92.5
exploration 91
administration 90
room 89.7
monochrome 89.7
two 88.3
group 88.2
wear 87.6
group together 86.6
instrument 85.9
medical practitioner 85.1
furniture 84.9
three 83.7
science 80.4
outfit 80.1
scientist 79.6
one 78.8

Imagga
created on 2022-01-29

device 46.8
ventilator 26.1
interior 20.3
equipment 19.5
technology 17.1
electric fan 16.6
fan 14.4
man 13.4
work 13.3
male 12.8
home 12.7
machine 12.7
modern 12.6
indoors 12.3
people 12.3
metal 12.1
room 11.9
glass 11.8
industry 11.1
person 10.7
men 10.3
power 10.1
house 10
brass 10
working 9.7
business 9.7
inside 9.2
clean 9.2
adult 9.1
steel 8.8
light 8.7
black 8.4
industrial 8.2
computer 8
medical 7.9
wind instrument 7.9
urban 7.9
building 7.8
factory 7.7
old 7.7
seat 7.6
design 7.3
connection 7.3
kitchen 7.1
information 7.1
furniture 7
medicine 7
chair 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.6
person 89
clothing 79.7
cartoon 70.3
posing 70.1
man 50.4
old 41

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-38
Gender Male, 99.4%
Calm 45.6%
Happy 31.5%
Surprised 8.7%
Sad 5.2%
Fear 3%
Disgusted 2.8%
Confused 2.3%
Angry 0.9%

Feature analysis

Amazon

Person
Helmet
Car
Person 99.6%

Categories

Text analysis

Amazon

CAMELS
smoking
before
ever before
QUEEN
people
More people are smoking
FILTER QUEEN
are
ever
FILTER
More
WAY
MJI7
MJI7 VIERAL ОСРИА
MINGTON
ОСРИА
VIERAL

Google

emiNGTOr More people are smoting CAMELS ever before FILTER QUEEN
emiNGTOr
More
people
are
smoting
CAMELS
ever
before
FILTER
QUEEN