Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (woman in industrial kitchen)

Date

1956

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19599

Human Generated Data

Title

Untitled (woman in industrial kitchen)

People

Artist: Samuel Cooper, American active 1950s

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19599

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.7
Human 98.7
Machine 90.4
Clothing 60.2
Apparel 60.2
Spoke 59.4
Workshop 57.9
Lab 57.1

Clarifai
created on 2023-10-22

people 99.9
adult 98.9
woman 96.3
one 95.9
employee 95.1
two 94.5
grinder 94.2
group 93.9
furniture 93
room 92.5
wear 92.5
man 91.9
industry 90.8
sewing machine 90.7
monochrome 89.3
indoors 87.8
vehicle 86.4
group together 83.6
dig 82.5
machine 81.6

Imagga
created on 2022-03-05

barbershop 100
shop 82.4
mercantile establishment 62.3
chair 59.9
barber chair 59.7
place of business 41.5
seat 38.9
furniture 31.6
man 26.2
people 22.3
establishment 20.7
room 18.4
indoors 18.4
business 18.2
male 17.7
interior 17.7
office 17.4
work 15.7
men 15.4
computer 15.2
modern 14.7
home 14.3
person 14.3
inside 13.8
lifestyle 13.7
indoor 13.7
working 13.3
table 13
adult 12.9
kitchen 12.5
steel 12.4
equipment 12.3
furnishing 11.9
technology 11.9
monitor 11.5
machine 10.9
music 10.8
old 10.4
sitting 10.3
industry 10.2
black 10.2
window 10.1
businessman 9.7
lamp 9.5
keyboard 9.4
light 9.4
house 9.2
wood 9.2
restaurant 9.2
worker 9.1
building 8.9
metal 8.8
life 8.6
back 8.3
industrial 8.2
job 8
architecture 7.8
glass 7.8
hand 7.6
salon 7.6
horizontal 7.5
senior 7.5
city 7.5
playing 7.3
transportation 7.2
piano 7.2
history 7.2
travel 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99
black and white 89.8
train 58.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.6%
Calm 99.8%
Surprised 0.1%
Happy 0%
Disgusted 0%
Confused 0%
Fear 0%
Angry 0%
Sad 0%

Feature analysis

Amazon

Person
Person 98.7%

Categories

Text analysis

Amazon

-
8
CATERER
METEL
- ЦЕНТИ
- -
N
CATEMER
THE
BILTHERN as METEL
E
as
FREE
на
на les FREE
WATER
BILTHERN
BROKEN WATER
ЦЕНТИ
GATERIA
BROKEN
les
Korea
100

Google

CATERE TERER CATERER
CATERE
TERER
CATERER