Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (man taking woman's pulse inside medical trailer)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9691

Human Generated Data

Title

Untitled (man taking woman's pulse inside medical trailer)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9691

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Chair 98.2
Furniture 98.2
Person 98.2
Human 98.2
Person 97.4
Weapon 85.6
Weaponry 85.6
Gun 62.6
Armory 59.9

Clarifai
created on 2023-10-27

people 99.7
man 99
two 98.4
monochrome 97.1
indoors 96.9
adult 96
room 95.4
couple 92.2
woman 91.5
three 90.4
mirror 89.4
barber 89
profile 87.1
street 87.1
group 82.2
group together 82
dressing room 81
family 80.4
portrait 79.8
side view 79.6

Imagga
created on 2022-01-23

barbershop 100
shop 79.3
mercantile establishment 61.3
place of business 40.9
chair 28.5
man 26.2
people 24
business 22.5
seat 21.5
establishment 20.4
office 19.7
barber chair 18.8
work 17.3
men 16.3
male 16.3
adult 16.2
working 15.9
businessman 15.9
indoors 15.8
computer 15.2
travel 14.8
inside 13.8
person 13.7
worker 12.9
room 12.5
transportation 12.5
passenger 12.2
urban 12.2
laptop 12
hairdresser 11.7
black 11.5
interior 11.5
women 11.1
architecture 10.9
smiling 10.8
suit 10.3
lifestyle 10.1
happy 10
building 9.8
modern 9.8
equipment 9.8
corporate 9.4
light 9.3
city 9.1
businesswoman 9.1
furniture 9
technology 8.9
job 8.8
table 8.6
sitting 8.6
smile 8.5
desk 8.5
handsome 8
home 8
salon 7.8
portrait 7.8
industry 7.7
piano 7.6
back 7.3
occupation 7.3
transport 7.3
to 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.7
black and white 91.3
man 90.5
clothing 88.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 83%
Calm 43.7%
Surprised 27.9%
Sad 24.3%
Fear 1.3%
Confused 1%
Disgusted 0.9%
Angry 0.7%
Happy 0.2%

Feature analysis

Amazon

Person
Person 98.2%

Text analysis

Amazon

KODAK-2AI

Google

YT37A°2--XAGO
YT37A°2--XAGO