Human Generated Data

Title

Eyesight Test - Examining Colored Worcesleds

Date

January 28, 1904

People

Artist: Paul Rowell, American active 1898-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.609

Human Generated Data

Title

Eyesight Test - Examining Colored Worcesleds

People

Artist: Paul Rowell, American active 1898-1935

Date

January 28, 1904

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Massachusetts Bay Transportation Authority, Boston Transit Collection, 5.2002.609

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Person 99.1
Human 99.1
Person 95.2
Sitting 75.8
Face 73.5
Interior Design 71.6
Indoors 71.6
Living Room 68
Room 68
Furniture 65.3
Clinic 64.9
Worker 63.1
People 62.7
Portrait 62.1
Photography 62.1
Photo 62.1
Table Lamp 57.7
Lamp 57.7
Spoke 55.6
Machine 55.6

Imagga
created on 2022-06-04

barbershop 36.8
shop 33.5
chair 24.6
mercantile establishment 23.7
interior 23
furniture 19.4
room 19.2
home 19.1
sketch 19
drawing 18.5
man 16.8
person 16.7
place of business 15.9
indoors 15.8
barber chair 15.4
house 15
table 14.6
people 14.5
seat 14.3
male 14.2
lifestyle 13.7
adult 13.6
window 12.9
work 11.9
building 11.3
modern 11.2
kitchen 10.9
glass 10.9
black 10.8
men 10.3
architecture 10.1
representation 10
urban 9.6
office 9.2
hospital 9.1
indoor 9.1
portrait 9.1
old 9
equipment 8.9
job 8.8
decor 8.8
device 8.7
design 8.4
floor 8.4
inside 8.3
happy 8.1
light 8
medical 7.9
establishment 7.9
smile 7.8
sitting 7.7
business 7.3
new 7.3
life 7.2
women 7.1
working 7.1

Google
created on 2022-06-04

White 92.2
Black 89.7
Interior design 86.8
Black-and-white 84.8
Curtain 84.2
Style 83.9
Monochrome 75.4
Monochrome photography 75.1
Machine 70.4
Room 68.8
Wood 66.3
Cooking 64.6
Cabinetry 63.2
Service 58.5
Kitchen 54.7
Sitting 53.5
Tradesman 52.1

Microsoft
created on 2022-06-04

indoor 96.5
text 94.3
black and white 94
person 87.2
furniture 62.9
man 58

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Female, 84.7%
Calm 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Disgusted 0.1%
Confused 0%
Angry 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Text analysis

Amazon

HXDT
3913
28,1984
Jan. 28,1984
Jan.
coloral
reslects

Google

HXDT Eyesight Test examine coloresleds Jan, 28,1904 3913
HXDT
Eyesight
Test
examine
coloresleds
Jan
,
28,1904
3913