Human Generated Data

Title

Charity, Tuberculosis: United States. Massachusetts. Rutland. Massachusetts State Sanatorium: Massachusetts State Sanatorium: Patients "at Camp"

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2769.2

Human Generated Data

Title

Charity, Tuberculosis: United States. Massachusetts. Rutland. Massachusetts State Sanatorium: Massachusetts State Sanatorium: Patients "at Camp"

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Human 99.6
Person 99.6
Person 99.3
Person 99.2
Person 99.1
Person 99.1
Person 98.7
Vehicle 90.9
Transportation 90.9
Car 65.5
Automobile 65.5
People 63.6
Truck 58.3
Fire Truck 58.3
Van 57.2

Clarifai
created on 2019-06-05

people 100
group 99.9
group together 99.7
adult 99.6
many 99.5
man 98.2
child 97.3
several 97.3
vehicle 97.1
woman 95.3
war 95.1
soldier 94
military 93.9
transportation system 93.7
five 90.9
four 90.7
three 90.4
administration 87.5
wear 86.5
two 85.5

Imagga
created on 2019-06-05

barbershop 100
shop 100
mercantile establishment 92.1
place of business 61.4
establishment 30.8
old 22.3
building 21.6
dirty 17.2
architecture 15.6
vintage 14.1
street 13.8
industry 13.7
grunge 13.6
city 12.5
construction 12
wall 12
window 11.9
house 11.7
transportation 11.7
history 11.6
urban 11.4
travel 11.3
industrial 10.9
factory 10.8
destruction 10.7
snow 10.7
black 10.2
winter 10.2
work 10.2
door 9.5
empty 9.4
exterior 9.2
structure 9.2
historic 9.2
aged 9
road 9
scene 8.7
cold 8.6
men 8.6
dark 8.3
transport 8.2
chair 8
working 8
business 7.9
people 7.8
season 7.8
antique 7.8
ancient 7.8
broken 7.7
tree 7.7
windows 7.7
brick 7.5
wood 7.5
landscape 7.4
man 7.4
retro 7.4
light 7.4
danger 7.3
tourist 7.2
metal 7.2
trees 7.1
sky 7

Google
created on 2019-06-05

Vehicle 56.3
Building 50.8
Family 50.2

Microsoft
created on 2019-06-05

tree 98.3
person 98.1
clothing 96.9
outdoor 94.8
man 85.7
group 85.3
people 77.7
old 50.2
several 16.9

Face analysis

Amazon

AWS Rekognition

Age 38-57
Gender Male, 54.4%
Surprised 45.1%
Calm 47.8%
Happy 45.2%
Sad 51.2%
Angry 45.4%
Confused 45.2%
Disgusted 45.1%

AWS Rekognition

Age 35-52
Gender Male, 53.4%
Disgusted 49.1%
Sad 48%
Surprised 45.1%
Calm 45.3%
Happy 46%
Confused 45.4%
Angry 46.2%

AWS Rekognition

Age 48-68
Gender Male, 53.4%
Disgusted 47.3%
Sad 45.5%
Surprised 45.3%
Calm 51.1%
Happy 45.2%
Confused 45.2%
Angry 45.4%

AWS Rekognition

Age 29-45
Gender Male, 54.1%
Angry 45.2%
Calm 53.8%
Sad 45.4%
Confused 45.2%
Disgusted 45.1%
Happy 45.1%
Surprised 45.1%

AWS Rekognition

Age 29-45
Gender Female, 52%
Happy 45.2%
Disgusted 45.1%
Angry 49%
Surprised 45.1%
Calm 50.1%
Sad 45.4%
Confused 45.1%

AWS Rekognition

Age 20-38
Gender Male, 52.7%
Happy 45%
Disgusted 45%
Angry 45.1%
Surprised 45.1%
Calm 54.4%
Sad 45.3%
Confused 45.1%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a vintage photo of a group of people sitting at a table 86.5%
a vintage photo of a group of people sitting around a table 86.4%
a vintage photo of a group of people sitting in a chair 86.3%

Text analysis

Amazon

ROCKSIDE.
OB

Google

ROCKSIDE
ROCKSIDE 1OB
1OB