Human Generated Data

Title

Social Settlements: Great Britain, England. London. Bermondsey Settlement: Bermondsey Settlement, London, Eng.: A talk on the theory of music. Drawing-room.

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2675.4

Human Generated Data

Title

Social Settlements: Great Britain, England. London. Bermondsey Settlement: Bermondsey Settlement, London, Eng.: A talk on the theory of music. Drawing-room.

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 98.5
Human 98.5
Person 96.2
Person 92.4
Person 91.9
Person 91.6
Person 85.3
Indoors 82
Room 82
Restaurant 80.8
Furniture 76.9
Meal 66.5
Food 66.5
Cafeteria 61.7
Cafe 59.4
Clinic 58.5
Crowd 58.1
Pub 56.4

Clarifai
created on 2019-06-05

people 99.9
group 98.1
adult 98
many 96.4
furniture 96.1
room 94.4
man 93.6
group together 93
home 91.6
woman 91.4
chair 90.1
several 88.8
administration 86.9
child 85.8
leader 85.3
indoors 84.2
war 81.8
wear 80.8
military 80.3
seat 79.9

Imagga
created on 2019-06-05

hall 21.5
business 18.2
room 17.8
grunge 16.2
paper 15.2
structure 14.3
old 13.9
house 13.4
silhouette 13.2
vintage 13.2
design 12.9
sketch 12.8
archive 12.4
art 12.1
modern 11.9
table 11.8
drawing 11.3
construction 11.1
glass 11.1
architecture 10.9
antique 10.7
sign 10.5
new 10.5
home 10.4
graphic 10.2
people 10
city 10
wallpaper 10
retro 9.8
interior 9.7
texture 9.7
decor 9.7
man 9.4
space 9.3
decorative 9.2
building 9.2
blackboard 9.1
decoration 9
work 8.8
urban 8.7
light 8.7
professional 8.5
happy 8.1
aged 8.1
backgrounds 8.1
symbol 8.1
financial 8
person 7.8
male 7.8
men 7.7
jigsaw puzzle 7.7
finance 7.6
grain 7.4
office 7.4
dirty 7.2
celebration 7.2
activity 7.2
flag 7.1
life 7.1
women 7.1
creative 7.1

Google
created on 2019-06-05

Microsoft
created on 2019-06-05

clothing 95.2
person 95.2
outdoor 88.8
house 87.5
white 72.5
woman 71.7
room 56.3
old 56.2
furniture 50.4

Face analysis

Amazon

AWS Rekognition

Age 15-25
Gender Male, 53.1%
Confused 45.5%
Calm 45.5%
Angry 45.3%
Sad 53%
Happy 45.2%
Disgusted 45.2%
Surprised 45.2%

AWS Rekognition

Age 30-47
Gender Female, 51.1%
Calm 45.3%
Surprised 45.1%
Disgusted 45.2%
Confused 45.4%
Sad 53.6%
Happy 45.1%
Angry 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.7%
Disgusted 45.6%
Confused 47.6%
Angry 45.7%
Calm 47.4%
Surprised 45.5%
Happy 45.1%
Sad 48.1%

AWS Rekognition

Age 17-27
Gender Male, 50.4%
Disgusted 49.5%
Angry 49.7%
Sad 49.7%
Surprised 49.6%
Happy 49.5%
Calm 49.6%
Confused 49.9%

AWS Rekognition

Age 27-44
Gender Male, 50.5%
Calm 49.7%
Angry 49.6%
Disgusted 49.5%
Sad 49.6%
Surprised 49.6%
Happy 49.9%
Confused 49.6%

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a vintage photo of a person in a white room 77.2%
a vintage photo of a person 74.7%
a black and white photo of a person 69.4%