Human Generated Data

Title

Charity, Aged: Belgium. Louvain. Refuge de Charité: Public Charitable Institutions, Louvain, Belgium: Refuge of Charity.: Warming Room.

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.573.2

Human Generated Data

Title

Charity, Aged: Belgium. Louvain. Refuge de Charité: Public Charitable Institutions, Louvain, Belgium: Refuge of Charity.: Warming Room.

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.573.2

Machine Generated Data

Tags

Amazon
created on 2019-06-07

Human 98.8
Person 98.8
Audience 98.1
Crowd 98.1
Person 97
Person 96.8
Jury 95.6
Person 95.1
Person 94.2
Person 92.9
Person 88.9
Person 85.2
Clock Tower 80.9
Architecture 80.9
Tower 80.9
Building 80.9
Person 80.4
Indoors 76.4
Room 71.9
Person 70
Person 61.9
Court 61.8
People 60.6
Head 55.8
Person 45.1

Clarifai
created on 2019-06-07

people 100
many 99.4
group 98.7
adult 98.4
leader 96.5
group together 95.9
administration 95.8
man 93.8
wear 91.7
chair 91
woman 90.4
several 88.8
military 87.7
room 86.2
furniture 81.9
sit 76.5
war 76
child 75.6
uniform 75.1
seat 74.8

Imagga
created on 2019-06-07

brass 91.2
wind instrument 70.3
musical instrument 47.6
cornet 45.8
room 20.4
people 17.8
man 17.5
interior 16.8
baritone 14.6
adult 14.4
person 13.1
dress 12.6
window 12.1
elegance 11.7
male 11.3
art 10.5
couple 10.4
home 10.4
luxury 10.3
monument 10.3
city 10
old 9.7
indoors 9.7
antique 9.5
love 9.5
wall 9.4
fashion 9
decor 8.8
table 8.6
work 8.6
happiness 8.6
architecture 8.6
men 8.6
teacher 8.6
old fashioned 8.5
house 8.3
historic 8.2
tourism 8.2
trombone 8.2
retro 8.2
music 8.1
device 7.8
travel 7.7
chair 7.7
bride 7.7
comfortable 7.6
clothing 7.6
wood 7.5
building 7.4
vintage 7.4
famous 7.4
classic 7.4
style 7.4
bass 7.4
inside 7.4
decoration 7.3
business 7.3
uniform 7.2

Google
created on 2019-06-07

Microsoft
created on 2019-06-07

person 96.9
clothing 95.7
indoor 85.9
group 82.6
old 77.6
people 76.1
woman 69.7
human face 63
man 54.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 51.9%
Disgusted 45.2%
Surprised 45.2%
Happy 45.2%
Calm 50.4%
Angry 45.4%
Sad 48.2%
Confused 45.4%

AWS Rekognition

Age 38-59
Gender Female, 53.5%
Happy 46.1%
Calm 48%
Disgusted 45.7%
Angry 45.9%
Surprised 45.9%
Sad 47.1%
Confused 46.3%

AWS Rekognition

Age 27-44
Gender Female, 53.1%
Disgusted 45.2%
Calm 50%
Confused 45.1%
Sad 46%
Angry 45.2%
Surprised 45.2%
Happy 48.2%

AWS Rekognition

Age 48-68
Gender Female, 51.3%
Sad 50.4%
Confused 45.8%
Calm 46.2%
Angry 45.9%
Disgusted 45.6%
Happy 45.4%
Surprised 45.7%

AWS Rekognition

Age 35-52
Gender Female, 50.8%
Angry 45.3%
Surprised 45.2%
Sad 53%
Disgusted 45.1%
Confused 45.3%
Happy 45.8%
Calm 45.3%

AWS Rekognition

Age 48-68
Gender Female, 52.8%
Sad 49.9%
Calm 46.2%
Angry 45.8%
Disgusted 45.4%
Confused 45.5%
Surprised 45.5%
Happy 46.7%

AWS Rekognition

Age 57-77
Gender Female, 53.1%
Angry 46%
Surprised 45.7%
Sad 46.2%
Confused 45.4%
Happy 49%
Calm 46.7%
Disgusted 46.1%

AWS Rekognition

Age 45-63
Gender Male, 50.5%
Surprised 49.5%
Disgusted 49.5%
Sad 49.7%
Angry 49.5%
Confused 49.5%
Calm 50.2%
Happy 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Angry 49.5%
Happy 49.7%
Confused 49.5%
Disgusted 49.5%
Sad 49.6%
Surprised 49.6%
Calm 50%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Disgusted 49.5%
Surprised 49.5%
Angry 49.5%
Happy 49.5%
Sad 49.5%
Calm 50.4%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Angry 45.6%
Calm 47.8%
Confused 45.4%
Happy 45.7%
Surprised 45.5%
Disgusted 48.7%
Sad 46.3%

AWS Rekognition

Age 11-18
Gender Female, 50.3%
Angry 49.6%
Confused 49.6%
Disgusted 49.6%
Surprised 49.7%
Calm 49.6%
Sad 49.6%
Happy 49.8%

Feature analysis

Amazon

Person 98.8%
Clock Tower 80.9%

Categories

Imagga

interior objects 99.9%