<
 
Picture
Is faith really a virtue as many claim, or is it a fundamental flaw in humanity that must by all means be rooted out in order to achieve a progressive society? This is a question I have come across all too often lately.

There are basically two camps that have sadly become polarized as the "religious" and the "non-religious" (though I would argue that some of the so-called religious are not and equally vice versa). In essence, the first perspective values faith as a benefit to life and to knowledge while the second degrades it as a crutch for the weak and an opponent to "real," testable knowledge. The former maintains faith as necessary for enlightenment while the latter calls it blind, stupid and ignorant. So who's right? Much more importantly, what's true?