Does Biblical Scholarship Destroy Faith?
Biblical scholars who approach the Bible from a historical perspective are often accused of working hard to deconvert the faithful. Is that true?
Do undergraduates widely abandon their faith once they learn the historical realities behind it? Are professors and authors generally interested in urging their students and readers to abandon their religion? And is there any positive result for faith that can come from understanding historical scholarship? Is it crucial to faith to understand the Bible, or just an unnecessary add-on?