Home » Posts tagged 'Christianity in America'
Tag Archives: Christianity in America
A friend sent me a link to this article about the Bible and biblical illiteracy among evangelicals today from the January 2015 issue of Newsweek.
The author makes the argument that modern American evangelicalism (aka the popular conservative portrayal of Christianity many have in mind in America) is quite at odds with what the Bible actually teaches, particularly when it comes to issues about the inerrancy of the Bible, issues on homosexuality, women’s roles in the church, the formation of the canon, and other issues. In fact, the Bible condemns the style of Christianity modern evangelicals are practicing now, the article states.