The End Of White Christian America

Jul 18, 2016

  In his new book, The End of White Christian America, Robert P. Jones, CEO of the Public Religion Research Institute, challenges us to grasp the profound political and cultural consequences of a new reality—that America is no longer a majority white Christian nation.

For most of our nation’s history, White Christian America (WCA)—the cultural and political edifice built primarily by white Protestant Christians—set the tone for our national policy and shaped American ideals. But especially since the 1990s, WCA has steadily lost influence, following declines within both its mainline and evangelical branches. Today, America is no longer demographically or culturally a majority white Christian nation.