Two hundred forty years ago, America’s founding fathers had a vision of a country where the God of the Bible was put first in every area of life and worshipped the way He deserved to be. Today, we can only see fading glimpses of their original intent. Any reference to God or Christianity is being systematically removed from American culture. Is there a way to get back to our founding fathers’ original intent? Is there a way to make a U-Turn?