Christian Faith vs Christian Nationalism
It is likely that many of you have heard of “Christian Nationalism,” but you may not know what it means. Basically Christian Nationalism is the idea that America was founded on Christian principals so America should have Christianity as a “national religion.” In this scenario, Christians have a divine mandate to assume political power to […]