Feb 24 2025 58 mins 1
Are Christians supposed to take over the family, religion, education, media, arts and entertainment, business, and the government? Those who hold to Dominion Theology are convinced Christians are supposed to control these "seven mountains of dominion." This is not some odd theology on the fringes. Rather, it has found its way into mainstream right-wing Christianity and is currently the central belief operating in the US government through the newly created White House Faith Office. Nothing could be more antithetical to the Gospel of Jesus Christ.