posted on Mar, 5 2013 @ 01:02 AM
Since when did the US become a democracy?
I cringe every time I hear these words especially by those who hold office. The United States was intended to be a constitutional Republic.
Democracies are what happen when a country is taken over by black ops agents before an inside attack government coup or false flag (whatever you want
to call it ) happens and then it is a stepping stone to sharia law.
In democracy, they allow homosexuals and other strange minority groups to gain lots of power. But then democracy changes into sharia law and under
those set of laws they kill those same people.
You see. The liberals and so called civil rights activists or any activist in general will all argue their way to democracy, once they have it no
particular group has an equal footing because every group is divided amongst brothers and sisters. Every group seeks power or authority over the other
group and the end result is usually democide.
The civil rights movements as well as many other movements are a farce. Why cant we go back to settling things the old fashioned way. You know like
the wild west, when the cops weren't going to show up to save anyone and not only was it accepted it was expected?
What has happened to us as a culture where we are so dependent on technology that we are spoon fed everything from a freaking device of some kind
where if some sort of EMP went off and didn't kill anybody but wasted all of our tech only less then about 10% of the population would be able to
survive? (estimate)
What is wrong with us, besides technology and arrogant homicidal power tripping maniacs that run the government?