posted on Jul, 11 2004 @ 03:11 AM
Why do educators and college claim to have academic freedom and equality, but all you are ever fed is a liberal ideology? I was a history major in
college and since I'm a dedicated historian I have read a lot of American history. I took American history classes and it seemed like almost
everything was distorted.
For instance, in the civil war, the South was well known to be Democratic, but the class was taught that after the Civil War, the South were well
received to the idea of ending slavery. They were simply men who were pro-state soveriegnty and that the war had nothing to do with their unnerving to
end slavery.
Another example is how much the New Deal is supported while Reaganomics are detested. But that same economy, which helped out Clinton, is something
totally different, which has been shown not to be true.
I could write other things from political science class, economics, sociology, philosophy. Professors were involved with the campus democrats, but you
have to get someone from the community to advise the campus republicans.
I just wanted to know what you thought about the liberal bias in education?