posted on Aug, 26 2008 @ 06:52 PM
I was recently watching a program on Chinese military developments when they mentioned that the Chinese were the first to develop firearms. This
initially struck me as absurd. I was taught in high school World History that, while the Chinese were the first to develop gun powder, it was
Europeans who were capable of turning beautiful fireworks into instruments of death.
The documentary made a very good case for it, though, including showing the earliest known "rifle", which was really just a shoulder mounted
cannon.
A little later I was reading "True Stories From The American Past", a book similar to A People's History of the United States, though it chooses a
few specific instances in history from the start of the colonization to 1865 (I only have volume 1). As I read, something occurred to me.
Throughout high school as well as grade school, I really enjoyed history, as I still do. Yet, as I've delved deeper and deeper into it, I've been
discovering that what I was taught in school is not what is really believed. It's possible that some has become clearer over time as new artifacts
and documents are discovered, but there was a theme.
My perception of history, though subtle, decidedly painted while males, specifically European white males, as not necessarily pure evil, but certainly
villainous, while all others, were the victims of white male villainy. It wasn't as overt as The People's History of the United States or this True
Stories From the American Past I'm reading, but the theme was that white males are trouble and every other race is typically peaceful. I say
typically because the exception is the Japanese in World War II, though they corrected their ways while white males are still extremely bad news.
For example, the Chinese developed fireworks -- white males got their hands on the technology and used it to kill folks.
White males committed genocide in the "New World" unprovoked as well as robbing land from a people innocent of any evil capitalistic concepts such
as land owning. While it is true that Europeans did often instigate some of the wars that took place, many of the larger Native American tribes were
very warlike and territorial. This was never taught.
White males created slavery. While this wasn't really ever taught, the impression given was that slavery didn't exist before the slave trade out of
Africa began, and then it was only white males who did all of it. Never was it mentioned that often it was other tribes that would do raids for
slaves, and that this trade had existed for a while before the cotton trade in America began. It just became more profitable for the tribal leaders to
sell the slaves to Europeans rather than keep them.
Abe Lincoln was a white hero we learned about. He freed the slaves, but then a white man who hated the idea of blacks being free had to kill him. The
irony of the action, that Abe had several plans of reintegrating the south that died with him for a far more harsh path forward was driven home.
In the Revolution, the Declaration of Independence was applauded, though the hypocrisy of the "All men are created equal" was far more than just
touched on. What could you expect, though? They were white males...
There were many other stories such as these. The Japanese internment camps of World War 2 in the US (with the teacher pointing out, as well as the
text book, that Germans were ok because Americans identified more with them due to their skin color) without a mention of Japanese treatment of any
foreigner and a byline mentioning the death march and nothing about their treatment of POWs.
Now make no mistake, the point I'm making is not that white males were innocent. Every one of these presentations of history holds truth, and
Europeans, generally being the most powerful in the world for the past 4 or 5 hundred years, have lead the charge on many atrocities. The point has
been, rather, the one sided presentation made, at least in my school years from ~'85 to '97. The one exception to this rule was a U.S. History
teacher named Dr. Epstein who had a PhD in History and would give us all sides of a conflict and what was going on, though not covered in the text
books (he was the first to open my eyes to the fact that the Civil War wasn't only about slavery, and that was my junior year of high school).
What has this produced? I know in me, utter revulsion at groups like the KKK, skin heads, Neo-Nazis, etc, but just a mild disgust with other
supremacist groups of other races, such as the Black Panthers. There was also a contempt towards capitalism as well as white men.
I'd love to hear your thoughts on this, though. If you think this is the way history should be presented, please say so and why! If you were taught
otherwise, say so! That was my education in the North Shore of Chicago. I'm sure it's different in other regions of the US, too.