posted on Jul, 20 2012 @ 09:07 PM
Please pardon me if this is located in the wrong area of the forum, but I suppose this is a conspiracy if it is against the grain and I have faith in
it in one way or another so I'll just get into it (and if this needs to be moved, I apologize and please do so ) .
With all of the violence growing in the world and especially this latest shooting at the movie theater and just the way the Earth as a whole is
turning out lately, do you think our animalist instincts are ebbing to the surface? Just hear me out if you will. If we truly evolved from apes, then
we should still in some way share some of our old instincts from any area of us be it our DNA, hindbrain, or whatever (sorry, I WAS a bio major but
switched so I'm quite rusty and this probably sounds more like science fiction) . Do we all have instincts coming to the surface? Are we all nothing
but beasts underneath this cover we call humanity and intelligence that we've adapted?
If this is happening, here is my (probably crack pot) theory as to why this is happening: There are more and more people in the world today and before
long we might be struggling for resources or at least some part of us thinks that we will. Therefore, we are growing violent and harming others and
using segregation tactics for the better survival of our own selves and close knit groups. With a more crowded Earth, I think it is a survival
instinct. Now, many may disagree with me, and that's perfectly okay.
What do you guys and gals think? I'm not seeing I even fully believe this myself yet, I'm just throwing it out there for now. Are we all nothing but
beasts deep down waiting to come out from underneath our own skin? It's kind of a terrifying thought if you think about it. Are you really who you
think you are?