(no subject)
Mar. 25th, 2008 01:17 pmI was driving down the street, and got behind a truck that had two bumper stickers on it. One said "Save Idaho’s Wildlife, Kill a Wolf" and the other said "Jesus Died for Your Sins".......
I am not Christian. I was raised Protestant, but got over that by the time I was old enough to see the world for it’s many options. During the first twelve years of my life though, I faithfully went to church on a weekly basis with my grandmother - even when the rest of our household had stopped going. If there is one thing I learned during that time, it was that what the Christian church says they teach, and what the members of that church usually portray are two entirely different things.
If you are a "Christian" you are supposed to love and cherish and trust the earth that your "God" gave to you. You are not supposed to pass judgement on "His" creatures, and you are supposed to leave the judgement and the condemnations up to "Him". So why is it that it would appear that the "Christians" are the ones casting the most judgements upon the creatures and the people of this earth, and the ones displaying the least compassion for the planet itself and it’s inhabitants? How is it wolves would be such evil creatures, if your "God" made them? Don’t they realise the circle of life is there to keep balance? Yet us humans keep throwing it off balance, and then blaming the creatures whose homes we’ve destroyed.
We take and take and take - from the resources, the precious bits of land, the lives of the animals, and then we blame, persecute, and glance down upon the animals that have survived for centuries before the humans were about.
Are we really so glibb and smug at our own existence, to think that just because we have the "upper hand" we are better than all of that?
I’d like to think not.