Thursday, March 8, 2012
The American Way
I've heard most of my life about the American way of life and the American Dream. But what does that even mean anymore? I know once it used to mean that if we worked hard enough, we could do or be anyone we wanted to be. I have been taught that the American way of life is to be honest and hard working all through your life, supporting your family. But as I look at the world around me, I don't recognize the America I was taught. It's like everyone wants everything for nothing. It's no longer the American Dream, but the American Expectation. An expectation without any prerequisite. I find it hard to believe that that type of America is what is giong to help us improve who we are. I feel like everyone in the news or politics is always blaming someone else for their problems. I wish sometimes that someone would just come out and say that it was their fault, that they are to blame. Or someone who receives help from the government to actually pay it back. I hope that I can teach my children of the America that used to be and hope that they somehow contribute to the change from the American Expectation back to the American Dream.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment