All I read about anymore is doom and gloom, everyone keeps saying we're doomed and there is no hope. Does everyone think that the USA is really doomed? Or do you still think there's hope? I have two little kids and hope they can still grow up in a good world and I want to have faith that we as a country will make it out of this, I know a lot of good people in this world and I believe there's many more out there.. I guess I'd like to hear something positive for a change instead of all negative, but maybe I'm being naive.