#1 Sat, Mar 2, 2013 - 9:08am
What GOOD is going on in the United States?
Seems like everything I read of the current state of the land of the Untied Snakes falls into the hell-in-a-handbasket category. I haven't lived there in seven years; did have a brief but lovely visit to NYC and New England last fall, without a single incident to stoke my 'you're-all-freaking-doomed!' disposition.
So what good or positive things do you see going on there?
Edited by: Plan B on Nov 8, 2014 - 5:27am