as far as myself, i think so. and i'm not entirely certain why. if it's true for you, what is it? what's happening and why the shift? is it shifting demographics? a changing culture? a schwartze in the white house? the bad economy? america's global decline? crime? whining? smug white liberals? greedy big business republicans?