Hello
I'm a very stereotypical young American women in that I hate all things British (accents, food, movies, tv, music, history, sports, monarchy etc..)
Most Americans would agree with my stance on the British but is it morally correct what I feel?
I'm very confused about this and I don't know what to think?
Is this healthy?
[link] [comments]