The trend I'm talking about is how everything seems to be gradually getting more demonic every year. I grew up in the 90s and I remember the music was beautiful and amazing, but now music is agressive and full of self-glorification, sex, gore, and horrible stuff that turns me off. It seems to be this way with many things, that our culture is degrading.
I mean, look at videogames and movies. They were amazing, but now most movies are pretty bad. Not to say they are all that way, but geerally speaking, they are not the labours of love they once were. I don't know if it's because they are focusing so heavily on CGI or if it's a byproduct of the degredation of out culture. Anyone else feel that our culture is becoming more demonic and losing its soul?
They say that what you see is a reflection of what you are inside, so maybe I am the problem. What do you guys think?
[link] [comments]