0
It seems to me that people are caring less these days about traditional establishment celebrity- such as Hollywood. This was evident by this years Oscars, where the "slap" from actor Will Smith at Comedian Chris Rock garnered more attention than the ceremony itself. The ratings are plummeting, as Hollywood becomes more woke. The "Grammy awards" also seem outdated. I don't think this means entertainment will die- but perhaps we will stop having a celebrity culture and more of independent movies and music . It is my hope that Hollywood will die. Are we now "Post-Celebrity"? Does it seem no longer relevant? or is it just me?
Bookmarks