I’m a young guy, but still I feel that something bad happened to the Hollywood music industry at one point in time. In the 60’s and the 70’s many of the most popular artists preached peace, love, understanding, freedom, expansion of the mind, and so forth. The lyrics were often intelligent. The values were totally different. The artists had real power and influence on people – in a mostly positive way. The drug abuse is another story, it got out of hand. Today however, it seems that every artist from Hollywood is all about individualism, materialism, consumerism, egoism. The lyrics are mostly very repetitive and blunt. What happened with the music industry?