I get asked (usually from Christian media) if I believe Hollywood is anti-Christian. I understand the question, because it’s pretty easy to see that Judeo-Christian values aren’t exactly the hot button these days in the movie and television industries. And yet, to make a blanket statement that Hollywood is the enemy is a big mistake. Recently, I discussed the issue with a major Christian media site, and here’s what I told them. I’d love to hear your comments about my answers:
What’s your response when you hear Christians complain that Hollywood is anti-Christian?
It certainly hasn’t been true in my experience. Obviously there are people in Hollywood who don’t like religion, just like there are attorneys, school teachers, plumbers, and store clerks across the country who don’t like religion. But in my experience the vast majority of producers, actors, filmmakers, and studio executives in Hollywood are very open. In most cases (again, like the general culture) these men and women weren’t raised in a Christian home, so they’re largely ignorant of any knowledge about the Christian faith. But that doesn’t make them “anti-faith.” In fact, I’ve had some remarkable conversations with industry leaders about Christianity, and you’d be amazed at the number of highly placed entertainment and media professionals who are believers.
Charisma NEWS Report Here