As people of faith, we like to think that we actually impact culture. But the truth is, historically speaking, it’s usually the other way around. You’ve no doubt read the classic quote: “Christianity began as a personal relationship with Jesus Christ. When it went to Athens, it became a philosophy. When it went to Rome, it became an organization. When it went to Europe, it became a culture. When it came to America, it became a business.” Over and over, it seems Christianity absorbs the surrounding culture, rather than Christianity transforming culture.
Libraries have been written on that issue, and we can’t adequately cover it here. But I’m more interested in you. How has it worked in your own life? Everywhere I go I see people who have filtered their faith through the lens of rock & roll, Hollywood, business, family values, patriotism, media, traditions, sports, and more. They live the life they want, and just surround themselves with a customized “lifestyle” edition of the Bible, Christian t-shirts, or the celebrity pastor of the moment. They pick a local church based on “how much it ministers to me,” and support whatever social cause is trendy.
Culture has changed them. Their faith is defined only in the context of a greater culture. And the minute a little persecution happens, it’s the faith that gets tossed, not the rock & roll, social cause, or t-shirt.
I’m wondering what would happen to a generation that actually defined culture through the lens of their faith. What if we practiced what we preach and stopped worrying if the clothes we preached in were cool?
What if instead of culture changing us, we actually changed culture?