Is the Christian faith a "blind" faith, or are there good, solid reasons to hold the beliefs that we do? Hopefully we can answer this question together over a series of articles on some of the foundations of the Christian faith.
Hardly a day goes by that we don't hear how man-caused climate change/global warming is in danger of making our planet an inhospitable desert unfit for us and/or especially our children to live on. How do we deal with this information?