America's Christian heritage is both rich and deep. What most historians and educators refuse to acknowledge, our forebears understood clearly: it was mostly Christians and churches that formed and shaped the new land that became known as the United States of America.
I guess we’re supposed to look at hospitals now and see nothing but gigantic butcher shops, filled with greedy doctors who are unnecessarily sawing off our feet and taking out our kids’ tonsils in order to pay their country club bills and add to their Mercedes collections.
“In 1984, just as my brother Alan … prepared himself to graduate from high school, a social worker visited our family's home,” writes T.J. Boisseau, now an associate professor of history at the University of Akron in Ohio. “She was there to explain to my parents the sorts of programs for which Alan was eligible until he turned 22 because he was mentally retarded.” Predictably, then, she “expressed surprise, and dismay” at the news of Alan’s graduation because “if he received a diploma, Alan would not be eligible for any training programs or state-sponsored support later in life.”