Liberal Education: Not a Bad Thing at All

The ironies of modern life in the United States, and our need to make sense of them, or alter them, to me, are among the best reasons why college-level, liberal education is vital to virtually all Americans. By “liberal” I do not mean left-wing; I mean education that includes exposure to philosophy, ethics, history, the arts, language, literature – collectively known as the humanities.