Can you recommend some good books about the Civil War to me?
June 26, 2012 9:24 AM Subscribe
I'm doing some research on the nineteenth century in the U.S. and I've realized I need to understand the Civil War better. Can you help me?
I basically need a couple of books: one that will explain why the Civil War happened and what the effects of it were on a national scale, and a couple that will aquaint me with the most recent scholarship in the area. I'm more interested in scholarly works than in pop histories. I'm totally uninterested in military histories that catalog the minute details of every battle. And I read way faster than I can watch any sort of screen, so I'm only interested in books for the moment.
In terms of specific subject matter, I'm most interested in something I've seen alluded to in couple of books already: the idea that before the Civil War, culture in the U.S. was much more regionally-focused and the Federal Government was weaker, but during the Civil War and afterwards, the North strengthened the Federal Government and began to create a more homogenous national culture (they created new Federal holidays and monuments, nationalized the banks, etc). My sense is that the purpose of this was first to strengthen northern power and later to dissuade another region from rebelling. I'm half-Yankee/half-Westerner myself, and I'm interested in the above argument for intellectual reasons, not because I'm harbouring resentment against the North or anything. Can you point me to some books that might be helpful?