I used to be impressed with how much people from southern states knew about the civil war. I thought they were all hobbyists, but then I found out they'd been taught a lot in school, which shows less individual initiative. If you're comfortable doing so, let me know when and where you went to school. If you went to school in more than one place, did you notice differences about the civil war? How much civil war material was offered at your college?
I grew up in Delaware in the 60s, and I don't have the impression I was taught in much detail, but I hated history. It seemed like a combination of boredom and people hurting each other.
It's okay if you tell me about what you were told about the hot button issues like the causes of the civil war (was it slavery or something else?), but I'm at least as interested about other aspects. Was it battle by battle? How much was your state or immediate region emphasized?
If you weren't educated in the US, I don't mind hearing about what you were taught about the US Civil war, but I'm also interested in regional differences in what's taught about events there rather than here.
There was a bit on NPR (sorry, I'm not sure when or which show, maybe This American Life) from a southerner who was taught about the civil war as a heroic endeavor and didn't find out until she was in college that the south had lost. If it was This American Life, they mix fact and fiction. Have you ever run across something that extreme?
I grew up in Delaware in the 60s, and I don't have the impression I was taught in much detail, but I hated history. It seemed like a combination of boredom and people hurting each other.
It's okay if you tell me about what you were told about the hot button issues like the causes of the civil war (was it slavery or something else?), but I'm at least as interested about other aspects. Was it battle by battle? How much was your state or immediate region emphasized?
If you weren't educated in the US, I don't mind hearing about what you were taught about the US Civil war, but I'm also interested in regional differences in what's taught about events there rather than here.
There was a bit on NPR (sorry, I'm not sure when or which show, maybe This American Life) from a southerner who was taught about the civil war as a heroic endeavor and didn't find out until she was in college that the south had lost. If it was This American Life, they mix fact and fiction. Have you ever run across something that extreme?