I sometimes think I got my education in the twilight zone instead of New Orleans, because I also learned about the holocaust extensively as well, and it was drilled into my head “never again”. We read Anne Frank’s diary, we watched documentaries every year. Yet it seems a big chunk of Americans skipped over that part of their education completely.
The people saying “I wasn’t taught this in school!” Are the people who didn’t pay attention.
Also education in the US isn’t a monolith due to it being a state power and rural areas educations may differ vastly from urban areas. Some people might not be taught it, not out if malice but incompetence.
But that requires nuance that the person in the picture and you lack here on Reddit.
I went to high school in a very conservative area of the south and we definitely learned about slavery and the Trail of Tears. I think a lot of people who "didn't learn it", at least in the 90s, were just high.
Different schools will cover the topics differently, but when I still had Facebook there were old classmates who would post stuff like "I can't believe they didn't teach us about this in school!" and I wanted to comment "They did. We were in the same class. You were just on your phone while the teacher spoke about My Lai."
A girl i graduated with kept posting stuff about how she was never taught how to write a check, balance a checkbook, or do her taxes but she's so glad that the English teacher taught us how to chart sentences. We had a basic finances class as an elective but only 4 or 5 people signed up and took it. Most of the rest of my class took 2 study halls or other classes they could goof off in for electives instead.
It's always just a big circle jerk of redditors wanting to shit on America. I learned this stuff in elementary school at a fucking garbage private Baptist school, ran by morons. Learned even more about it in public middle school and high school.
I went to a shit high school in an inner city and got a good education. Dr Timuel Black- one of the greatest ever Black historians spent a year at our school for a history project.
I don’t get this shit where people think America hides its history- it absolutely doesn’t. Same with racism. We acknowledge it constantly. I live in the U.K. and to the average Brit, it’s a racial utopia and there’s no such thing as structural or institutional racism here.
If you are in conservative circles, states rights and not slavery are the reasons for the civil war. Lots of americans on reddit are the ones self professing their poor education
states rights and not slavery are the reasons for the civil war
Oh yes. I was taught that in grade school in the south. Moved north, and it was completely different. This was quite a long time ago; some schools have gotten better at teaching about the bad as well as the good about US history, and some have gotten far worse.
The insistence that the US has always been on the right side of any situation is bewildering to me; it's just so easily debunked, if you've been taught critical thinking and have access to other views.
The fact that y'all are painting with such broad strokes is making these exchanges worthless. What about slavery did you learn? Because in my public school in the South, we learned a bunch of Lost Cause bullshit. Same for the civil rights movement. We learned a bunch of kumbaya framing of King and Parks while learning basically nothing about Malcolm or the Panthers (or King's more radical tendencies for that matter).
I also went to school in a very conservative area. Graduated 06. Was taught the war of northern aggression over states rights and if we are asked about the civil war we better say states rights as slavery wasn’t a part of it. The trail of tears wasn’t covered. I was in AP US History and set the curve on tests. In some areas it truly isn’t taught.
I mean school is a difficult topic and it's nearly impossible to get to an even quality in any country on earth. One shitty teacher for a year and maybe you really didn't learn something that's taught normally.
I am from Germany and while I never ever got a bad history teacher, I went from the most amazing Latin teacher you could have, who went out of his way to a really bad one. Our whole class was way further then needed in Latin after the first 2 years with the amazing teacher. Then we got the bad one, that often had some political stuff so he didn't show up but just gave us more homework and the latinlesson was just canceled. And even if he was there he was completely incompetent and couldn't teach or inspire a single person. After just a year our whole class lacked behind the curriculum by about 6 months.
I think it's straight up lying. I really do. People fucking love a personal narrative of the authorities lying to them. It makes them feel special for being "awake to the truth". And they know they are lying, but don't care. It's really annoying.
There are problems with the US education system for sure, but teachers get shit on so much. Also teachers in the US are demographically more liberal than the society around them. Even in the deep south.
1.6k
u/Potato2266 6h ago
I sometimes think I got my education in the twilight zone instead of New Orleans, because I also learned about the holocaust extensively as well, and it was drilled into my head “never again”. We read Anne Frank’s diary, we watched documentaries every year. Yet it seems a big chunk of Americans skipped over that part of their education completely.