Is this true, I heard that somebody stated that youngers are being taught across the U.S. history lies in schools? The addression also involves, "they're being taught that U.S. Civil War was fought over the slavery but it's a lie"? "I talked to a teacher about it and they told me they know it is a lie, but they're doing it for money and they don't want to get fired".
Because if so, then wow, damn...
p.s. I looked up on wiki too and it seems the same
Because if so, then wow, damn...
p.s. I looked up on wiki too and it seems the same