One lesson we did learn from the past was the post-WWI era. The European powers came to the conclusion that Germany was the main cause of the war so, motivated by the appalling loss of life and property, they decided to make Germany pay retribution for those losses by charging it hundreds of millions of dollars (which it didn't have) and taking their overseas colonies. As a result Germany sought revenge for what it considered mis-treatment by the Allies and played upon those emotions leading to the rise of Nazism and fascism. Regarding the lost colonies and lands, Germany used the excuse of protecting German citizens and businesses to "reclaim" the Sunderland region of what was then Czechoslovakia to send in troops. Had Germany (which really wasn't the cause of the war) been allowed to rebuild, WWII might not have happened.
__________________
I'd rather be historically accurate than politically correct.
|