Was the German Army really defeated in WWI?
Or was it betrayed by politicians and stabbed in the back?
Was the German Army really defeated in WWI?
Or was it betrayed by politicians and stabbed in the back?
They destroyed by their german navy at kiel
on the western front? yes. near the end there wasn't enough food or ammo to go around.
lack of will
The Germans were exhausted but I would have to say they handled it better than the Wehrmacht did. They took quite a beating and held strong considering the circumstances.
Defeated? Everyone lost to them in actual combat. The US getting involved was bullshit, they didn't have the resources to keep fighting. After defeating both the French and English. The US had no business getting involved. Most people in the US agreed with this too.
The Brits failed miserably to push Germany out of France
Yes
not at all. the wermacht blead itself out.
They defeated Russia totally, then were poised to smash the French and the UK when the United States jumped in. After - what, four years of utter hell it was just too much. They'd been bled white. It was an honorable and understandable defeat, but then they went full autist and created the "stab in the back" myth which blamed Jews, commies, fags, whoever was to hand for Germany getting BTFO'd. And then Hitler got into power only to fuck up everything even harder. Deutsches Reich: not even once.