“The image of Germany as a sinister, predatory, warlike nation only took root in the twentieth century. Nineteenth century Germany, by contrast, was seen as a place of peace and enlightenment.”
Although this article suffers, in my estimation, from its none-too-subtle anti-Jewish bias, what the author has to say about the demonization of Germany is squarely on the mark.
I have shared elsewhere on this blog (“Who’s to blame for World War One?”) how the idea that Germany is solely or even primarily to blame for the First World War is entirely counter-factual, the result of Allied propaganda created to mask their own complicity and to justify a horrific and completely unnecessary war; and is is widely recognized by sober and objective historians that the rise of Hitler and the Nazi party was a direct and perhaps inevitable result of the draconian punitive measures leveled against Germany by those same Allies, when victorious.
Prior to World War One, “Germany was admired by the world as a center of learning, for its high culture and for its achievements in every field; but also for its culture of honesty, hard work, orderliness and thrift, which existed even at the lowest level of society. British scholars and journalists had been very favorably disposed toward all things German, including their history, culture, and institutions throughout the nineteenth century,” and “British author Thomas Arnold (June 13, 1795 – June 12, 1842) saw Germany not as a nation with a unique predisposition toward authoritarianism and regimentation, but rather as a ‘cradle of law, virtue, and freedom,’ and considered it a ‘distinction of the first rank’ that the English belonged to the Germanic family of peoples.”