Recent Posts

The Lies America Tells Itself About the Middle East: As Its Influence Faded, Washington Dissembled and Denied Reality

On any given day during the long war in Gaza, a Biden administration official could be expected to assert any of the following: a cease-fire was around the corner, the United States was working tirelessly to achieve one, it cared equally about the Israelis and the Palestinians, a historic Saudi-Israeli …

Read More »

What Happened to “the West”?

As America Drifts Away From Its Allies, a Less Peaceful World Awaits It has become commonplace to speak of living in a “post-Western world.” Commentators typically invoke the phrase to herald the emergence of non-Western powers—most obviously China, but also Brazil, India, Indonesia, Turkey, and the Gulf states, among others. …

Read More »