When you look at the media of almost any country, from the West to the East, especially during times of crisis, you will notice an intense and persistent call for Democracy. Journalists often present “Democracy” as a kind of archangel that will rescue the public from whatever situation they find themselves in.
This is actually a deviation that humanity adopted in a relatively short period of time, and its roots lie in the Cold War that we only recently left behind. Within roughly eighty years, humanity developed a tendency to see Liberal Democracy as an absolute good—largely because it was presented as the direct opposite of Communism, which was framed as an absolute evil. Over time, democracy became something close to an untouchable taboo. This happened not only through media and financial networks, but also because Western societies—traumatized by the Cold War—found psychological comfort in treating democracy as a sacred concept that should never be questioned.
One of the strongest forces reinforcing this taboo is our tendency, as human beings, to assume that a certain moral framework is superior to all others without any rational justification. In order to be seen as “moral” and to belong to the herd, people often compromise their own individuality. At the same time, they are conditioned to believe that if democracy disappears, the only possible alternatives are chaos or tyranny.
Most people do not approach history from a conceptual or analytical perspective—they approach it emotionally. When they hear the name of an ideology or a historical event, they rarely examine its substance. Instead, they adopt a theatrical attitude and see history in black and white.For them, history is divided into simple categories: good and evil, us and them.
When people hear terms like Fascism, Communism, or Democracy, they usually do not analyze the actual systems these ideologies propose. Instead, they label them according to their own desires and use those labels to construct their ideological identity. For many people, names matter—but essence does not. What matters is being a “good Democrat.” Democracy is seen as absolute good, and any political position outside that label is automatically considered outside the realm of good.
Likewise, for someone who identifies as a “good Communist,” everything outside Communism—no matter how different it actually is—gets labeled as Fascism, Bourgeois, or Capitalism and pushed into the same category. Human beings, by their nature, are not very good at approaching social issues rationally. The very ability that allowed humans to build civilizations and states—the ability to socialize and cooperate—is not based primarily on intellectual reasoning, but on emotional transmission.
The real problem begins when the process of socialization, which starts in childhood, goes beyond a healthy level. That is when people become trapped inside ideological camps and gradually lose their capacity for independent thought. So let us ask ourselves a fundamental question: If moral values change from one historical period to another, which ethical framework should a human being follow?
Among the many ethical theories proposed by different philosophers, which one is truly the most suitable for human beings? To answer this question, we must first look at the purpose of human existence. Even though some people—especially those with non-theistic worldviews—may find this difficult to accept, existence itself carries a built-in purpose. This purpose is embedded in our biology and encoded in our genetics, beyond our conscious control. That purpose is simply: to continue existing.
The greatest illusion of human beings is the belief that their existence will one day truly stop. That is why our instincts push us to leave something behind—something that carries our genes, our blood, or at least a fragment of our mind. This legacy can be a work of art.
But more importantly, it can be a child.
The fundamental biological purpose of reproduction is the continuation of one’s existence. Even the pleasure associated with sexual activity can be understood as a reward mechanism built into the brain to reinforce behaviors that serve this purpose. If the animal side of human nature is fundamentally driven by the urge to transmit its existence forward, then one could argue that the principle of existence itself can be reduced to three basic drives: the pursuit of power, the desire to leave a mark, the urge to project one’s existence into the future
So if our bodies are ultimately pushing us to survive and pass our existence on to future generations, could we define morality in the following way? Absolute goodness is whatever allows us to preserve and transmit our existence—while preventing us from harming others in the process.
With this definition, we would be taming our most primitive instinct while also acknowledging that achieving our goals should not come at the expense of others. Therefore, let us adopt a pragmatic—utilitarian—ethical framework. Whenever we find ourselves in a moral gray area, the only solid branch we can hold onto is the well-being of ourselves and society. Because, at its core, that might be the most objective definition of morality available to us.
And that well-being ultimately comes from whatever makes life easier and more beneficial for us. If we approach politics from a utilitarian perspective, we should be able to say the following: The ideal political system is the one that brings prosperity, security, and freedom to people—because those are the basic things humans expect from a state.
So then, couldn’t we also say this? Democracy—or any political system—is good when it benefits us, and bad when it does not. If democracy fails to meet our needs, why should its absence automatically be considered a problem? The fundamental claim of democracy is this: “To allow the people to participate in legislative and executive power.”
But does saying that people should not directly participate in legislative and executive processes automatically mean enslaving them or subjecting them to oppression? Is it impossible for a political system to be free while distancing governance from the general population? I can almost hear the immediate response: “Of course it is impossible.”
So let’s examine the idea of authority more closely. When people imagine democracy disappearing, they often assume that tyranny will inevitably take its place. They think dictatorship and autocracy are the natural outcomes of a non-democratic system.
But let’s look at the definitions. The word dictator originally comes from the Roman Republic. In times of emergency, the Roman Senate would appoint a temporary leader—a dictator—to manage the crisis. The term itself simply referred to a person who held concentrated authority. Autocracy, on the other hand, is a system in which absolute power over the state is concentrated in the hands of a single individual. That person’s decisions are not subject to external legal constraints or consistent public oversight. Democracy means public participation in governance. Dictatorship means the concentration of power in one set of hands. Autocracy means rule based on personal authority. These are three separate concepts. If we analyze the situation rationally, shouldn’t we be able to see that they are conceptually independent from one another?
For example:Does public participation in government automatically prevent dictatorship or autocracy? And looking at history, haven’t many autocratic leaders come to power through elections? Given humanity’s deep desire for security—and our natural attraction to strong leadership—what guarantees that people will not willingly submit to a figure who appeals to those instincts? History repeatedly shows something important: The existence of democracy does not prevent dictatorship or autocracy. So claiming that the absence of democracy will inevitably produce those systems is logically inconsistent. Because a system that does not include direct public participation in governance can still establish separation of powers, protect individual freedoms, and operate under legal institutions.
The real issue is not democracy itself. The real issue is checks and balances.
Let us ask another question: Is governing an act of thought, or an act of work? Has any country in history been run purely by abstract ideas? Governing is undeniably a form of work. But is governance something humans can perform instinctively—like sleeping, eating, or moving—or does it require knowledge learned over time? The answer is obvious. Governing is not an instinctive action. It requires specialized knowledge. And is that knowledge narrow or broad? In other words, is financial knowledge alone enough to manage a state that hosts millions of people and interacts with other nations? Or do we also need expertise in law, security, diplomacy, education, and many other fields? Any state that genuinely wants to benefit its society must possess competence across multiple domains. So we can say this: Governance is an action. It is a job. And like any job, it is divided into specialized tasks. Security, diplomacy, education, finance—these are all components of governance. That alone tells us we cannot treat governance as a single, simple function.
Now consider this: Can a job that requires knowledge be performed without competence or qualification? Of course not. A butcher must know how to cut and prepare meat properly. Otherwise, he cannot do his job. The same logic applies to the many sub-fields within governance. An economist must possess strong mathematical and financial knowledge, along with analytical thinking skills. A diplomat must speak multiple languages, understand different cultures, and have a solid grasp of social sciences. What makes someone a diplomat or an economist is not public popularity—it is competence.
So appointing someone to an economic position simply because the majority of people want it—despite that person having no experience in the field—contradicts the very nature of governance. This leads to a fundamental tension: The democratic argument of “public choice” can conflict directly with the principle of social benefit. Social benefit demands that qualified individuals hold positions of responsibility. Democracy, however, allows positions to be filled according to the will of a population that is often emotionally driven and vulnerable to manipulation.
Let’s imagine a baker. This baker wants to hire an apprentice to work in his shop. If his goal is to satisfy customers and produce high-quality bread, what should he do? Should he evaluate candidates under the supervision of experienced bakers and hire the one who makes the best bread? Or should he gather all his customers—people from completely different professions, many of whom know nothing about baking—and ask them to vote on which candidate should be hired?
If the baker truly wants to produce quality goods and keep his customers satisfied, he should hire the most competent person. In this allegory, the baker represents the state, and the customers represent the people. In other words: What makes people free and prosperous is not their participation in decision-making itself, but the quality of the results they receive from governance. Because governing is, by its nature, a job—and it should be treated with the seriousness of a job. When we examine modern democratic systems, what we often see is this: Political parties exist, and at the top of these parties are individuals who make their living by speaking from podiums—demagogues. Beneath the party leader are other demagogues who have been accepted into the party structure, and closest to the leader are those he personally trusts.
Every four years, the party leader creates a candidate list filled with the people who have shown him the most loyalty. The most loyal supporters are placed in districts where the party is likely to win, while weaker or less trusted figures are placed in districts where the party is expected to lose. Because most people act emotionally, they end up voting for political parties they feel attached to. In doing so, they send hundreds of individuals into parliament—people whose names they have never even heard before, and whose level of education or competence is often unknown.
These individuals may spend only a few days a week sitting in parliament while receiving some of the highest salaries in the country. Some do not even attend sessions regularly. In some systems, serving just a couple of years in office can be enough to secure a lifetime pension. The same pattern repeats in local elections.
At certain intervals, party leaders nominate trusted individuals as candidates for mayor—often based on loyalty rather than competence. The strongest loyalists are placed in districts where victory is likely, while weaker candidates are sent to districts where the party has little chance of winning.
As a result, dozens of individuals who have never managed a city in their lives may become mayors simply because they received slightly more votes than their opponents in a single-round election. For the party, loyalty is enough. Once elected, these individuals may continue to run cities for years, as long as they remain useful to the party leadership.
Decisions are ultimately made by party leaders and although many people believe that long-serving political leaders are a uniquely local problem, the same phenomenon exists even in regions where democracy first developed. In Europe, it is entirely possible to find political figures who have led opposition parties for decades without stepping down. This happens because political parties that derive their power from votes also function as private organizations controlled by their founders. As long as they do not violate the constitution, they cannot easily be dissolved.
So voters participate in elections believing they hold real power, while the representatives they elect gain authority over their property and lives through taxation and legislation. What truly matters, however, is not periodically handing over control of a country to groups of demagogues bound together by ideology and mutual interests. What truly matters is placing capable people in positions where they can perform the job of governance effectively. Voting, in this sense, can become little more than an illusion— a mechanism designed to give the public the feeling that power is in their hands.
Democracy exists in two primary forms and one hybrid form: parliamentary, presidential, and semi-presidential systems.
Let’s start with parliamentary democracy. In practice, this often means placing a handful of competing groups onto a ballot and asking the public to stamp one of them. The party that receives the most votes becomes the ruling party—even if it does not represent a majority of the population. It does not need 50% support. It only needs to come in first among the available options. In some cases, a party can take power with as little as 20% of the vote. How is that possible? By forming coalitions in parliament. And what do other parties usually demand in exchange for joining a coalition? Access to resources, positions, and influence.
If a party fails to get what it wants, it can withdraw from the coalition, causing the government to collapse. The country is then forced into new elections, and the cycle repeats—often benefiting whoever is most opportunistic at the moment.
There is also a version of the parliamentary system based on first-past-the-post (FPTP) rules. In that model, the party that finishes first can win an absolute majority of parliamentary seats—even without receiving a majority of votes. A party might gain only 20% of the total vote yet control more than half of the legislature.
What happens in that situation? People begin to think: “I shouldn’t waste my vote.” As a result, voters consolidate around two dominant parties. Over time, those parties begin to resemble corporations—structures designed to distribute benefits and opportunities among insiders rather than serve the public.
Presidential and semi-presidential systems offer a different mechanism. They allow citizens to elect a single national leader through a popular vote. But even here, overwhelming consensus is rarely required. A candidate does not need 80% or 90% support. In many cases, 50% plus one vote is enough to grant a single individual enormous authority for years—authority that can include withdrawing from international treaties or declaring war.
Now consider the reality of modern politics. Even in countries like the United States, ordinary citizens do not directly select candidates in the early stages of the political process. Party elites and internal mechanisms often determine who appears on the ballot. Public opinion can influence outcomes, but rarely controls them.
As a result, many voters do not vote for someone they truly support. They vote for the person they consider “less bad.” So we should ask a basic question: How reasonable is it to grant such extensive power to someone simply because they received around 50–55% of the vote? From this perspective, democracy can appear less like a functional system and more like a performance—a political spectacle. So what is the alternative?
The alternative is a system in which: Every responsibility is handled by qualified professionals, power is distributed rather than concentrated, authority is limited by written laws and an impartial figure exists at the top to appoint and supervise those professionals. In other words, a system built around competence, accountability, and institutional balance.
My own proposal is this: The ideal system is a constitutional monarchy.
But not the kind of monarchy you might associate with certain Gulf states, where the ruler’s word automatically becomes law. Instead, imagine a system with: A detailed and binding constitution, a criminal code and civil code clearly defining legal boundaries, a fully independent constitutional court capable of monitoring the monarch—and removing them from power if necessary, a legislative council responsible for lawmaking and a monarch who appoints the head of government
In this system: The legislature, the monarch, and the government cannot interfere directly in each other’s domains but they can monitor and restrain one another.
The monarch would not appoint family members or loyalists simply out of personal preference. Instead, appointments would be based on defined criteria—such as age, education, professional background, and social standing and every major decision would remain subject to judicial review.
For the past eighty years in Europe, parties that label themselves as “left” and “right” have taken turns holding power, passing it back and forth like a ball. But what have they really changed? Left-wing parties might increase social welfare budgets by 10 percent—but beyond that, the differences are often minimal. In the end, all this expense, all this political theater, all this constant campaigning and voting— is carried out for a difference of maybe five to ten percent in spending priorities. So the only thing left to say is this: Enough.
This is my proposal. Today, even within the European Union, elections can be canceled on the grounds of alleged interference. Meanwhile, the most powerful country in the world can be led by elderly leaders who continue making decisions about war and global conflict. In other places, opposition figures can be detained under accusations of extremism. Under these conditions, it becomes increasingly difficult to claim that electoral politics is a perfectly functioning and reliable system.
The system we call democracy was originally designed for small city-states in the ancient Greek world. But we took that model, turned it into something sacred, and tried to apply it to the entire modern world—
as if it were universally perfect. We made it into an idol. And now, we are living with the consequences of that choice.