Strikes at Public Universities
Tibor R. Machan
In a free society when employers and employees negotiate on terms of trade, both sides are free to walk away from the table. Employers would then be left with the need to hire new staff, while employees would need to find a new job. Neither is a welcome prospect, so both sides try to avoid it. But customers can live with this because other sources of goods and services usually exists in the market place where they can purchase what the negotiating parties offer.
When one turns to public employment, however, the situation is markedly different. That’s because customers are not free to refrain from purchasing the goods and services public institutions offer. So, for example, if the teachers at the California State Universities, who are threatening to strike if their terms are not met by the university system, walk of their job, those who pay their salaries and for the schools operations must keep paying. The paying customers, namely, California taxpayers, aren’t legally free to walk, whereas teachers are. And this is unjust.
The entire notion of striking is at home only in a free market system where all parties have alternatives. In a public service industry, however, those who pay for the service lack the freedom to seek other ways to spend their funds. Their funds are confiscated, no matter what.
Now if it is OK, which it isn’t, of course, to force customers (mostly the parents of the students in the case of CSU) of public services to pay, it could be argued that it is OK to have providers put out the work for which these customers are legally required to pay. There is clearly an imbalance afoot—teachers may refuse to work but those who pay them are not free to refuse to pay.
The lesson, of course, is that there should be no public employment other than those required for the maintenance of justice—the courts, military, and so forth. And those should not be able to go on strike since their pay is secured by means of coercion and cannot be withheld.
In a free market of education, colleges and universities would be just like shoe stores or recreation facilities or weight loss centers—their provisions would be obtained with the full consent of all the parties involved in the exchange relationship. No one would be privileged, favored by government as against others involved in the provision of the service (in CSU’s case, education). Because no one’s resources could be obtained against his or her will, there would have to be serious, honest negotiations, with no one in the position to act like an extortionist.
With public service institutions, however, not all the parties are free to deal on their own terms. Taxpayers are stuck having to pay taxes, while teachers can refuse to teach. They can even shut down a university or the entire system while those who pay them will go to jail if they attempt to withhold payment of their taxes that go to the maintenance and administration of the system.
So, perhaps all this is moot since we do have a massive public service sector in this country, which is far from a free one the rhetoric to the contrary notwithstanding. What is the right approach given this plain enough fact?
Striking would have to be banned, just as refusing to pay taxes is banned. This is not a welcome option, of course, to anyone who believes that the flow of goods and services ought to be free. But when it isn’t free for customers, maybe it shouldn’t be free for employees either.
In some public service industries strikes are banned precisely for this reason. If a monopoly or near-monopoly has been established for the delivery of certain goods or services, so that it is nearly impossible to go elsewhere to gain what one wants (since resources for this are conscripted and one could go to jail if one failed to provide them), then no one ought to make it seem this is a free market in which all parties are free agents.
Observations and reflections from Tibor R. Machan, professor of business ethics and writer on general and political philosophy, now teaching at Chapman University in Orange, CA.
Thursday, March 22, 2007
Tuesday, March 20, 2007
Iraq War Conservative Skepticism
Tibor R. Machan
Since the start of talks of invading Iraq, it has been my view that it was a bad idea, in conflict with the principles of the foreign and military policy of a free society. I ended my first column written against the war on September 9, 2002, as follows:
"Perhaps Iraq needs to be moved on and fast, to stop Saddam Hussein from destroying us and our friends abroad. Perhaps some people do have the needed information that would justify such a preemptive retaliation.
"But with all the evidence showing the lack of credibility of the U.S. Government in so many matters, and the evasion of the process of getting Congressional authorization, how can someone support a mission that involves such serious risks as a war does?"
Later I kept reiterating my skepticism, based mainly on the idea that the government of a free country is established so as to secure the rights of its citizens, not to solve problems abroad unless some very carefully draw treaty has been entered into which requires getting involved there.
Slowly but surely quite a few early supporters or fellow travelers joined in and by now many conservatives such as Senator Chuck Hegel and William F. Buckley, Jr. have gone on record opposing President George W. Bush in his refusal to relinquish his irrational objective of building a functional constitutional democracy in Iraq. These are, it bears keeping in mind, not a bunch of America hating Leftists. These are men and women who came to realize that there is no rational justification for America to be fighting this completely mad war, a war against an enemy that amounts to, as I recently put it, a deadly heavy fog with no clearly identifiable substance that could be construed as a disposal enemy.
I admit that my opposition to the war was what some folks call "ideological"—in that tone that has surrounded this term ever since Karl Marx made it into something insidious. (An ideology, for Marx, was a simplistic rationalization for the ruling class’s efforts to make its exploitation of the people seem acceptable.) What, in fact, guided my thinking is the plain, unambiguous wording in the Declaration of Independence about the purpose of government in a country that is founded on the idea that human beings have unalienable rights to life, liberty and the pursuit of happiness.
If this is a sound idea—and it is eminently sound, in comparison to others on which political regimes are founded—then government are akin to a body guard one may hire to provide one with proper protection against aggression from others. The body guard isn’t supposed to go around looking for other people who may need help. The job is to protect the clients and the clients of the government and military of the United States of America are its citizens.
Sure, sophists among us may scoff at this as simplistic—but basic principles are supposed to be clear, unambiguous, understandable by all to whom they apply, whose conduct they are supposed to guide. The complexities, and there will be plenty when those principles are applied in concrete situations, need to be worked out but never at the expense of those basics.
For this reason such apparent slogans—which are, in fact, sound, clearly articulated principles—as Benjamin Franklin’s observations that "Those who would sacrifice liberty for security deserve neither…" have been my guiding ideas and all the talk about pragmatism and how the world is too messy to stick it out with principled policies, have never deterred me from my stance.
It is somewhat gratifying that early Bush loyalists are beginning to appreciate this, although it’s like that it comes too late for all those who were sacrificed on the altar of hubris. But then this has now become clear about the American government, be it under the administration of Democrats or Republicans. Its officials would not recognize a principle such as those laid out by the American founders if it came up to them a bit them in the face.
Tibor R. Machan
Since the start of talks of invading Iraq, it has been my view that it was a bad idea, in conflict with the principles of the foreign and military policy of a free society. I ended my first column written against the war on September 9, 2002, as follows:
"Perhaps Iraq needs to be moved on and fast, to stop Saddam Hussein from destroying us and our friends abroad. Perhaps some people do have the needed information that would justify such a preemptive retaliation.
"But with all the evidence showing the lack of credibility of the U.S. Government in so many matters, and the evasion of the process of getting Congressional authorization, how can someone support a mission that involves such serious risks as a war does?"
Later I kept reiterating my skepticism, based mainly on the idea that the government of a free country is established so as to secure the rights of its citizens, not to solve problems abroad unless some very carefully draw treaty has been entered into which requires getting involved there.
Slowly but surely quite a few early supporters or fellow travelers joined in and by now many conservatives such as Senator Chuck Hegel and William F. Buckley, Jr. have gone on record opposing President George W. Bush in his refusal to relinquish his irrational objective of building a functional constitutional democracy in Iraq. These are, it bears keeping in mind, not a bunch of America hating Leftists. These are men and women who came to realize that there is no rational justification for America to be fighting this completely mad war, a war against an enemy that amounts to, as I recently put it, a deadly heavy fog with no clearly identifiable substance that could be construed as a disposal enemy.
I admit that my opposition to the war was what some folks call "ideological"—in that tone that has surrounded this term ever since Karl Marx made it into something insidious. (An ideology, for Marx, was a simplistic rationalization for the ruling class’s efforts to make its exploitation of the people seem acceptable.) What, in fact, guided my thinking is the plain, unambiguous wording in the Declaration of Independence about the purpose of government in a country that is founded on the idea that human beings have unalienable rights to life, liberty and the pursuit of happiness.
If this is a sound idea—and it is eminently sound, in comparison to others on which political regimes are founded—then government are akin to a body guard one may hire to provide one with proper protection against aggression from others. The body guard isn’t supposed to go around looking for other people who may need help. The job is to protect the clients and the clients of the government and military of the United States of America are its citizens.
Sure, sophists among us may scoff at this as simplistic—but basic principles are supposed to be clear, unambiguous, understandable by all to whom they apply, whose conduct they are supposed to guide. The complexities, and there will be plenty when those principles are applied in concrete situations, need to be worked out but never at the expense of those basics.
For this reason such apparent slogans—which are, in fact, sound, clearly articulated principles—as Benjamin Franklin’s observations that "Those who would sacrifice liberty for security deserve neither…" have been my guiding ideas and all the talk about pragmatism and how the world is too messy to stick it out with principled policies, have never deterred me from my stance.
It is somewhat gratifying that early Bush loyalists are beginning to appreciate this, although it’s like that it comes too late for all those who were sacrificed on the altar of hubris. But then this has now become clear about the American government, be it under the administration of Democrats or Republicans. Its officials would not recognize a principle such as those laid out by the American founders if it came up to them a bit them in the face.
Dinesh D’Souza’s Amoralism
Tibor R. Machan
As Andrew Sullivan makes so evident in his review of Dinesh D’Souza’s controversial book, The Enemy at Home: The Cultural Left and its Responsibility for 9/11 (Doubleday, 2006), D’Souza thinks morality means being forced to follow God’s laws. In fact, of course, morality means voluntarily doing the right thing and refusing to do the wrong. Whatever moral or ethical system is involved, if one doesn’t follow it voluntarily, of one’s own free will, one isn’t being moral or ethical in the slightest. One then is an amoralist! Because of this gross mistake, D’Souza argues that there is a natural affinity between modern Christianity and radical Islam, so much so that despite some differences and excesses, the two are locked in a struggle with the real enemy of morality, namely, post-Enlightenment secularism.
For D’Souza, ethics "is based on the notion that there is a moral order in the universe, which establishes an enduring standard of right and wrong." So far so good, although D’Souza omits that crucial element, namely, that the order has to be embraced by a human being of his or her own free will. And that is so within the Western Christian ethical tradition, however much that might have been overlooked or perverted during the Holy Inquisition. Radical Islam, in contrast—especially its most dominant version today, the Wahhabi doctrine, as fashioned, mainly, by Sayyid Qutb—rejects the idea of free will. Instead Muslim ethics or morality involves coercing people to the will of Allah, along lines that ethics is understood vis-à-vis little children in the West. In Muslim ethics, at least as Qutb saw it, every human being is like a child; the state, which is God’s instrument, must make the "child" comply with the moral order.
Why Dinesh D’Souza, who used to have a good grasp of the meaning of American liberty, all of a sudden forgets this essential difference between Western religion and radical Wahhabi Islam is quite perplexing. But perhaps it is his desperation to shore up old fashioned, non-American conservatism that explains his current stance, a stance that he seems to see as the last hope for conservatives. The reason it seems to him to be the last hope is that he takes it to be the only alternative to the liberal moral position, which is concerned with "autonomy, individuality, and self-fulfillment as moral ideals." And, as Sullivan notes, for D’Souza this implies that "liberal morality … consists first of all in the right of the individual to choose for him- or herself what morality is."
Seeing things in this light lends D’Souza’s stance some credibility—it is no morality at all which an individual chooses for him- or herself. The point of morality (or ethics) is to provide a standard by which individuals are guided in their conduct, not something they invent for themselves. An uneducated person may be excused for confusing "choosing to do the right thing" with "choosing what the right thing is" but D’Souza isn’t uneducated. So why does he make this mistake?
As with many traditional conservatives, D’Souza seems unable to abide by the idea that individuals must choose to do the right thing, even though of course what the right thing is isn’t up to them in the slightest. Compare: in a free society people must choose to adhere to a diet or fitness program yet clearly what diet or fitness program they ought to follow could be something entirely independent of their wishes or choices—it is, rather, what the science of nutrition or medicine identify. The same with ethics—right and wrong are indeed "based on the notion that there is a moral order in the universe." That much D’Souza got right. But an adult human being must choose to obey that order and no moral credit comes from being coerced to follow it.
Yes, there is much talk in the West of moral skepticism and has always been, even back in the good old days conservatives claim to love. Socrates and Aristotle did battle with the moral skeptics. But the Left does not embrace moral skepticism. The Left considers the ethics of the Right wrong but it has its own it thinks is correct. What really distinguishes the conservative’s and the radical Islamists’s moral stance is the issue of freedom of choice, not skepticism. In the end Dinesh D’Souza is propounding life without ethics at all, a life of dehumanized regimentation, which is to say, a life of amoralism.
Tibor R. Machan
As Andrew Sullivan makes so evident in his review of Dinesh D’Souza’s controversial book, The Enemy at Home: The Cultural Left and its Responsibility for 9/11 (Doubleday, 2006), D’Souza thinks morality means being forced to follow God’s laws. In fact, of course, morality means voluntarily doing the right thing and refusing to do the wrong. Whatever moral or ethical system is involved, if one doesn’t follow it voluntarily, of one’s own free will, one isn’t being moral or ethical in the slightest. One then is an amoralist! Because of this gross mistake, D’Souza argues that there is a natural affinity between modern Christianity and radical Islam, so much so that despite some differences and excesses, the two are locked in a struggle with the real enemy of morality, namely, post-Enlightenment secularism.
For D’Souza, ethics "is based on the notion that there is a moral order in the universe, which establishes an enduring standard of right and wrong." So far so good, although D’Souza omits that crucial element, namely, that the order has to be embraced by a human being of his or her own free will. And that is so within the Western Christian ethical tradition, however much that might have been overlooked or perverted during the Holy Inquisition. Radical Islam, in contrast—especially its most dominant version today, the Wahhabi doctrine, as fashioned, mainly, by Sayyid Qutb—rejects the idea of free will. Instead Muslim ethics or morality involves coercing people to the will of Allah, along lines that ethics is understood vis-à-vis little children in the West. In Muslim ethics, at least as Qutb saw it, every human being is like a child; the state, which is God’s instrument, must make the "child" comply with the moral order.
Why Dinesh D’Souza, who used to have a good grasp of the meaning of American liberty, all of a sudden forgets this essential difference between Western religion and radical Wahhabi Islam is quite perplexing. But perhaps it is his desperation to shore up old fashioned, non-American conservatism that explains his current stance, a stance that he seems to see as the last hope for conservatives. The reason it seems to him to be the last hope is that he takes it to be the only alternative to the liberal moral position, which is concerned with "autonomy, individuality, and self-fulfillment as moral ideals." And, as Sullivan notes, for D’Souza this implies that "liberal morality … consists first of all in the right of the individual to choose for him- or herself what morality is."
Seeing things in this light lends D’Souza’s stance some credibility—it is no morality at all which an individual chooses for him- or herself. The point of morality (or ethics) is to provide a standard by which individuals are guided in their conduct, not something they invent for themselves. An uneducated person may be excused for confusing "choosing to do the right thing" with "choosing what the right thing is" but D’Souza isn’t uneducated. So why does he make this mistake?
As with many traditional conservatives, D’Souza seems unable to abide by the idea that individuals must choose to do the right thing, even though of course what the right thing is isn’t up to them in the slightest. Compare: in a free society people must choose to adhere to a diet or fitness program yet clearly what diet or fitness program they ought to follow could be something entirely independent of their wishes or choices—it is, rather, what the science of nutrition or medicine identify. The same with ethics—right and wrong are indeed "based on the notion that there is a moral order in the universe." That much D’Souza got right. But an adult human being must choose to obey that order and no moral credit comes from being coerced to follow it.
Yes, there is much talk in the West of moral skepticism and has always been, even back in the good old days conservatives claim to love. Socrates and Aristotle did battle with the moral skeptics. But the Left does not embrace moral skepticism. The Left considers the ethics of the Right wrong but it has its own it thinks is correct. What really distinguishes the conservative’s and the radical Islamists’s moral stance is the issue of freedom of choice, not skepticism. In the end Dinesh D’Souza is propounding life without ethics at all, a life of dehumanized regimentation, which is to say, a life of amoralism.
Monday, March 19, 2007
What Ails the World
Tibor R. Machan
The ills of the world aren’t mostly medical, of course, but philosophical, moral, and cultural. And the main one is definitely something only straight thinking can hope to cure.
Now, mind you, straight thinking doesn’t produce results immediately. It’s like a fitness program which needs to be followed rather strictly and over a good bit of time before its effects can be realized. But thinking well really is the only answer—it is, in fact, the only thing that’s under one’s control, the rest in about the forces of nature and the consequences of past religious, scientific, legal, and related thinking now embedded in the societies in which people live their lives. The mind is the free organ of the intact human agent from which the complex actions that can transform the world spring. (Yes, there is debate about this but the skeptics refute themselves when they make their skeptical case with, you guessed it, their minds!)
So then what is the main malady and how can it be fixed? First and foremost the world needs to give up on lumping people all in with groups. The Asians, illegal Americans, refugees, blacks, whites, middle class, politicians, merchants, and so forth—thinking about human beings as if they all managed to fit such groups, as if their identity consisted of their national, racial, ethnic, religious, class membership, or origins is nearly always a bad idea. Now and then it is acceptable, as when some biological similarity can help predict how someone will fare medically or when people make commitments to be part of a group and others can infer how they will act as a result. But even here it is how they act as individuals that will make the greatest difference—whether they take the initiative to educate themselves, to work hard, to rethink the ideas they have accepted mostly unthinkingly and so forth.
Tribalism is the term I prefer for the kind of “we” think that so many folks practice both when it comes to themselves or to other people. Women this, Southern Californians that, Europeans yet another thing, and all the rest that bury who a person is by virtue of his or her choices and decisions beneath layers of group identity. The practice is evident in the thinking done by the most unsophisticated as well as the most erudite folks one runs across.
Multiculturalism is one implication of this kind of constant classification of human individuals across the globe. Diversity programs even at those bastions of supposed independent thought, namely, colleges and universities, focus on lumping students and faculty not by the variety of thinking but of color, race, gender, national origin and so forth, as if when one looks different from another, or hails from a different place, that necessarily means one will think and look at the world differently.
Not only is this demeaning of people—regarding them as if they were made with cookie cutters and had no hand in directing their own lives (which then can actually become a self-fulfilling prophesy)—but it also renders most creativity difficult to get off the ground, especially about social and political issues. It is difficult to escape it. Yes, it is possible but tough because most people do not cherish being deemed weird. Yet of one thinks for oneself, that’s often the result.
Tribal thinking has been around for most of human history and has had its periodic uses, too. But the damage it has wrought has been devastating (e.g., the Holocaust). Such a way to view people tends to make it appear they are replaceable, predictable, and interchangeable within the group. As if their individuality didn’t much matter.
Of course, this is way off—anyone knows that the death of a friend or loved one cannot be remedied by replacement. That’s because in their essence human beings are individual, unique, irreplaceable. But this doesn’t suit tribal policy-makers much, those who have plans for people as members of groups regardless of their individual choices and agendas.
For me, a first generation American, it was American culture’s stress of the importance of the individual that held out the greatest hope. It was, even if only rhetorically, what gave the place its uniqueness among the societies of history and the world. And it is still, I am convinced, the major cure of what ails the world.
Tibor R. Machan
The ills of the world aren’t mostly medical, of course, but philosophical, moral, and cultural. And the main one is definitely something only straight thinking can hope to cure.
Now, mind you, straight thinking doesn’t produce results immediately. It’s like a fitness program which needs to be followed rather strictly and over a good bit of time before its effects can be realized. But thinking well really is the only answer—it is, in fact, the only thing that’s under one’s control, the rest in about the forces of nature and the consequences of past religious, scientific, legal, and related thinking now embedded in the societies in which people live their lives. The mind is the free organ of the intact human agent from which the complex actions that can transform the world spring. (Yes, there is debate about this but the skeptics refute themselves when they make their skeptical case with, you guessed it, their minds!)
So then what is the main malady and how can it be fixed? First and foremost the world needs to give up on lumping people all in with groups. The Asians, illegal Americans, refugees, blacks, whites, middle class, politicians, merchants, and so forth—thinking about human beings as if they all managed to fit such groups, as if their identity consisted of their national, racial, ethnic, religious, class membership, or origins is nearly always a bad idea. Now and then it is acceptable, as when some biological similarity can help predict how someone will fare medically or when people make commitments to be part of a group and others can infer how they will act as a result. But even here it is how they act as individuals that will make the greatest difference—whether they take the initiative to educate themselves, to work hard, to rethink the ideas they have accepted mostly unthinkingly and so forth.
Tribalism is the term I prefer for the kind of “we” think that so many folks practice both when it comes to themselves or to other people. Women this, Southern Californians that, Europeans yet another thing, and all the rest that bury who a person is by virtue of his or her choices and decisions beneath layers of group identity. The practice is evident in the thinking done by the most unsophisticated as well as the most erudite folks one runs across.
Multiculturalism is one implication of this kind of constant classification of human individuals across the globe. Diversity programs even at those bastions of supposed independent thought, namely, colleges and universities, focus on lumping students and faculty not by the variety of thinking but of color, race, gender, national origin and so forth, as if when one looks different from another, or hails from a different place, that necessarily means one will think and look at the world differently.
Not only is this demeaning of people—regarding them as if they were made with cookie cutters and had no hand in directing their own lives (which then can actually become a self-fulfilling prophesy)—but it also renders most creativity difficult to get off the ground, especially about social and political issues. It is difficult to escape it. Yes, it is possible but tough because most people do not cherish being deemed weird. Yet of one thinks for oneself, that’s often the result.
Tribal thinking has been around for most of human history and has had its periodic uses, too. But the damage it has wrought has been devastating (e.g., the Holocaust). Such a way to view people tends to make it appear they are replaceable, predictable, and interchangeable within the group. As if their individuality didn’t much matter.
Of course, this is way off—anyone knows that the death of a friend or loved one cannot be remedied by replacement. That’s because in their essence human beings are individual, unique, irreplaceable. But this doesn’t suit tribal policy-makers much, those who have plans for people as members of groups regardless of their individual choices and agendas.
For me, a first generation American, it was American culture’s stress of the importance of the individual that held out the greatest hope. It was, even if only rhetorically, what gave the place its uniqueness among the societies of history and the world. And it is still, I am convinced, the major cure of what ails the world.
Subscribe to:
Posts (Atom)