PROFESSIONALISM, professionalization, and the professions are increasingly central to any grasp of modern societies, yet persistently elude proper understanding. On…
Professionalism, professionalization, and the professions are increasingly central to any grasp of modern societies, yet persistently elude proper understanding. On the one hand, the professions are old, and considerably antedate modern society, with its scientific base, its huge production of commodities, its mass education, and its complex division of labor—after all, theology, medicine, and law were already solidly established as professions requiring specialized training in the medieval universities. On the other hand, professional organization seems to be distinctively characteristic of modern societies. More and more occupations are organized as professions, and new occupations claiming to be professions regularly come into being. No definition, it seems, can possibly cover all these developments. Professions range from those most of us could accept as being based on scientific and specialized knowledge, demanding long training, to those that raise many questions as to whether they are truly “professions.” As they multiply, so do our doubts about them. We doubt that they are based on as much specialized knowledge as they claim to be; we doubt that they require as much training as they claim to; we suspect that their insistence on formal degrees and their efforts to bar the unqualified smack more of restrictive trade-union practice than of any altruistic concern for their clients or the public.
These suspicions are strongest when it comes to the newer professions—the ones that not too long ago were simply considered occupations, and whose claim to any sort of special scientific or technical knowledge seems dubious. Education and social work (and their various sub-specialties) fall into this category, along with child-care and care of the mentally ill, the retarded, and the elderly. All these were turned (or are in the process of being turned) into professions only recently, yet the trappings of professionalism are already securely in place: specialized courses of study in institutions of higher education; the professional associations limited to degree-holders; the competitive examinations and certification procedures under state auspices. (Not long ago, Robert Wood, a former professor of political science at MIT, a former secretary of the Department of Housing and Urban Development, and a former president of the University of Massachusetts, required a special exemption to become superintendent of schools in Boston—he did not have the proper course in education and administration.)
But the current attack on the professions, which derives in part from the consumer-advocacy movement, is also aimed at the older professions like law and medicine (theology, in our secular society, is safe because it is no longer important). They are criticized for monopolizing the services they perform, charging too much for them, exaggerating the degree of specialized knowledge they possess, and the length of time necessary to acquire it. Much of what they do, the argument continues, we could do ourselves, if the states and professional associations would dismantle their regulatory machinery, and allow those without the requisite credentials to perform these services for themselves if they want to, and for anyone else willing to accept them.
This argument has something to recommend it. A closer examination of the claims of the professions, their technical qualifications, and the objectives they pursue is certainly warranted, as is raising the question of whether achieving these objectives necessarily involves a lengthy period of education and specialized training. Every occupation tries to enhance its own prestige, and the impulse to emphasize the importance of what one does, to insist that it requires more than ordinary capacities and application, and to try to keep the doors closed to outsiders is only human. Yet there are also social objectives to be considered here, and we are entitled to ask whether specialized training and professional qualifications bolstered by the state are a better way to achieve them than fair and open competition in the marketplace.
If one major source of criticism of the professions has been consumer advocacy—the cry, “We can do it for ourselves”—another has been the new revisionist history of social institutions. The asylum, the school, the university, the jail, the social-service organization have all come under sharp examination by young historians in recent years, and their efforts have substantially revised the functional, and rather benign, explanation of how these institutions came about. Over and over again we have been hearing that the schools were not intended primarily to educate, the jails were not intended to protect the community from crime, the social-service agencies were not intended to help families with problems. Rather, all these institutions, to a lesser or greater extent, manipulated those in their charge, stripped immigrants of their culture, shaped a docile working class, and imposed alien middle-class values.
Certainly we have learned a great deal from this new history, and what we have learned has raised many questions about the professions. Thus, from Steven L. Schlossman’s scholarly and thoughtful study of the early treatment of juvenile delinquency in America,1 we learn that our enlightened, progressive approach to this problem—which involves housing children in institutions modeled as far as possible on the family—actually dates back to the early 19th century. In those days, children’s institutions were called Houses of Refuge or Reform, industrial schools, or reformatories, and none of these terms had the associations they now call up. They started out with the brightest of hopes as enlightened sanctuaries, only to degenerate into jails whose outrageous conditions led to recurrent explosions of public outrage. We learn too from Schlossman’s book that these earlier efforts to reform the treatment of the delinquent child were based, just like our own, on scientific tenets, though admittedly the science of that day could well have included phrenology. Still, since our own approach in these matters is not so different from the ones he is describing, Schlossman’s book inevitably raises the question of whether our own efforts to understand delinquency scientifically stand any better chance of success. If those earlier experts (they were not called professionals then) were so wrong about the efficacy of such things as family-style confinement centers, juvenile courts, etc., why should we do any better? Inevitably, the mood of skeptical caution induced by Schlossman’s study, and other studies of early reform institutions; carries over to influence our attitude toward our own present-day professionals and their claims.
Schlossman’s study is an example of the better kind of revisionist history. But unfortunately a good part of this history, influenced by Marxism, consists of a massive and undiscriminating criticism of the professions in which specific questions are all but lost in a flood of total, yet obscure, denunciation. It is one thing to point out the inevitable limitations of understanding of the professions, to criticize exaggerated claims, and to call attention to motives of self-interest. It is quite another to interpret professionalism as such as simply an example or tool of class domination, as both Burton Bledstein and David Noble do, the first in his study of higher education in America, and the second in his account of how the profession of scientific engineering was shaped by the needs of corporate capitalism.2 Both books are based on a great deal of research (Noble in particular opens up obscure sources) and both are very informative, but in a framework of interpretation so distorted that even a reader like myself, not qualified to fault their research, can easily see gaping holes in their efforts to construct an argument on the basis of that research.
Bledstein devotes much of his book to the rise of the American university, whose purpose, as he sees it, was to insure dominance for the children of the middle class by providing them with professions. He focuses on the group of men who created the modern American university in the latter part of the 19th century—Charles W. Eliot of Harvard, Daniel Coit Gilman of Johns Hopkins, Andrew Dickson White of Cornell, James B. Angell of Michigan—to tell a story that has already been told once (and with great authority and insight) by Laurence Veysey in his masterful The Emergence of the American University. But what a different version we get in Bledstein! What escapes almost completely from his account is the effort of these men, in varying degree, to raise the level of scholarship of the American university, and to find means of service, of connecting it to the life of the ongoing society. That they also created and led institutions and did what they had to do to expand them and establish them on firm foundations, and to attract support and students, goes without saying. But this is almost all one sees in Bledstein. One would not suspect that the early university presidents also spent a good deal of time in such pursuits as looking for scholars who would raise the reputation of their institutions, even though they were of little practical use when it came to such things as raising money or attracting undergraduate students.
Bledstein’s thesis is that “By and large, the American university came into existence to serve and promote professional authority in American society. More than in any Western country in the last century, the development of higher education in America made possible a social faith in merit, competence, discipline, and control that were basic to accepted conceptions of achievement and success.” He adds further that “. . . the culture of professionalism in America . . . has taken an inestimable toll of the integrity of individuals.” These sweeping if somewhat unclear judgments are made possible in part because Bledstein uses the term professionalism so loosely. The transformation of the undertaker into the mortician, the plumber into the sanitary engineer, the amateur historian into the academic historian—all this to him is part of the history of professionalism, as are changes in the training of doctors, increases in the number of specialties and specialists’ organizations, and the rise of academic and professional engineering societies. Even “professional” sports—where the term means something quite different—is swept into the large loose net of his generalizations.
Against all these professions and professionals, Bledstein displays an animus which is never explained. He speaks often of professional “pretensions,” and finds something “ominous” in the way the professions attempted to pass judgment on “specific skills and technical competence.” But what exactly is so suspicious about all this? After all, in some of the newer professions—like mining engineering, for example—there was obviously a basis of technical knowledge that justified claims to some sort of special competence. In others, there may not have been such a basis, but people honestly thought there was until the development of science (a dangerous phrase, but I stand by it) proved otherwise. In all of them there was undoubtedly some measure of self-delusion, self-enhancement, and undue exclusivity. But in the general attack on professionalism as securing middle-class privilege, these crucial distinctions disappear.
So, for example, we are told how doctors went about intimidating their clients through “rituals and ceremonies” involving the conspicuous display of “new tools and equipment.” So likely does this appear on the face of it that the reader finds himself nodding in agreement, until he realizes that the list drawn up by the author to make his point—“anesthetics, the laryngoscope, the microscope, the cardiograph, the thermometer, and the x-ray”—does not actually contain a single example of an instrument that is useless and serves only to mystify. Bledstein seems to be incapable of distinguishing among real advances in knowledge and competence, honest errors, and intended mystification, but unless one does, it would seem impossible to mount a legitimate critique of the pretensions of the professions and the universities that provided training for them.
The claim that the professions were monopolized by the middle classes is not in itself an unreasonable one. One can argue that professional status replaced land, or property, or social status as an indicator of achievement, and that this was accomplished through the new universities, which gave up the teaching of Latin, Greek, and classical mathematics in favor of more practical and usable instruction. Even so, to make this argument fairly, one would have to take into acount the role of the merit-based professions in creating social mobility. It was not only the Anglo-Saxon middle class that found in the university an opportunity to enter the professions and thereby a way of improving its place in the world.
More important, Bledstein does not take into account the autonomous development of the professions themselves as they endeavored to deal with the problems that were then becoming manifest in the society around them. That the professions conferred social status and income may be granted, but they also played a very specific and practical role in 19th-century society. As Bledstein’s own list of medical instruments demonstrates, there were extremely useful advances made during this time in methods of diagnosing illness, even if it would take until well into the 20th century before specific drugs or procedures would be developed to deal effectively with diagnosed problems. Whatever else it may have done for the status and income of doctors, the absorption of the medical school into the university and the reduction in the number of proprietary and independent medical schools also improved medicine. It is this aspect of the development of professionalism that is ignored in Bledstein’s account, yet without it no real evaluation of the professions is possible.
David Noble’s argument is somewhat different. His main concern is not the role of professional engineering in raising personal status, and obviously he cannot discount the great technical feats of engineering as mere “mystification,” for after all, there are all those tunnels and bridges and great mining works around to prove otherwise. But he looks critically upon the process whereby engineering moved out of the hands of master craftsmen into those of technically-trained, and then university-trained, professionals. This change resulted, in his view, in placing a monopoly over science and technology in the hands of the middle class. Even worse, it led to the monopolization of technical knowledge in the service of the monster corporation.
The word “monopoly” is a great favorite of present-day Marxist historians, automatically eliminating the necessity for further analysis. Thus, Noble announces mysteriously that “a systematic monopolization of scientific knowledge by the professionals” took place in the late 1800’s, and we are duly impressed. All he seems to be saying by this, however, is that the development of science-based technical knowledge brought about the decline of the amateur—of the tinkerer, puttering away at his invention in the family barn—and this undoubtedly did occur in the late 19th and early 20th century. But in what sense can this legitimately be called “monopolization”? No law was passed prohibiting tinkering, or preventing tinkerers from getting patents. It simply grew harder to be an effective amateur as mechanical, chemical, and electrical processes became more and more inaccessible to those without scientific training, and as those who employed engineers asked for degrees, membership in professional societies, and the judgment of peers in order to make some assessment of competence. To talk of “monopoly” in this connection is rather like saying that trained mathematicians have brought about a monopolization of higher mathematics. It is not, after all, as though they (or for that matter the engineers) were engaging in efforts to prevent other people from studying mathematics.
The monopoly of “scientific engineering” by the professional engineers, Noble goes on to tell us, was “the reverse side of the monopolization of the scientific engineers by science-based corporations.” Once again, Noble makes free with the term “monopoly,” and even puts quotation marks around “scientific engineering,” to imply that it was not scientific. Yet in what sense can the corporations be said to have “monopolized” scientific engineers? There were also engineers who worked for the academy, engineers who worked for the government, and consulting engineers who worked for themselves. True, there was an increase after World War I, as Noble says, in the proportion of chemists working for industry, but why is this evidence of “monopolization” rather than of growth in the chemical industry?
We seem to be in the grip of a new kind of criticism which substitutes slogans for thinking and does so, moreover, as though it had scored some overwhelming point. Here is Noble, for instance, reflecting darkly on male chauvinism in the professions: “It should be emphasized that this enormously influential class of people was overwhelmingly male. . . . In addition to being a product of industrial capitalism, reflecting class control over the means of production, modern technology in America has been equally a product of a male-dominated culture reflecting male prerogatives and male preferences.” What is the reader to make of this mixture of the obvious and ridiculous? Just what would technology with 50 (or 90) percent female participation look like, and how would it be different from male technology? What is unargued in the critique of engineering is just how it could be different. Perhaps it could, under different financial and political arrangements, but the persistent blindness to the limits of this difference, to the range of possibilities offered by modern technologically-based societies, permits the illusion to persist that things could be very different, as different as anyone’s fantasies could envisage. The existing is contrasted with alternatives that are not explained or developed and that are very likely impossible.
Noble has certainly collected a great deal of information on the relations among engineering, large corporations, higher education, and government, but much of this material does not prove what he thinks it does. Thus, to demonstrate the enslavement of technology by capitalism, he quotes from the president of the Stevens Institute Alumni Association in 1896: “The financial side of engineering is always the most important; the sooner the young professional recedes from the notion that simply because he is a professional man, the position is paramount, the better it will be for him. He must always be subservient to those who represent the money invested in the enterprise.” Admittedly, the spirit of 1896 comes through here rather unadorned, but leaving aside the question of tone, what does it mean to say that the professional position is not paramount? Is it not simply to assert that engineers must take into account economic considerations, considerations of efficiency in the investment of capital? These may be expressed through the market, or through state judgment; they may be expressed in dollars or in rubles. But however they are expressed, they cannot be ignored.
Noble is not very clear about just what should guide and control professional engineering decisions—he knows only that if businessmen have any part in the process, it must be improper and destructive of humane ends. Thus, quoting from a patent attorney about how the American Telephone and Telegraph Company won out over its competitors, Noble sounds as triumphant as though he had unveiled a nefarious strategy: “. . . one of the first steps taken was to organize a corps of inventive engineers to perfect and improve the telephone system in all directions . . . that by securing accessory inventions, possession of the field might be retained as far as possible, and for as long a time as possible.” So? Would he have preferred uninventive engineers, and would he have preferred not to “perfect and improve” the system? The result of this strategy has after all been a rather good telephone service, no more expensive than that of other countries where the telephone is state-run (whether by a capitalist or a socialist state), treating its workers no worse than they are treated in these other systems, and admittedly also supplying substantial dividends, bond interest, and other returns of capital to many rich people and a good number of not-so-rich ones.
Noble’s tone here is that of a man convinced he has proved a point, but what point has he proved? That there should be no patent system at all? But this would discourage invention. That corporations should not be allowed to buy up rights to patents and control a complex enterprise like the modern telephone system? But this would make it very hard for owners of individual inventions to get together—and if they did, would the result be all that different from AT&T? That the state should own the patents? This is obviously a possibility, since state-owned telephone systems, and even socialist state-owned telephone systems, do exist. Presumably this is the solution Noble would prefer-but on what grounds, since it would not lead to better or cheaper service, or be less exploitative of labor? There is a certain advantage to presenting one’s facts with a sneer—it obviates the necessity of making a case. But the reader, honestly perplexed, is Still left in the dark, wondering what, in the present system, seems so terrible to Noble and in what way another would be better?
One can raise the same questions in regard to Noble’s detailed account of the relations between business and the universities in developing engineering education. Science-based corporations needed technically-trained personnel, so they trained some themselves and hoped the universities would train others. They also developed partnerships with the universities, and provided equipment and supports for university education. Noble tells this interesting and important story as though its wickedness were self-evident, the automatic assumption being that the universities thereby became creatures of the corporations. The possibility that the universities were also making use of the corporations—to expand their educational offerings, and to assure their graduates of jobs—and thus strengthening themselves as autonomous institutions with separate interests of their own plays no role in Noble’s account. If universities keep records on students and their capacities, it is in order to aid industry, according to Noble. But does it not also aid students in getting jobs, and thus incidentally advertise the value of the university to other potential students? Anyone who has served on both sides of this particular divide must be aware that the university’s exploitation of the corporation is as likely—indeed, more likely—than the reverse. It is only a built-in bias that leads Noble to take for granted the second possibility rather than the first.
The same bias is to be detected in his discussion of the relationships between organized professional engineering and the federal government. The corporations, according to Noble, used government just as they used the universities, directly or through the professional engineering societies which were dominated by men who had either come out of the corporations, worked for them, or consulted for them. In short, not a move can be made but Noble sees in it the hidden hand of big business. If a Bureau of Standards is set up, it is merely another way to serve science-based industry, and never mind that it may also increase efficiency, improve America’s capacity to compete, reduce costs, and all the rest. “The spirit of standardization,” he writes portentously, “thus promised to weld science to power, and to lend the might of legal and moral authority the legitimacy of scientific truth. . . .” Is he then, one wonders, against standardization—preferring no bolt to match any nut but the one made for it—or is he in favor only of socialist standardization? That anyone can benefit from any rational or efficient action but the corporation seems beyond his imagination. If the federal government encourages research, it is because science-based industry wants it and is getting the people to pay for it. A likely possibility, but are there not other objectives to government-sponsored research? A stronger defense? A better internationally competitive position? Better products for consumers? That government can use industry is as unimaginable to Noble as that the university can use industry.
Stemming from the same Marxist tradition as the work of Bledstein and Noble, yet quite different in its complexity, is Magali Sarfatti Larson’s The Rise of Professionalism: A Sociological Analysis.3 She too emphasizes the role of the professions in enhancing and defending social status; she too explores their links with corporate capitalism and the university; and she too is overly attached to the term “monopoly.” But in her examination of the history of the professions she is consistently aware that there are conflicting interests even in capitalist society, and that the autonomy of the professions, of professional education, and of the university is not simply an illusion. The scientific or—to use her terminology—“cognitive” base of the professions, in which competence is in principle open to all, inevitably makes the professions more than an ideology to protect privilege.
Opposing the unequal results in income and status that emerge from an open educational system, Professor Larson nevertheless does see that the system supports legitimate values and real needs: “. . . experts and professionals do possess cognitive and technical competences which are important, if not always essential, for the social development of productive forces and the full satisfaction of human needs.” Her book maintains the Marxist tradition of criticizing the present from a future that is unexplicated and to me obscure, but its saving grace is that it recognizes the reality as well as the ideology of professional competence. “Today,” she writes, “knowledge is acquired and produced within educational and occupational hierarchies which are . . . inegalitarian, anti-democratic, and alienating.” But, she goes on, “these structures achieve a fusion between the progressive content of special competences and the requirements of a system of domination.” This is a good sample of the tensions within her approach, as well, alas, of the unrelentingly “professional” style that will make her book inaccessible to the general reader.
The most radical attack on professionalism comes from the remarkable Ivan Illich, who has brought a strikingly original vision, shaped by his work in developing countries, to the criticism of modern education, modern medicine, and most of the other professions that seek, through the use of advanced methods, to improve conditions of life in poor countries. Illich does not see the problem as primarily one of either the interests of the middle class or the power of business. He is as critical of professionalism in Russia—or China, or Cuba—as in the capitalist or developing world. Rather, the issue to him is how human competence is invariably reduced by professionalism, whether based on true science or false pretension. He has mounted an effective critique of advanced medicine and elaborate hospitals in developing countries, whose first need is for clean water. He has criticized extensive educational systems that serve to create from the upper-middle classes the professionals developing countries can do without, while neglecting to provide basic literacy.
Illich’s virtue is in his details. Thus, he points out how the poor manage well enough to house themselves in Latin American cities through their own talents and the use of scrap material. Unfortunately, the experts then arrive on the scene, and may convince the authorities (who are already convinced) that only concrete high-rises, beyond the capacity of the slum-dwellers to fabricate for themselves, and requiring advanced machinery and skilled labor, can solve their housing problems. More sophisticated experts turn up, impressed by what the poor have done for themselves, and wishing to learn from it. Eager to improve the self-help process, they begin suggesting improvements, which mean new machines, methods, and products that make the whole thing-more expensive and less familiar. In a word, they professionalize it.
But Illich’s critique is not limited to professionalism in poor countries. In his newest book,4 he also questions what we in the advanced countries have done to ourselves, thanks to our inordinate fascination with technology. To what end, he asks, do we develop all our medical technology but to delay grotesquely deaths which are inevitable? To what end our staggeringly expensive hospitals, which produce as much ill-health as health? To what end our inefficient schooling systems, which cannot even teach simple reading to our children?
One may not agree with these positions, but Illich at least looks at things with a fresh eye, and a remarkable acuteness for contemporary ailments. He writes, in one inspired passage: “We talk about the manufacture of housing or the delivery of medical care; people are no longer regarded as fit to heal or house themselves. . . . Instead of learning how to nurse grandmother, the teenager learns to picket the hospital that does not admit her.”
Illich does not actually expect us to learn to house or heal ourselves the way modern builders or doctors do. The issue is not only that the professions are inadequate, but that modern society has set false goals—more commodities, more comfort, escape from pain. He argues rather that we must all, developed and developing countries alike, reduce our requirements and move toward ascetic subsistence.
No one can deny the attractiveness of this vision of self-dependence, in which the experts are dismissed, not because man has become so expert himself but because he has lowered his expectations to the point where he no longer has need of them. And certainly part of the vision we can all adopt and learn from. Who knows how many doctors have been made superfluous by jogging (though Illich would prefer the elimination of enough modern transportation so as to make bicycling or walking necessary)? But there would appear to be limits to this process of reduction of needs.
Take, for example, his image of the picketing teenager. Certainly she (or he) could nurse grandmother as well—and perhaps even better—than the hospital or nursing home, despite their array of professionals, from doctors specializing in gerontological medicine to practical nurses. But what would convince her to do it, since it is no longer regarded as a duty, and she can assuage her guilt by assuring herself that others, paid for by the state, will do it better anyway? There seem now to be social limits to taking care of our own needs ourselves, or reducing those needs to the point where we can take care of them without experts. There may also be technical limits. It is not only that needs have expanded, but that old needs are met in new ways, and I for one find it difficult to see how, except for small groups, we can work our way back to becoming a more ascetic, needs-reducing society. Nor do I know how we could convince developing countries not to expand their needs, or, for that matter, how their present needs can be met without the help of those techniques we are being urged to abandon. Still, if Illich’s vision of a world without professionals seems impossible of attainment, it has contributed to a useful criticism of professionals.
None of the professions—not even doctors and engineers, with their seemingly secure scientific credentials—has escaped the current attack on professionalism. But the most hapless of all professionals in the present climate have been those belonging to the so-called “helping professions,” the social workers, psychologists, guidance counselors, and others who run the social-service agencies dealing with people in distress of one kind or another. The scientific basis of these professions, assuming there is one, is rather shaky, resting as it does on the social sciences, which offer only sandy footing, and developmental psychology and psychiatry, which are not much firmer. The present-day social services were started by high-status volunteers, proffering philanthropy. They are now staffed by full-time professionals doing a job. If there has always been some suspicion of the motives of volunteers—what secret prurience or lust for power would lead people to involve themselves in the troubles of others?—what can the present-day professional, working for wages, hope for in the way of understanding or sympathy? Not very much, we may conclude, from Doing Good: The Limits of Benevolence,5 a collection of essays on social policy by a psychiatrist, a literary critic, a historian, and a civil-rights lawyer.
This extremely interesting work, based—as the introduction tells us—on many long discussions among the participants and outsiders, is a kind of four-way panel discussion on the subject of what can be done, at this point in our history, to ameliorate the lot of the suffering and underprivileged in America. The psychiatrist, Willard Gaylin, and the lawyer, Ira Glasser, provide most of the polemics with their running debate on the respective virtues of law and love in protecting the helpless; historical perspective is provided by the critic, Steven Marcus, who demonstrates the dreadful effects of the English Poor Laws, as reflected in the work of Dickens and Wordsworth, and by the historian, David Rothman, author of the important book, The Discovery of the Asylum, which initiated the new criticism of early American social agencies.
Rothman contributes a thoughtful analysis of the recent dissolution of the Progressive consensus, which unquestioningly assumed that the state could benefit people in need. He points out that the Progressives, unlike us, did not perceive any conflicts of interest in dealing with the problems of delinquent or neglected children. They felt free to create new kinds of institutions for these children—in particular the juvenile court, in which the ordinary protections of criminal law were lifted—because they had no doubts whatsoever that the judge and the probation officer and the social worker would know what was best for the child, even if his parents disagreed. We, on the other hand, as Rothman points out, have lost that confidence. We see conflicts of interest everywhere—among teachers and college administrations and students, between wardens and prisoners, between psychiatrists and mental patients, between attendants and mentally retarded patients, between the juvenile court and the child, between the parent and the child; the list is endless. As Rothman puts it: “We are witnessing the dissolution of the Progressive version of the community as a viable concept. . . . The very notion of a harmony of interests seems deceptive and mischievous.”
This breakdown has led us to expand the protection that the law—in particular the Constitution, under those infinitely elastic phrases, “due process” and “equal protection”—offers to all those subject to governmental action: students, children, prisoners, the institutionalized mentally ill and mentally retarded. The new mood is aptly summarized by Rothman: “. . . better to trust to the skills of a lawyer in court than to the good intentions of the state. Presupposing conflict, let the battle be fought with both sides armed.” And given the recent reputation of the state, who can object? At this point, however, the psychiatrist on the panel, Willard Gaylin, speaking in a sense for all those who have the care of the dependent in their charge, voices certain reservations.
To Gaylin, love is not merely a sentiment, but rather a biological dictate, owing to the long period of helplessness of the human child. He acknowledges that this dictate is often violated—even in the paradigmatic relations of mother and child—and that it is the less likely to operate the farther we move from the mother-child relationship. We can expect less from the grown child caring for an aged parent, less from the professional or sub-professional caring for the institutionalized. Certainly Gaylin is fully aware of the horrors in institutions that called for the intervention of civil-rights lawyers like Ira Glasser. Yet it is clear which side he is on: however hard it may be to bring about a spirit of benevolence toward those who must be cared for, he is doubtful that the law can be of much help. He admits that “there will always be the need for vigilance in recognizing the limitations of government as surrogate parents.” Nevertheless, he would limit this vigilance. “The language of rights, with its litigious and paranoid assumption that good can only be received from others by pursuit and protection of the law, must also recognize that the good that can be received from others in that way is quite limited. We cannot, for example, ‘coerce’ a parent into caring for a child. . . .”
“Paranoid” is a fighting word, and Ira Glasser, the lawyer, rises to the provocation. In any case, we expect lawyers to be combative. The New York Civil Liberties Union, which Glasser heads, represented the plaintiffs in the recent Willowbrook case, in which the State of New York was required to improve and change its treatment of 5,300 retarded inmates, and Glasser cites the case as a demonstration of the efficacy of law where all else failed. He sees no problem in the aggressive defense of the rights of the mentally retarded to the point where, as is now becoming the norm, a lawyer or his equivalent must represent a mentally retarded child in any decision made for him by parents or professionals. If Gaylin argues that the care of the dependent must be rooted in love, Glasser retorts: “. . . who loves welfare recipients or the residents of public housing? . . . The record of public charity is an unloving record of punishments, degradation, humiliation, instrusion, and incarceration.” Glasser of course does not propose that the lawyer substitute for this absent love—who ever heard of a loving lawyer?—but he is absolutely certain that the lawyer is necessary to prevent abuse.
As for what is needed beyond preventing abuse, Glasser has nothing to say, and might well take refuge in the principle, if we cannot do good, at least let us not do harm. Gaylin, as befits his professional role, thinks it is still possible to do good, though even he has no answer either. (In the course of the discussion that preceded this book, Rothman tells us, Gaylin announced that he “finally understood the motive impulse of the adversarial movement: to substitute for the hard-nosed, belligerent, and tough-minded psychiatrist the attention of the gentle, understanding, empathetic lawyer!”)
There are no winners and no losers in the debate. To care for the dependent we need both the love and benevolence specified by Gaylin, and the legal protections specified by Glasser. The dilemma, however, is that love cannot be evoked by court order. After the psychiatrist and the lawyer, representatives of two eminent professions, have crossed swords in this debate, the job of nurturing and caring remains in the hands of humbler professionals—social-work administrators, social workers, physical and occupational therapists, recreation specialists, nurses, teachers, and, below the very lowest rungs of the professional ladder, ward attendants and jailkeepers. What of all these, who bear the brunt of dealing with the dependent, the mentally ill and retarded, the neglected, and the criminal, people whom we refuse to keep in our houses—whether the state takes them away or not—and whom we often will not allow into our neighborhoods, to live in the sad imitations of “homes” that current professional wisdom now devises for them? Glasser can be belligerent in defending the dependent against the state, because no one really minds a tough stand against the state. But is there anyone coming to the defense of those hapless functionaries who must do the job, with all its day-to-day horrors? When it appeared that the many requirements of the court order were not being carried out properly at Willow-brook, the lawyers in the case went to court to attack the good will of the state. The state official responsible for mental retardation, who was not increasing the number of attendants as fast as the order called for, pointed out that it was not easy to get people to stick to a job which involved changing diapers on a forty-year-old man.
It seems to me that in its next stage, criticism of the professions should take account of the lesser professionals and their problems in taking care of those we entrust to them, or rather, shove off on them. At that point, one suspects, the certainties of well-meaning lawyers may be moderated by an understanding of the real complexity of things. Glasser claims in this book that it was not “a lack of money, a lack of knowledge, or a lack of administrative capacity” that impeded progress in the Willowbrook case. Rather, it was “a lack of will. . . .” The problem is not as simple as that, however. It is true that money has been available, but our knowledge is inadequate, administrative capacity is always in short supply, and as for will . . . whose will? That of Governor Carey? Commissioner Coughlin? The director of Willowbrook, with her 4,000 employees, or any of the other officials responsible for placing the mentally retarded in community settings? And how do we get in touch with the wills of the lower employees who deal directly with the mentally retarded, or of the people in the surrounding communities in which they are to be placed? To refer to lack of will in such a situation is tantamount to confessing that we do not know where to go from here. But wherever we do go in dealing with these intractable problems, we will not get very far by denouncing those who do what we—the lawyers and the critics alike—would not be willing to do.
1 Love and the American Delinquent: The Theory of “Progressive” Juvenile Justice, 1825-1920, University of Chicago Press, 303 pp., $15.00.
2 The Culture of Professionalism: The Middle Class and the Development of Higher Education in America, by Burton F. Bledstein, Norton, 354 pp., $12.95; America By Design: Science, Technology, and the Rise of Corporate Capitalism, by David F. Noble, Knopf, 384 pp., $12.95.
3 University of California Press, 309 pp., $14,95.
4 Toward a History of Needs, Pantheon, 143 pp., .$7.95.
5 Pantheon, 171 pp., $8.95.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The Attack on the Professions
Must-Reads from Magazine
Smeared for doing the job.
When then-presidential candidate Donald Trump famously declared his intention to be a “neutral” arbiter of the conflict between Israel and the Palestinian territories and put the onus for resolving the conflict on Jerusalem, few observers could have predicted that Trump would run one of the most pro-Israel administrations in American history.
This year, the Trump administration began relocating the U.S. embassy in Israel to the nation’s capital city, fulfilling a promise that began in 1995 with the passage of a law mandating this precise course of action. The administration also declined to blame Israel for defending its Gaza border against a Hamas-led attack. Last week, the administration shuttered the PLO’s offices in Washington.
The Trump administration’s commitment to shedding the contradictions and moral equivalencies that have plagued past administrations has exposed anti-Zionism for what its critics so often alleged it to be.
This week, Department of Education Assistant Secretary of Education for Civil Rights Kenneth Marcus announced his intention to vacate an Obama-era decision that dismissed an alleged act of anti-Semitism at Rutgers University. Marcus’s decision to reopen that particularly deserving case has led the New York Times to publish an article by Erica L. Green full of misconceptions, myths, and dissimulations about the nature of the anti-Israel groups in question and the essential characteristics of anti-Semitism itself.
In reporting on Marcus’s move, Green declared the education activist and opponent of the Boycott, Divestment, and Sanctions (BDS) movement a “longtime opponent of Palestinian rights causes,” a designation the paper’s editor felt fine printing without any substantiating evidence. You could be forgiven for thinking that BDS itself constituted a cause of “Palestinian rights” and not an international effort to stigmatize and harm both Israel and its supporters. If you kept reading beyond that second paragraph, your suspicions were confirmed.
Green contended that Marcus’s decision has paved the way for the Education Department to adopt a “hotly contested definition of anti-Semitism” that includes: denying Jews “the right to self-determination,” claiming that the state of Israel is a “racist endeavor,” and applying a double standard to Israel not “expected or demanded of any other democratic nation.” As Jerusalem Post reporter and COMMENTARY contributor Lahav Harkov observed, this allegedly “hotly contested definition” is precisely the same definition used by the International Holocaust Remembrance Alliance. In 2010, the IHRA’s working definition was adopted almost in total by Barack Obama’s State Department.
Green went so far as to say that this not-so-new definition for anti-Semitism has, according to Arab-American activists, declared “the Palestinian cause anti-Semitic.” So that is the Palestinian cause? Denying Jews the right to self-determination, calling the state of Israel itself a racist enterprise, and holding it to nakedly biased double standards? So much for the two-state solution.
Perhaps the biggest tell in the Times piece was its reporters’ inability to distinguish between pro-Palestinian activism and anti-Israeli agitation. The complaint the Education Department is preparing to reinvestigate involves a 2011 incident in which an event hosted by the group Belief Awareness Knowledge and Action (BAKA) allegedly imposed an admissions fee on Jewish and pro-Israel activists after unexpected numbers arrived to protest the event. An internal email confirmed that the group only charged this fee because “150 Zionists” “just showed up,” but the Obama administration dismissed the claim, saying that the organization’s excuse—that it expected heftier university fees following greater-than-expected attendance—was innocuous enough.
Green did not dwell on the group, which allegedly discriminated against Jews and pro-Israeli activists. If she had, she’d have reported that, just a few weeks before this incident, BAKA staged another event on Rutgers’s campus—a fundraiser for the organization USTOGAZA, which provided aid to the campaign of “flotillas” challenging an Israeli blockade of Gaza. USTOGAZA’s links to the Turkey-based organization Insani Yardim Vakfi (IHH), which has long been associated with support for Hamas-led terrorist activities, rendered the money raised in this event legally suspect. Eventually, as Brooke Goldstein wrote for COMMENTARY, even BAKA conceded the point:
After community members demanded that Rutgers, a state-funded university, hold an investigation before handing over any money to USTOGAZA, the school responded by offering to keep the money raised in an escrow account until a suitable recipient could be found. In June 2011, BAKA sent out an e-mail admitting the University had, after “much deliberation” and despite their initial approval, “decided that they are not willing to release the funds to the US to Gaza effort” due to concerns of being found liable for violating the material-support statutes.
Rutgers prudently limited BAKA’s ability to participate in on-campus events after these incidents, but the organization that took their place—Students for Justice in Palestine (SJP)—is no better. The Times quoted officials with the Center for Law and Justice who praised Marcus’s move and cited SJP as a source of particular consternation, but the reporters did not delve into the group’s activities. If they had, they’d find that the organization’s activities—among them declaring that “Zionists are racists,” supporting anti-Zionist individuals despite credible accusations of child abuse, and endorsing Hamas’s governing platform, which labels the entire state of Israel “occupied territory”—fits any cogent definition of anti-Semitism. This is to say nothing of the abuse and harassment that American Jews experience on college campuses that play host to SJP’s regular “Israel apartheid weeks.”
Some might attribute the Times’ neutral portrayal of groups that tacitly support violence and people like Omar Barghouti—an activist who “will never accept a Jewish state in Palestine” and has explicitly endorsed “armed resistance” against Jews, who he insists are “not a people”—to ignorance, as though that would neutralize the harm this dispatch might cause. But the Times piece has emboldened those who see Israel’s Jewish character as a threat both to its political culture and our own. That worrying sentiment was succinctly expressed by New York Magazine’s Eric Levitz: “You don’t have to be a staunch supporter of the Palestinian cause to question Israel’s right to exist as a Jewish state.”
The benefit of the doubt only extends so far. Even the charitably inclined should have discovered its limits by now.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
A conservative rethinks race and policing.
A week since an off-duty Dallas police officer shot and killed her neighbor in his own home, numerous unanswered questions bedevil investigators. Among them: How and why did the officer, Amber Guyger, end up in a different flat than her own that night? Did she mistake his apartment for hers, as she has claimed, or did she force her way inside, as some eyewitness reports seem to suggest?
More questions: Did the two neighbors know each other? Was there bad blood between them from the past? Or were they like two strange vessels floating in dark waters, the one accidentally ramming the other and sinking it? What really transpired between shooter and victim in that bleak, brief, and irrevocable instant that extinguished the life of Botham Shem Jean—a professional, a stalwart of his church, a black man, a human being?
America’s adversarial system of justice will, I expect, answer most of these questions in due course. But one fact is already inescapable: Even within the four walls of his castle, his home, Jean was not safe from undue police violence. As a CNN observer argued recently, even “living while black” can, well, end black men’s lives. And this should impel those of us on the right to drop the tendency to reflexively rally behind law enforcers in such cases and our corresponding tendency to dismiss claims about racial injustice in our system.
The arguments in favor of these reflexes are well-known to me. I know that day-in, day-out, legions of American law enforcers risk their lives to protect and to serve. That the vast majority of these men and women aren’t power-tripping bigots or trigger-happy lunatics. That, on the contrary, these are well-trained but fallible human beings, whose job requires them to make snap judgments in which life and death are at stake.
I know, too, that street thuggery and “black-on-black crime” are, statistically speaking, the far greater menace to African-American lives than potentially fatal encounters with the police. That often the police officers doing the shootings, whether justly or unjustly, are themselves black or Hispanic. That family members of those unjustly shot by police have many legal means for making themselves whole.
I know that in some of the most notorious cases, the suspects increased the danger to their lives by acting foolishly, defying verbal commands, and so forth. And I know that if officers feel too hamstrung by litigation or public scrutiny, it may actually cause them to become less vigilant in enforcing the law, thereby putting yet more black lives at risk.
All of this is true. I know these arguments through and through, and I have often made them, in these pages and elsewhere. And yet, and yet, there is the inescapable fact that, one night, Botham Shem Jean came to his own apartment, probably seeking a few hours’ shuteye after a long day at work, only to be shot and killed by an off-duty officer with questionable, if not outright malicious, judgment. One minute, Botham Shem Jean was a beloved son, coworker, and church member. The next minute, he was dead. And for what? Who can say?
The typical arguments marshaled in favor of the policing status quo can’t, and shouldn’t, be used to justify or pooh-pooh the raw, awful reality of this violation. Neither procedural safeguards nor statistics about black criminality should deafen us to the cries for substantive justice that ring out from the African-American community when a black man is shot within the four walls of his own home by an intruder with a badge.
Nor should conservatives harden their hearts when African-Americans point to the persistence of a certain racial pattern in these violent encounters. Assuming Guyger’s account is true, for example, did she instantly assume she was facing a “burglar” owing to the color of Jean’s skin? If so, is that evidence that implicit bias exists? We can’t yet be sure. Officer fatigue, bad lighting, a misunderstanding, the coarseness and alienation of American urban life—all of these may have been a factor. All could mitigate or extenuate Guyger’s culpability.
But the point is this: After Botham Shem Jean, conservatives should be a little less quick to insist that we don’t have systemic problems. I know I will.
Choose your plan and pay nothing for six Weeks!
For a very limited time, we are extending a six-week free trial on both our subscription plans. Put your intellectual life in order while you can. This offer is also valid for existing subscribers wishing to purchase a gift subscription. Click here for more details.
The Elon Musk problem.
No one has ever mistaken me for a business writer. Show me a balance sheet or quarterly report, and my eyes will glaze over. Bring up “chasing alpha” at the bar, and I’ll ask for the check and give you the old Irish goodbye. Business chatter—the kind you can’t help but overhear from young stockjobbers at the gym and bloaty middle managers on the Acela—bores me to tears. I’m especially allergic to the idea of “The Market” as an autonomous, anthropomorphic entity with a unitary will and mind of its own.
But even I can tell you that Elon Musk is imploding.
The latest omen came Friday when footage of the South African-born magnate smoking a fat marijuana blunt dropped online. The video is worth watching; the Guardian has the key bits from the 150-minute interview (do people really watch interviews this long?).
Rogan, whose fame has been a mystery to many yet is an inescapable fact of our online lives, offers the joint to Musk but is quick to add: “You probably can’t [smoke it] because of stockholders, right?” (On second thought, I think I know why Rogan is famous—because he knows how to push his subjects’ buttons.)
“I mean it’s legal, right?” Musk replies.
And so Elon Musk—the founder of an electric-car company worth $50 billion and a rocket company worth $20 billion—presses the blunt between his lips and takes a drag. He washes it down with a sip of whiskey on the rocks.
“I’m not a regular smoker of weed,” Musk says a few minutes later. “I almost never [smoke it]. I mean, it’s it’s—I don’t actually notice any effect.” His speech by now is noticeably more halting than it has been earlier in the interview. “I know a lot of people like weed, and that’s fine. But I don’t find that it is very good for productivity.”
The Market was not amused. News of two senior Tesla executives quitting their jobs broke soon after the interview appeared. Tesla shares slid 8 percent. On Twitter, where he competes with President Trump for the World Megalomaniac Award, Musk tweeted out his Rogan interview, adding: “I am a business magnet.” Perhaps he was still coming down.
These disasters follow the summer’s going-private fiasco. In early August, Musk claimed he had secured the vast funding needed to take his company private and then did a switcheroo. Tesla short-sellers, whom Musk constantly tries to show up, were vindicated. The Market got angry; shares slid.
“Moving forward, we will continue to focus on what matters most,” Musk wrote in a statement to investors two weeks later, “building products that people love and that make a difference to the shared future of life on Earth. We’ve shown that we can make great sustainable energy products, and we now need to show that we can be sustainably profitable.”
That apparently entails shooting the THC-laden breeze with Joe Rogan for two and a half hours.
The question now is: How did Musk ever get so big in the first place? There were many Tesla-skeptics, of course, chief among them those very short-sellers. They were onto something, perhaps because they sensed that a sound inventor-investor-executive would be more concerned with producing a reliable, profitable, non-subsidized automobile than with . . . showing up short-sellers. Even so, Tesla shares climbed and climbed. Even now, after Friday’s Harold and Kumar routine, the stock is trading north of $260.
Two explanations come to mind. The first is that, after Steve Jobs’s death, Wall Street and Silicon Valley types were seeking the next Eccentric Visionary to whom they could hitch their dreams. And Musk was straight out of central casting for Eccentric Visionary. Ending climate change. Colonizing Mars. Super-trains linking cities across vast distances. Everything seemed possible with him. Who knows, maybe the hopes were well-placed at one point, and the adulation went to the man’s head?
The second explanation, which needn’t be mutually exclusive with the first, is ideology. So much of Musk’s business reputation rested on his claims of solving climate change and other planetary crises that loom large in the minds of the Davos crowd. Musk embodied the ideological proposition that no modern problem eludes solution by noble-minded technocratic elites. The Market, it turns out, was as prone to magical thinking as any of the rest of us.
Clarification: News of the Tesla executives’ departure broke following Musk’s pot-smoking interview, but at least one of the departures had been finalized earlier this week.
Choose your plan and pay nothing for six Weeks!
The course the West followed has been a disaster.
The West has squandered the last, best opportunity to rid the world of the criminal regime in Syria.
Damascus was designated a state sponsor of terrorism in 1979, and it has lived up to that title every year since. Syria’s descent into civil war presented several opportunities to dispense with the despot in Damascus and avert a crisis in the process, but they were all ignored. As I wrote for National Review, Syria is a case study in the perils of ideological non-interventionism. The results of the West’s over-reliance on covert action, outsourcing, and diplomacy in Syria is arguably the worst-case scenario.
Had Barack Obama not abandoned his infamous “red line” in 2013, the U.S. might have preserved the 100-year prohibition on the battlefield use of chemical weapons. The collapse of that taboo has been rapid and terrifying. In the years that followed, chemical arms have been regularly deployed in Syria, and rogue powers have been using complex nerve agents on foreign (even allied) soil in reckless state-sponsored assassination campaigns.
Ideological adherence to non-interventionism well after it had proven an untenable course of action allowed the flourishing of terrorist organizations. Some parties in the West with a political interest in isolationism deliberately confused these terrorist groups with secularist movements led by Assad regime defectors. In the years that followed, those moderate rebel factions were crushed or corrupted while Islamist terror networks, which provided a politically valuable contrast to the “civilized” regime in Damascus, were patronized and nurtured by Assad.
The incubation of terrorist organizations eventually necessitated the kind of American military intervention Obama had so desperately sought to avoid, but at a time and place not of America’s choosing and with a footprint too small to achieve any permanent solution to the crisis. All the while, a great human tide poured out from Syria in all directions, but especially into Europe. There, an influx of unassimilated migrants eroded the continent’s post-War political consensus and catalyzed the rise of illiberal populist factions.
Even as late as the summer of 2015, there was still time for the West to summon the courage to do what was necessary. In a stunning speech that summer, Assad himself admitted that Syrian forces suffered from “a lack of human resources” amid Western estimates that nearly half the 300,000-strong Syrian army had been killed, captured, or deserted. “Based on current trend lines, it is time to start thinking about a post-Assad Syria,” an intelligence source told the Washington Post’s David Ignatius. But Obama dithered still. Just a few short weeks later, Vladimir Putin, upon whom Obama relied to help him weasel out of his pledge to punish Assad for his crimes, intervened in Syria on Damascus’s behalf. That was when the greatest crimes began.
Russian intervention in Syria began not with attacks on “terrorists,” as Moscow claimed, but with attacks on covert CIA installations and arms depots; a dangerous campaign that continued well into the Trump era. The Syrian regime and its Iranian and Russian allies then embarked on a scorched-earth campaign. They bombed civilian neighborhoods and hospitals and maternity wards. They surrounded the liberated cities of Homs and Aleppo, barraging and starving their people into submission. They even targeted and destroyed a United Nations aid convey before it could relieve the famine imposed by Damascus. All the while, Moscow’s propagandists mocked reports of these atrocities, and the children who stumbled bloodied and ashen from the ruins of their homes were deemed crisis actors by Russian officials and their Western mouthpieces.
America’s strategic obligations in Syria did not diminish with Russian intervention. They increased, but so too did the danger. Early on, Russian forces concentrated not just on attacking Assad’s Western-backed enemies but on harassing NATO-aligned forces that were already operating in the Syrian theater. Russian warplanes harassed U.S. drones, painted allied assets with radar, conducted near-miss fly-bys of U.S. warships and airplanes in the region, and repeatedly violated Turkish airspace. This conduct was so reckless that, in November of 2015, NATO-allied Turkish anti-aircraft fire downed a Russian jet. On the ground, Moscow and Washington engaged in the kind of proxy fighting unseen since the collapse of the Soviet Union, as U.S.-manufactured armaments were routinely featured in rebel-made films of successful attacks on Russian tanks and APCs.
In the years that followed this intensely dangerous period, the Syrian state did not recover. Instead, Syrian forces withdrew to a narrow area along the coast and around the capital and left behind a vacuum that has been filled by competing great powers. Iran, Russia, Turkey, Jordan, Saudi Arabia, Qatar, the United Arab Emirates, Canada, the United Kingdom, France, Australia, and the United States, to say nothing of their proxy forces, are all competing to control and pacify portions of the country. Even if the terrorist threat is one day permanently neutralized in Syria—a prospect that today seems far off, considering these nations’ conflicting definition of what constitutes a terrorist—the state of competition among these powers ensures that the occupation of Syrian territory will continue for the foreseeable future.
And now, the final battle is upon the rebels. On Friday, hundreds of Syrians waving the “independence flag” poured into the streets of Idlib, the last of the country’s free cities, begging the international community to spare them from the onslaught that has already begun. The United Nations has warned that up to 800,000 people could be displaced in Damascus’s efforts to retake the rebel-held enclave, and the worst of the seven-year war’s humanitarian disasters may be yet to come.
Over the last two weeks, the United States has issued some ominous warnings. Senior American officials have begun telling reporters that the evidence is increasing of Damascus’s moving chemical munitions near the frontlines with the intent of using them on civilians. Trump administration officials announced in no uncertain terms that they would respond to another chemical attack with disproportionate force.
In response to these threats, Moscow deployed the biggest Russian naval taskforce on the Syrian coast since 2015. Simultaneously, Russia has warned of its intent to strike “militant” positions in the country’s Southwest, where U.S. soldiers routinely patrol. American forces are holding firm, for now, and the Pentagon insists that uniformed personnel are at liberty to defend themselves if they come under assault. If there is a conflict, it wouldn’t be the first time Americans and Russians have engaged in combat in Syria.
In February, Russian mercenaries and Syrian soldiers reinforcing columns of T-72 tanks and APCs armed with 125-millimeter guns engaged a position just east of the Euphrates River held by American Green Berets and Marines. The four-hour battle that ensued resulted in hundreds of Russian fatalities, but it may only have been a terrible sign of things to come.
Of course, a Western-led intervention in the Syrian conflict would have been accompanied by its own set of setbacks. What’s more, the political backlash and dysfunction that would have accompanied another difficult occupation in the Middle East perhaps presented policymakers with insurmountable obstacles. But the course the West followed instead has been a disaster.
The lessons of the Syrian civil war are clear: The U.S. cannot stay out of destabilizing conflicts in strategically valuable parts of the world, no matter how hard it tries. The humanitarian and political disasters that resulted from Western indifference to the Syrian plight is a grotesque crime that posterity will look upon with contempt. Finally, the failure to enforce prohibitions against chemical-weapons use on the battlefield has emboldened those who would use them recklessly. American soldiers will suffer the most in a world in which chemical warfare is the status quo of the battlefield of the future.
American interventionists are often asked by their opponents to reckon with the bloodshed and geopolitical instability their policies encourage. If only non-interventionists would do the same.
Choose your plan and pay nothing for six Weeks!
And the demands of realpolitik.
Earlier this week, my housekeeper, Mary, arrived to work decked out in a bright red T-shirt emblazoned with a photo of Philippine President Rodrigo Duterte, who came to Israel last Sunday for a three-day official visit.
Mary was at the Knesset on Monday, one of several hundred Filipino workers among approximately 28,000 in Israel, enthusiastically cheering her strongman president.
I asked her what she thought of Duterte–a leader who makes President Trump seem eloquent and measured, by comparison–and I was taken aback by her effusive, unhesitating endorsement: “Oh,” she enthused, “he is a very good president! The best!”
“But,” I suggested, carefully, “he says and does some pretty extreme, crazy things. Does that concern you at all?”
“Oh, no!” she collapsed in laughter. “He doesn’t mean that. It’s just his style.”
Indeed, Duterte has “style.” Bragging of his intent to kill millions of Filipino drug addicts, and invoking Hitler and his genocidal rampage, approvingly, in this context; referring to President Obama as a “son of a whore”; boasting of his parsimony in keeping multiple mistresses available in low-end hotels; approving of sexually assaulting women, particularly attractive ones. And then there was the outburst during the Pope’s visit to the very Catholic Philippines in 2015 when Duterte called him a “son of a bitch” for causing a traffic jam while in Manila.
Mary is not a simple woman. She is university educated, hard-working, pleasant, and respectful. And whatever makes her overlook Duterte’s thuggish tendencies should interest us all, because there are many Marys the world over supporting populist leaders and governments. Mary admires Duterte’s strength of conviction in dealing with drug dealers, addicts, corruption and Islamic extremists.
Human rights activists and journalists, of course, see only a brute who visited Israel to shop for weapons and defense capabilities, which would be put to questionable use. Then again, Duterte is hardly the first and far from the only unsavory ruler to come shopping in Israel, America, or elsewhere, for arms.
Israel deftly managed the visit and optics. Whereas many were disgusted that the PM and President Rivlin gave Duterte an audience, according him a legitimacy and respect that is undeserved, their meetings were brief and remarks carefully calibrated.
In addition to acknowledging his personal gratitude to the Filipino caregiver who was a companion to his father in his final years, Bibi reminded us all of the enduring friendship the Philippines has shown Israel, and Jews, for decades. Prior to WWII, then president Manuel Quezon made available 10,000 visas as part of an “open door” policy to accommodate European Jewish refugees. Only 1,300 were used, ultimately, due to the Japanese invasion which closed off escape routes.
In 1947, the Philippines was the only Asian country to vote in support of the 1947 UN Partition Plan, providing critical support for the momentum building towards the creation and international acceptance of the Jewish state one year later. These are important, historical events about which Bibi, quite rightly, chose to remind us all.
I am no cheerleader of dictators and thugs, but I do wonder why the morality of many objectors to the Duterte visit is so selective. Israel (and all western nations) has relations and ties with many countries led by dictators and rulers far more brutal than the democratically elected Duterte.
Much ado has been made in recent months of Bibi’s meetings with a number of right-wing populists and, worse. Some link it to what they see as disturbing, anti-democratic tendencies in his own leadership of late. Others, myself included, would read it as a careful effort to maintain and cultivate as many international relationships as possible that may enhance Israel’s strategic and economic interests, particularly in this period of extreme global political, economic and institutional instability.