Most professions rest on the expertise of their members. Professionals are professionals primarily because they know more than most of us about something of importance to our society or to many members of it. Professionals are given power, respect, prestige, and above-average incomes. If professionals are worthy of this status, it is largely because of their special knowledge and the way they use it. And if professionals have special rights and responsibilities, it is also primarily because of the social positions they occupy due to their presumed expertise.
I hope the broad claim that most professions rest on expertise is plausible. For I plan to embark on an exploration of the ethics of expertise under the assumption that the ethics of expertise will throw at least some light on professional ethics. But I can not justify this basic assumption here, nor can I enter into a discussion of whether various professions really do have the knowledge on which to base their claims. Instead, I must simply state that I believe that understanding the expert-layperson relationship will provide an important part of the explanation of why we need to consider professional ethics at all. An ethics of expertise should also provide common denominators for the ethics of many different professions. Perhaps the differences between the ethics of different professions can also be traced in part to differences in various forms of expertise.
As my title indicates, I do not have a fully-developed ethics of expertise. I will, however, offer some reflections on the form of an ethics of expertise and then propose maxims for inclusion in such an ethics. But before I can do either, I must explain what I take the expert-layperson relationship to be, since the features of an ethics of expertise will flow out of the nature of that relationship. So, bear with me while I discuss the epistemology of expertise. There will be an ethical payoff for these labors. Or that is my hope, at any rate.
I find myself believing all sorts of things for which I do not possess evidence: that acid rain and global warming are things to worry about; that my house will not be safe unless it is rewired; that we have no sure way to dispose of high-level nuclear waste; that my irregular heartbeat is premature ventricular contraction and hence nothing to worry about; that my son's failure to do well in school is a sign of insecurity and hence is something to worry about; that mass media and increasing mastery of the techniques of persuasion threaten democracy; that money in my retirement account is safely invested so that it will be there when I retire.
The list of things I believe without having the evidence for them could be extended indefinitely. And I am finite. I might be able to gather the evidence necessary to support one of these beliefs. But I could not gather the evidence that supports all of them. Too much is known; the evidence is too extensive; much of it is available only to those with special aptitudes and skills honed over years of study and practice. And I lack the competence, the skills, the sheer intellectual capacity, as well as the time. Usually, I lack the ability to critically evaluate the merits of evidence presented to me. Often I can not even understand it. Thus, if any of the beliefs I have just mentioned is a rational belief, it is not because I possess the evidence to justify it. It is because I believe, with good reason, that others possess the necessary evidence.
I am not alone in this. More is known than anyone can know by herself, and this forces intellectual specialization: those who invest the necessary time, energy, and focused attention in a particular area come to have better opinions than those who have not similarly invested themselves in that area. I do not see how we can deny the epistemic superiority of expert opinion without denying the efficacy of inquiry with respect to the relevant subject matter. If inquiry is efficacious, more inquiry is likely to lead to a more reliable opinion. And we generally do acknowledge our need for more reliable opinions than our own: this acknowledgement is the force that creates social support for all kinds of "knowledge industries" -- including this book.
As I understand it, then, the idea of expertise is based on an epistemological inequality between two opinions -- an expert opinion and a non-expert or lay opinion. (I will use "non-expert" and "layperson" interchangeably.) Within her area of expertise, an expert's opinion is better than a non-expert opinion. By "better," I mean more reliable, but not necessarily correct. Even though it is not necessarily correct, an expert opinion is still better than a lay opinion because it is less likely to be mistaken and likely to be less mistaken. Areas in which expert opinion exists and is available are areas in which one ought not to make up one's own mind -- without first becoming an expert. For, to insist on relying on the evidence that one can gather or even assess for oneself would be to hold relatively crude, unchecked, unreliable, and therefore irrational beliefs.
Although it matters greatly to me that many of my beliefs rest on trust in others, my beliefs about most matters are of no great import . . . except to me. But our leaders, our government officials, our public and corporate policy makers are in a similar epistemic position. They, too, must rely on expert opinion. They, too, are often not able to assess the merits of the arguments given them by experts. And that is a matter of much greater import. Trust in experts goes on -- and must go on -- even at the highest levels.
So far, I have been talking as if non-experts were the only ones who must trust the experts within a given field. But this is not the case. Expertise rests on specialization, so an expert about one thing will be a layperson about another. Even more important: experts become experts and remain current in their own fields primarily by assuming the role of laypersons and accepting the testimony of other experts, standardly in the form of books and journal articles. No one could be competent in, say physics or automobile repair, medicine or psychology, if she insisted on accepting only propositions which she had verified for herself.
An attempt at epistemic self-reliance -- even by the experts within their own fields of expertise -- would be sheer folly. Experts, too, must rely on others to possess the evidence for their beliefs within their disciplines or even their subspecialties. In a culture as complex as ours, even knowledge is often unavoidably based at least partly on trust in the testimony of other experts.
I hasten to add that this does not mean that scientists and scholars have only a poor, second-best substitute for direct first-person evidence. It means that "hearsay evidence" is often the best evidence anyone could have. For example, an attempt to measure to lifespan of charm particles takes about 300 person/years and is such a complicated process that no one university has physicists with all the competences necessary to run the experiment. Any individual physicist attempting to measure the lifespan of charm particles without relying essentially on the testimony of others could produce only a hopelessly inept and crude measurement, no matter how brilliant and industrious she might be.
Even experts must, then, rely on what they are told by other experts, even within their own disciplines. This leaves us with a terminological difficulty: it seems perversely wrong-headed to call a full professor of chemistry at Harvard a layperson in chemistry. But all chemists can be competent only by accepting the opinions of other chemists and by deferring to their judgment -- i.e., by taking the role of laypersons. Even a top-flight chemist can not validate for herself all of the opinions she accepts in chemistry, for reasons not only of time, but also of competence. Indeed, she standardly will not be able to validate for herself even the premises on which her own research rests. In the following, I will continue to talk in terms of "the expert-layperson relationship." But we will need to remind ourselves that the role of a layperson is not limited to those outside a field of expertise.
The relationship between expert and layperson is grounded on an epistemic inequality. The expert knows more than the layperson about matters within the scope of her expertise. And if the layperson appeals to the judgment of the expert, he usually does so because he acknowledges the superiority of the expert's judgment to his own. Thus, the epistemology of the expert-layperson relationship can be focused on the concept of rational deference to epistemic authority. This rational deference lies at the heart of the particular form of power that an expert has and is also the center of the particular form of vulnerability that each of us, as a layperson, is in.
The concept of rational deference presupposes that the layperson appeals to the expert and acknowledges the rational authority of the expert. Obviously, someone must appeal to the expert, some layperson must acknowledge the authority of the expert or the expert's knowledge will be socially irrelevant and useless. So, the ideas of appealing to epistemic authority and rational deference may suffice for the epistemology of expertise. But when we turn to the ethics of expertise, we must not forget that often an expert's expertise is applied to those who have not appealed to the expert. They may well not even agree that this opinion represents a form of expertise. Indeed, expert opinion alters many people's lives without their knowledge, much less their consent.
If we limit ourselves to considering beliefs in propositions for which reasons can be given, the essential structure of rational appeals to the authority of experts is this:
If A has good reasons to believe that B has good reasons
to believe p, then A has good reasons to believe p.
Let's look more closely at this structure of rational deference. A has good reasons to believe that B has good reasons to believe p. It is B's good reasons that make B an expert and distinguish her authority -- epistemic authority -- from other forms of authority and other grounds for deference.
And A defers to B because he believes that she has good reasons; it is these good reasons that make her opinion count in his eyes. In fact, A defers to B because he believes that she has better reasons than he has -- or even could have -- by himself. Clearly, the whole point of appealing to the testimony of others is that they know things we don't. If this were not the case, basing belief on the testimony of others would be pointless at best, hence nonrational or irrational. The appeal to B must be able to strengthen A's reasons for believing p.
Nor is this conclusion based entirely on the limited competence of A about p. As we have seen, even the most knowledgeable people in many specialties must base their claims to expertise and their own work primarily on appeals to the knowledge of others in their field. If the best reasons for believing p are sometimes primarily testimonial reasons, if knowing requires having the best reasons for believing, and if p can be known, then knowledge will also sometimes rest on testimony.
Basing belief on expert testimony has potential problems as well as strengths, however, and they arise from the same feature of testimony: in order for testimony to be useful, A cannot already have B's reasons. So, if A accepts p on B's say-so, those reasons (B's reasons) which are necessary to justify p and A's belief that p are reasons that A does not have. Sometimes it is feasible for B to share with A all the evidence necessary to justify the claim that p. But usually not. Indeed, if A and B come from different disciplines or even different specialties within the same discipline, A often will not know what B's reasons are, much less why they are good reasons for believing p.
Thus, there is a certain blindness in A's belief that p, since he lacks much or all of the evidence that justifies the claim that p. The vulnerability of laypersons who rely on experts thus grows out of the same features that justify appeals to experts in the first place: A doesn't know what B knows. And since he doesn't, his ability to check up on what she says, to rationally criticize her opinions, or even to answer back effectively is really quite limited. I may suspect that my cardiologist is incompetent, that she is not being honest with me, or that she is ordering unnecessary tests. But one must know much of what cardiologists know in order to rationally confirm or dispel my suspicion.
Given all this, what can we say about layperson A as he appeals to expert B? A's position is really this:
1) A knows that B says p.
2) A has good reasons to believe that B (unlike A) is in a position to know what would be good reasons to believe p and to have the needed reasons.
3) A believes (and has good reasons to believe?) that B is speaking truthfully, that B is saying what she believes.
4) A believes (and has good reasons to believe?) that B actually has good reasons for believing p when she thinks she does.
At this point, it is obvious, though important, to note two things about B and her contribution to A's good reasons.
1) Unless B is likely to believe what she is saying, her knowledgeability about p will not give A good reasons to believe p. Thus A's reasons must include the (implicit) belief that B is truthful or at least being honest in this situation. This truthfulness is no simple or single thing to assess; it can involve a good deal of B's character. This becomes clear if we consider various reasons that an expert might not be honest. In certain situations, B cannot be relied upon to be honest if she is greedy, or cowardly, or too concerned about her status and reputation, or lazy, or too friendly with some who have important interests at stake in her testimony.
2) Even B's truthfulness will not give A good reasons to believe p if B believes she has good reasons when she does not. Experts are generally most knowledgeable about what constitutes good reasons in the domain of their expertise. So that's not the main issue. B must, of course, be competent, keep herself up to date, and she must have done her own work carefully and conscientiously. But the point I wish to emphasize is this: A's reasons must also include the (implicit) belief that B is not subject to epistemic vices -- the tendency to deceive herself about the extent of her knowledge, for example, its reliability, or its applicability to whether p.
If questions about B, about her truthfulness, her competence, or her appreciation of the limits of her knowledge, arise in A's mind, he can, to an extent, check up on B. He can, for example, find out about B's reputation in her field. But generally one must know what someone in B's field knows in order to know whether B is competent and whether her reasons are, in fact, good reasons to believe p. A, as a non-expert, cannot know this, without first becoming an expert himself. Thus, although B's claims are subject to cross-checking, that cross-checking must generally occur within the community of experts. Therefore, A's attempt to check up on B cannot avoid reliance upon experts. It only refines and extends it. The way in which a layperson can check up on an expert can be expressed as follows: A has good reasons to believe that C has good reasons to believe that B has good reasons to believe p.
A's good reasons must include beliefs that B is of a certain character. A judgment about B's character, both ethical and epistemic, must be part of A's good reasons for believing p. (Or, if A appeals to C to check on B's personal reliability or professional competence, then A's good reasons must include a similar assessment of C's character. This becomes plain if we think about the phenomenon of physicians covering for incompetent or unethical physicians.)
Thus, even our attempts to be rational rest on an ineluctable element of trust. Trust in persons and the trustworthiness of those persons is epistemically basic and inescapable. Because more is known than anyone can know by himself, we must trust experts. And we must trust them as persons not just rely on their knowledge, since their honesty and their lack of self-deception about the limits of their knowledge are always at issue. Those who do not trust cannot be fully rational -- they often can not have the best evidence for their beliefs. Those who do not trust often can not know. Those who do not trust usually can not be experts; they cannot be competent members of most professions.
And yet, this trust in experts is, like any trust, subject to abuse. There are untrustworthy experts and also many who fraudulently or mistakenly claim more expertise than they possess. There are many cases where experts have proven to be inaccurate, biased, of limited vision, and even dishonest. And sometimes we are, in fact, so vulnerable that we can't even tell whether the experts are serving us well or merely enriching themselves, whether they are helping us or leading us down a path to very serious problems later on.
Because trust in persons is epistemically basic and yet can be abused, there must be an ethics of expertise. There must be principles or maxims for the expert to follow -- so as not to abuse the power inherent in expertise or to undermine the attempts of laypersons to be rational. Following the principles or maxims of this ethics would be part of what makes an expert trustworthy and her testimony reliable.
But the layperson is an agent, too, his vulnerability notwithstanding. So there is also an ethics of appealing to experts -- principles or maxims defining what constitutes an ethical appeal to experts. A complete "ethics of expertise" would, then, be composed of two branches -- maxims for experts and maxims for those appealing to experts. This ethics of expertise would have epistemological, as well as ethical import: in cases where our rationality rests on trust, the epistemological is not separable from the ethical. For these maxims would be maxims for arriving at rational belief in cases in which expert opinion exists.
There is obviously going to be some play in the connection between the ethics of expertise and the rationality of trusting experts. Occasional lapses from the practice of the ethics of expertise would not ordinarily make an expert's testimony completely untrustworthy. Similarly, a few untrustworthy experts would not undermine the rationality of appeals to the authority of that kind of expert, not even if the layperson cannot tell which experts are untrustworthy.
Nonetheless, we must recognize that the temptations to render untrustworthy testimony will often be greatest when the experts' testimony counts for most. Moreover, the result of widespread failure to practice the ethics of expertise would be a culture that would not be able to utilize its own knowledge, due to pervasive distrust of its knowers. If its knowers are indeed often untrustworthy, it might well be rational for a culture to refuse to rely on them. But that distrust would bar the culture from arriving at maximally rational beliefs and decisions.
What would an ethics of expertise look like? An ethics for experts must be an ethics which acknowledges that where there is expertise, knowledge is not in fact open and accessible to all, an ethics which recognizes that the expert's reasons can not be checked by the layperson and often will not even be intelligible to him. It must be an ethics sensitive to a kind of power in knowing, a power unlike that of any of our epistemic peers, whose opinions we can usually test for ourselves. It must also, then, be an ethics sensitive to the very basic vulnerability that comes with deciding to let others make up our minds, for that kind of reliance on others undermines even the internal independence necessary to decide not to accept what the other says.
The ethics of expertise will not, then, rest on a metaethics applicable to relationships among equals. For example, a contract theory that presupposes that each party will look after her own interests will not be adequate to expert-layperson relationships. Since the layperson has laid down a portion of his autonomy in deferring to the expert, the ethics of expertise can not be just an ethics of voluntary agreements, not even when expert and layperson begin their relationship with a voluntary agreement. The ethics of expertise must also acknowledge that laypersons often have no real choice but to rely on an expert -- e.g., when someone needs open heart surgery or when his car is broken down far from home.
There must be an element of benevolence in the ethics of expertise. It must be an ethic of fidelity to laypersons' interests, of service to non-experts. At the very least, it must be an ethics of fairness to the often unprotected interests of laypersons.
Despite this need for loyalty to the interests of laypersons, an ethics of fiduciary relationships is not adequate for the ethics of expertise. Medical ethics and legal ethics have usually been constructed on the basis of the claim that in exchange for a client's trust, a physician or lawyer is to pursue the best interests of that client and no one else. However, a strict fiduciary ethic requires that we be able to identify a single client to whom the expert owes fidelity and that this client's interests not conflict with others to whom the expert also owes moral consideration qua expert. Given the conflicting interests of various stakeholders, there is, for most experts, no single answer to the question, To whom is fidelity owed? Despite its widespread acceptance within medical ethics, a strict fiduciary model is not adequate even in medicine.
Finally, the ethics of expertise must resist a certain individualistic cast present in much popular American thought about ethics: an ethics for experts cannot be limited to concern for one's own conduct. Instead, experts must also be concerned about the conduct of other experts in their fields; they have a professional responsibility to "blow the whistle" on incompetent or unethical experts. This responsibility arises from the fact that often only another expert in the same discipline can really judge the trustworthiness of expert testimony.
The form of an ethics of expertise is still unclear. So, instead of pursuing these reflections on the form of this ethics, I will proceed to a list -- admittedly an unsystematic list -- of maxims for experts and for those who appeal to the authority of experts. The maxims may well be more important than the metaethical form, anyway. And the process of examining and refining a proposed set of maxims may also help to clarify the form required by an ethics of expertise.
Before moving to specific maxims, I must make two general points about maxims for an ethics of expertise. First, I do not intend a legalistic ethics; I think of the maxims as guidelines for responsible professional behavior, not as hard and fast principles specifying moral obligations. Second, I do not take these maxims to be new discoveries on my part. Rather, I am attempting to formulate some of the principles that already guide the conduct of responsible experts. Reflective experts may well find little new here.
MAXIMS FOR THE EXPERT:
1) Admit when you don't know, when you're guessing, and when your opinion is only a reasonable estimate. Don't overestimate the scope or certainty of your knowledge, or the inferences that can be validly drawn from it. Refuse to give opinions when you are being asked for opinions that are beyond the range of your expertise. Distinguish cases where no one knows from those where you don't know and make proper referrals in the latter cases.
This may be a difficult maxim for experts to obey, since their sense of their professional worth and also their social status depend on knowing more than others do. Also, loyalty to the community of experts is often combined with a sense that you are letting the community down if you admit the limits of the community's knowledge. But it is an ethical vice to pretend to know more than you do; it is an epistemic vice to believe that you know more than you do.
2) Tell the truth as you see it in your professional judgment, but don't give the impression that you speak for the community of experts when you do not. When the community of expert opinion is divided, there is an obligation to say that it is. When your opinion is a minority view within the community of experts, you should make that clear.
3) Tell the truth as you see it in your professional judgment, even if you have to tell your employers, clients, or those in power things they don't want to hear. The money you are paid must not be taken by either you or your employer to mean that you will tell her what she wants to hear. Nor that you will publicly support positions or propositions that you do not believe the evidence supports.
Truthfulness is important in any relationship. But truthfulness may be even more important for experts than it is among those who are epistemic peers. The arguments of those who are in positions of relatively equal expertise can more readily be tested and evaluated. Hence, it doesn't matter as much if they are lying, inquiring in bad faith, or misrepresenting what they know. We can usually check up on what they say and so trust in their reports is less necessary. But if your lawyer says you should plea bargain or a group of engineers says that a reactor has adequate safety devices, it is much more difficult to find out (soon enough) whether they are being truthful.
4) Recognize the human propensity to rationalize: you will be tempted to believe what your employers or those in power want to hear you say. Where possible, make allowances for this tendency by checking your opinion against that of other members of the community of experts who operate under fewer or different incentives, or against other communities of experts.
5) Consider the effects of your statements on those who are not your employers or clients. Especially if they are likely to be put at risk by the application of your knowledge and most especially if they are likely to be put at risk without their knowledge or consent. Obligations to employers or clients do not outweigh more basic considerations of justice. You must not use your knowledge to treat others unfairly or to arm your employers to do so. Remember that only organized interests can afford your services and that the least advantaged and also unorganized interests tend not to be able to make themselves either heard or considered.
Although failure to obey this maxim would not destroy the trust necessary for the expert-employer relationship, the belief that experts serve only the interests of their employers would undermine the trust of everyone else in the testimony of experts. We would all then have good reason to be suspicious of expert testimony about issues in which more than one party have an interest. If, for example, the researchers Dow-Corning hires to investigate the safety of silicon breast implants see themselves as responsible only to the interests of Dow-Corning, the rest of us will have reason to suspect a whitewash.
Nor will an ethics of advocacy silence these concerns. An expert can not simply say, "I'll represent my clients' interests, let those with opposing interests hire their own experts." Because that would leave the interests of the poor and disorganized defenseless. Business ethics recognizes that you never owe absolute fidelity to the desires or interests of those who employ you (or accord status to you, etc.). But we have given less thought to what a responsible chemist, psychologist, or political scientist can and can not do. Perhaps we've thought less about this because we tend to think of the pursuit of knowledge as a purely theoretical quest for truth, thus often ignoring the uses to which it's put. But, as a friend of mine puts it, "when a layperson wants to know whether or not p, there's almost always a reason why he wants to know."
6) Know your own ethical limits. Try to avoid positions where you might not be able to obey the above principles because you are susceptible to the temptations of the position or too afraid of the possible costs of following them. Recognize that the use of experts is dependent on a climate of trust. Don't pollute the atmosphere of social trust by abusing it for personal gain, for increased respect, or support for your discipline.
If an expert suspects that she will be giving biased answers, she should not be working on that set of questions. Biased research has come back to haunt Dow-Corning and the Department of Energy. And each new case of untrustworthy research that comes to light fuels public suspicion that scientists and other experts are distorting their messages to serve their own interests or those of special interest groups.
Failure to practice the preceding maxims would have the long-term effect of undermining the social trust needed to sustain the epistemic authority of experts. However, we must also acknowledge that these will often be very difficult maxims to obey. Moreover, one of the prices we pay for blindness to the ethics of expertise is that we have not selected graduate students and colleagues for their character, their ethical commitment, or their ability to withstand the temptations of their positions. Nor have we made any serious attempt to give them the ethical training they might need.
MAXIMS FOR THOSE APPEALING TO EXPERTS:
The ethics of expertise is not a one-way street. A layperson is usually not simply a passive recipient of expert activity. Even if he can not very well evaluate the testimony of an expert, a layperson remains an agent and an important part of the ethics of expertise is the ethics of one who appeals to experts. Indeed, there are ways in which laypersons can make it more likely that experts will offer trustworthy testimony.
I would argue, then, that we need to think much more about the ethics of the recipient of all kinds of professional services -- about the ethics of the patient, the person retaining a lawyer, the analysand; about the ethics of the company or government agency hiring an engineer or chemist, a firm to do an accounting audit or environmental impact statement. The ethics of the recipient of expert opinion is also an important part of a complete professional ethics.
1) Try to find the best-qualified expert and recognize that agreement with your values, desires, policies, plans, or hunches is not a qualification for an expert. Selecting an expert whom you think will likely support your position is an epistemic vice, a form of rationalization. Selecting an expert because you know she will support your position is a form of deliberate deception (or of self-deception) and hence an ethical vice.
Appealing to experts who will support the views we already hold is a common failing, but it defeats the rational purpose of appealing to experts. If A selects an expert because he knows she will support his original views, he presupposes that he already knows enough to be able to tell whether p is true. Thus, an environmentalist should seek out the best experts on the effects of Dioxin, even if they maintain it is less dangerous than it was previously thought to be.
Of course, if A will be the only one affected by decisions based on the expert's opinion, selection of an expert who agrees with his basic beliefs and values may be acceptable. But even in this special case, it is often a form of rationalization. A common example is "doctor shopping" for a physician who will not object to one's smoking or unhealthy eating habits.
Where others will be affected, if experts are hand-picked for their support of antecedently held positions, cynicism about the whole process of appealing to experts is promoted. Suppose our interests conflict or we disagree about what should be done. Then, if you are not persuaded to my position on the basis of what I've said, why should you be persuaded by an expert I have selected if she has been selected because she will support my position? The tendency will be for you to select "your" expert who you know will support your position. You, then, refuse to accept anything my expert says, and I refuse to accept anything your expert says. We're then back to our original position of disagreement, except that we have undermined the credibility of those who might have been able to throw genuine light on the issues we confront. Unfortunately, the use of expert witnesses in legal cases provides many clear examples of this phenomenon.
We must distinguish, then, two very different appeals to the authority of experts: 1) appeals to experts in a context of genuine inquiry, where the layperson is trying to find out whether or not p, and 2) appeals to experts to promote acceptance by others -- or even by the layperson himself -- of a position he already holds. The first is rational, an essential ingredient in achieving rational belief. The second is rationalizing, a form of deception or self-deception. As clear as the distinction between these two may be in theory, in practice these two different appeals to experts probably represent opposite poles of a continuum. And important as the distinction is to the ethics of expertise, it is seldom a simple matter to know where either you yourself or others are on this continuum. For the experts who agree with my position will always seem to me to have the best arguments.
2) Although you appeal to experts to reduce your uncertainty and to enable you to act with greater assurance, recognize that what you would like to know simply may not be known. Recognize, too, that even the best experts may be divided. Do not generate pressure on experts to pretend to know more than they do, to overestimate the relevance of what they know, or to feign consensus within the community of experts where there is none.
We usually appeal to experts in the context of decisions and decision makers have practical and often urgent needs for conclusions. Facing the need to decide about issues involving complex and technical matters and recognizing the insufficiency of our own knowledge, it is very difficult to refrain from trying to get more information or certainty from experts than they have to offer.
3) An expert's educated guess may or may not be a sufficient basis for action. Try to distinguish these two types of situations. When a decision must be made, the educated guess of the experts will be the best basis for a decision -- an educated guess is better than an uneducated guess. Also recognize, however, that even the best, most informed judgment can be mistaken. Where relevant expert opinion exists, it is the best opinion you have. But it is not infallible. Don't expect more than you can get from an expert or try to hold an expert accountable for more than you can reasonably expect.
Experts will have better reasons than laypersons (within their domains of expertise). But better reasons are not always sufficient reasons to act--e.g., when no action is an option or the decision can be postponed; when
other, less uncertain alternatives exist; when the risks or costs of their having guessed wrong are great. It seems that we must act on the problem of the depletion of the ozone layer because the risks are so high. However, we could decide to do further research before permanently disposing of high-level radioactive waste. Sweden, for example, has decided that we do not now know enough to design long-term storage facilities for high-level radioactive waste and will store its waste above ground while trying to develop more certain science and technology.
4) Recognize that experts either directly or indirectly in your employ will be tempted to tell you what you want to hear and that those trained to be experts have not been selected for their courage or their ability to withstand the heat of non-academic battle. Make allowances for that. Try to make experts understand that you want their candid assessments, not support for your position.
I think we have tried to substitute good money for good character in the experts we appeal to. If the expert is well paid by us, she will have no reason to lie to us, and hence we don't need to worry about whether she might distort her testimony if she had reason to. But the ability of good pay to substitute for good character is limited. And it brings into play new incentives for the expert. Laypersons thus need to ask what a given expert has to gain by being believed.
MAXIMS FOR THE COMMUNITY OF EXPERTS:
First, a word about the phrase "community of experts." The "community," as I understand this term, is defined by field or subject matter. It is based on disciplined attention, sustained inquiry, and consequently, better knowledge of the relevant subject. The community of experts does not, then, rest upon or require agreement. However, contemporary philosophers of science in the Kuhnian tradition have made us sensitive to the degree of conformity in both procedure and belief that communities of experts often in fact require of their members.
1) Never use rewards and punishments to stifle dissent within the community of experts.
Rewarding mere conformity or punishing disagreement would seriously compromise the community's quest for truth and hence its claim to be a community of experts. It should be equally obvious, however, that there will be a temptation to encourage conformity, both because professional consensus presents a better face to the public and also because of our natural tendency to see those who agree with us as more competent and more ethical than those who disagree.
2) Beware the gap between social expectations of your community and what your members can in fact do. Combat unrealistic social expectations. Do not attempt to generate social support for your work by overestimating what is known, what is likely to become known, or the relevance and applicability of either to practical problems.
Social acceptance of the knowledge possessed by a community of experts depends on a climate of trust in that community. Trust that has been lost or destroyed is extremely difficult to reestablish. However, the short-term interests of the community (and its members) may be enhanced by unrealistic social expectations, by the appearance of consensus, or by the appearance of trustworthiness. But at the cost of the long-term interests of the community that suffer when those unrealistic expectations cannot be met, disagreements surface, or shoddy conduct comes to light.
3) Take steps to insure that your members are worthy of the social trust placed in them. Take your responsibility for certifying experts seriously. Insure that your members remain competent and current, discipline those who misuse their positions as experts, decertify those who are no longer fit to be included in your community. Certification, disciplining, and decertification all involve an ethical component. Those who abuse the power inherent in their specialized knowledge must be censured, penalized, ultimately excluded from the community of experts.
Granted, however, the lines between mistakes and culpable mistakes, between mistakes and incompetence, between incompetence and disagreement, and between disagreement and improper behavior will not often be easy to draw.
4) Recognize the obligation to be a "whistle-blower" and the sacrifices that often must be made by whistleblowers. Resist the temptation to "circle the wagons" and defend the reputation of the community by withholding information about the misconduct of members of your community. Work for institutions and social settings that minimize temptations to abuse the power of expertise and that protect those who blow the whistle on untrustworthy members of the community.
This responsibility follows from the fact that one must usually know what an expert knows in order to be able to tell whether the expert is competent and working carefully and conscientiously. The idea that experts are responsible for the competence and conduct of other experts is foreign to our individualistic orientation -- "mind your own business," "don't rat on your buddies." The idea that peer review necessarily involves ethical review is especially troubling to many who believe that ethics is all subjective. Moreover, the work of monitoring one's peers seems irrelevant to the real task of the expert -- to advance or apply knowledge in her field. But these attitudes fail to take seriously the element of trust in appeals to experts and the dependency of the community of experts on that climate of trust. They also ignore the dependency of one expert on the integrity of others in her field.
In the "Baltimore case" involving the fraudulent medical research of Thereza Imanishi-Kari, many members of the scientific community rallied prematurely to the defense of one of their esteemed colleagues, charging a "witch hunt." By contrast, Margot O'Toole -- who blew the whistle and whose charges were eventually substantially upheld -- was very nearly driven out of the scientific community. She was fired, she was unable to find another research job, she lost her house, and for a time, she was reduced to answering the telephone for her brother's moving company. Arguably, she was more severely punished than the fraudulent researcher whose misconduct she reported.
MAXIMS FOR A SOCIETY OR GROUP THAT RELIES ON EXPERTS:
1) Create settings for experts which protect experts who take responsible but unpopular positions, and which minimize the temptations to abuse the power of expertise. Support those experts who are conscientiously engaged in peer review.
An example of an unpopular but responsible position is that of Jerry Szymanski, a geologist, who questioned the suitability of the Yucca Mountain site for the high-level radioactive waste repository. His report has gradually been given a full hearing, despite its controversial nature.
2) Do not permit expertise to be monopolized by the wealthy or powerful or to be used as a tool of oppression or exploitation.
The danger is that money and power will command expertise and thus an appearance of rationality so skillfully woven as to be virtually undetectable. Access to the information provided by experts can itself be a tool of injustice. Therefore, although this will be especially difficult:
3) There is a responsibility to finance the education and information (through experts) of opposing and potentially opposing groups.
Unless money and power are to resolve the discussion of questions of truth, the knowledge-buying power of different groups must not be grossly unequal. To avoid that danger and to quiet suspicions about possible bias, a Martinsville, Illinois group that opposes selection of a low-level radioactive waste disposal site in its vicinity has been given $500,000 to finance the development of its own expert knowledge. (However, the site's proponents have had many millions to finance their experts.)
Obviously, the last two maxims apply both to different groups within a country and to different countries within the international community.
These, then, are the maxims that I would propose as elements in the ethics/epistemology of expertise. I am confident that the present list represents a beginning, at best. Major corrections and additions will undoubtedly be necessary. Moreover, I have said nothing about how laypersons should proceed in conditions where experts are not trusted and very little about how a layperson is to determine whom to trust in a specific instance.
In closing, let us briefly recall the purpose of trying to construct such a list of maxims: in the long run, trust in experts will be well-placed and rational to the extent that these maxims are generally followed, and misplaced, often irrational to the extent that they are not followed. Thus, if something like this set of maxims were found to be acceptable for an ethics of expertise, we would face a very important question of fact: to what extent do people in our society follow this ethics of expertise? I can't answer this question. But it is worth noting that one could read this paper as providing a strong argument for not trusting the members of our professions, on the grounds that those in professions in the United States do not even come close to following this ethics.
However that may be, if the main argument of this paper is acceptable --if, that is, deference to the authoritative opinions of experts is essential to our rationality and knowledge, and if that deference unavoidably rests on trust, not only in the competence, but also in the epistemic and ethical characters of our experts -- then it is high time that we get to work on the ethics of expertise. Indeed, it is past time.
* I wish to thank Joan Callahan, E. Roger Jones, John Nolt, Dan Wueste, the members of the philosophy department at East Tennessee State University, and especially Mary R. English for helpful discussion of many of the points in this paper. Research for this paper has been supported by the National Endowment for the Humanities through a Fellowship for College Teachers and Independent Scholars and by East Tennessee State University through a Tennessee Board of Regents grant-in-aid.
 I develop this point in: Hardwig JR. Epistemic Dependence. Journal of Philosophy. July 1985: 335-49.
 See Hardwig, JR. The Role of Trust in Knowledge. Journal of Philosophy. 1991: 693-708.
 For a discussion of this example, see my paper, "Epistemic Dependence," referred to above.
 For example. Rustum Roy, head of a chemistry laboratory at Pennsylvania State University, said, in response to the suggestion that research directors should be responsible for the quality of work coming out of their own labs: "Fifteen years ago I would look at all the raw data myself. Now I'm not close enough. I wouldn't catch it if someone were deliberately trying to falsify results. Every head of a large lab is in the same position. I can't even run half of my machines." (Zurer PS. Misconduct in Research; It May Be More Widespread Than Most Chemists Like to Think. Chemical & Engineering News. 1987 [April 13]: 15.)
 In cases where B's expertise rests on what she has learned from others (e.g., in the literature), rather than on evidence she herself has gathered, the structure of an appeal to her authority is this: A has good reasons to believe that B has good reasons to believe that C (and D and E) have good reasons to believe p. This may seem baroque to the point of silliness. But it is not. The chain of authority implicit in most appeals to experts is even more complex and remote than this. Our rationality rests on trust in many (the C's, D's and E's) whom we have never met.
 The reason for the parenthetical question in this statement and the next is that I am unsure about what to say about implicit trust. If A trusts B implicitly, he will often not have or even feel the need to have good reasons to believe in B's honesty. Obviously, that can be dangerous. But I think communities in which a climate of implicit trust prevails also have some real advantages over those in which good reasons for believing in the truthfulness of its experts are felt to be needed and are then supplied.
 Often we try to substitute judgments about B's situation for judgments about her character -- "B would have no reason to lie in this situation." But as we shall see, this substitution cannot be completely adequate. There are almost always reasons for the expert to exaggerate, if not to lie.
 Hardwig JR. What About the Family? Hastings Center Report. 1990 (March/April): 5-10.