Skip to main content

Advertisement

Log in

Regulating “Good” People in Subtle Conflicts of Interest Situations

  • Original Paper
  • Published:
Journal of Business Ethics Aims and scope Submit manuscript

Abstract

Growing recognition in both the psychological and management literature of the concept of “good people” has caused a paradigm shift in our understanding of wrongful behavior: Wrongdoings that were previously assumed to be based on conscious choice—that is, deliberate decisions—are often the product of intuitive processes that prevent people from recognizing the wrongfulness of their behavior. Several leading scholars have dubbed this process as an ethical “blind spot.” This study explores the main implications of the good people paradigm on the regulation of employees’ conflicts of interest. In two experiments, we examined the efficacy of traditional deterrence- and morality-based interventions in encouraging people to maintain their professional integrity and objectivity at the cost of their own self-interest. Results demonstrate that while the manipulated conflict was likely to “corrupt” people under intuitive/automatic mindset (Experiment 1), explicit/deliberative mechanisms (both deterrence- and morality-based) had a much larger constraining effect overall on participants’ judgment than did implicit measures, with no differences between deterrence and morality (Experiment 2). The findings demonstrate how little is needed to compromise the employees’ ethical integrity, but they also suggest that a modest explicit/deliberative intervention can easily prevent much of the wrongdoing that may otherwise result.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

Notes

  1. We ignore here concepts such as mistakes which are treated by the negligence doctrine.

  2. The “good people” argument does not use the term “good” to mean “moral” or “virtuous.” Rather, the focus is on garden variety individuals who might, in various organizational settings, end up behaving unethically without fully recognizing that what they do is unethical.

  3. Originally, most discussions of intrinsic motivation have been within the context of interest in the task. See generally Deci, Koestner, and Ryan (1999), describing the research approach and results of a number of studies on intrinsic motivation; see also Kasser and Ryan (1996), examining the differences in individual well-being associated with focusing on extrinsic and intrinsic goals.

  4. These responses were identified based on duplicated IP addresses and GPS locations.

  5. In the original design, along with the material-based conflict of interest used in the current paper, we had five more group of participants that went through an identity-based conflict of interest manipulation as an additional type of conflict of interest. One group with no intervention manipulation and four groups with the same intervention manipulations we used for the material-based conflict of interest. We did not find any effect for the identity-based conflict of interest on participants (as compared to the control group of no COI condition). Since the focus of the current experiment was to examine how can we regulate people’s behavior in a conflict of interest situation, there was no point in including these conditions in the paper, so we focused only on the material-based conflict of interest conditions.

  6. By mistake, one of the items referring to the researchers in the center was worded in the opposite way to all other items (i.e., disagreement indicated a favorable evaluation of the research institute). Because this was the only item formulated in such a way, and because Cronbach's alpha reliability of the researcher items, with the inclusion of the reversed responses to this item resulted in a drop from .83 (without this item) to .75, we excluded this item from further analysis. The pattern of the following reported results was similar when this item was included in the analysis.

References

  • Achtziger, A., Alós-Ferrer, C., & Wagner, A. K. (2015). Money, depletion, and prosociality in the dictator game. Journal of Neuroscience, Psychology, and Economics, 8(1), 1.

    Article  Google Scholar 

  • Adams, J. S., Tashchian, A., & Shore, T. H. (2001). Codes of ethics as signals for ethical behavior. Journal of Business Ethics, 29(3), 199–211.

    Article  Google Scholar 

  • Alemanno, A., & Sibony, A. L. (2015). Nudge and the law: A European perspective. London: Bloomsbury Publishing.

    Google Scholar 

  • Ayal, S., & Gino, F. (2011). Honest rationales for dishonest behavior. The social psychology of morality: Exploring the causes of good and evil. Washington, DC: American Psychological Association.

    Google Scholar 

  • Banaji, M. R., & Greenwald, A. G. (2013). Blindspot: Hidden biases of good people. New York, NY: Delacorte Press.

    Google Scholar 

  • Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: Why we fail to do what’s right and what to do about it. Princeton, NJ: Princeton University Press.

    Book  Google Scholar 

  • Bereby-Meyer, Y., & Shalvi, S. (2015). Deliberate honesty. Current Opinion in Psychology, 6, 195–198.

    Article  Google Scholar 

  • Bersoff, D. M. (1999). Why good people sometimes do bad things: Motivated reasoning and unethical behavior. Personality and Social Psychology Bulletin, 25(1), 28–39.

    Article  Google Scholar 

  • Buhrmester, M. D., Kwang, T., & Gosling, S. D. (2011). Amazon’s Mechanical Turk: A new source of inexpensive, yet high-quality data? Perspectives on Psychological Science, 6, 3–5.

    Article  Google Scholar 

  • Cain, D. M., Loewenstein, G., & Moore, D. A. (2005). The dirt on coming clean: Perverse effects of disclosing conflicts of interest. The Journal of Legal Studies, 34(1), 1–25.

    Article  Google Scholar 

  • Camerer, C. F., & Hogarth, R. M. (1999). The effects of financial incentives in experiments: A review and capital–labor–production framework. Journal of risk and uncertainty, 19(1-3), 7–42.

    Article  Google Scholar 

  • Che, Y. K. (1995). Revolving doors and the optimal tolerance for agency collusion. The Rand Journal of Economics, 26(2), 378–397.

    Article  Google Scholar 

  • Chugh, D., Bazerman, M. H., & Banaji, M. R. (2005). Bounded ethicality as a psychological barrier to recognizing conflicts of interest. In D. A. Moore, D. M. Cain, G. Loewenstein, & M. H. Bazerman (Eds.), Conflict of interest: Challenges and solutions in business, law, medicine, and public policy (pp. 74–95). New York, NY: Cambridge University Press.

    Chapter  Google Scholar 

  • Cornaggia, J., Cornaggia, K. J., & Xia, H. (2016). Revolving doors on wall street. Journal of Financial Economics, 120(2), 400–419.

    Article  Google Scholar 

  • Craswell, R., & Calfee, J. E. (1986). Deterrence and uncertain legal standards. Journal of Law, Economics and Organization, 2(2), 279–303.

    Google Scholar 

  • Deci, E. L., Koestner, R., & Ryan, R. M. (1999). A meta-analytic review of experiments examining the effects of extrinsic rewards on intrinsic motivation. Psychological Bulletin, 125(6), 627–688.

    Article  Google Scholar 

  • Evans, J. S. B. (2003). In two minds: Dual-process accounts of reasoning. Trends in Cognitive Sciences, 7(10), 454–459.

    Article  Google Scholar 

  • Evans, J. S. B. (2008). Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology, 59, 255–278.

    Google Scholar 

  • Feldman, Y. (2009). The expressive function of the trade secret law: Legality, cost, intrinsic motivation and consensus. Journal of Empirical Legal Studies, 6(1), 177–212.

    Article  Google Scholar 

  • Feldman, Y. (2011). The complexity of disentangling intrinsic and extrinsic compliance motivations: Theoretical and empirical insights from the behavioral analysis of law. Washington University Journal of Law and Policy, 35, 11–52.

    Google Scholar 

  • Feldman, Y. (2014). Behavioral ethics meets behavioral law and economics. In Zamir, E., & Teichman, D. (Eds.), Oxford handbook of behavioral law and economics (pp. 213–241). Oxford University Press.

  • Feldman, Y., Gauthier, R., & Schuler, T. (2013). Curbing misconduct in the pharmaceutical industry: Insights from behavioral ethics and the behavioral approach to law. The Journal of Law, Medicine and Ethics, 41(3), 620–628.

    Article  Google Scholar 

  • Feldman, Y., & Lobel, O. (2015). Behavioral trade-offs: Beyond the land of nudges spans the world of law and psychology. In Alemanno, A. & Sibony, E. (Eds.), Nudge and the law: A European perspective. Oxford: Hart Publishing.

  • Friedberg, M., Saffran, B., Stinson, T. J., Nelson, W., & Bennett, C. L. (1999). Evaluation of conflict of interest in economic analyses of new drugs used in oncology. JAMA, 282(15), 1453–1457.

    Article  Google Scholar 

  • Gigerenzer, G., & Goldstein, D. G. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103(4), 650–669.

    Article  Google Scholar 

  • Gino, F., & Desai, S. D. (2012). Memory lane and morality: How childhood memories promote prosocial behavior. Journal of Personality and Social Psychology, 102(4), 743–758.

    Article  Google Scholar 

  • Gino, F., Schweitzer, M., Mead, N., & Ariely, D. (2011). Unable to resist temptation: How self-control depletion promotes unethical behavior. Organizational Behavior and Human Decision Processes, 115(2), 191–203.

    Article  Google Scholar 

  • Gneezy, U., Meier, S., & Rey-Biel, P. (2011). When and why incentives (don’t) work to modify behavior. The Journal of Economic Perspectives, 25(4), 191–209.

  • Gormley Jr, W. T. (1979). A test of the revolving door hypothesis at the FCC. American Journal of Political Science, 23(4), 665–683.

  • Halali, E., Bereby-Meyer, Y., & Meiran, N. (2014). Between self-interest and reciprocity: The social bright side of self-control failure. Journal of Experimental Psychology, 143, 745–754.

    Article  Google Scholar 

  • Halali, E., Bereby-Meyer, Y., & Ockenfels, A. (2013). Is it all about the self? The effect of self-control depletion on ultimatum game proposers. Frontiers in Human Neuroscience, 7, 240.

    Article  Google Scholar 

  • Hillman, A. L. (1987). Financial incentives for physicians in HMOs. Is there a conflict of interest? The New England Journal of Medicine, 317(27), 1743–1748.

    Article  Google Scholar 

  • Hollis, J. (2008). Why good people do bad things: Understanding our darker selves. New York, NY: Gotham Books.

    Google Scholar 

  • Jolls, C., Sunstein, C. R., & Thaler, R. (1998). A behavioral approach to law and economics. Stanford Law Review, 50(5), 1471–1550.

  • Kahneman, D. (2011). Thinking, fast and slow. London: Macmillan.

    Google Scholar 

  • Kasser, T., & Ryan, R. M. (1996). Further examining the American dream: Differential correlates of intrinsic and extrinsic goals. Personality and Social Psychology Bulletin, 22(3), 280–287.

    Article  Google Scholar 

  • Kruglanski, A. W., & Gigerenzer, G. (2011). Intuitive and deliberate judgments are based on common principles. Psychological Review, 118(1), 97–109.

    Article  Google Scholar 

  • Kuder, G. F., & Richardson, M. W. (1937). The theory of the estimation of test reliability. Psychometrika, 2, 151–160.

    Article  Google Scholar 

  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480.

    Article  Google Scholar 

  • Lessig, L. (2011). Republic, lost: How money corrupts congress—and a plan to stop it. New York, NY: Hachette Digital Inc.

    Google Scholar 

  • Mazar, N., Amir, O., & Ariely, D. (2008). The dishonesty of honest people: A theory of self-concept maintenance. Journal of Marketing Research, 45(6), 633–644.

    Article  Google Scholar 

  • Mead, N., Baumeister, R. F., Gino, F., Schweitzer, M., & Ariely, D. (2009). Too tired to tell the truth: Self control resource depletion and dishonesty. Journal of Experimental Social Psychology, 45, 594–597.

    Article  Google Scholar 

  • Moore, D. A., & Loewenstein, G. (2004). Self-interest, automaticity, and the psychology of conflict of interest. Social Justice Research, 17(2), 189–202.

    Article  Google Scholar 

  • Moore, D. A., Tanlu, L., & Bazerman, M. H. (2010). Conflict of interest and the intrusion of bias. Judgment and Decision Making, 5(1), 37–53.

    Google Scholar 

  • Norenzayan, A., & Shariff, A. F. (2008). The origin and evolution of religious prosociality. Science, 322(5898), 58–62.

    Article  Google Scholar 

  • Paolacci, G., & Chandler, J. (2014). Inside the turk understanding mechanical turk as a participant pool. Current Directions in Psychological Science, 23(3), 184–188.

    Article  Google Scholar 

  • Pillutla, M. M. (2011). When good people do wrong: Morality, social identity, and ethical behavior. In D. De Cremer, R. van Dijk, & J. K. Murnighan (Eds.), Social psychology and organizations (pp. 353–370). New York, NY: Routledge.

    Google Scholar 

  • Pittarello, A., Leib, M., Gordon-Hecker, T., & Shalvi, S. (2015). Justifications shape ethical blind spots. Psychological Science, 26(6), 794–804.

  • Rand, D. G., Greene, J. D., & Nowak, M. A. (2012). Spontaneous giving and calculated greed. Nature, 489(7416), 427–430.

    Article  Google Scholar 

  • Rodwin, M. A. (1989). Physicians’ conflicts of interest: The limitations of disclosure. New England Journal of Medicine, 321(20), 1405–1409.

    Article  Google Scholar 

  • Rodwin, M. A. (2012). Conflicts of interest, institutional corruption, and pharma: An agenda for reform. The Journal of Law, Medicine and Ethics, 40(3), 511–522.

    Article  Google Scholar 

  • Sanfey, A. G., Rilling, J. K., Aronson, J. A., Nystrom, L. E., & Cohen, J. D. (2003). The neural basis of economic decision-making in the ultimatum game. Science, 300, 1755–1758.

    Article  Google Scholar 

  • Schwartz, M. S. (2002). A code of ethics for corporatecode of ethics. Journal of Business Ethics, 41(1–2), 27–43.

    Article  Google Scholar 

  • Schweitzer, M. E., & Hsee, C. K. (2002). Stretching the truth: Elastic justification and motivated communication of uncertain information. Journal of Risk and Uncertainty, 25, 185–201.

    Article  Google Scholar 

  • Sezer, O., Gino, F., & Bazerman, M. H. (2015). Ethical blind spots: Explaining unintentional unethical behavior. Current Opinion in Psychology, 6, 77–81.

    Article  Google Scholar 

  • Shalvi, S., Dana, J., Handgraaf, M. J. J., & De Dreu, C. K. W. (2011). Justified ethicality: Observing desired counterfactuals modifies ethical perceptions and behavior. Organizational Behavior and Human Decision Processes, 115, 181–190.

    Article  Google Scholar 

  • Shalvi, S., Eldar, O., & Bereby-Meyer, Y. (2012). Honesty requires time (and lack of justifications). Psychological Science, 23, 1264–1270.

    Article  Google Scholar 

  • Shalvi, S., Gino, F., Barkan, R., & Ayal, S. (2015). Self-serving justifications doing wrong and feeling moral. Current Directions in Psychological Science, 24, 125–130.

    Article  Google Scholar 

  • Shenhav, A., Rand, D. G., & Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in god. Journal of Experimental Psychology: General, 141, 423–428.

    Article  Google Scholar 

  • Shu, L. L., Mazar, N., Gino, F., Ariely, D., & Bazerman, M. H. (2012). Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. Proceedings of the National Academy of Sciences, 109(38), 15197–15200.

    Article  Google Scholar 

  • Somers, M. J. (2001). Ethical codes of conduct and organizational context: A study of the relationship between codes of conduct, employee behavior and organizational values. Journal of Business Ethics, 30(2), 185–195.

    Article  Google Scholar 

  • Srull, T. K., & Wyer, R. S. (1979). The role of category accessibility in the interpretation of information about persons: Some determinants and implications. Journal of Personality and Social Psychology, 37(10), 1660.

    Article  Google Scholar 

  • Stanovich, K. E. (1999). Who is rational? Studies of individual differences in reasoning. Mahwah. NJ: Erlbaum.

    Book  Google Scholar 

  • Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645–665.

    Article  Google Scholar 

  • Stapenhurst, R., & Kpundeh, S. J. (Eds.). (1999). Curbing corruption: Toward a model for building national integrity. Washington, DC: World Bank Publications.

    Google Scholar 

  • Stevens, B. (1994). An analysis of corporate ethical code studies: “Where do we go from here?”. Journal of Business Ethics, 13(1), 63–69.

    Article  Google Scholar 

  • Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions using the architecture of choice. New Haven, CT: Yale University Press.

    Google Scholar 

  • Uziel, L., & Hefetz, U. (2014). The selfish side of self-control. European Journal of Personality, 28(5), 449–458.

    Article  Google Scholar 

  • Weaver, G. R. (1995). Does ethics code design matter? Effects of ethics code rationales and sanctions on recipients’ justice perceptions and content recall. Journal of Business Ethics, 14(5), 367–385.

    Article  Google Scholar 

  • Wilson, T. D., & Schooler, J. W. (1991). Thinking too much: Introspection can reduce the quality of preferences and decisions. Journal of Personality and Social Psychology, 60(2), 181–192.

    Article  Google Scholar 

  • Xu, H., Bègue, L., & Bushman, B. J. (2012). Too fatigued to care: Ego depletion, guilt, and prosocial behavior. Journal of Experimental Social Psychology, 48(5), 1183–1186.

    Article  Google Scholar 

  • Zamir, E., & Sulitzeanu-Kenan, R. (2016). Explaining Self-Interested Behavior of Public-Spirited Policymakers (November 28, 2016). Hebrew University of Jerusalem Legal Research Paper No. 17–8. Available at https://ssrn.com/abstract=2876437.

Download references

Acknowledgements

We thank the Edmond J. Safra Center for the Study of Ethics, Harvard University (Grant No. 10), and the Jerusalem Crime Group for its financial support. We thank Dan Simon, Barak Ariel, Mazarin Banaji and Christoph Engle for their helpful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuval Feldman.

Appendix

Appendix

The 18-Item Questionnaire

Please state your agreement or disagreement with the following statements, as objectively as possible, on a scale of 1 (strongly disagree) to 6 (strongly agree).

  1. 1.

    Research conducted by this center is more important than most other research I’m familiar with in the Social Sciences [R].

  2. 2.

    The research done by the center is more valuable than research done by other similar centers [R].

  3. 3.

    Universities should divert funds for this center’s research [R].

  4. 4.

    There should be less scrutiny into the actions of members of this center [S].

  5. 5.

    The center’s research would provide useful information for the scientific community [R].

  6. 6.

    It would be a valuable use of my time to read research about institutional corruption and how to increase public trust in institutions [R].

  7. 7.

    Mistakes by researchers in this foundation should not be punished as harshly as other researchers [S].

  8. 8.

    Government subsidies for this center are better investments than subsidies for other centers [R].

  9. 9.

    Salaries of researchers in this center should be higher than other researchers’ salaries [S].

  10. 10.

    For this question, please answer with the “2” button.

  11. 11.

    Research by this center is crucial for the future success of the international community [R].

  12. 12.

    Governments should divert research funds from other areas to this area [R].

  13. 13.

    International foundations should consider allocating funds to this center [R].

  14. 14.

    Researchers in this center should have greater freedom in how they use public grants [S].

  15. 15.

    Researchers at the Safra Center are more likely to donate to charity than other researchers [S].

  16. 16.

    Researchers at the Safra Center are more concerned with helping people than researchers at other institutions [S].

  17. 17.

    Researchers at the Safra Center are more likely to misuse funds than other researchers [S].

  18. 18.

    Researchers at the Safra Center are less likely to plagiarize work than other researchers [S].

  • [R] Items focusing on the research conducted by the institute.

  • [S] Items focusing on the scientists working at the institute.

The Binominal Questionnaire

We would like to ask for your help in rating various statements the Safra Center could potentially use in a future fund-raising campaign. Please indicate whether this statement is accurate/inaccurate, you agree/disagree, would say this statement to potential donors/would not say this statement to potential donors, and would sign a petition containing this statement/would not sign a petition containing this statement.

  • Research conducted by the Safra Center is crucial for the well-being of society.

1.

Accurate

Inaccurate

2.

Agree

Disagree

3.

Would say to potential donors

Would not say to potential donors

4.

Would sign a petition

Would not sign a petition

  • The Safra Center’s research will change the way we look at public institutions.

5.

Accurate

Inaccurate

6.

Agree

Disagree

7.

Would say to potential donors

Would not say to potential donors

8.

Would sign a petition

Would not sign a petition

  • The Safra Center’s mission is the first attempt ever to deal with one of our most important problems

9.

Accurate

Inaccurate

10.

Agree

Disagree

11.

Would say to potential donors

Would not say to potential donors

12.

Would sign a petition

Would not sign a petition

The Objectivity Questionnaire

  1. 1.

    Do you think you had any sort of influence while you were answering the questions?

    • Yes (if so, please state what you were influenced by)

      ––––––––––––––––––––––––––––––––––––––––

    • No

  2. 2.

    Were you completely objective during this study?

    • Yes

    • No (if so, please state why you were not completely objective)

      ––––––––––––––––––––––––––––––––––––––––

  3. 3.

    Did you think of any factor besides your best judgment while answering the questions?

    • Yes (if so, please state what else you used)

      ––––––––––––––––––––––––––––––––––––––––

    • No

The Explicit Deterrence Manipulation

Many countries have focused on cracking down on people and businesses who act unethically. Those who are involved in multiple interests, and let one of those interests corrupt their actions are especially important targets. Global leaders have decided that such conflict of interest situations are intolerable. Governments around the world took action against hundreds of unethical individuals last week. As a result, both individuals and organizations must be extra cautious when doing business with the government. Otherwise, if they let conflict of interest situations influence their decisions, they will be heavily prosecuted.

In accordance with this worldwide trend, we believe that people who let their conflict of interest affect their objectivity and integrity when completing this survey should be penalized. Hence, participants who let their conflicting interests affect their judgment might lose some of their compensation for the work they do for us.

Who have decided conflict of interest situations are intolerable?

  • Everyday people

  • Global leaders

  • Big business companies

What will happen to people if they let their conflict of interest situations influence their decisions?

  • They will receive a warning

  • They will be rewarded

  • They will be prosecuted

What will happen to participants in this survey if they are influenced by their conflict of interest when completing the survey?

  • Their compensation might be affected

  • Their reputation might be harmed

  • The validity of their answers might be affected

The Explicit Morality Manipulation

Conflict of interest situations are among the greatest problems the world faces today. A conflict of interest occurs when an individual or organization is involved in multiple interests, one of which could possibly corrupt the motivation for an act in the other. Such situations harm the public good, as the correct decision in a national dilemma may be rejected due to these corrupt individuals or organizations. Conflicts of interest also threaten the merit-based system, as individuals are chosen based on who they know, not what they know. These actions are immoral, so conscientious individuals should do everything in their power to avoid conflict of interest situations.

In accordance with this worldwide trend, we believe that people who let their conflict of interest affect their objectivity and integrity when completing this survey are not acting in a moral and ethical way. Hence, participants who will let their conflicting interests affect their judgment might harm the public good.

What should conscientious individuals do in regard to conflict of interest situations?

  • Avoid them

  • Seek them out

  • Take advantage of them

What do conflict of interest situations harm?

  • A person’s feelings

  • The environment

  • The public good

What will happen to participants in this survey if they are influenced by their conflict of interest when completing the survey?

  • They might harm the public good

  • Their integrity might be harmed

  • The validity of their answers might be affected

The Implicit Deterrence Manipulation

c_ _ _uption

corruption

jai_

jail

poli_ _

police

punish_ _ _t

punishment

fin_

fine

_ubpoena

subpoena

jud_e

judge

in_ictm_nt

indictment

in_ _st_gat_on

investigation

br_be

bribe

_uilt_

guilty

cro_ _ing

crossing

rotat_ _ _

rotation

_ miling

smiling

s_ll

sill

fi_ _y

fiery

flou_

flour

b_ld

bald

r_ _t

root

fe_er

fever

w_ _ds

weeds

fema_ _

female

_ _ gineer

engineer

al_gn

align

d_sconn_ _ted

disconnected

catal_ _

catalog

_ orn

corn

mer_ e

merge

fantast_ _

fantastic

_uman

human

exc_ll_nt

excellent

cop_ _r

copier

tra_

trap

bl_e

blue

effic_ _nt

efficient

The Implicit Morality Manipulation

integri_ _

integrity

_rust

trust

mor_li_y

morality

hon_sty

honesty

objectivi_ _

objectivity

princi_ _es

principles

_irtue

virtue

t_ _th

truth

_ _irness

fairness

neut_ali_ _

neutrality

jus_ic_

justice

cro_ _ing

crossing

rotat_ _ _

rotation

_ miling

smiling

s_ll

sill

fi_ _y

fiery

flou_

flour

b_ld

bold

r_ _t

root

fe_er

fever

w_ _ds

weeds

fema_ _

female

_ _ gineer

engineer

al_gn

align

d_sconn_ _ted

disconnected

catal_ _

catalog

_ orn

corn

mer_ e

merge

fantast_ _

fantastic

_uman

human

exc_ll_nt

excellent

cop_ _r

copper

tra_

trap

bl_e

blue

effic_ _nt

efficient

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feldman, Y., Halali, E. Regulating “Good” People in Subtle Conflicts of Interest Situations. J Bus Ethics 154, 65–83 (2019). https://doi.org/10.1007/s10551-017-3468-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10551-017-3468-8

Keywords

Navigation