Lee, A.S. “A scientific methodology for MIS case studies.,” MIS Quarterly (13:1), March 1989, pp 33-50.

In message 207 on Saturday, October 27, 2007 5:17pm, Charles Myers (cemyers) writes:
>Lee presents an argument against the use of a single case study for a research topic.

Actually, Lee is defending single-case-study research.

Key rhetorical strategy: Lee says he chooses the natural science model as the ideal for the social science model, in part because the critics of single-case studies often ground their criticism in natural science methodology; Lee thus intends to respond to the critics in their own language and show that single-case studies can satisfy natural science criteria [34].

The four problems of single-case studies:

>–Making controlled observations
>–Making controlled deductions
>–Allowing for replicability
>–Allowing for generalizability
>

To understand how Lee defends single-case studies against the charges that they can’t satisfy the above 4 criteria, Lee appeals to 4 other standards laid out by Popper, who said you have a good theory when it is…

1. falsifiable
2. logically consistent
3. competitively predictive (it predicts stuff at least as well as competing theories)
4. durable (scientists test it, and it survives!)

Markus tests her three theories, falsifies two of them, and finds the third (the interaction theory) satisfies Popper’s conditions. Someone else could still construct a test to falsify it; it’s consistent with what Markus observed; it makes better predictions than two competing theories; and it survived this case study. That’s science!

>Seemely Allen both agreed with Marcus and disagreed with Marcus.
>In the end Ibelive he states that multple case study approach is better
>for reasearch because of the rigor and potential for quantity and quality
>of data.

I don’t want to offer excuses for sloppy science — I’ll take multiple examples over one anecdote any day — but I might propose the 9/11 Theory of IS Research in support of single-case studies. The United States restructured the entire federal security apparatus, engaged in two significant military adventures, and radically changed the nature of defendant rights with the PATRIOT Act, all on the basis of a single event, one coordinated terrorist attack. Without commenting on the wisdom of any particular policy adopted, I think it is safe to say that scorn and derision would have met anyone who stood up on September 12 and said, “Hang on, before we form any theories or design any solutions, we’d better try replicating the experiment and controlling the variables. We can’t derive any useful knowledge from just one event.”

In practice, single cases form the basis of important decisions and life-lessons all the time.  Granted, single events also often form the basis of really bad decisions and prejudices. Scientists have an obligation to be circumspect, to look around for more than one case when possible to back up what they wish to argue. But sometimes a single case can provide such powerful, useful information that we’d be fools to ignore it.

Given that IS interacts so closely with the world of practice (we really are flying the plane, studying it, and swapping out bolts and rotors all at the same time!), maybe we IS researchers get special dispensation to do more single-case studies. Astrophysics and molecular biology don’t move fast like IS; we’ve got to provide folks some useful results before the technology we’re studying mutates into something completely unrecognizable. Maybe we need to make the most we can out of single-case studies, give business all the advice and best-guesses we can derive, and hope for the best as we scramble to study the next new thing.

Weber, R. (2003). Editor’s comments: Still desperately seeking the IT artifact. MIS Quarterly, 27(2), iii-xi.

Weber offers a forthright discussion of his views and the disagreement within the IS ranks. Of particular interest is Weber’s mention that Orlikowski and Iacono generated a great deal of discussion with their 2001 paper (on whose title Weber bases his own title here). Metaresearch like Orlikowski and Iacono (2001) might be considered navel-gazing, but in this case, their paper addressed and provoked sustained discussion about the nature and future of the discipline. Orlikowski and Iacono (2001) thus stands as an example of research performing an important service for the discipline.

Orlikowski, W. J., & Iacono, C. S. (2001). Research commentary: Desperately seeking the “IT” in IT research – a call to theorizing the IT artifact. Information Systems Research, 12(2), 121-134.

The study is itself a hardcore lit review: 188 papers, the full corpus of articles published in ISR in its first ten years. It’s “metaresearch” about the discipline itself.

Four flavors of the tool view:

  1. labor substitution,
  2. productivity enhancement,
  3. info processing,
  4. social relation changing

Three proxy views:

  1. perception (survey attitudes),
  2. diffusion (#/density of artifacts),
  3. capital (dollars spent)

O&I mention the “productivity paradox” — increasing investment in IT w little demonstrable return, recognized late 1980s-early 1990s

Four flavors of the ensemble view:

  1. development project (examines all forces that influenced how tech came to be — lots of field studies)
  2. production network (similar, but focuses less on indiv artifacts and more on entire industry, global network, use throughout society)
  3. embedded system (social influences on adoption & use of tech)
  4. structure (tech embodies social structures of builders, influences social structures of users)

Computational view: “traditional computer science approach” [129]; not nuts to society, still interested in capacity to solve problems, but focus on computational power

  1. algorithm: focus on building/improving IS
  2. model: focus on representing phenomena, making simulations

Nominal: IS research in name only! tech absent

Lee, A. S. (1999). Rigor and relevance in MIS research: Beyond the approach of positivism alone. MIS Quarterly, 23(1), 29-33.

Gold nugget from Lee: “I believe that there are often circumstances in which one of our responsibilities as academicians is to be the conscience for our practitioner colleagues and, indeed, for society in general” [31].

It is easy to imagine academics as the folks thinking about IS and practitioners as the folks doing things with IS. If that dichotomy is valid in any way, it makes sense that the role of conscience would fall to the academics. Assigning that role to academics doesn’t excuse practitioners to act with wanton disregard for the general welfare. All citizens have an obligation to act thoughtfully. But IS academics can position themselves uniquely to see beyond the confines of a single firm or industry (the normal realm of the practitioner) and recognize the broader social and moral ramifications of where practitioners are taking IS.

Of course, the role of conscience only increases the burden on IS academics to keep their perspectives and their entire field diverse. To answer not only “Can we do X?” but also “Should we do X?” we must be able in our worldviews to encompass algorithms and allegories, management theories and moral precepts.

Lyytinen, K. (1999). Empirical research in information systems: On the relevance of practice in thinking of IS research. MIS Quarterly, 23(1), 25-27.

Lyytinen pricks the conscience of this English teacher:

Even so, I would not trade off academic writing style and demand that we write in a manner that is simple, concise, and clear, as Benbasat and Zmud suggest. Instead I would expect that we educate our practitioners to appreciate brilliant intellectual efforts! Several IS phenomena are hard to understand and may demand difficult and esoteric language because they cannot be couched in the “common language.” Still, the message of a text written in an esoteric language can be relevant. For example, the fashionable use of Heidegger in understanding design or use of IT is neither possible nor useful unless the  reader can work through Heidegger’s thick concepts and ideas. My nightmare would be to emasculate Heidegger and dress him into the HBR format! [Lyytinen 27]

Lyytinen’s concern sounds like the same one I faced when askedby my superintendent to teach “modern language” — i.e., dumbed-down — versions of Romeo and Juliet and Hamlet. There are “brilliant” works in the English language that would lose value if couched language deemed more “simple, concise, and clear.”

But just how esoteric is our field? How hard can it be to explain what’s happening with information systems? Lyytinen’s suggestion that “academic writing style” and simplicity, conciseness, and clarity lie at opposite ends of the spectrum is distressing. Simplicity, conciseness, and clarity should lie at the heart of advice to any academic writer. They are relative terms — “concise,” for instance, may mean 200 pages for some topics, and “simple” may still require a full semester course to understand.

Shakespeare gets off the hook for extravagant wordsmithery because he is seeking artistic and emotional effect along with understanding of the human condition. Academics have a more focused mission: the expansion of knowledge and understanding. We should welcome the occasional well-turned phrase, but our impact and our aesthetic relies on understanding, Our efforts do little good if they do not spread understanding.

When Senator Mundt taught English, he kept a sign on his classroom wall: “If you can’t say it, you don’t know it!” If we can’t phrase our findings relatively simply, concisely, and clearly, maybe we ourselves don’t fully understand our findings.

Davenport, T. H., & Markus, M. L. (1999). Rigor vs. relevance revisited: Response to Benbasat and Zmud. MIS Quarterly, 23(1), 19-23.

D&M do B&Z (1999) one better, arguing we need to remake the discipline to more resemble law and medicine, fields where there is much greater academic-practitioner interaction. D&M seem to argue that B&Z advocate too much a path of being all things to all people. D&M would have us pursue a different course, making our discipline over more in the image of practitioners rather than trying to fit the mold of hard-core academics.

Interesting market-based argument: if students are consumers, IS schools need to produce more practical research that those consumers can consume, “thereby increasing the audience of reflective practitioners” [20]. Business translation: produce more practical, relevant research, and we get more customers, more tuition, more funding.

But viewing students as consumers isn’t just about increasing our revenue. Thinking about what future practitioners (are the majority of students future practitioners rather than future academics?) want and need will drive us to produce more relevant research.

Evaluation and policy research — I hope we’ve made progress in accepting such research since D&M wrote in 1999. Otherwise, my dissertation is in trouble!

I would think the Board of Regents would completely support this approach of including practitioner journal publications in a professor’s tenure-review portfolio. The Regents are all about relevance now.

Reinig, B. A., & Shin, B. (2002). The dynamic effects of group support systems on group meetings. Journal of Management Information Systems, 19(2), 303-325.

More noticeably, most experiments employ a one-shot methodology with ad hoc groups [35, 58], completely ignoring the effects of time and history on GSS dynamics and performance. A number of studies (for example [17, 18, 58]), however, have demonstrated that outcomes differ for both GSS and face-to-face groups over time. That is, the results of a group effort in an initial meeting are not necessarily indicative of the results realized in later meetings. For example, in a study conducted by Chidambaram et al. [18], GSS participants reported less group cohesion than their face-to-face counterparts in initial meetings, but in subsequent meetings the GSS participants reported more group cohesion than the face-to-face participants did. Also, field research examining the use of GSS with naturally occurring groups over long periods have consistently reported positive results, including reductions in project completion times and labor costs savings [39, 66, 71, 83]. The positive findings of field research stand in stark contrast to the inconsistent findings from experimental research. [305]

Why pull out this paragraph? Reinig and Shin do something very important with their lit review here. They identify a failing in not just one study but a whole class of studies. They point out important effects, time and history, that researchers in this field must take into account and can only do so through field research with real groups, not aritifical and ad hoc assemblages of experimental subjects. This point in the lit review not only justifies their own decision to use groups of students in “natural classroom settings” [311] but also gives future researchers something to watch out for as they review past research and plan their own projects.

Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS Quarterly, 26(2), xiii-xxiii.

“Because IS is an interdisciplinary field straddling other disciplines, you often must look not only within the IS discipline when reviewing and developing theory but also outside the field” [xvi].

Here’s where our interdisciplinary nature can bite us: to effectively review the literature, we can’t just read a few good IS journals or even scan all of the IS journals; we need to keep an eye out for relevant research in all related fields. Research on various IT-industry issues could pop up in any number of journals in economics, marketing, management, even engineering and psychology. If we’re really going to be interdisciplinary, we have to do a lot of reading. That just means we have to be smarter than everyone else.

Benbasat, I., & Zmud, R. W. (1999). Empirical research in information systems: The practice of relevance. MIS Quarterly, 23(1), 3-16.

[5] Just as important as, if not more important than, an article’s content is its style and tone. Stated simply, articles that are not read, regardless of their content, are not relevant. Articles that tend to be read by IS professionals are those that

• are shorter
• use more exhibits
• use everyday language, rather than esoteric or stilted language
• have less discussion of related literature
• have less discussion of a study’s methods
• have more contextual description
• have more prescriptions

Talk about a primer in how to write. Will any advisor or university encourage its doctoral candidates to follow such guidelines in writing their dissertations? What if we rewrote the dissertation outline and, per Benbasat and Zmud’s suggestion, defer the lit review and heavy methodology descriptions to appendices?

Let’s boil down the reasons for our seeming irrelevance:

  1. Emphasis on rigor over relevance: maybe we’re too busy trying to prove we’re real scientists and win academic respect instead of just studying problems and coming up with solutions.
  2. “lack of cumulative research tradition” — again, are we trying to make up for that shortcoming, our relative youth compared to other disciplines?
  3. Tech churn: the stuff we study changes so fast, our papers become irrelevant. How do we beat that? Seeking that level of abstraction might not help, might instead only move us further into the realm of eggheaded navel-gazing that already turns off business folks.
  4. Academicians not working in industry, not seeing what’s happening on the job — hmm… I’m starting to wonder if IS academicians are being asked to be all things to all people. Do we really need to pursue all that relevance? Are we uniquely expected to be relevant in addition to rigorous?
  5. Constraints on freedom of action within academia — ah! See what the authors say on the previous cause, apply it here. If we have to prepare set curricula that look good on WebCT or PowerPoint, we have a hard time conducting the free-flowing conversations that allow us to really get what’s going on in industry. We also can’t risk writing a paper that breaks the expected conventions of the publication review boards,  tenure review committees, grant-issuing agencies, etc. We can only safely take a swing at relevance if everyone else in the academy agrees to swing with us.

B&Z offer a number of recommendations for making IS research more relevant. B&Z themselves acknowledge that the simplest may be to write in a  “clear, simple, and concise manner” that makes our research accessible by all the potential readership of a journal” [12]. We’re academics; people know we’re smart. We don’t have to prove it with vocabulary; we should prove it with ideas.

Not that we should dumb ourselves down. We can still trot out some fifty-dollar words. But we should choose words as effective (and sometimes beautiful) conveyers of our thoughts, not as shibboleths.

Barua, A., Sophie Lee, C. H., & Whinston, A. B. (1996). The calculus of reengineering. Information Systems Research, 7(4), 409-428.

This article was assigned simply as a good example of a lit review, but it provides some good content for my general research area as well.

Note that right off, the authors find failure rates of 70% for reengineering projects. But this is 1996, before the PMP revolution, right? I look at Barua’s mention [412] of his own multi-stage business value models from 1989, 1991, and 1995 and wonder if they are still applicable. How odd to be working in a field where we automatically question whether research from just 10-20 years ago remains relevant to current research. But then Barua bases this research on complementarity, a theory Edgeworth put forth in 1881. We can fit modern technology into older theories. We certainly need to look for lasting theories that will keep our research relevant past our lifetimes.

Practical Lit review note: Barua et al. give a thorough review of the concept of supermodular functions, devoting a page and a half to a single article. They can justify this detailed attention to a single extradisciplinary article (if we are allowed to consider anything extradisciplinary to IS) by noting that the topic “has not been addressed in prior MIS research” [414]. Instead of guessing what their readers might know, Barua et al. let the existing literature direct their detail. MIS literature lacks information on the topic, so they take it upon themselves to introduce the topic to the MIS literature.

Did Thomas Friedman read this article before writing The World Is Flat? Barua proposes the complementarity theory. He also talks about the move in reengineering away from “the Adam Smith division-of-labor attitude and to shift to a process oriented frame of thinking” and “structural changes such as replacing the conventional hierarchical structure with cross-functional teams… and decision making empowerment” [Barua (1996) 412]. Friedman talks about “the complementary convergence of the ten flatteners” [Friedman (2005) 177] (events and innovations that have made the world market more interconnected and accessible to all participants) and the shift of businesses and individuals “from largely vertical means of creating value to more horizontal ones” [Friedman (2005) 175]. Can we say that Friedman bears out for the lay reader the operation on a global scale of the firmwise theory Barua introduces to IS? Not bad foresight on Barua’s part, for a theory and formula based on “anecdotal evidence available in the reengineering literature” [416].

Given all the talk about reengineering including big personnel cuts, do the profit and productivity benefits for each company translate into net benefits for the surrounding community? Increased productivity and profits for a company don’t mean much for us if we lose our jobs. Do the extra profits get plowed back into the economy the same way all those paychecks used to? Are those downsized workers freed up to engage in other business activities that they are better suited for? Reengineering might be useful for everyone with respect to state government — we downsize government agencies, provide more efficient service while enabling tax cuts or reprioritization of tax dollars for other programs. But all those job cuts in the private sector don’t look good for the community’s economic picture.

[421] IT fundamentally makes things more horizontal (Friedman’s flatness). More people at all levels, bosses and peons, have access to more information. To act on that information, we have to horizontalize the power structure as well. All the info in the world won’t help the peons if they still have to fill out requests in triplicate to be able to act on that info. Bosses don’t get to hog all the information; they can’t hog all the decision-making any more, either. IT drives Friedman flatness.

[421] But it looks like reengineering won’t save money unless it allows us to dump people. Salaries are the budget killer; IT needs to save us from the trouble of hiring people to work for us. Ugh. IT in itself doesn’t save the budget; shedding workers does.