Lee, A.S. “A scientific methodology for MIS case studies.,” MIS Quarterly (13:1), March 1989, pp 33-50.

In message 207 on Saturday, October 27, 2007 5:17pm, Charles Myers (cemyers) writes:
>Lee presents an argument against the use of a single case study for a research topic.

Actually, Lee is defending single-case-study research.

Key rhetorical strategy: Lee says he chooses the natural science model as the ideal for the social science model, in part because the critics of single-case studies often ground their criticism in natural science methodology; Lee thus intends to respond to the critics in their own language and show that single-case studies can satisfy natural science criteria [34].

The four problems of single-case studies:

>–Making controlled observations
>–Making controlled deductions
>–Allowing for replicability
>–Allowing for generalizability

To understand how Lee defends single-case studies against the charges that they can’t satisfy the above 4 criteria, Lee appeals to 4 other standards laid out by Popper, who said you have a good theory when it is…

1. falsifiable
2. logically consistent
3. competitively predictive (it predicts stuff at least as well as competing theories)
4. durable (scientists test it, and it survives!)

Markus tests her three theories, falsifies two of them, and finds the third (the interaction theory) satisfies Popper’s conditions. Someone else could still construct a test to falsify it; it’s consistent with what Markus observed; it makes better predictions than two competing theories; and it survived this case study. That’s science!

>Seemely Allen both agreed with Marcus and disagreed with Marcus.
>In the end Ibelive he states that multple case study approach is better
>for reasearch because of the rigor and potential for quantity and quality
>of data.

I don’t want to offer excuses for sloppy science — I’ll take multiple examples over one anecdote any day — but I might propose the 9/11 Theory of IS Research in support of single-case studies. The United States restructured the entire federal security apparatus, engaged in two significant military adventures, and radically changed the nature of defendant rights with the PATRIOT Act, all on the basis of a single event, one coordinated terrorist attack. Without commenting on the wisdom of any particular policy adopted, I think it is safe to say that scorn and derision would have met anyone who stood up on September 12 and said, “Hang on, before we form any theories or design any solutions, we’d better try replicating the experiment and controlling the variables. We can’t derive any useful knowledge from just one event.”

In practice, single cases form the basis of important decisions and life-lessons all the time.  Granted, single events also often form the basis of really bad decisions and prejudices. Scientists have an obligation to be circumspect, to look around for more than one case when possible to back up what they wish to argue. But sometimes a single case can provide such powerful, useful information that we’d be fools to ignore it.

Given that IS interacts so closely with the world of practice (we really are flying the plane, studying it, and swapping out bolts and rotors all at the same time!), maybe we IS researchers get special dispensation to do more single-case studies. Astrophysics and molecular biology don’t move fast like IS; we’ve got to provide folks some useful results before the technology we’re studying mutates into something completely unrecognizable. Maybe we need to make the most we can out of single-case studies, give business all the advice and best-guesses we can derive, and hope for the best as we scramble to study the next new thing.

Weber, R. (2003). Editor’s comments: Still desperately seeking the IT artifact. MIS Quarterly, 27(2), iii-xi.

Weber offers a forthright discussion of his views and the disagreement within the IS ranks. Of particular interest is Weber’s mention that Orlikowski and Iacono generated a great deal of discussion with their 2001 paper (on whose title Weber bases his own title here). Metaresearch like Orlikowski and Iacono (2001) might be considered navel-gazing, but in this case, their paper addressed and provoked sustained discussion about the nature and future of the discipline. Orlikowski and Iacono (2001) thus stands as an example of research performing an important service for the discipline.

Orlikowski, W. J., & Iacono, C. S. (2001). Research commentary: Desperately seeking the “IT” in IT research – a call to theorizing the IT artifact. Information Systems Research, 12(2), 121-134.

The study is itself a hardcore lit review: 188 papers, the full corpus of articles published in ISR in its first ten years. It’s “metaresearch” about the discipline itself.

Four flavors of the tool view:

  1. labor substitution,
  2. productivity enhancement,
  3. info processing,
  4. social relation changing

Three proxy views:

  1. perception (survey attitudes),
  2. diffusion (#/density of artifacts),
  3. capital (dollars spent)

O&I mention the “productivity paradox” — increasing investment in IT w little demonstrable return, recognized late 1980s-early 1990s

Four flavors of the ensemble view:

  1. development project (examines all forces that influenced how tech came to be — lots of field studies)
  2. production network (similar, but focuses less on indiv artifacts and more on entire industry, global network, use throughout society)
  3. embedded system (social influences on adoption & use of tech)
  4. structure (tech embodies social structures of builders, influences social structures of users)

Computational view: “traditional computer science approach” [129]; not nuts to society, still interested in capacity to solve problems, but focus on computational power

  1. algorithm: focus on building/improving IS
  2. model: focus on representing phenomena, making simulations

Nominal: IS research in name only! tech absent

Lee, A. S. (1999). Rigor and relevance in MIS research: Beyond the approach of positivism alone. MIS Quarterly, 23(1), 29-33.

Gold nugget from Lee: “I believe that there are often circumstances in which one of our responsibilities as academicians is to be the conscience for our practitioner colleagues and, indeed, for society in general” [31].

It is easy to imagine academics as the folks thinking about IS and practitioners as the folks doing things with IS. If that dichotomy is valid in any way, it makes sense that the role of conscience would fall to the academics. Assigning that role to academics doesn’t excuse practitioners to act with wanton disregard for the general welfare. All citizens have an obligation to act thoughtfully. But IS academics can position themselves uniquely to see beyond the confines of a single firm or industry (the normal realm of the practitioner) and recognize the broader social and moral ramifications of where practitioners are taking IS.

Of course, the role of conscience only increases the burden on IS academics to keep their perspectives and their entire field diverse. To answer not only “Can we do X?” but also “Should we do X?” we must be able in our worldviews to encompass algorithms and allegories, management theories and moral precepts.

Lyytinen, K. (1999). Empirical research in information systems: On the relevance of practice in thinking of IS research. MIS Quarterly, 23(1), 25-27.

Lyytinen pricks the conscience of this English teacher:

Even so, I would not trade off academic writing style and demand that we write in a manner that is simple, concise, and clear, as Benbasat and Zmud suggest. Instead I would expect that we educate our practitioners to appreciate brilliant intellectual efforts! Several IS phenomena are hard to understand and may demand difficult and esoteric language because they cannot be couched in the “common language.” Still, the message of a text written in an esoteric language can be relevant. For example, the fashionable use of Heidegger in understanding design or use of IT is neither possible nor useful unless the  reader can work through Heidegger’s thick concepts and ideas. My nightmare would be to emasculate Heidegger and dress him into the HBR format! [Lyytinen 27]

Lyytinen’s concern sounds like the same one I faced when askedby my superintendent to teach “modern language” — i.e., dumbed-down — versions of Romeo and Juliet and Hamlet. There are “brilliant” works in the English language that would lose value if couched language deemed more “simple, concise, and clear.”

But just how esoteric is our field? How hard can it be to explain what’s happening with information systems? Lyytinen’s suggestion that “academic writing style” and simplicity, conciseness, and clarity lie at opposite ends of the spectrum is distressing. Simplicity, conciseness, and clarity should lie at the heart of advice to any academic writer. They are relative terms — “concise,” for instance, may mean 200 pages for some topics, and “simple” may still require a full semester course to understand.

Shakespeare gets off the hook for extravagant wordsmithery because he is seeking artistic and emotional effect along with understanding of the human condition. Academics have a more focused mission: the expansion of knowledge and understanding. We should welcome the occasional well-turned phrase, but our impact and our aesthetic relies on understanding, Our efforts do little good if they do not spread understanding.

When Senator Mundt taught English, he kept a sign on his classroom wall: “If you can’t say it, you don’t know it!” If we can’t phrase our findings relatively simply, concisely, and clearly, maybe we ourselves don’t fully understand our findings.

Davenport, T. H., & Markus, M. L. (1999). Rigor vs. relevance revisited: Response to Benbasat and Zmud. MIS Quarterly, 23(1), 19-23.

D&M do B&Z (1999) one better, arguing we need to remake the discipline to more resemble law and medicine, fields where there is much greater academic-practitioner interaction. D&M seem to argue that B&Z advocate too much a path of being all things to all people. D&M would have us pursue a different course, making our discipline over more in the image of practitioners rather than trying to fit the mold of hard-core academics.

Interesting market-based argument: if students are consumers, IS schools need to produce more practical research that those consumers can consume, “thereby increasing the audience of reflective practitioners” [20]. Business translation: produce more practical, relevant research, and we get more customers, more tuition, more funding.

But viewing students as consumers isn’t just about increasing our revenue. Thinking about what future practitioners (are the majority of students future practitioners rather than future academics?) want and need will drive us to produce more relevant research.

Evaluation and policy research — I hope we’ve made progress in accepting such research since D&M wrote in 1999. Otherwise, my dissertation is in trouble!

I would think the Board of Regents would completely support this approach of including practitioner journal publications in a professor’s tenure-review portfolio. The Regents are all about relevance now.

Reinig, B. A., & Shin, B. (2002). The dynamic effects of group support systems on group meetings. Journal of Management Information Systems, 19(2), 303-325.

More noticeably, most experiments employ a one-shot methodology with ad hoc groups [35, 58], completely ignoring the effects of time and history on GSS dynamics and performance. A number of studies (for example [17, 18, 58]), however, have demonstrated that outcomes differ for both GSS and face-to-face groups over time. That is, the results of a group effort in an initial meeting are not necessarily indicative of the results realized in later meetings. For example, in a study conducted by Chidambaram et al. [18], GSS participants reported less group cohesion than their face-to-face counterparts in initial meetings, but in subsequent meetings the GSS participants reported more group cohesion than the face-to-face participants did. Also, field research examining the use of GSS with naturally occurring groups over long periods have consistently reported positive results, including reductions in project completion times and labor costs savings [39, 66, 71, 83]. The positive findings of field research stand in stark contrast to the inconsistent findings from experimental research. [305]

Why pull out this paragraph? Reinig and Shin do something very important with their lit review here. They identify a failing in not just one study but a whole class of studies. They point out important effects, time and history, that researchers in this field must take into account and can only do so through field research with real groups, not aritifical and ad hoc assemblages of experimental subjects. This point in the lit review not only justifies their own decision to use groups of students in “natural classroom settings” [311] but also gives future researchers something to watch out for as they review past research and plan their own projects.