Benbasat, I., & Zmud, R. W. (1999). Empirical research in information systems: The practice of relevance. MIS Quarterly, 23(1), 3-16.

[5] Just as important as, if not more important than, an article’s content is its style and tone. Stated simply, articles that are not read, regardless of their content, are not relevant. Articles that tend to be read by IS professionals are those that

• are shorter
• use more exhibits
• use everyday language, rather than esoteric or stilted language
• have less discussion of related literature
• have less discussion of a study’s methods
• have more contextual description
• have more prescriptions

Talk about a primer in how to write. Will any advisor or university encourage its doctoral candidates to follow such guidelines in writing their dissertations? What if we rewrote the dissertation outline and, per Benbasat and Zmud’s suggestion, defer the lit review and heavy methodology descriptions to appendices?

Let’s boil down the reasons for our seeming irrelevance:

  1. Emphasis on rigor over relevance: maybe we’re too busy trying to prove we’re real scientists and win academic respect instead of just studying problems and coming up with solutions.
  2. “lack of cumulative research tradition” — again, are we trying to make up for that shortcoming, our relative youth compared to other disciplines?
  3. Tech churn: the stuff we study changes so fast, our papers become irrelevant. How do we beat that? Seeking that level of abstraction might not help, might instead only move us further into the realm of eggheaded navel-gazing that already turns off business folks.
  4. Academicians not working in industry, not seeing what’s happening on the job — hmm… I’m starting to wonder if IS academicians are being asked to be all things to all people. Do we really need to pursue all that relevance? Are we uniquely expected to be relevant in addition to rigorous?
  5. Constraints on freedom of action within academia — ah! See what the authors say on the previous cause, apply it here. If we have to prepare set curricula that look good on WebCT or PowerPoint, we have a hard time conducting the free-flowing conversations that allow us to really get what’s going on in industry. We also can’t risk writing a paper that breaks the expected conventions of the publication review boards,  tenure review committees, grant-issuing agencies, etc. We can only safely take a swing at relevance if everyone else in the academy agrees to swing with us.

B&Z offer a number of recommendations for making IS research more relevant. B&Z themselves acknowledge that the simplest may be to write in a  “clear, simple, and concise manner” that makes our research accessible by all the potential readership of a journal” [12]. We’re academics; people know we’re smart. We don’t have to prove it with vocabulary; we should prove it with ideas.

Not that we should dumb ourselves down. We can still trot out some fifty-dollar words. But we should choose words as effective (and sometimes beautiful) conveyers of our thoughts, not as shibboleths.

Advertisements
Lee, A. S., Zmud, R. W., Robey, D., Watson, R., Zigurs, I., Wei, K.-K., et al. (2001). Editor’s comments: Research in information systems: what we haven’t learned. MIS Quarterly, 25(4).

Zmud notes that lots of research has looked at whether firms are getting a good return on investment in IT. That’s encouraging for my dissertation purposes — lots of literature to review. He also notes that much of the IT investment has been in small-return areas. Zmud directs us toward study of on-going IT portfolio management with an eye toward high-return areas.

Robey seems to embrace diversity in IS. He encourages us not to focus so exclusively on publication in MISQ and instead fire our missives out to other journals. Such encouragement fits with the outreach push discussed in an earlier article, the idea that we should evangelize our findings and our field by seeking publication in journals of related fields. Robey also asks for more “interesting” work — i.e., research with “unconventional departures from accepted wisdom” that opens “new and interesting avenues for inquiry.” That sort of “interesting” work will only come from diverse viewpoints, not a rigid delineation of what IS is and isn’t.

Lewis offers my kind of bullet list: six key principles, all tied to one core goal: improving organizational performance. But do these principles comprise a “good grand theory,” or are we just restating the same sensible goals that every organization has: do the job faster, better, cheaper? A grand theory for the field needs to be something more than a goal or value statement. I would assume we are pursuing something like the grand unifying theory of physics, a general theory that explains how things work, not just how we want them to work. Then again, how profound a result is it to say, “These systems work, these don’t”? Is that enough to make us more than a glorified vo-tech? Is our field so weird and diverse that its grand theory needs to be an odd merging of physics and philosophy? (Maybe I’m just grappling with the issue Webster refers to in this article as “meta-theory,” which she says IS hasn’t sufficiently tackled.)
Zigurs also beats the diversity drum. We have to make the expansion of knowledge and understanding our main goal, not our methodology or personal definition of the field. We do not have to put our own approach to IS on a podium to advance the field. We need to keep the doors open to all comers. As long as researchers are interested in the core goal Lewis identified, improving organizational performance with IS, and pursue that goal with scientific rigor, we should communicate and work with them, keep our doors and journals open to them, regardless of theory, methodology, or specialization. We stand to learn (and serve!) from every approach.

Wei and I will be good friends, it appears, at least with respect to increasing the connections between IS and economics. I’m also intrigued by Wei’s suggestion that system development needs more research. We need to look not only at the impact of IS on everything else, but of IS on IS. How do programmers, analysts, and other participants in the software development process get from “Gee, I wish we could…” to the shelves at Best Buy? That sounds like some navel-gazing that would be very interesting!

Myers — another vote for diversity, but with a caveat: we can have lots of researchers looking into lots of different problems, but we still need to be able to talk to, work with, and learn from each other. Again, if there is a Grand Unified Theory of IS, we will find it by comparing notes from all the different disciplines that IS overlaps and finding out what the IS-econ, IS-health-tech, IS-marketing, IS-behavioral, IS-etc. research has in common.

Sambamurthy: another new friend for my research area!

What business and IT capabilities, structure, and processes are associated with continued success in leveraging information technologies for superior performance through innovation, globalization, speed-to-market, operational excellence, cost leadership, and customer intimacy?

Add “in South Dakota” to the end of that question, and you’ve got my research area. I would spin a different alternative from Sambamurthy’s recommendations: where Sambamurthy sees potential for global and specific studies of specific IT capabilities, I would add the possibility of studying those capabilities in specific regions and cultures. Friedman tells us the world is flat, but differences in wealth, education, infrastructure, and who knows what else may show some differences in how those various IT capabilities, structures, and processes work in different regions.

Sambamurthy will also be all over my intended research sources:

“…[R]elevant ideas about business or IT capabilities are not likely to emerge simply from a literature review. Researchers must develop their insights by examining the trade press, working with a few companies, talking to senior executives, and then blending these emerging insights with theory and prior literature.”

Expect to see Darin Namken in my bibliography….

Agarwal mentions “IT human capital,” a concept perhaps particularly relevant in South Dakota. As I research the impact of investment in IT, I will certainly want to consider measures of productivity and profit. But I may also want to look for a way to quantify the creation of IT human capital, students and workers who because of the incorporation of IT in their schools and businesses become experts (of varying degrees) in IT and can apply that knowledge themselves, not only within the innovating institution, but in other pursuits. IT implementation and integration improves the performance of the specific institution being studied (we hope), but it also creates more IT human capital and thus a more marketable, value workforce for attracting other businesses. Then we can talk about balancing costs: turnover costs for specific firms versus overall benefits for the state economy.

Benbasat, I., & Zmud, R. W. (2003). The identity crisis within the IS discipline: Defining and communicating the discipline’s core properties. MIS Quarterly, 27(2), 183-194.

Cognitive legitimacy = it exists
Sociopolitical legitimacy = it should exist (we’ll fund it, publish it, etc.)

The first Benbasat article (with Myers) must have worn me out. “The Identity Crisis…” didn’t add to my understanding of the problem, only reframed the same concerns as the previous article. The moral of this article: don’t forget the info systems in your info systems research.