…plus a quick gripe about IEEE style!

Here’s something to kick around: move all scholarly IS research, review, and publication online. We’re IS geeks; we ought to be able to use IS to optimize our own processes.

I’m turning a couple previous posts (here and here) into actual homework for INFS 838. The topic bubbled to the top of the soup after Freiburg-im-Breisgau trumped me on ePB. Here’s a really rough draft: your comments and critique are welcome!

And my apologies for the citations: this is still a growing draft, so the citations are by author name rather than IEEE ref number. Has it occurred to anyone else that IEEE is a really inconvenient format for writing? Under APA, I can write a complete citation as I draft. (Heidelberger, 2009) — boom! I’m done. I can add all sorts of other references, and I don’t have to sweat changing the numbers. But in IEEE, as I’m drafting, I have leave a placeholder for the in-text citation [Heidelberger], then go back and put in those citation numbers from the refs page after I’m all done with the paper. And then if I find one more really good source to add to the paper, and the author’s name starts with A, I have to change the whole numbering scheme. What was IEEE thinking?

Besides, double-columns drive me nuts. When I’m reading a PDF, I have to keep scrolling back up to read column 2. Enough already! Down with IEEE!

O.K., let’s see a working draft:

—————————————-

Constant Peer Review: Improving Research
Through Online Writing, Review and Publication

1. Introduction

Research and publication are slow. The world is not. Information systems research in particular often lags behind the realities of a swiftly evolving Internet. When the time between inception and publication of a research project may be measured in years, information systems researchers face a significant challenge to producing results practitioners will find relevant.

Many scholars and institutions are turning to online publication to make academic research more easily accessible to colleagues across regional and disciplinary boundaries as well as to the general public. However, traditional journal publication remains the “gold standard” for researchers seeking prestige and promotion. Researchers have not given much attention to the question of how the application of IT beyond mere automation might affect the peer review process [Mandviwalla].

This research will propose a model for and means for testing wiki-publication for scholarly research. We propose to use as a testing ground the graduate program in Information Systems at a small American university. Masters and doctoral students will particiapte in a program of online, public composition, peer/instructor review, and publication of all research-related assignments. Program leaders will evaluate quality of student online wor against the benchmark of previous cohorts’ performance on similar assignments in the curriculum. It is hoped that these students will produce better research, will engage their peers in more constructive discussion of each other’s work, and carry from their graduate school experience an attitude more open to constant and open online peer review of research in progress, which in turn will promote an even more collaborative knowledge-sharing culture in academia that will produce better, more relevant research.

2. Standard Journal Publication

Information systems research appears to be failing to meet its challenge to inform practitioners, students, and scholars in other disciplines [Gill]. Among other dismal indicators, practitioner contributions to MIS Quarterly have dwindled significantly in the last two decades, and MIS journals have the second-lowest frequency of citation in out-of-field journals [Gill].

Lag time between submission and publication has been cited as one reason information systems journals may fail to connect with broader audiences. A survey of decision support systems research found lag times of up to five years between submission and publication hindering relevance [Arnott].

Scholars have attempted to widen the publication pipeline in various ways. Scholars in 2007 engaged in a Deplhi study to identify a larger number of “premier” IS journals, in the hope of establishing more avenues in which scholars might seek publication, not only for the higher goal of expanding scientific knowledge but also for the very practical goal of strengthening their vitae and earning tenure at their institutions [Saunders]. Another senior scholar committee in the same Delphi study recommended a number of changes to speed up the review process, including capping the number of reviewers to two, capping review lengths to four pages, charging submission fees, and using “online reviewing systems to streamline workflow [Saunders].

3. Online Publishing: Current Practice

Movement toward online publishing appears to encounter an arguably ironic resistance from the information systems research community. A recent Delphi study asked  senior IS scholars to rank 19 categories of suggested strategies for increasing the number of high quality papers published. “Go electronic” ranked 18th in importance. Electronic publishing fared somewhat better among junior researchers in the study, ranking 11th out of 17 categories [Saunders].

Nonetheless, numerous scholars and organizations are attempting to establish open access publishing as a method of knowledge management for all academia. A meeting of scholars sponsored by the Open Society Institute in December 2001 gave birth to the Budapest Open Access Initiative, which calls for “world-wide electronic distribution of the peer-reviewed journal literature and completely free and unrestricted access to it by all scientists, scholars, teachers, students, and other curious minds” [Open Society Institute]. Starting with 16 signatories, the BOAI currently boasts 5014 individual and 489 organizational signatories worldwide [Open Society 2]. Research supports the idea that articles freely available online will be cited more frequently, indicating greater spread of knowledge [Lawrence].

Online publishing has taken different forms. Lund University Libraries of Copenhagen maintains the Directory of Open Access Journals. DOAJ’s mission is to list online research journals following BOAI principles and exercising peer review or editorial quality control. DOAJ currently lists 4113 journals, 1484 of which are searchable by article. The service currently includes 274,273 articles [DOAJ]. The journals listed here publish articles online, but generally do not conduct open peer review or host reader commentary.

The Public Library of Science adopts the open access model in its publication of seven online journals. Six of the journals are dedicated to life/health science, while the seventh, PLoS ONE, is open to all scientific and medical research. PLoS maintains standard peer review practices, with perhaps the exception that the PLoS ONE reviewers determine only whether a submission is technically sound, not its importance. Once an article is published, it is subject to rating, comment, and discussion by all users, which the editors view as an ongoing determination of an article’s importance and contribution [PLoS]. Time from submission to publication appears to range from two to six months.

On the positive side, reader comment and discussion provides a useful supplement to the peer review process. Even peer reviewers can make mistakes and allow substandard work to pass; attentive readers can catch and flag such work with their comments and ratings [Anderson].

Where PLoS fails in openness is in its business model: it charges authors for publication. Publication fees range from US$1300 for PLoS ONE to US$2850 for PLoS Biology and PLoS Medicine. [PLoS] Even online publishing costs money, and PLoS defrays costs by offering discounts for institutional members and fee waivers for researchers unable to afford publication costs. If the open access model gains favor in academia, institutions may be able to transfer expenses from journal subscriptions to support for online organizations like PLoS, but in the meantime, such high prices for online publication do erect a barrier to participation for some researchers causes the model to fall short of fully open access.

Cornell University’s arXiv is a “moderated repository,” not an online journal [arXiv Primer]. Started in 1991, arXiv provides open online access to a growing body of e-prints, currently over 536,000 scholarly articles in physics, mathematics, computer science, quantitative biology, quantitative finance, and statistics. arXiv conducts no peer review process. New users submitting their first paper and registered users submitting for the first time in a particular topical archive usually must receive endorsement from another arXiv user. Standards for endorsement are relatively light:

The endorsement process is not peer review [emphasis in original]. You should know the person that you endorse or you should see the paper that the person intends to submit. We don’t expect you to read the paper in detail, or verify that the work is correct, but you should check that the paper is appropriate for the subject area. You should not endorse the author if the author is unfamiliar with the basic facts of the field, or if the work is entirely disconnected with current work in the area [arXiv].

Without peer review, submissions generally achieve publication within a single business day. arXiv moderators review the repository to identify and suggest removal of articles based on inappropriate format, inappropriate topic, duplicated content, misuse of copyrighted material, and excessive submission rate [arXiv mod]. This moderation falls well short of the interaction and improvement possible under peer review. Like PLoS, interaction with the general community of scholars and readers takes place only post publication.

The Public Knowledge Project, a partnership of the University of British Columbia, Simon Fraser University, and Stanford University, has developed Open Journal Systems, an open-source management and publishing system for online journals. According to PKP figures, almost 3000 online journals worldwide were using OJS to manage or publish content as of March 2009 [PKP]. (In a breakdown by continent, North America has the second-highest number of journals using OJS, 833, surpassed, interestingly, by South America, with 877 journals using OJS.)

OJS supports some information systems publications, including the Electronic Journal of Information Systems in Developing Countries (EJISDC) and the Australasian Journal of Information Systems (AJIS). The Public knowledge Project sees open-source, open access journal publication as an important means to support the development of independent scholarship in developing coutnries [Esseh].

Some research as been conducted on online publications. One trial found that while open access articles appear to attract more unique readers and more downloads of full article text, open access does not currently appear to correlate with increased citation rates in subsequent work [Davis]. However, comparisons of citation rates may be impacted by perceptions among researchers that online articles are inferior to those obtained from traditional sources [Odlyzko]

4. Model for Online Scholarly Research and Publication with Constant Pre- and Post-Publication Peer Review

Scholarly publication cannot be instant. Good research takes time. Responsible review of such research takes time. Removing the practical delays of producing a paper journal by moving publication online does not in itself reduce the delays inherent in a formal peer review process. Public access to journal articles is important, and the Internet is an obvious choice that many outlets are already making to provide that access faster. We may use Web 2.0 technology to enhance and expand peer review as well. An online scholarly research and publication system that follows the open access model at all stages will look like this:

  1. Researchers will present research ideas, raw data, working drafts, and accepted publications online in standard HTML format. File attachments in other formats are acceptable, but accessing research data should require as few distinct software applications as possible. The ideal is that all relevant information will be accessible with a basic Web browser.
  2. Citations will use hyperlinks to original sources and supplementary information wherever possible and appropriate.
  3. All online documents, working and final, will include comment sections in which other scholars, practitioners, students, and other interested readers may ask questions, critique the document, or submit other commentary pertaining to the document. This element of the model supports the constructivist principle of that we build knowledge socially.
  4. The system should include RSS capability to allow interested readers to subscribe to specific research documents to receive updates on progress, revisions, and responses to comments and reviews [Mandviwalla]. That RSS capability should also permit users to subscribe by keyword and topic.
  5. Researchers will be able to clearly designate documents as “working” or “ready for full review” drafts. Documents ready for full review will be transmitted automatically to editors and submitted for transparent peer review. Documents undergoing full review will be open to comments only from designated peer reviewers.
  6. Peer reviews will appear online. Peer reviews will include names of reviewers, written comments, and numerical ratings akin to those on most current peer review forms. Identifying reviewers is a specific response to the concern expressed in [Saunders] that anonymity in the review process may cloak “a relatively inimical and unconstructive way of evaluating each other.”
  7. Upon completion of all assigned peer reviews, editors will change the status of reviewed documents from “ready for full review” to “reviewed.” A document of this status will feature an aggregate peer-review rating alongside the headline.
  8. Documents will support revision and discussion features to allow authors to incorporate recommendations from reviewers while preserving the original document.

The research plan outlined below attempts to implement the key features of this model in a single software platform shared by all users. However, the model need not be executed with a centralized software scheme. Especially for pre-publication purposes of serving collaborative development of good research, researchers may carry out the drafting and comment solicitation aspects of the model on their own websites, with wiki, blog, or other content management software of their choosing.

5. Research Plan: Design, Implementation, Evaluation

Ideally this research would establish a scholarly research and publication clearinghouse with the participation of many active scholars and test the effectiveness of a system following the above model in actual scholarly publication. Creating such an online system and drawing a critical mass of scholarly participants is a daunting task. It seems reasonable, therefore, to try out such a system in a more controlled laboratory-like setting. This research will test the model of wiki-publication for scholarly work through integration in the curriculum of the graduate program in Information Systems at a small American university. It will investigate the following research questions:

  1. Does online drafting and publication promote increased research-related student interaction?
  2. Do students perceive online interaction in the course of their research and writing to be beneficial their intellectual output?
  3. Does online drafting and publication improve the quality of student research and writing?
  4. Does online drafting and publication attract participation (readership or commentary) from individuals outside the immediate university community?

5.1 Prototype Academic Wiki-Publishing Tool

The online knowledge management tool deployed for this experiment will have the following features:

  • Secure registration and log-in for authorized users.
  • Administrative settings to allow flexible and multiple definitions of user roles and permissions to allow students varying levels of access to publish their own documents, comment on other students’ work, and, where appropriate, make revisions to documents created by other users.
  • Ability to set viewing permissions for specific documents to private; however, this feature should be used sparingly and in special cases only, as the purpose of the experiment is to expose the broadest range of scholarly activity to public view.
  • Ability to store all revisions and revert to any version of a document.
  • Discussion capability attached to each document.
  • RSS feeds customizable by author, topic, course number, etc. to allow all users, students and professors alike, to follow updates and discussion on specific assignments, activity by a specific author or set of authors, etc.
  • Search and tagging capability.

Numerous knowledge management tools can satisfy these requirements. The most logical choice is the well-tested MediaWiki platform, which supports Wikipedia, the largest and most successful collaborative knowledge management experiment on the Internet. While not strictly a wiki, Drupal content management software may also satisfy the requirements of this experiment. The Drupal installation may more easily facilitate auxiliary functions like discussions not directly related to research assignments; however, MediaWiki focuses more directly on the public collaborative scholarly production that is at the center of this project. Both platforms are open source and may be installed and maintained by existing staff and information systems students. As entirely Web-based software, both platforms will impose minimal technical requirements on students—nothing more than a functional Web browser, independent of any specific operating system or other software.

The Online Journal Systems software is obviously enjoying success among active publishing organizations and is a promising tool for large-scale online publishing. However, it appears to lack (at this time) the open discussion and review functions outlined in the requirements for this experiment. Plugins to support such features could be created, but at this time, MediaWiki and Drupal both appear to offer more functionality “out of the box” to satsify both the requirements of this experiment and the adaptability necessary for the classroom.

5.2 Justification: Coherence with Current Practice and Educational Mission

The cooperating university in this experiment already uses proprietary curriculum management software to support distribution of course information and curricular materials, student discussion and collaboration, assignment submission, assessment activities, and grading. Various courses in the graduate program have used an assortment of online collaborative tools. Students are thus acquainted with using online tools for producing coursework.

Some students are already experimenting with publishing their coursework online through individual blogs and other online media. I have used a blog on WordPress to archive various assignments, take course notes, and record brainstorms and useful sources. I have also used this blog to post a dissertation proposal as an easy link to include in e-mails to potential committee members, fellow students, and the author best known for use of the planned methodology. Another doctoral student in the graduate program has incorporated commentary about his coursework and archived coursework on his own website alongside material relating to his other professional activities. After we found each other’s websites, the other student read my notes on a presentation he prepared for class and said he learned from reading an independent account of his own work. That student’s work also provided key sources and insights that formed the basis of a conference paper and other assignments that I have created. Both of us started putting content online in hopes of receiving input from other readers, and we have already benefited from such input. Our experience suggests a reasonable and pedagogically responsible justification for involving more students in the doctoral program in an expanded model of sharing scholarly work online.

The university at which this experiment will take place is in the process of building its doctoral program and expanding its graduate course offerings. Long known mostly as a teaching institution, the university is making great strides in building a complementary research culture. Faculty and students alike are being encouraged to conduct more original research and seek publication and conference presentation opportunities. This experiment fits with existing curriculum and research goals to promote scholarship that will withstand rigorous public scrutiny. As a side benefit, presenting student work (both in-progress and complete) offers an opportunity to raise the university’s public profile, another logical goal of the university.

5.3 Implementation

Doctoral courses at the participating university will migrate student coursework research and writing activities to the chosen software platform (referred to hereinafter as the wiki). There coursework activities will include student discussions, collaborative project development, multimedia presentations, and working and final drafts of written research reports. The existing curriculum management software will still be used for official assignment submissions, restricted quizzes and tests, and grading activities. In compliance with the Family Education Rights and Privacy Act (FERPA), the institution will not post any grades or other official educational records on the wiki; those items will remain on the curriculum management system.

The university will phase in the wiki during its summer session, with full launch for use in all graduate course in the fall semester. The graduate department will also encourage students working on conference and journal papers to draft and publish those materials on the wiki. Managers of the experiment will provide assistance to students and faculty in applying Creative Commons licensing and establishing copyright over their publicly available work.

5.4 Methodology: Data Collection and Assessment

This experiment in online composition, peer review, and publication will be evaluated by several sets of data over at least one full academic year (fall, spring, and summer terms). The wiki will receive a rigorous review from unviersity faculty and officials independent of this investigation to determine whether to continue using this system or similar technology and pedagogical approach. The hope is that the program will prove sufficiently beneficial to continue its use in the university for at least three years, long enough to be used consistently by at least one cohort of students throughout their graduate course requirements, if not for their dissertation work.

5.4.1 Wiki content and usage.

The wiki itself will provide a wealth of publicly available data on the scholarly discourse supported by this system. The researchers will analyze the quantity and quality of student writing, editing, and interaction concurrently with student use of the system.

From a quantitative perspective, researchers will measure frequency of student activity, time distribution of activity (extent to which students cluster activity around assignment due dates), distribution of activity among users (all students will submit assignments, but how many will engage spontaneously in conversation? how many will volunteer comments and criticisms on other students’ work beyond requirements imposed by instructors?). Researchers will also analyze the text of online discussions, commentary, and pages to determine the relative frequency of certain types of contributions (social, scholarly, task-oriented, etc.). These data may be compared with similar data from previous semesters’ student-generated content on the current curriculum management software.

The researchers will also compile statistics on usage of the wiki site. Relevant data will include number of visits and page views, number of unique IPs among visitors, and number of visits and contributions in comment sections attributable to students, faculty, and individuals outside the immediate university community.

From a qualitative perspective, the researchers will analyze the evolving texts created and revised by students to determine the extent to which students engaged in ongoing peer review of each other’s work (both in compliance with and beyond minimum course requirements) and the extent to which that online review during the brainstorming, research, and writing process appears to have shaped and improved the students’ work.

5.4.2        Student data.

The researchers will work with university staff during the implementation of the wiki to field student questions and suggestions. This work will not only ease the burden on institution staff. This interaction may compel the researchers to make changes in the system to ensure the best possible educational experience for all students, up to and including the possibility that, if the wiki is proving more of an obstacle than an enhancement to effective learning, it will be terminated. The educational mission of the university and its service to students always takes priority over the interests of the researchers in this project. However, empirical results have demonstrated that the use of online tools can support knowledge construction, creative expression, and accountability among students [Du], suggesting that the potential for educational detriment, while warranting vigilance, is minimal. As the researchers serve the institution as advisors on this implementation, they will also gain useful insights directly from students, in the form of online and in-person help requests, into the effectiveness of the wiki.

The researchers will solicit student comment via two methods. First, the wiki will include a clearly defined “About” section which explains clearly the objectives of the research project and invites student discussion about the wiki project itself. This “meta-discussion” area will allow anonymous comments, and researchers will respond to user questions and participate in further conversation. It will also provide students with e-mail addresses and phone numbers that will allow them to send comments privately and directly to the researchers. Second, the researchers will host “Graduate Social Hour” events during the first month of each semester. The researchers will conduct group discussions about the wiki, encouraging students to discuss their thoughts about the system. As motivation for students to participate, the on-campus events will also serve as social mixers to bring graduate students together for conversation and cookies. For off-campus students, the researchers will hold online versions of the Graduate Social Hour, conducted via Elluminate or other available online conferencing software. (The researchers will also send off-campus students mailable equivalents of the goodies served at the on-campus Social Hours to thank them for their participation.)

5.4.3        Faculty data.

Observations from faculty will be crucial to evaluating the effectiveness of the wiki as a model for scholarly discourse. Graduate faculty have experience in evaluating student work from both pedagogical and research perspectives. Faculty at this university can also provide perspective on the quality of student coursework and research relative to student performance in preceding academic years. The researchers will thus seek faculty input through a number of available channels.

The researchers will surely interact with faculty in their role as the main support personnel for the wiki. As convenient to the faculty members, the researchers will conduct ongoing conversations throughout the academic year with faculty about the progress (or lack thereof) students appear to be making under the wiki publication model. Faculty will have access to the wiki’s meta-discussion area and will be able to ask questions and guide the ongoing development of the system, just like their students. The researchers will participate as permitted in faculty meetings to discuss the system and gather further faculty impressions about the system. Again, the faculty mission to teach and mentor effective researchers comes first: if faculty input during the course of the experiment indicates that the experiment is damaging their ability to teach, the experiment will be immediately changed or, if necessary, terminated.

6. Conclusion

If, as argued in [Straub], Type II errors that reject useful papers are more serious than Type I errors that accept weak papers, online publication may provide a significant benefit to the academic community. Strong papers that receive low ratings from traditional peer reviewers will still be published online, where the realization by readers of the research’s utility can save it from obscurity, and the information systems field will benefit. At the same time, weak papers that slip by peer reviewers will still be available, just as in the current publishing model, but readers will be able to more directly and quickly rectify the reviewers’ error, minimizing the damage that may take place in the status quo.

This research attempts to make progress toward establishing the usefulness of ongoing online peer review and publication in supporting information systems research. If the experiment proves successful, it will have the added benefit of preparing graduate students for a developing publishing model that they will see gaining prominence as they move through their academic careers.

7. References

[1]         C. Anderson, “Technical solutions: Wisdom of the crowds,” Nature, 2006. Retrieved 2009.05.05 from http://www.nature.com/nature/peerreview/debate/nature04992.html

[2]         D. Arnott and G. Pervan, “Eight Key Issues for the Decision Support Systems Discipline,” Decision Support Systems, vol. 44, 2008, 657–672.

[3]         arXiv.org, “The arXiv Endorsement System,” Jan. 2004. Retrieved 2009.05.05 from http://arxiv.org/help/endorsement

[4]         arXiv.org, “The arXiv Moderation System,” Mar. 2009. Retrieved 2009.05.05 from http://arxiv.org/help/moderation

[5]         arXiv.org, “arXiv Primer,” Feb. 2009. Retrieved 2009.05.05 from http://arxiv.org/help/primer

[6]         P.M. Davis, B.V. Lewenstein, D.H. Simon, J.G. Booth, and M.J.L. Connolly, “Open access publishing, article downloads, and citations: randomised controlled trial,” BMJ,  vol. 337, Jul. 2008, p. a568. Retrieved 2009.05.05 from http://www.bmj.com/cgi/content/full/337/jul31_1/a568

[7]         Directory of Open Access Journals, home page, May, 2009. Retrieved 2009.05.05 from http://www.doaj.org/

[8]         H.S. Du and C. Wagner, “Learning with Weblogs: An Empirical Investigation,” Proceedings of the 38th Annual Hawaii International Conference on Systems Sciences, 2005, p. 7b.

[9]         S. S. Esseh and J. Willinsky, “Strengthening Scholarly Pulbishing in Africa: Assessing the Potential of Online Systems.” In Workshop on Online Scholarly Journal Pulbishing, 2006, Africa. Retrieved 2009.05.05 from http://pkp.sfu.ca/files/AfricanWorkshops.pdf

[10]     G. Gill and A. Bhattacherjee, “Whom Are We Informing? Issues and Recommendations for Mis Research from an Informing Sciences Perspective,” MIS Quarterly,  vol. 33, Jun. 2009, pp. 217–235.

[11]     S. Lawrence, “Online or Invisible?,” Nature,  vol. 411, 2001, p. 521. Retrieved 2009.05.05 from http://citeseer.ist.psu.edu/online-nature01/

[12]     M. Mandviwalla, R. Patnayakuni, and D. Schuff, “Improving the peer review process with information technology,” Decision Support Systems, vol. 46, Dec. 2008, pp. 29–40.

[13]     A. Odlyzko, “The rapid evolution of scholarly communication,” Learned Publishing,  vol. 15, Jan. 2002, pp. 7–19.

[14]     Open Society Institute, “Budapest Open Access Initiative,” Feb. 2002. Retrieved 2009.05.05 from http://www.soros.org/openaccess/read.shtml

[15]     Open Society Institute, “Budapest Open Access Initiative: View Signatures,” 2009. Retrieved 2009.05.05 from http://www.soros.org/openaccess/view.cfm

[16]     Public Knowledge Project, “Journals Using Open Journal Systems by Continent,” 2009. Retrieved 2009.05.04 from http://pkp.sfu.ca/ojs-geog

[17]     Public Library of Science, “PLoS ONE: Journal Information,” 2009. Retrieved 2009.05.05 from http://www.plosone.org/static/information.action

[18]     Public Library of Science, “Publication Fees,” 2009. Retrieved 2009.05.05 from http://www.plos.org/journals/pubfees.html

[19]     C. Saunders and I. Benbasat, “A Camel Going Through the Eye of a Needle,” MIS Quarterly,  vol. 31, 2007, pp. iv–xviii.

[20]     D.W. Straub, “Diamond Mining or Coal Mining? Which Reviewing Industry Are We In?” MIS Quarterly,  vol. 33, Jun. 2009, pp. iii–viii.

Advertisements