Skip to main content

WriteSim TCExam - An open source text simulation environment for training novice researchers in scientific writing

Abstract

Background

The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers)

Results

The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results.

Conclusion

WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods.

Peer Review reports

Background

Biomedical researchers need strong writing skills to obtain funding and to communicate the results of their research [1–3]. The success of grant proposals and research manuscripts depends as much on the quality of the writing as on the promise of the research or the significance of the results. Yet there is a lack of relevant extensive [4] and effective teaching mechanisms [5] in the area of scientific writing. When formal instruction is available, it is often reductive and mechanistic, [6] and fails to impart basic knowledge of rhetorical techniques, structure-content differentiation, style, clarity, and accuracy [7]. Researchers need a firm grounding in these concepts to participate in highly specialized scientific communities [8]. There is a clear need for instructional methodologies that incorporate hands-on experience, familiarization with existing literature, consistent practice, topical relevance, and explicit learning methods [9–12].

Innovations in information technology have given rise to new tools for science education. Increasingly sophisticated simulation environments, for example, have been used in a variety of disciplines, including engineering [13] economics, [14] and physics e.g., electrical circuits [15]. Flight simulators are used to train pilots and astronauts; [16–18] war games train military personnel; [19] and management games train business managers and decision makers [20, 21]. More recently, simulations have been used in clinical settings, such as critical care [22] and emergency medicine, [23] to train residents and medical students, and they have proven effective in teaching nurses how to respond to uncommon, composite clinical situations[24].

Simulation environments are designed to mimic real-world systems with sufficient accuracy to provide the rough equivalence of hands-on experience [25, 26]. They try to duplicate problems trainees will encounter in the real world, which improves their problem-solving skills; [27] as well, they can be programmed to simulate any situation, they standardize training routines, and they can be accessed at any time convenient to the user. Used consistently, they can substantially develop skills. Current practices in teaching scientific writing are largely based on trial and error, which discourages young researchers and makes poor use of their time. There are currently no simulation environments for developing writing skills. Text structure templates that serve to guide researchers on the role of a text block in a manuscript have been previously developed and tested by the research on research group (RoR) [28, 29]. However during informal use of these templates while coaching researchers, our group noticed that researchers appreciated the help of templates but at times found it difficult to populate the text blocks with relevant content. This stimulated our interest in exploring the role of simulation environments in coaching researchers in manuscript writing.

Given their success in other biomedical sciences, and their proven record of honing necessary skills, we believe simulation environments can be a powerful tool for scientific writing instruction. This paper presents our contribution in the form of WriteSim TCExam, an open source, web-based, textual simulation environment

Implementation

Design objectives

We set out to develop a simulation environment to educate novice researchers as to appreciate and use the proper structure, content, and style of scientific writing. We aimed to design a simulation environment that is: 1) web-based; 2) open-source; 3) well-structured and documented; 4) user-friendly; and 5) amenable to a question-and-answer format. Additionally, rather than starting from scratch, we decided to modify an existing application. We reviewed GoVenture simulation designer, [30] a commercial desktop-based application with which nontechnical users can build custom learning simulations. We also reviewed TCExam, [31, 32] a Web-based application that enables educators to create, schedule, deliver, and produce reports on surveys, quizzes, and exams. Finally we chose TCExam for its simple, intuitive interface and its open-source architecture.

We reviewed existing applications to find examples of features that could be added to TCExam to enable it to function as a textual simulation tool. We assessed the feasibility of each feature especially considering the scope of our objectives as well as the time and funds available. In the end, we decided to preserve the basic architecture and interface of TCExam [31], with the following modifications:

  1. 1.

    End users would receive immediate feedback upon answering questions. Incorrect answers would produce detailed explanations, and the user would then select from the remaining answers.

  2. 2.

    A grading mechanism would provide a summary of the user's performance, identifying areas for further improvement.

  3. 3.

    Blogs and forums would enable mentoring relationships via interactions among participants and between participants and the administrator. This would facilitate the exchange of ideas and help answer participants' questions in a friendly environment.

  4. 4.

    Persistent bugs in the existing version of TCExam would be corrected, and the user interface would be modified to make it more user-friendly.

After incorporating the above changes, we renamed the application as "Writesim TCExam." We maintained a list of potential further modifications, to be effected later, that were excessively time consuming or expensive, or beyond the current scope of the project.

Characteristics of Writesim TCExam users

Writesim TCexam has two distinct interfaces: admin interface and user interface, each have different set of uses. The 'admin interface' is meant to be used by senior researchers, mentors or course instructors (from now on referred to as "administrators") involved in training novice researchers in scientific writing. They can design the simulation material, upload it, design the simulation test and share it with novice researchers (from now on referred to as "users") who can access it through the 'user interface.' The 'user interface' helps the users to access and interact with the simulation material developed by the administrators.

Steps to design and implement a textual simulation environment

Step 1: Design the simulation material

Writesim TCExam allows administrators to easily design and manage simulation material related to manuscript and grant writing. For the purpose of Writesim TCexam, we define simulation material as examples designed from previous publications using the question - answer - answer key format.

By examining a series of examples, novice researchers can 1. learn to distinguish between good and poor scientific writing and understand the placement and flow of content in a manuscript. It equips them with the tools and perspective to evaluate and improve their own writing.

In parallel to modifying TCexam, we developed simulation material from 30 randomly selected peer reviewed publications published in Biomed central [33] and indexed in PubMed [34]. Next we analyzed the structure of each manuscript and dissected it to determine the text blocks [28] that drive the argument flow. We used these text blocks to formulate 100 simulation questions. These questions were focused on helping users to understand the role of various text blocks in a manuscript and how to populate the text blocks with relevant content.

We then prepared a list of possible answers to the questions. Some of these answers were very close to the right answer while others were distinctly unrelated. The underlying purpose was to enable users to look at a wide range of common errors and understand the distinction between ideal and not so ideal scientific writing. We also designed answer keys that would be displayed in a pop-up window, when the users selected wrong answers. We believe, the answer keys would enable the user to learn and understand the underlying reason behind correct/incorrect answers which would facilitate its application in their subsequent efforts at scientific writing. Next we classified the material into topics corresponding to the IMRD (Introduction, Methods, Results and Discussion) structure and uploaded them to Writesim TCExam. Since, initially our purpose was to test the modified TCExam application, we randomly selected and retrieved publications from pubmed. However, in future practice, we propose the use of high impact peer reviewed publications for the development of simulation material. We believe that it can enable users to appreciate the architecture of high impact publications while learning the nuances of scientific writing.

Step 2: Add the simulation material through Admin interface

Administrators can log in to the admin interface which has 2 major sections - a central work area and a navigation pane on the left. The navigation pane aids navigation through six major sections of the application: index, users, topics, tests, help, and info. The Index section, (displayed by default) provides a brief overview of the six sections and their respective subsections in the work area. Selection of a specific section/subsection displays a brief description of its purpose and functionality at the bottom of the screen. Detailed information about a majority of features and functionalities can be accessed from the documentation on the TCExam website [32].

Following sub-steps describe how simulation material can be added to Writesim TCexam:

i. Define topics

Writesim TCexam follows a topic-question-answer-answer key hierarchy. After defining the name of the topic, administrators can add a brief description and an image if needed. (Figure 1)

Figure 1
figure 1

Writesim TCExam - Workflow diagram for users.

ii. Add questions

After defining the topics for the simulation material, questions related to each topic can be uploaded in the question management ('questions') section. This section can be accessed from an icon just below the topics page or from the navigation pane. Administrators can add the question description along with images if any. Answer types, difficulty levels and disable/enable are additional features. Answer types range from single answer, multiple answer, free answer and ordered answers. (Figure 2)

Figure 2
figure 2

Users section of TCExam.

iii. Add answers and answer keys

In case of multiple answer type questions, administrators can add them along with their keys in the multiple answer management form. ('answers'). Administrators can also upload images, define right/wrong answers, score answers, define their position in the list of answers and enable/disable answers. Administrators can review all the uploaded material through the 'list' section in the navigation pane. (Figure 3)

Figure 3
figure 3

Adding answers and answer keys in Writesim TCexam.

Step 3: Implement simulation test

After uploading the simulation material, administrators can implement the simulation test for a single user or amongst a set of users ('groups'). For this purpose, following sub steps can be followed:

i. Add users and groups

Administrators can provide access to users/groups of users by filling up the 'User' form. A list of users can also be imported into Writesim TCexam. (Figure 4)

Figure 4
figure 4

Adding users and groups for a simulation test in Writesim TCexam.

ii. Design a simulation test

Administrators can use the test management form to design a simulation test. Test name, a brief description of the simulation test, period of user access and total test time are some of the fields of the test management form. Administrators can choose to randomly select simulation questions from the list of simulation questions previously designed. They can also define the score points for each question and choose whether results should be displayed to users at the end of the test. Access for the simulation test can be provided to specific user groups. Finally administrators can choose to include specific topics from a list of topics, number and type of questions and the difficulty level of questions. After designing the test, administrators can send users a link to the user interface along with their log in details [35] along with the log in details. (Figure 5)

Figure 5
figure 5

Designing a simulation test in Writesim TCexam.

Steps to use the textual simulation environment

Users can access Writesim TCexam by following the luser interface link provided by administrators. The simulation test displays a list of questions that they can access one after the other. After selecting a specific question from the list, a description of the question along with possible answer options are displayed. Users can choose the answer option that they think best answers the question. In case if the answer is wrong, an answer key justifying the same is displayed. Users are given an additional chance to choose the right answer; failure at which displays the correct answer key. Users have the choice to answer a question or skip the same. They can also leave comments related to a specific question. Finally they have the choice to answer all questions in the list or end the test midway and submit it. lets add.. The following images display the user interface (Figure 6) and the user interface with key (Figure 7), of the Writesim TCExam respectively.

Figure 6
figure 6

User interface of Writesim TCExam.

Figure 7
figure 7

User interface with Key.

Evaluation of simulation test completed by the users

While designing the simulation material administrators can grade the questions and accordingly generate scores at the end of the simulation test. Once a simulation test is successfully conducted, administrators can access individual results for each user and view the test score. They can send the results to the user as a PDF file through an email.

How Writesim TCexam works as a simulation environment?

TCExam's question-and-answer format based on computer aided formative assessment method is a promising foundation upon which to build a scientific writing simulation environment. Formative assessment aims to improve learning rather than grading it and is intimately related to instruction. Well conceived and designed question and answer fields can be used to present examples of text blocks from different sections of a manuscript, and instantaneous feedback clarifies concepts while providing positive reinforcement [36, 37]. The difficulty that users of Writesim TCExam experience in distinguishing between ideal and less-than-ideal text structures as well as content placement simulate the challenges faced by novice scientific writers when they write their first manuscripts. The power of reinforcement through immediate feedback has often been exploited by behavioral scientists in the design of computer-based instruction tools, [38–41] and early process-based studies have also demonstrated the power of instant and corrective feedback [41, 42].

The success of this application depends on well conceived and well designed simulation material which can help students to understand various aspects of scientific writing. For example: the role of a manuscript's various sections (see templates [43] for a description); the proper framing of subsections and content placement. The list of topics can be extended as per the scope and goal of scientific writing instruction. To develop this material, our group drew on its expertise in standardized training methods (such as manuscript dissection and templates) that help novice writers to structure and organize their thoughts

Blog and forum

Writesim TCExam's blog and forum can be accessed from the admin and user interfaces. The blog is based on the wordpress software script, [44] and the forum is based on PHP Bulletin Board, [45] an open-source forum package. The blog serves as a platform for administrators to post guidance material, videos, slides on scientific writing. The forum serves as a platform for users to communicate their experiences, difficulties or questions about the simulation test or scientific writing in general. Administrators can provide comments or address difficulties at an individual level in these forums. Users can provide inputs and feedback on the simulation questions and test which can help administrators in improving simulation material.

Proposed workflow

Administrators

After designing the simulation material, administrators can log in and upload it to relevant sections of Writesim TCExam through the admin interface. They can then 1. design simulation tests 2. define users and 3. implement the test by providing access to the users. Administrators can complement existing courses/educational-training programs in scientific writing by providing access to a textual simulation environment like Writesim TCExam customized to their needs. In preparation to the simulation test, aministrators can provide users with study material on scientific writing which may consist of 1. list of peer reviewed articles focused on scientific writing, for example "The science of scientific writing"[46] 2. slides, videos prepared by administrators that explain scientific writing 3. difference on structure - content and scientific writing templates [28] and 4. recommended books on scientific writing. Administrators can also provide instructions on how to use the application through an inclass session, slides or videos. Once the users have successfully used the simulation material, administrators can evaluate their test results to score/grade them. Subsequently administrators can choose to email the results to users. (Figure 8)

Figure 8
figure 8

Admin user workflow.

Users

Before using the simulation environment, users are expected to review reading material or such other guidance material on scientific writing shared with them. Subsequently they can use the simulation environment based on the link and access provided by the administrators. After using the simulation environment, they may receive the results for self evaluation either immediately or later by email. (Figure 9)

Figure 9
figure 9

Simulation user workflow.

Field usability

After the application was created, the first three months of development were devoted to conducting field usability tests which revealed major/minor problems with software coding, uploading images, logging in, and navigation. Once these problems were corrected and the navigation interface optimized, we conducted tests using example simulation material contributed by senior researchers in the (RoR) group [47]. These tests revealed additional issues concerning use of special characters, navigation, and display errors which were identified and corrected.

Usability

To date, Writesim TCExam has been used informally to train 15 novice researchers from our research group (RoR). Simulated topics include topics like: role and framework of manuscript subsections as well as content placement in scientific manuscripts. Trainees report that the application was easy to use, it helped them understand structure and content, and it improved their overall writing skills.

User survey

Admin

Admin testing was performed by 15 junior and senior researchers who matched the 'administrator' profile described earlier and who met minimum standards of computer literacy. They completed an online survey (Additional file 1) using DADOS-Survey, a CHERRIES-compliant survey tool [48]. They were shown a 15-minute tutorial [49] and instructed to explore the application. They described their experiences as summarized in Table 1.

Table 1 Admin survey results
Users

User testing was performed by 14 novice researchers who matched the profile of 'users'. They completed an online survey (Additional file 2) using DADOS-Survey, volunteering biographical information and their assessment of the user interface. They described their experiences as summarized in Table 2.

Table 2 User survey results

Results

Early usage

Writesim TCExam currently has simulation material designed by our research group to enable novice researchers to understand the role of various text blocks in a manuscript and content placement. The application currently has 100 questions derived from 30 open access articles published in the BMC journal. The application with these 100 questions has been used by us to train 30 novice researchers since 2007. These researchers hailed from various backgrounds like medical students, clinicians and residents. Based on informal communication, we understand that they benefited from using the simulation environment.

Admin user survey

This survey was completed by 13 of the 15 participants(86.66%). Majority of the participants reported the application to be fast (agree = 53.33%, strongly agree = 26.66%), easy to learn (agree = 60%, strongly agree = 13%), easy to use (agree = 60%, strongly agree = 20%), easy to understand in every aspect of its functionality (agree = 46.66%, strongly agree = 13.33%), and easy to navigate (agree = 80%, strongly agree = 6.66%). Responses to questions regarding computer literacy suggest that most respondents were well-versed in basic computer functions. (Table 1)

User survey

This survey was completed by 10 of the 14 participants(71.42%). Majority of participants were females (0.6%) and had post graduate degrees (0.7%). In terms of past publication most of them (0.8%) had never published a peer reviewed manuscript while some of them (0.2%) had published between 1-5 peer reviewed manuscripts. (Table 3)

Table 3 Demographic characteristics of User survey

A majority of them reported the application to be fast (agree = 40%, strongly agree = 60%), intuitive to navigate (agree = 90%, strongly agree = 10%), and easy to use (agree = 60%, strongly agree = 40%). In addition, most felt that training with Writesim had improved their understanding of the functions of a manuscript's subsections (agree = 70%, strongly agree = 20%); that it improved their understanding of how content is divided into subsections (agree = 70%, strongly agree = 10%); that the answer keys that provide feedback were highly beneficial (agree = 20%, strongly agree = 50%); the overall experience helped them better understand scientific manuscript writing (agree = 80%, strongly agree = 20%); and they would look forward to using Writesim TCExam in the future (Yes = 90%, No = 10%). Responses to questions regarding computer literacy suggest that most respondents were well-equipped with basic computer skills.

Discussion

Summary

To create Writesim TCExam, we started with TCExam, an existing open-source, Web-based assessment application and modified it to function as a textual simulation environment. TCExam is used by educators to design, schedule, execute, and report assessment tests. Our decision not to use Goventure stemmed from the fact that it is a commercial, non-open source application. Commercial applications have a cost that limits distribution and use. Additionally since it was not open source, we couldnt make the modifications outlined earlier on our own. We named our adaptation as "Writesim TCExam" to denote its status as a writing simulation environment. After correcting the minor problems in the software, the user survey studies show the application to be intuitive, easy to navigate and use.

Other instructional methods

The research community functions as a collective network, exploring, validating, and disseminating scientific ideas that benefit society. Effective scientific writing is fundamental to the progress of the scientific community and to the careers of individual scientists; [1–3] therefore it is essential that novice researchers develop their writing skills. As an added benefit, an in-depth understanding of the writing process can increase productivity [50].

Over the years, a great many methods for teaching writing skills have been explored, including traditional classroom instruction, seminars, workshops, certificate courses, distance learning, and mentoring. One method, collaborative learning, stresses collective problem-solving [51]. While it has shown some promise in teaching writing skills to researchers, [52] its practical application is limited because scientific communication depends, in the end, on individual effort. Simulation environments can complement collaborative learning by helping researchers understand the flow of ideas in scientific manuscripts, and the difference between structure and content. As well, studies have noted that simulation environments often promote collaborative learning, which prepares students for peer criticism and group work [53].

An increasingly popular method, e-learning, makes use of the Internet [54] and other electronic resources, such as multimedia [55]. However, it often amounts to nothing more than the digitization and dissemination of previously existing educational materials, and so fails to fully take advantage of new technologies, while often perpetuating inefficient and ineffective lesson plans. One example is the e-learning tool created by Dagmar Malikova, consisting of 11 self-study modules, which although is well-designed but is not interactive [56]. Another innovative but limited use of digital technologies involves searching Internet biology forums for comprehensible examples of scientific writing and then using computerized retention strategies to produce "digital learning logs" to track common errors [57].

Other methods like group manuscript critiques, [58] rewriting published manuscripts, [12] manuscript editing, [59] and journal clubs and letter writing [60] can help build writing skills, but they are insufficient on their own and must be combined with other methods. Similarly traditional practice assignments have also been shown to be insufficient to help in improvement of writing skills [61, 62]. Finally, studies evaluating the effectiveness of these and other approaches have yielded few important findings, and the findings are often contradictory [63].

The great variety of writing instruction programs attests to the diversity of settings and objectives that collectively serve to educate novice researchers. Whatever the training method or context, it is important to remember that writing is a dynamic, individualistic process, to which each student brings his own perspectives and concerns, [64] and that, where possible, training programs should be tailored to the specific needs of the various specialties [62].

Simulation environments

In comparison to the methods described above, simulation environments provide a realistic environment in which users can explore simplified versions of both realistic and highly hypothetical situations [65].

Researchers evaluating simulation-based approaches to second-language writing instruction, with an explicit focus on genre and genre analysis, cite numerous benefits. Students become increasingly aware of discipline-specific features, they develop competence in discourse, and they become more precise in their use of language [66]. Simulation also helps students overcome motivational and attitudinal problems, especially those related to collaborative learning [53, 67–69]. Other studies have shown that simulation environments increase opportunities for collaborative learning, which improves students' attitudes toward peer criticism and group work [53]. The many strengths of simulation environments speak to their great potential for scientific writing instruction.

We chose TCExam as it followed the computer aided formative assessment method. It suits well for a simulation environment as it also encourages reflective style of learning. It enables consistent delivery and immediate feedback. Recent applications also allow the use of images and videos making the application rich and interactive. Repeatability, flexibility of access, reliability and being student centred are some of its many advantages [70]. By improving student learning outcomes, it leads to positive attitude towards learning [71]. These benefits add on to those of simulation environments. On the other hand, development time, risks related to hardware, software and administrative aspects of the application and need for users to be computer literate are some of its disadvantages [72] which are equally applicable to simulation environments. In reference to feedback in assessment applications, immediate explanatory feedback on why an answer is incorrect is more beneficial to users as compared to no feedback and it leads to better performance [73]. Although not focused on assessing users, feedback plays an important role in simulation environments. It would facilitates better understanding and retention of the concepts and various aspects of scientific writing. Additionally a second chance mechanism to choose the correct answer was aimed at encouraging brainstorming and enhancing the learning process.

The blog and forum are primarily aimed at improving and enhancing user-user and user-administrator communication. Since Writesim TCExam is an online application, users may not be located in the same place thus restricting group and collaborative discussions. Blog and forum address this issue and serve as a platform for voicing their queries, finding solutions to queries and exchange of individual experiences.

We think Writesim TCExam would be more accessible to the research community owing to its open source nature as compared to other commercial simulation environments like Goventure [30]. Individuals involved in teaching and training novice researchers like mentors, course instructors, program coordinators can download Writesim TCExam [74] and install it at their institutes. They can modify the application if required as well as develop simulation material according to their needs. They can administer the textual simulation environment by providing a link to the application along with instructions on how to use it. The end users can follow the link to undertake the simulation test.

Limitations

Currently Writesim TCExam follows a test-feedback, i.e. deterministic mechanism which does not support real time analyses and feedback on non deterministic aspects of scientific writing. Structure and semantic interconnections that assist the reader to map and understand the context of content constitute the deterministic aspects of writing. Structure persists across multiple articles while content changes according to the topic and hence the latter constitutes the non-deterministic aspect of writing. Gopen [46] argues that deterministic aspects (structure) of written communication provide clues to the reader enabling them to make important interpretative decisions about the content. Our application thus focuses on mimicing intricacies of the deterministic aspects of scientific writing. Additionally, its effectiveness highly depends on the quality of the simulation material.

Current utilization

Writesim TCExam is currently used by the RoR group to train novice researchers and medical students in scientific writing. Writesim TCExam will be used in the Certificate Course in Outcomes Research, [75] an eight-month course of study that will soon be implemented at the Duke-NUS Graduate Medical School in Singapore. The course trains healthcare professionals in every step of research publication, from generating a dataset to submitting to a high-quality journal. Writesim TCExam will be used as a pre-class exercise to train participants in manuscript writing.

Potential uses

WriteSim TCExam is an inexpensive instructional tool that has potential to significantly improve researchers' confidence and writing skills and reduce the time required to produce high-quality manuscripts

Conclusion

Writesim TCExam is a first-of-its-kind, Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews have been positive, a formal comparative study is needed to measure the benefit to writing quality and related outcomes when compared with standard instructional methods alone.

Availability and requirements

Project name:Writesim TCExam

Project home page: http://www.ceso.duke.edu/tcexam

Operating systems: Linux and Windows

Programming language: PHP

Other requirements: Apache Server, MySQL or PostGreSQL, XHTML, CSS, Sendmail, PHPMailer, TCPDF library, Barcode Render Class for PHP using the GD graphics library, and LaTeX Rendering Class v0.8.

License: GNU General Public License v.2. This license ensures that the source code can be freely distributed, modified, and sold, as long as the source code is provided with every copy of the application. The source code is available at no charge.

Restrictions to use by academics/non-academics: New users must email the Research on Research group for a user name and password.

Source code: http://www.ceso.duke.edu/tcexam/tcexam.tar.gz

Software links

1. Admin link: http://www.ceso.duke.edu/tcexam/admin/code/index.php

• user name: reviewer

• password: reviewer

• Description: By using the above link, log in and password, you can get the admin rights. You can create users/groups, assign passwords, create tests, assign tests to specific groups, add topics, add questions, see the results of participants and many other admin functions.

2. User link: http://www.ceso.duke.edu/tcexam/public/code/index.php

• user name: reviewer

• password: reviewer

• Description:The users/participants can execute the test assigned to them by using this link, log in and password. For example, there is a test "Introduction Section" already created to give an idea of the functioning of this software.

Abbreviations

XHTML:

Extensible HyperText Markup Language

CSS:

Cascading Style Sheets

LATEX:

Lamport Tex

PDF:

Portable Document Format

PHP:

Hypertext Preprocessor.

References

  1. Peat J, Elliott E, Baur L, Keena V: Scientific Writing: Easy When You Know How. BMJ Books. 2002, London: BMJ Books, London:Blackwell publishing

    Google Scholar 

  2. Hekelman FP, Gilchrist V, Zyzanski SJ, Glover P, Olness K: An educational intervention to increase faculty publication productivity. Fam Med. 1995, 27 (4): 255-9.

    Google Scholar 

  3. Jones RF, Gold JS: Faculty appointment and tenure policies in medical schools: a 1997 status report. Acad Med. 1998, 73 (2): 212-9. 10.1097/00001888-199802000-00023.

    Article  Google Scholar 

  4. Yanoff KL, Burg FD: Types of medical writing and teaching of writing in U.S. medical schools. J Med Educ. 1988, 63 (1): 30-7.

    Google Scholar 

  5. Nte AR, Awi DD: Writing a scientific paper: getting to the basics. Niger J Med. 2007, 16 (3): 212-8.

    Google Scholar 

  6. Zamel V: Recent Research on Writing Pedagogy. TESOL Quarterly. 1987, 21 (4): 697-715. 10.2307/3586990.

    Article  Google Scholar 

  7. Mary Regan, Ricardo Pietrobon: A Conceptual Framework for Scientific Writing in Nursing. The Journal of Nursing Education. 2010, 1-7.

    Google Scholar 

  8. Julia Romberger: Teaching Scientific Writing Conventions: Learning to Write is an Integral Part of Writing to Learn in the Sciences. Online Writing Lab. 2000, [http://owl.english.purdue.edu/owl/resource/671/05/]

    Google Scholar 

  9. Day RA: How To Write & Publish a Scientific Paper. 1998, Phoenix, AZ.: Oryx Press, 280.

    Google Scholar 

  10. Sbaih L: Helping students convert assignments into articles: tips for teachers. Accid Emerg Nurs. 1999, 7 (2): 112-3. 10.1016/S0965-2302(99)80032-0.

    Article  Google Scholar 

  11. McConnell CR: From idea to print: writing for a professional journal. Health Care Superv. 1999, 17 (3): 72-85.

    Google Scholar 

  12. Tomaska L: Teaching How to Prepare a Manuscript by Means of Rewriting Published Scientific Papers. Genetics. 2007, 175 (1): 17-20. 10.1534/genetics.106.066217.

    Article  Google Scholar 

  13. de Jong T, Swaak J, Scott DM, Brough J: The use of simulations for training engineers in the process industry. Paper presented at the Conference of the European Association for Research on Learning and Instruction. 1995, Nijmegen, The Netherlands

    Google Scholar 

  14. Shute VJ, Alley B, Blais J, Bonar J, Christal R, Katterman K, Kyllonen P, Lesgold A, Raghavan K, Regian W, Resnick P, Woltz D, Schultz J, Peterson A, Glaser R: A Large-Scale Evaluation of an Intelligent Discovery World: Smithtown. Interactive Learning Environments. 1990, 1 (1): 51-77. 10.1080/1049482900010104.

    Article  Google Scholar 

  15. White BY, Frederiksen JF: Causal models as intelligent learning environments for science and engineering education. Applied Artificial Intelligence. 1989, 3 (2 & 3): 167-190.

    Article  Google Scholar 

  16. Goodman W: The world of civil simulators. Flight International Magazine. 1978, 18: 435.

    Google Scholar 

  17. Garrison P: Flying Without Wings. 1985, Blue Ridge Summit, Pa: TAB Books Inc, 1-31: 102-106.

    Google Scholar 

  18. Rolfe JM, Staples KJ: Flight Simulation. 1986, Cambridge, England: Cambridge University Press, 232-249.

    Google Scholar 

  19. Ressler EK, Armstrong JE, Forsythe GB: Military mission rehearsal. Innovative Simulations for Assessing Professional Competence. Edited by: Tekian A, McGuire C, McGaghie WC. 1999, Chicago, Ill: Dept of Medical Education, University of Illinois Medical Center, 157-174.

    Google Scholar 

  20. Streufert S, Pogash R, Piasecki M: Simulation-based assessment of managerial competence: reliability and validity. Person Psych. 1988, 41: 537-557. 10.1111/j.1744-6570.1988.tb00643.x.

    Article  Google Scholar 

  21. Keys B, Wolfe J: The role of management games and simulations in education and research. J Manage. 1990, 16: 307-336. 10.1177/014920639001600205.

    Google Scholar 

  22. Cooke JM, Larsen J, Hamstra SJ, Andreatta PB: Simulation Enhances Resident Confidence in Critical Care and Procedural Skills. Fam Med. 2008, 40 (3): 165-7.

    Google Scholar 

  23. Binstadt ES, Walls RM, White BA, Nadel ES, Takayesu JK, Barker TD, Nelson SJ, Pozner CN: A Comprehensive Medical Simulation Education Curriculum for Emergency Medicine Residents. Annals of Emergency Medicine. 2007, 49 (4): 495-504.e11. 10.1016/j.annemergmed.2006.08.023.

    Article  Google Scholar 

  24. Corbridge SJ, McLaughlin R, Tiffen J, Wade L, Templin R, Corbridge TC: Using Simulation to Enhance Knowledge and Confidence. The Nurse Practitioner. 2008, 33 (6): 12-13.

    Google Scholar 

  25. Theall M, Franklin J: What Have We Learned? A Synthesis and Some Guidelines for Effective Motivation in Higher Education. New Directions for Teaching and Learning. 1999, 78: 97-109. 10.1002/tl.7810.

    Article  Google Scholar 

  26. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005, 27: 10-28. 10.1080/01421590500046924.

    Article  Google Scholar 

  27. Magee M: State of Field Review: Simulation in Education. Alberta Online Learning Consortium. Edited by: Calgary AB. 2006, [http://www.ccl-cca.ca/NR/rdonlyres/C8CB4C08-F7D3-4915-BDAA-C41250A43516/0/SFRSimulationinEducationJul06REV.pdf]

    Google Scholar 

  28. Shah J, Shah A, Pietrobon R: Scientific writing of novice researchers: what difficulties and encouragements do they encounter?. Acad Med. 2009, 84 (4): 511-6. 10.1097/ACM.0b013e31819a8c3c.

    Article  Google Scholar 

  29. Phadtare A, Bahmani A, Shah A, Pietrobon R: Scientific writing: a randomized controlled trial comparing standard and on-line instruction. BMC Med Educ. 2009, 9: 27-10.1186/1472-6920-9-27.

    Article  Google Scholar 

  30. GoVenture Simulation Designer. [http://www.goventure.net/home.cfm?ID=2&go=site/products/sd]

  31. TCExam website. [http://www.tcexam.com]

  32. TC Exam general description. [http://www.tecnick.com/public/code/cp_dpage.php?aiocp_dp=tcexam_description]

  33. BioMed central website. [http://www.biomedcentral.com]

  34. Pubmed. [http://www.ncbi.nlm.nih.gov/sites/entrez?db=PubMed]

  35. Writesim TCExam. [http://www.ceso.duke.edu/tcexam/public/code/index.php]

  36. Bloom BS: Learning for mastery. Evaluation Comment. 1968, 1 (2): 112.

    Google Scholar 

  37. Bloom BS, Hastings JT, Madaus G: Handbook on formative and summative evaluation of student learning. 1971, New York: McGraw-Hill

    Google Scholar 

  38. Morton FR: The Teaching Machine and the Teaching of Languages: A Report on Tomorrow. PMLA, LXXV. 1960, 4 (2): 1-6. 10.2307/2699300.

    Article  Google Scholar 

  39. Skinner BF: Why We Need Teaching Machines. Harvard Education Review. 1961, 31 (4): 377-398.

    Google Scholar 

  40. Rothwell KS: Programmed Learning: A Back Door to Empiricism in English Studies. College English. 1962, 23 (4): 245-250. 10.2307/373062.

    Article  Google Scholar 

  41. Bloom LZ, Martin B: The Teaching and Learning of Argumentative Writing. College English. 1967, 29: 128-135. 10.2307/374051.

    Article  Google Scholar 

  42. Porter D: The Behavioral Repertoire of Writing. College Composition and Communication. 1962, 13 (3): 14-17. 10.2307/354755.

    Article  Google Scholar 

  43. Manuscript writing templates. [http://researchonresearch.duhs.duke.edu/site/?page_id=969]

  44. Wordpress website. [http://wordpress.org/]

  45. PHP website. [http://www.phpbb.com/]

  46. Gopen GD, Swan JA: The Science of Scientific Writing. American Scientist. 1990, 78: 550-558.

    Google Scholar 

  47. Research on Research group website. [http://researchonresearch.duhs.duke.edu/site/]

  48. Shah A, Jacobs D, Martins H, Harker M, Menezes A, McCready M, Pietrobon R: DADOS-Survey: an open-source application for CHERRIES-compliant Web surveys. BMC Medical Informatics and Decision Making. 2006, 6 (1): 34-10.1186/1472-6947-6-34.

    Article  Google Scholar 

  49. Writesim TCExam screencast. [http://researchonresearch.blogspot.com/2008/08/tc-exam-textual-simulation-environment.html]

  50. Neuhauser D, McEachern E, Zyzanski S, Focke S, Williams RI: Continuous quality improvement and the process of writing for academic publication. Qual Manage Health Care. 2000, 8: 65-73.

    Article  Google Scholar 

  51. Dillenbourg P: What do you mean by collaborative leraning?. Collaborative-learning: Cognitive and Computational Approaches. Edited by: Dillenbourg P. 1999, Oxford: Elsevier, 1-19.

    Google Scholar 

  52. Huerta , Deborah , McMillan Victoria: Collaborative Instruction by Writing and Library Faculty: A Two-Tiered Approach to the Teaching of Scientific Writing. Issues in Science & Technology Librarianship. 2000, 28: 1.

    Google Scholar 

  53. Halleck GB, Moder CL, Damron R: Integrating a Conference Simulation into an ESL Class. Simulation Gaming. 2002, 33 (3): 330-344. 10.1177/104687810203300307.

    Article  Google Scholar 

  54. Gunasekaran A, McNeil CR, Shaul D: E-learning: research and applications. Industrial and Commercial Training. 2002, 34 (2): 44-53. 10.1108/00197850210417528.

    Article  Google Scholar 

  55. Drake A, Haka S, Ravenscroft S: An ABC Simulation Focusing on Incentives and Innovation. Issues in Accounting Education. 2001, 16 (3): 443-471. 10.2308/iace.2001.16.3.443.

    Article  Google Scholar 

  56. Malíková D: An Instruction Tool for Effective Technical/Scientific Writing in English. [http://www.kj.fme.vutbr.cz/leon2/pdf/iCEER2004_Malikova.pdf]

  57. Bowers R: A Computer-Mediated Scientific Writing Program. 1995, [http://tesl-ej.org/ej03/a3.html]

    Google Scholar 

  58. Miller BK, Muhlenkamp A: Teaching students how to publish in nursing journals: a group approach. The Journal of Nursing Education. 1989, 28 (8): 379-81.

    Google Scholar 

  59. Misak A, Marusić M, Marusić A: Manuscript editing as a way of teaching academic writing: experience from a small scientific journal. J BUON. 2006, 11 (2): 153-9.

    Google Scholar 

  60. Edwards R, White M, Gray J, Fischbacher C: Use of a journal club and letter-writing exercise to teach critical appraisal to medical undergraduates. Med Educ. 2001, 35 (7): 691-4. 10.1046/j.1365-2923.2001.00972.x.

    Article  Google Scholar 

  61. Amirault M, Doherty A, LeBlanc A, Vickery A: The writing skills working group. Can Nurse. 2005, 101 (2): 8-9.

    Google Scholar 

  62. Rawson RE, Quinlan KM, Cooper BJ, Fewtrell C, Matlow JR: Writing-skills development in the health professions. Teach Learn Med. 2005, 17 (3): 233-8. 10.1207/s15328015tlm1703_6.

    Article  Google Scholar 

  63. Zamel V: Teaching Composition in the ESL Classroom: What We Can Learn from Research in the Teaching of English. TESOL Quarterly. 1976, 10: 67-76. 10.2307/3585940.

    Article  Google Scholar 

  64. Bazerman C: The diversity of writing. The Quartely. 2002, 24 (2): 13-20.

    Google Scholar 

  65. Ip A, Naidu S: Experienced-Based Pedagogical Designs for eLearning in Education Technology. Magazine for Managers of Change in Education. 2001, Educational Technology Publications, 41 (5): 53-58.

  66. Cheng A: Symposium article: Simulation-based L2 writing instruction: Enhancement through genre analysis. Simulation Gaming. 2007, 38 (1): 67-82. 10.1177/1046878106297879.

    Article  Google Scholar 

  67. Ince A: Motivating Students beyond Teacher Expectations. Simulation Gaming. 2002, 33 (4): 481-485. 10.1177/1046878102238611.

    Article  Google Scholar 

  68. Salies TG: Simulation/Gaming in the EAP Writing Class: Benefits and Drawbacks. Simulation Gaming. 2002, 33 (3): 316-329. 10.1177/104687810203300306.

    Article  Google Scholar 

  69. Tjie CB: Conflict and Roles in Simulations. Simulation Gaming. 2002, 33 (4): 486-489. 10.1177/1046878102238612.

    Article  Google Scholar 

  70. Charman D: Issues and impacts of using computer-based assessments (CBAs) for formative assessment. Computer-assisted Assessment of Students. Edited by: Brown S, Race P, Bull J. 1999, London: Kogan Page

    Google Scholar 

  71. Peat M, Franklin SE: Supporting student learning: the use of computer based formative assessment modules. British Journal of Educational Technology. 2002, 33 (5): 515-523. 10.1111/1467-8535.00288.

    Article  Google Scholar 

  72. Jenkins M: Unfulfilled promise: formative assessment using computer-aided assessment. Learning and Teaching in Higher Education -- LATHE. 2004, 67-80. 1

  73. Grant L, McAvoy R, Keenan JB: Prompting and feedback variables in concept programming. Teaching of Psychology. 1982, 173-177. 10.1207/s15328023top0903_11. 9

  74. Writesim TCExam source code download link. [http://www.ceso.duke.edu/tcexam/tcexam.tar.gz]

  75. Pietrobon R, Rajgor D, Shah J, Ostbye T, Eisenstein EL, Tay SK, Krishnan R: Curriculum and strategy for a global problem-based training program in outcomes research. Medical Teacher. 2008.

    Google Scholar 

  76. Pietrobon R, Guller U, Martins H, Menezes A, Higgins L, Jacobs D: A suite of web applications to streamline the interdisciplinary collaboration in secondary data analyses. BMC Medical Research Methodology. 2004, 1: 29-10.1186/1471-2288-4-29.

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgements

We acknowledge the use of the material and templates developed by "Research on Research" team for manuscript writing [28] and literature search [76] while compiling this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ricardo Pietrobon.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

JS contributed to the development and testing of the application and wrote the manuscript. DR contributed to the testing of the application and reviewed the manuscript. MV contributed to the development and testing of the application and reviewed the manuscript. AP contributed to the development and testing of the application and reviewed the manuscript. SP contributed to the testing of the application and reviewed the manuscript. EC contributed to the testing of the application and reviewed the manuscript. RP contributed to the development and testing of the application and reviewed the manuscript. All authors read and approved the final manuscript.

Jatin Shah, Dimple Rajgor, Meenakshi Vaghasia, Amruta Phadtare, Shreyasee Pradhan, Elias Carvalho and Ricardo Pietrobon contributed equally to this work.

Electronic supplementary material

Authors’ original submitted files for images

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Shah, J., Rajgor, D., Vaghasia, M. et al. WriteSim TCExam - An open source text simulation environment for training novice researchers in scientific writing. BMC Med Educ 10, 39 (2010). https://doi.org/10.1186/1472-6920-10-39

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-10-39

Keywords