Kamis, 23 Desember 2010

Grammar, Writing, and Technology: A Sample Technology-supported Approach to Teaching Grammar and Improving Writing for ESL Learners


VOLKER HEGELHEIMER
DAVID FISHER
Iowa State University
Abstract:
English language learners are frequently unable to benefit from the prevailing process-writing approaches due to a lack of grammar and vocabulary knowledge relevant to academic writing. This paper describes how the need for explicit grammar instruction as part of preparing students to write can be addressed by using a collection of learner texts and transforming that collection into an online grammar resource for intermediate nonnative speakers (NNS) of English. Drawing on research in grammar and writing, the use of learner texts, and online interactivity, we outline the development and the prototype of the Internet Writing Resource for the Innovative Teaching of English (iWRITE). We discuss how the judicious use of advanced technology (e.g., XML) facilitated the implementation of iWRITE, an example of one possible approach to embodying aspects of second language acquisition (SLA) theory while taking advantage of the Web
s potential for interactivity.
INTRODUCTION
Despite participating in courses specifically aimed at improving the writing proficiency of English as a second language (ESL) learners, nonnative speakers (NNS) are frequently not prepared to produce acceptable academic writing (Hinkel, 2004). Hinkel (2002) points out that, among other problems, the relative absence of direct and focused grammar instruction, the lack of academic vocabulary development, and the exclusive use of a process-writing approach contribute to this problem. Even high intermediate and advanced NNS do not have the grammatical and lexical wherewithal to benefit from the process-writing-teaching approaches. Thus, researchers (Hinkel, 2002, and others) recommend to specifically include grammar and vocabulary relevant to academic writing in the curriculum of writing classes for NNS. The availability of advanced technology coupled with recent research dealing with learner texts allows for the creation of systems specifically designed to address learner needs (Kuo, Wible, Chen, Sung, Tsao, & Chio, 2002; Wible, Kuo, Chien, Liu, & Tsao, 2001). An ideal platform for implementing these recommendations into functional systems is the World Wide Web (WWW). In this paper, we draw on research in the area of grammar in writing approaches and suggest that technology can be instrumental in creating an innovative online grammar resource aimed at raising learner awareness of troublesome grammatical features. In particular, we show how, by harnessing the capabilities of technology and implementing the principles of computer-assisted language learning, learner texts can be transformed and integrated into an effective online resource. In doing so, we proceed as follows: First, we reiterate and highlight the need for including grammar instruction as part of ESL writing courses, review work that has been done to date using learner corpora to assist with such instruction, suggest features to be included in a Web-based resource based on information derived from an interactionist view of second language acquisition (SLA), and review existing writing systems. Second, we outline four stages used in the development of the Internet Writing Resource for the Innovative Teaching of English (iWRITE), describe the system's components, and give examples of its pedagogical uses. In the last part, we propose empirical research to evaluate the usefulness of this Web application.
WRITING AND GRAMMAR
Hinkel (2004) points out the mismatch between what is taught and what can be accomplished by intermediate- and advanced-level ESL writers. Often, she argues, “intensive, individualized help with sentence-level syntax […]” is needed despite the explicit grammar instruction learners have received. Since learners frequently do not have the competence they need, they are required to enroll in ESL writing courses. However, even these courses fail to adequately prepare NNS for the academic writing expected of them. One important concern is that since the 1980s writing classes have shifted away from a product approach to embrace a process approach to writing (Hairston, 1982). While important for the personal development of the learners, “the new instructional methodology centered squarely and almost exclusively on the writing process that fundamentally overlooked the fact that NNS writers may simply lack the necessary language skills (e.g., vocabulary and grammar) to take advantage of the benefits of writing process instruction” (Hinkel, 2004, p. 9).
In addition to these concerns, it is the product, not the process that is evaluated in academic testing situations in which students are asked to produce written texts, such as for assignments in most (if not all) higher-education classes—except writing classes. Strikingly, even in most placement test situations in English, only the product (i.e., the essay) is evaluated, while the teaching approach remains process oriented.
A distinct, yet related aspect of process-writing approaches is that they integrate peer editing. Research (e.g., Hyland, 2002; Hinkel, 2004) supports classroom experience that peer editing, while often perceived as helpful, may not lead students to improved error awareness and error recognition. Helping learners focus on errors typically committed by learners from a particular L1 can raise the awareness of such problem areas and facilitate the detection (and prevention) of certain error types. In fact, learners often want to focus on form and wish for a pedagogical tool to serve as a reference and an easy-to-use resource. Nevertheless, the exclusive use of model texts that are not accessible to students is viewed skeptically by students and may lead to unrealistic expectations.
What is needed is direct instruction coupled with explicitly pointing out mistakes in essays written by language learners. Hinkel (2004) calls for innovative ways of teaching rather than more of the same. Recent development in the area of corpus linguistics in general and in working with learner corpora in particular, as well as advances in technology, may be ideally suited to play a key role in reinventing (or at least supplementing) grammar teaching as part of a writing course. Each is discussed in turn below.
LEARNER CORPORA
Since being called a revolution in applied linguistics in the early 1990s (Granger, 1994), learner corpora have become a major source for learning about various errors, including L1 interference errors, particularly in ESL writing. One major project, the International Corpus of Learner English1 (ICLE) consisting of argumentative writings by ESL learners from different countries, provides learners with access to not only an error corpus, but also to a comparison group corpus consisting of essays written by native speakers (NS) of English (Virtanen, 1996). In order to transform these learner corpora into useful learning and teaching tools, we must draw from the current research in CALL and online interactivity. The next section situates the interactionist theory of SLA within the more general discussions of online writing and pedagogical interactivity. In doing so, we provide a heuristic for the development and assessment of online tools.
CALL, WRITING SYSTEMS, AND WEB INTERACTIVITY
Phinney (1996) realized the importance of technology in writing and recognized the following paradigm shift: “As part of the changing culture of composition instruction, there is a new emphasis on de-centering authority, coupled with a recognition of the importance of collaborative learning, and a realization of the need for new models of writing and rhetoric” (p. 140). A gradual shift from word processing to collaborative writing in the late 1980s to mid-1990s necessitated the development of tools to accommodate this shift in pedagogy. However, writing systems were often developed by writing teachers in response to a lack of appropriate writing tools (Phinney, 1996). This led to the creation of more collaboratively oriented writing environments such as the Daedalus Integrated Writing System and Prep Editor. The focus of these tools was in line with the predominant process approach to writing and, therefore, teachers or peers used these tools mostly to make organizational and rhetorical comments. One theoretical framework that can serve as a basis for the development and assessment of an online resource that integrates grammar, writing, and the use of learner corpora is the interactionist theory of SLA. Focusing mainly on the role input and interaction plays in instructed (or classroom-based) settings (Pica 1994; Long, 1996; Gass, 1997), the hypotheses in the interactionist theory are pertinent to the design of CALL activities and resources. Acquisition occurs only when linguistic input becomes intake, that is, is comprehended syntactically and semantically by the learner. Noticing linguistic input is viewed as a prerequisite for acquisition (Schmidt, 1990), and noticing is more likely to occur during interaction. Hence, software features that enhance noticing in general and that help the learner to focus on form (FoF) (Long, 1991) are viewed as beneficial. Chapelle (1998) proposed seven criteria for the development of multimedia CALL based on hypotheses that derive from interactionist-based research:
1. make linguistic characteristics salient,
2. help learners comprehend semantic and syntactic aspects of input,
3. learners need to be able to produce output,
4. learners need to be able to notice errors in their output,
5. learners need to correct their linguistic output,
6. target language interactions need to be modifiable for negotiation of meaning, and
7. learners need to engage in L2 tasks designed to maximize opportunities for good interaction.
Chou (2003) sought to assist those developing what Maddux called Type II uses of technology—or what we can conceive of as interactionist learning systems—by providing a list of interactivity dimensions culled from the past 15 years of research on instructional design. These dimensions help us envision how Chapelle's interactionist criteria can be concretely embodied in a Web-based system while also providing a rubric of sorts for assessing such a system's level of interactivity (see Table 1). Guided by these considerations, we describe in the next part the development, implementation, and anticipated use of iWRITE.
RESOURCE DEVELOPMENT
Taking into consideration the issues surrounding the opportunity presented by the collection of genuine learner data in the form of placement essays, the advantages of learner corpora, and principles derived from SLA theory, the development of an appropriate Web-based resource also needs to include issues related to the Web environment to arrive at an application that truly transforms a learner corpus.
Project Development
Figure 1 provides an overview of the iWRITE system, which includes the learner corpus, documents and activities that support student/instructor interaction with it. For clarity, we have divided the process into four stages which correspond to the type of work undertaken on (or the instructional value we are adding to) the corpus. In each stage, the corpus remains at the center of the process, and the materials and activities that surround it serve to make the corpus useful to students and instructors by enabling the interactivity that characterizes the iWRITE interface.
Stage 1: Corpus and Database Design and Assembly
All essays selected for inclusion in the corpus were handwritten as part of an English placement test at Iowa State University on one of four different topics requiring expository writing. The essays were rated by two independent readers who both agreed on the specific placement of students.2 Perfect interrater reliability was the primary criterion for selection. Once typed, the total collection of learner texts amounted to 45 essays, or 12,839 words. In total, 1,268 errors were identified and marked. The following information was also captured and/or prepared for entry into the relational database:
1. nationality, TWE score, and TOEFL scores of the writers of the essays;
2. essay topic;
3. contexts, solutions, and corrected contexts (all described below) for marked errors; and
4. pointers to Flash movies, Word documents (marked during “filming” of Flash movies), and reference (“Additional Information”) files.
Stage 2: Learner Text Mark Up and Solution Production
At first, five essays were analyzed in detail, and the initial error categories were modified according to the actual errors found in the essays. Subsequently, the remaining 40 essays were marked using the coding scheme outlined in the Appendix, resulting in marked-up essays like the one illustrated in Figure 2. The error codes were derived from error codes currently in use at the university and modified to fit the errors exhibited by the learners in this subsample. In addition to grammatical errors, lexical errors, which Santos (1988) found to be considered the most serious errors by professors who evaluated nonnative writers, were also included. The importance of focusing on both grammatical and lexical errors is also supported by findings reflected in other studies (Vann et al., 1984; Vann et al., 1991), in which lexical and semantic errors were found to be most problematic, particularly when committed by NNS. In subsequent versions of iWRITE, a display of errors based on error gravity will be considered, but the current incarnation does not assign weights to errors.
Database Build and Load
In the next step, each error was put into a spreadsheet, along with identifying information, and one possible solution (see Table 2). However, many times, sentences contained multiple errors. Therefore, an error-free solution of the entire sentence (or context) was entered into the spreadsheet. The marking and entering was done by two different members of the research team in order to minimize errors and to double check the marking of the errors. After the marking was complete, the spreadsheet was loaded into a table in the relational database.
XML Mark Up: Creating Smart Documents
After the errors were uploaded into the database, the essays were marked up with tags developed using XML. A set of tags (technically known as elements within a document type definition) that represented each of the error categories (paragraph, sentence, word, determiner, and miscellaneous) was created. By identifying each error uniquely within the error-category tags, and therefore within the text of the corpus itself (i.e., by establishing the linkage between the corpus and the database), we were able to design iWRITE to
1. draw on the relational database table that contains one possible solution for the identified error as well as a corrected context, in which all of the errors in the text surrounding the marked error are corrected (these had been entered into spreadsheets and uploaded into the database as described above), and thus enable students to get solution information by clicking on a link in the essay; and
2. make available the “additional help” reference pages for each type of error from a variety of contexts.
Video Recording
The research team also annotated Word versions of placement essays using the “Track Changes” feature. This activity, along with oral comments made by an annotator, was recorded using Camtasia, a program that allows users to capture and replay motion that takes place on a computer monitor. These audio/video files were then transformed into Flash movies to permit speedier delivery over the Web. The annotator did not have access to the marked-up version of the text. Rather, 5 minutes were allotted to allow the annotator to glance at the essay before making suggestions and corrections, which were often more qualitatively oriented and included praise and constructive suggestions rather than only syntactic and lexical corrections, mimicking an interaction between a student and an instructor while reviewing an essay.
Reference Page Creation
After the major error types were identified, the team created a number of reference, or “Additional Information” pages. These pages contain detailed explanations of the error, examples of how to fix the error, and links to websites where students could go for more information.
Stage 3: Corpus Transformation
An important part of creating layered interactivity lies in providing students with the ability to query the essays in various ways. In essence, the XML tags encode some of the expertise that has traditionally resided in instructors and makes it accessible to students.
XSL: Displaying Documents Smartly
XSL (eXtensible Stylesheet Language) transformations involve a marked-up document (like the learner corpus), a transformation stylesheet, and software that creates a new document out of the two. The stylesheets in iWRITE contain a set of instructions about how to display each element (i.e., error type) for which a tag has been defined. The transformation software creates a new document that renders the data associated with each tag in the way that the stylesheet instructs. In other words, the transformations that occur in iWRITE produce HTML documents that appear in the students' browsers with certain error types highlighted and linked to solutions.
Stage 4: Corpus presentation: iWRITE; a smart corpus-based prototype
The homepage of the iWRITE application gives learners access to five main components: Solutions, Essays, Practice, Marking, and Corpus, and a logout option.
Classroom Application
The iWRITE has immediate pedagogical applications in that it can be used to raise learners' grammatical awareness, encourage learner autonomy, and help learners prepare for editing or peer editing. In this section, sample classroom applications of each of the four major sections of iWRITE are outlined.
First, iWRITE's Solutions section can be used to help learners understand the terminology (or metalanguage) necessary to begin to ask specific questions about grammar, which is one important aspect of becoming an autonomous learner. The Solutions section presents the error terms and examples using appropriate grammatical terminology. The Essays section allows learners to dissect essays in layers since they can look at different categories of errors at the word, sentence, or paragraph level. This section is ideally suited to classroom settings because it does not confront learners with an overwhelming number of errors at the same time. Plus, the essays are accessible by the writer's country of origin. Therefore, this section can be used to prepare for upcoming peer-editing sessions in that readers can review essays written by a writer from the same country as the one they will read during the peer-editing session. The Practice section can be used to generate worksheets as Word documents, which can be used in a small group activity in which each group member is responsible for finding (and correcting) specific mistakes at the word, sentence, or paragraph level. Upon completion, the individual members can collectively correct the essay and compare the errors they detected with the ones accessible through iWRITE. The last major section, the Marking section is aimed at encouraging learners to interact cognitively with the audio/video annotations of an essay. It can be used for peer-editing or error-detection exercises in which unmarked essays can be downloaded and marked up and corrected by learners who can then verify their choices using iWRITE.
These are just a few potential uses of applications like iWRITE. Future development of this application will need to include more learner texts so that multiple essays from learners of specific L1s can be made available.
CONCLUSIONS
Building collections of online resources that focus on the needs of users is not a simple process (Calverley & Shephard, 2003). We envision our effort, then, as an attempt to create a prototype of what Maddux (2002) called a Type II system in which pedagogical value is added to a learner corpus by providing a number of different kinds of interactivity. As we took up the challenge of creating a Type II system, we decided to use a browser interface and Web pages, rather than a more proprietary model that might have been housed on a few computers in our language-learning lab. We made this choice for two main reasons. First, Hillman, Willis, & Gunawardena (1994) noted that the “extent to which a learner is proficient with a specific medium correlates positively with the success the learner has in extracting the desired information” (p. 32). Many of the students who will be using iWRITE have a good deal of experience searching the Web and working with browsers and thus should be comfortable working with a system that uses familiar Web conventions (e.g., links and back buttons). Second, we hope eventually to make this resource available to a number of teachers/learners around the world at no or minimal cost, so the Web seemed the ideal medium. If readers are interested in using the system, they should contact Volker Hegelheimer at volkerh@iastate.edu. Next we worked to decide which kinds of interactivity would be most helpful in (a) enabling our students to achieve the learning goals set forth in the ESL class in which they would be using the system and by means identified in current SLA theory and (b) enabling us as researchers to determine how (or if) the system was effective in helping students with their language-learning efforts. Table 3, an expanded version of Table 1 above, relates Chou's (2003) interactivity dimensions to student needs and instructor goals and outlines how this is accomplished in iWRITE. We view iWRITE as a prototype of smart, dynamic, and learner-corpus-based applications that will enhance language learning in the near future. In this paper, we illustrated one approach on how to transform a learner corpus into a sound online resource using theory-supported design features and an iterative, dynamic approach. This incarnation of iWRITE deals with predefined syntactic problems. However, the underlying architecture of this program can be used to address other problems as well, be they more rhetorical aspects of writing or writings composed by NS on a variety of topics. While preliminary feedback from learners and teachers suggests that iWRITE is viewed as a potential asset for language learning, what needs to be examined in greater detail next is how language learners and language teachers perceive iWRITE in terms of its potential to transform learners' awareness of grammatical errors and their writing. Among the various notions driving this line of research, one ideal outcome would be to generate an automatic profile of a learner (e.g., Granger & Rayson, 1998). Since the creation of the first version of iWRITE in June 2003, the resource has been used by approximately 200 learners in intermediate and high-intermediate academic-writing classes at Iowa State University. Indicative of how students perceive the resource is the following quote of one intermediate-level student: “When I revised my partner's essay I used iWRITE to help. We did it in class but I also did it outside of class. I think it helped, but I still think it's really hard to detect errors on my own.” The use of this resource also promises increased motivational appeal. During a semistructured interview, one student expressed his enthusiasm about the program by saying “I particularly like the marking component of the program. I love it! It feels like my tutor is sitting beside me.” Another student's remark (“When I peer-edit I look at paragraph level, sentence level, [and] word level now.”) hints at a positive analytical development in that the notion of a layered approach towards peer editing seems to be growing. However, while these reactions are promising, more research (e.g., Hegelheimer, in press) is needed before conclusions can be drawn.



Tidak ada komentar:

Posting Komentar