Title of Invention

METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO EXPAND FUNCTIONALITY OF A TEST DRIVER

Abstract The present invention relates to a system for computer-based testing for at least one test, the at least one test having a presentation format and data content, comprising: a test driver, having an executable code that controls functionality that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, and control results reporting of the at least one test; a resource file, in operative data communication with the test driver, that stores information relating to the data content, the presentation format, progression, scoring, and results reporting of the at least one test, the information being accessible to the test driver to enable the functionality of the test driver; and an expansion module, in operative data communication with the test driver and the resource file, that retrieves the information relating to at least one of the data content, the presentation format, the progression, the scoring, and the results reporting of the at least one test from the resource file and provides the information to the test driver during delivery of the at least one test, the expansion module expanding the functionality of the test driver without necessitating modification to the executable code of the test driver. The present invention also relates to a method for computer-based testing.
Full Text


METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING PLUGINS TO
EXPAND FUNCTIONALITY OF A TEST DRIVER
CROSS REFERENCE TO RELATED APPLICATIONS
This application is related to and claims the priority of U.S. Provisional Application Serial No. 60/331,228, filed November 13, 2001 and incorporated herein by reference, and is further related to: U.S. Patent Application entitled "'EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING" and having inventors Clarke Daniel Bowers, Tronster Maxwell Hartley, Kyle Michael Kvech. and William Howard Garrison (Docket No. 26119-146); U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING CUSTOMIZABLE TEMPLATES" and having inventor Clarke Daniel Bowers (Docket No. 26119-143); U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING A NON-DETERMINISTIC EXAM EXTENSIBLE LANGUAGE (XXL) PROTOCOL" and having inventor Clarke Daniel Bowers (Docket No. 26119-144); and U.S. Patent Application entitled "METHOD AND SYSTEM FOR COMPUTER BASED TESTING USING AN AMALGAMATED RESOURCE FILE" and having inventor Clarke Daniel Bowers (Docket No. 26119-145) all of which are being filed concurrently herewith and all of which are incorporated by reference in their entirety herein.
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
The present invention generally relates to the field of computer-based testing, and in particular, the present invention relates to an expandable test driver used to deliver a computer-based test to an examinee, where plugins are used to expand the functionality of the test driver during delivery of the computer-based test.
BACKGROUND OF THE RELATED ART
For many years, standardized testing has been a common method of assessing examinees as regards educational placement, skill evaluation, etc. Due to the prevalence and mass distribution of standardized tests, computer-based testing has emerged as a superior method for providing standardized tests, guaranteeing accurate scoring, and ensuring prompt return of test results to examinees.
Tests are developed based on the requirements and particulars of test developers. Typically, test developers employ psychometricians or statisticians and psychologists to determine the specific requirements specific to human assessment. These expens often have their own, unique ideas regarding how a test should be presented and regarding the necessary contents of that lest, including the visual format of the lest as well as the data content of the test. Therefore, a particular computer-based test has to be customized to fulfill the client's requirements.

Figure 1 illustrates a prior art process for computerized test customization, denoted generally by reference numeral 10. First, a client details the desired test requirements and specifications, step 12. The computerized test publisher then creates the tools that allow the test publisher to author the items, presentations, etc., required to fulfill the requirements, step 14. The test publisher then writes an item viewer, which allows the test publisher to preview what is being authored, step 16.
An item presenter is then written to present the new item, for example, to the test driver, step 18. Presenting the new item to the test driver requires a modification of the test driver's executable code. The test driver must be modified so that it is aware of the new item and can communicate with the new item presenter, step 20. The test packager must then also be modified, step 22. The test packager, which may also be a compiler, takes what the test publisher has created and writes the result as new object codes for the new syntax. Subsequently, the scoring engine must also be modified to be able to score the new item type, step 24. Finally, the results processor must be modified to be able to accept the new results from the new item, step 26. This process requires no less than seven software creations or modifications to existing software.
U.S. Patent No. 5,827,070 (Kershaw et al.) and U.S. Patent No. 5,565,316 (Kershaw et al.) are incorporated herein by reference. The '070 and '316 patents, which have similar specifications, disclose a computer-based testing system comprising a test development system and a test delivery system. The test development system comprises a test document creation system for specifying the test contents, an item preparation system for computerizing each of the items in the test, a test preparation system for preparing a computerized test, and a test packaging system for combining all of the items and test components into a computerized test package. The computerized test package is then delivered to authorized examinees on a workstation by the test delivery system.
Figures 2A and 2B illustrate the test preparation process as disclosed in the *070 and *316 patents. Te.st developers assemble the test as shown at 32. As shown at 36, item selection is preferably automated (AIS) using the test development/document creation ("TD/DC") system or an equivalent test document creation system. Using "TD/DC", test developers enter the test specifications into the "TD/DC" system. Based on these specifications, "TD/DC" searches its central database for items, which satisfy the test specification, e.g., 50 math questions, 25 of which are algebra problems and 25, which are geometry problems. Then, the test developers review the items selected by "TD/DC" for sensitivity and overlap constraints described in the background section. If the lest developer decides that the sensitivity or overlap constraints are not satisfied by the current selection of items, certain items may be designated to be replaced by another item from the database. In addition, test developers provide a test description specifying the directions, messages, timing of sections, number of sections of the test, etc. as shown at 42. If a computer adaptive test (CAT) is to be run, test developers may run a computer adaptive test simulation at 34, which are known to skilled test developers. Using the Test Preparation Tool (TPT) and TOOLBOOK 46, the test preparation system ("TPS") prepares the test level components as shown at 50. TOOLBOOK is commercially available from Asymetrix Corporation. The

test level components include scripts 66, item table block sets 56, general information screens 58, diVection screens 60, message screens 62, and tutorial units 64. Each of the test components will be described in detail below. As the components are prepared, the TFT stores them in a TPS network directory 52. Then, the components are entered into the TPS Production database 54. The components stored in the TPS Production database 54 will be retrieved during test packaging.
U.S. Patent No. 5,513,994 (Kershaw et al.), which is incorporated herein by reference, disclo.ses a centralized administrative system and method of administering standardized tost to a plurality of examinees. The administrative system is implemented on a central administration workstation and al least one test workstation located in different rooms at a test center. The administrative system software, which provides substantially administrative functions, is executed from the central administration workstation. The administrative system software, which provides function carried out in connection with a test session, is executed from the testing workstations.
None of the Kershaw et al. patents appear to make any mention of how modifications may be made to the computer-based testing system to incorporate a particular client's test specification. What is required is-a system that, for example, allows the test driver to be expanded to support new item types, scoring algorithms, etc., without making any changes to the test driver's executable or recompiling the test driver to support the new functionality as described below in connection with the present invention. Other features and advantages in addition to the above, or in the alternative to the above, are described in the Summary of the Invention and the Detailed Description provided below.
SUMMARY OF THE INVENTION
It is one feature and advantage of the present invention to enable expansion of a test driver in a computer-based lest delivery system without necessitating changes to an executable code of the test driver and without recompiling or re-linking the test driver.
It is another optional feature and advantage of the present invention to enable the test driver to support new item types, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, results persistence reporting, printed score reporting, and/or helm types without change to the executable code of the test driver.
It is another optional feature and advantage of the present invention to facilitate expansion of the test driver using an expansion module.
It is another optional feature and advantage of the present invention that the expansion module is instantiated based on a class name within a test defunition language based on extensible Markup Language format.
It is another optional feature and advantage of the present invention to use component object model interfaces to facilitate communication between the test driver and the expansion module such that the expansion module can enable expansion of the test driver.
It is another optional feature and advantage of the present invention that multiple types of expansion modules can implement a particular component object model interface.

It is another optional feature and advantage of the present invention that the expansion modules can be written after the test driver is built.
These and other features and advantages of the present invention are achieved in a system for computer-based testing for producing a test and delivering the test to an examinee. The lest has a presentation format that determines the visual presentation of the lest and data content that determines the functional properties of the test. The system includes a lest driver that has an executable code that controls the test driver. Based on the executable code and|a test specification and content defined by a test definition language, the test driver delivers the test to the examinee using a display device and manages the lest. The test driver controls progression of the test based on the navigational aspects of the test specification and content. The test driver also controls scoring of the test and controls results reporting of the test, which optionally includes timing of units of the exam, item timing, item responses, exam unit score, candidate demographics, appointment information, item scores, etc.
The system also includes a resource file that stores information relating to data content, presentation formal, progression, scoring, and/or results reporting of the test. The information is accessible to the test driver to enable the lest driver to retrieve the test specification and content and to deliver the test to the examinee. The system further includes an expansion module that retrieves the information relating to the data content, the pre.sentation format, the progression, the scoring, and/or the results reporting of the test from the resource file and provides the information to the test driver during delivery of the test. The expansion module advantageously expands the functionality of the test driver without necessitating modification to the executable code of the test driver. In an alternative embodiment, the expansion module is a plugin.
In an alternative embodiment, the system of the present invention further includes a source file storing information relating to the data content, the presentation format, the progression, the scoring, and/or the results reporting of the test. A test publisher authors the information stored in the source file. The system further includes a test packager retrieving the information from the source file. The resource file stores the information retrieved from the source file by the test packager. In one embodiment, the lest packager passes the information from ihe source file to the expansion module to allow the expansion module to validate the test definition language stored in the source file. The system also includes an instance file storing examination state information comprising responses provided by the examinee to items presented to the examinee during the test. The examination stale information is accessible to the expansion module to enable a restart of the test if the test is interrupted during delivery due to, for example, a power failure.
In another embodiment, the system of the present invention employs, for example, nine different expansion modules to enhance the functionality of the test driver. Each expansion module optionally controls a different feature of the test, retrieving different types of information from the resource file and providing the information to the test driver during the delivery of the lest. A first expansion module retrieves information from the resource file that relates to non-interactive display

material. A second expansion module retrieves information relating to test navigation. A third expansion module retrieves informalion relating to lest navigation controls. A fourth expansion module retrieves information relating to items, or questions, that are delivered to the examinee during the test. A fifth expansion module retrieves information relating to timing of the test. A sixth expansion module retrieves information relating to selection, where the selection information determines v^hat items are presented to the examinee and in what order. A seventh expansion module retrieves information relating to scoring of the test. An eighth expansion module retrieves information relating to results. Finally, a ninth expansion module retrieves information relating to reporting, such as, printing the examinee^s score report. Of course, the functionality of the expansion modules may be combined and/or separated in alternative embodiments.
In another embodiment of the present invention, a method of computer-based testing for a test is provided, where the test has a presentation format that determines the visual presentation of the test and data content that determines the functional properties of the test. Delivery of the test is controlled by a test driver that has an executable code that enables the test driver to deliver the test to an examinee using a display device, manage the test, control progression of the test, control scoring of the test, and control results reporting of the test.
The method includes the sequential, non-sequential, and/or sequence independent steps of instantiating an expansion module, providing to the expansion module a resource storage element within a resource file, and loading information from the resource storage element into the expansion module during delivery of the lest. The information from the resource storage element relates to the data content, the presentation formal, the progression, the scoring, and/or the results reporting of the test. The method also includes providing the information from the expansion module to the test driver during the delivery of the test. The expansion module expands the functionality of the test driver without necessitating programming changes to the executable code of the test driver. In one embodiment of the method, the expansion module is instantiated during delivery of the test.
In a further embodiment, a method of computer-based testing for a test includes instantiating an expansion module and loading informalion into the expansion module from a source file, where the information relating to the data content, presentation formal, progression, scoring, and reporting of test results of the test. The method also includes validating the information from the source file and unloading the information from the validation expansion module into a resource storage element within a resource file. The expan.sion module expands the functionality of the test driver without necessitating programming changes to the executable code of the test driver. In an alternative embodiment of the method, the expansion module is instantiated during production of the test.
In another alternate embodiment of the present invention, a method of computer-based testing for a test includes instantiating an expansion module during production of the test and loading information into the expansion module from a source file, where the information relating to at least one of non-interactive display material, lest navigation, test navigation controls, items, timing, selection.

belter understood, and in order that the present contribution to the art may be better appreciated. There are, of course, additional features of the invention that will be described hereinafter and which will form the subject matter of the claims appended hereto.
In this respect, before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and'should not be regarded as limiting.
As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods and systems for carrying out the several purposes of the present invention. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the present invention.
Further, the purpose of the foregoing abstract is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the an who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The abstract is neither intended to define the invention of the application, which is measured by the claims, nor is it intended to be limiting as to the scope of the invention in any way.
These, together with other objects of the invention, along with the various features of novelty, which characterize the invention, are pointed out with particularity in the claims annexed to and forming a pan of this disclosure. For a better understanding of the invention, its operating advantages and the specific objects attained by its uses, reference should be had to the accompanying drawings and descriptive matter in which there is illustrated preferred embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a flow diagram of a prior an method of computerized lest customization;
Figures 2A and 2B are block diagrams of a prior art process for test production:
Figure 3 is a schematic diagram of a computer-based testing system according to the present invention;
Figure 4 is a block diagram illustrating different types of plugins that are used with the computer-based testing system according to the current invention;
Figure 5 illustrates various components that comprise an exam source file;
Figures 6A and 6B are a schematic illustrating the components, classes, and interfaces that comprise a test definition language compiler according to the present invention;
Figure 7 is a schematic illustrating the components that comprise a test driver and a test administration system according to the present invention;

Figures 8A and SB are schematics illustrating the classes and interfaces that comprise the test driver;
Figure 9 illustrating the interfaces that comprise a structured storage according to the present invention;
Figures 1(]A and lOB are schematics illustrating the classes and interfaces that comrise the structure storage and associated operations;
Figure M is a block diagram of main storace branches of an exam resource file according to the present invention;
Figure 12 is a block diagram illustrating an exams branch of the exam resource file;
Figures 13A and 13B are block diagrams illustrating a forms branch of the exam resource file:
Figure 14 is a block diagram illustratinig an items branch of the exam resource file:
Figure 15 is a block diagram illustrating a categories branch of the exam resource file;
Figure 16 is a block diagram illustrating a templates branch of the exam resource file;
Figure 17 is a block diagram illustrating a sections branch of the exam resource file;
Figure IS is a block diagram illustrating a groups branch of the exam resource file;
Figures 19A, 19B, 19C, and 19D are block diagrams illustrating an events sub-branch of the groups branch of the exam resource file;
Figure 20 is a block diagram illustrating a plugins branch of the exam resource file;
Figure 21 is a block diagram illustrating a data branch of the exam resource file;
Figure 22 is a block diagram illustrating a formGroups branch of the exam resource file;
Figure 23 is a block diagram illustratini: an attributes branch of the exam resource file;
Figure 24 is a block diagram illustrating a scripts branch of the exam resource file;
Fiizure 25 is a block diagram illustratini: a messagc box branch of the exam resource file;
Figures 26A, 26B, 26C. and 26D are block diagrams of an exam instance file according to the present invention:
Figure 27 is a flow diagram of a method of computerized test customization according to the present invention:
Figure 28 is a flow chart of a method of lest production and test delivery according to the
present invention; 1
Figure 29 is a flow chart of a method for validation of test specification and content according to the present invention;
Figure 30 is a flow chart of a method for test delivery according to the present invention;
Figure 31 is a flow chart of a method of restarting a test after interruption according to the present invention;
Figure 32 is a diagram of a life cycle of a plugin according to the present invention;
Figure 33 is a flow diagram of a process for compiling plugins according to the present invention; and

Figures 34A. 34B, 34C. and 34D are flow diagrams of a process for delivering plugins (o an examinee during a computer-based test.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Reference now will be made in detail to the presently preferred embodiments of the invention. Such embodiments are provided by way of explanation of the invention, which is not imtcnded to be limited thereto, in fact, those of ordinary skill m the an may appreciate upon reading the present speci . union and viewing the present drawings that various modifications ar a variations can be made.
For example, features illustrated or described as pan of one embodiment can be used on other embodiments to yield a still funher embodiment. Additionally, ccnain features may be interchanged with similar devices or features not mentioned yet which perform the same or similar functions. It is therefore intended that such modifications and variations are included within the totality of the present invention.
The present invention discloses a system and method of computer-based testing using a test driver that is. for example, object-oriented and is architected to dynamically add functionality through. for example, the use of an expansion module, and preferably through the use of plugins. The test driver preferably references component object model servers using standard interfaces, and uses, for example, class names (that can be an Active Document)defmed in a custom test definition language entitled extensible eXam Language ("XXL") based on cXiensible Markup Language ("XML") format u> interact with existing applications while offering the flexibility of allowing development of new plugins. These new plugins ean be customized to a client's needs without changing the core test driver. The specific format and protocol of XXL is also described in the co-pending application filed on the same date, entitled "EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING," incorporated herein by reference.
The plugins advantageously enable the test driver to support, for example, new item types. navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit selection algorithms, resuhs persistence reponing, printed score reporting, and/or helm types without change to the tesi driver's executable. Plugins also allow expansion of the lest driver's functionality without requiring the test driver to be recompiled or re-linked, and without requiring the test publisher to learn to program. Since plugms are written independently of the lest driver, plugins can be written long afier the test driver is built.
The client and the .software developer can design and test the plugins and distribute the plugins to each test site. By using this method, large-scale regression testing of other examinations will not usually be necessary unless changes are made to the plugins that may be used by many examinations.
L Overview of Computer-Based Test Delivery System
Figure 3 shows an overview of the software architecture for the computer-based test delivery system of the present invention, denoted generally by reference numeral 100. Test driver 110 is

responsible for controlling all aspects of the computer-based test. Test driver 110 identifies examinees scheduled to take the computer-based test and identifies and creates the appropriate test. Test driver 1 10 then presents all of the test components to exammeet; using a display device (not shown), such as a computer monitor, and enables examinees to enter responses to test questions through the use of an input device (not shown), such as a keyboard, a mouse, etc. Test driver 110 also monitors the security of the test. For example, test driver 110 can prevent access to the Internet and can validate examinees, although, lhe.se functions are preferably performed by the lest center administration system. Test driver I 10 also monitors the timing of the It.., providing relevant warnings to examinee regarding the elapsed time of the test and the time remaining for a panicular section of the test or for the entire test. Test driver ! 10 is also responsible for scoring the test, once the test is completed or while the lest is in progress, and for reporting the results of the lest by physical printout using printer 182 or in a Hie format using candidate exam results file 180. If the lest is interrupted while in progress, for example, due to a power failure, lest driver 110 restarts the test, preferably at the point at which the test was interrupted, as will be described subsequently in more detail. Finally, if the lest is left incomplete, test driver 110 cleans up the incomplete lest. An incomplete test will have an exam instance file in the examnce's directory but will not have created a results file. A results file is created even though generally the candidate will fail. The number of items delivered to the examinee is recorded in the results file. Test driver 110 picks up where (he event was interrupted and invisibly deliveries ihe rest of the units of the test.
A test specification is authored by a test publisher according to (he specifications of the client and stored in exam source files 130. Exam source files 130 include data files 132, XXL files 134, multmedia files 136, and hypertext markup language ("HTML") files 138. XXL files 134 include tlu test specification, which contains the client's requirements for the lest, a bank of test items or questions, templates that determine the physical appearance of the test, plugins, and any additional data necessary to implement the test. Additional data is also stored in data files 132. For example an adaptive selection plugin may need a, b & c theta values. These values are stored in a binary file created by a statistical package.
HTML files 130 include, for example, any visual components of the test, such as the appearance of test items or questions, the appearance of presentations on the display device, the appearance of any client specified cu.stomizations, and/or the appearance of score reports. HTML files 130 preferably also mclude script, for example, VBscript and Jscript, or Java script. HTML files 130 are preferably authored using Microsoft's FrontPage 2000. FrontPage 2000 is preferably also used to manage the source files in a hierarchy that is chosen by the test publisher. Multimedia files 136 include, for example, any images (jpg, .gif etc.) and/or sound files (.mp3, .wav, .au, etc.) that are used during the test.
XXL compiler 140 retrieves XXL files 134 from exam source files 130 using interface 190 and compiles the XXL test content stored in XXL files 134. XXL compiler 140 stores the compiled test

riles in exam resource file 120. In another embodiment, exam source files 130 do not contain XXL *
files 134 and contains, for example, only multi-media files. In this embodiment, XXL compiler 140 is merely a lest packager that writes the data directly to exam resource file 120 without modification or validation. The data appears in a stream under the "data" branch of exam resource file 120. The name of the stream is specified by the test author.
In a preferred embodiment. XXL files 134 also include XXL language that defines plugins 150, in which case, plugins 150 assist XXL compiler 140 in compiling XXL files 134. Test driver 110 preferably supports, for example, nine different types of plugins llC, including, for example: dispia} plugin !52; helm plugin 154; item plugin 156; timer plugin 158; selection plugin 160; navigation plugin 162: scoring plugin 164; results plugin 166; and repon plugin 168. Plugins 150. which are also included in XXL files 134, are the first XML files compiled into exam resource file 120.
Plugins 150 allow a lest designer to customize the behavior of test driver 1 10 and are divided into two types, for example: visible plugins and invisible plugins, as shown in Figure 4. The visible plugins, which include display plugin 152, helm plugin 154, and item plugin 156, enable the test driver to control what is presented visually to an examinee on the display device. The invisible plugins, which include timer plugin 158, selection plugin 160, navigation plugin 162. scoring plugin 164, results plugin 166, and repon plugin 168, enable the te.st driver to control more functional aspects of the test. Plugins 150 are used lo validate data stored in exam source files 130 that is lo be used by one of plugins 150 during delivery of the test to the examinee, as is described below in greater detail. Plugins 150 are, preferably, component object model ("COM") objects, as described below. Plugins 150. may also utili7e .lava implementation. Plugins 150 are preferably written using Microsoft Visual C++ or Visual Basic 6 () or anv fully COM enabled lanszuace. Plueins 150 mav be in or out-of-process. and, therefore, can exist as executable (".EXE") files or as dynamic link library (".DLL") files.
An application or component that uses objects provided by another component is called a client. Components are characterized by their location relative to clients. An oul-of process component is an .exe file that runs in its own process, with its own thread of execution. Communication between a client and an out-of-process component is therefore called cross-process or out-of-process communication.
An in-process component, such as a .dll or .ocx file, runs in the same process as the client. It j provides the fastest way of accessing objects, because property and method calls don't have to be marshaled across process boundaries. However, an in-process component must use the client's thread of execution.
Exam resource file 120 receives the compiled test content from XXL compiler 140 and plugins 150, if applicable, and stores the compiled test content in an object-linking and embedding ("OLE*') strucured storage format, called POLESS, which is described in greater derail below. Other storage formats may optionally be used. OLE allows different objects to write information into the .same file, for example, embedding an Excel spreadsheet inside a Word document. OLE supports two types of

structures, embedding and linking. In OLE embedding, the Word document of the example is a container application and the Excel spreadsheet is an embedded object. The container application contains a copy of the emibedded object, and changes made to the embedded object affect only the container application. In OLE linking, the Word document of the example is the container application and the Excel spreadsheet is a linked object. The coniainer application contains a pointer to the linked object and any changes made to the linked object change the original linked object. Any other applications that link to the linked object are also updated. POLESS supports structured storage such that orliy one change ma.i.. to an object stored in exam resource file 120 is globally effective. 1\-.M driver I!() comprises Active Document container application 112 for the visible plugins, display plugin 152, helm plugin 154, and item plugin 156, which function as embedded objects, preferably COM objects.
Both XXL compiler 140 and plugins 150 are involved in storing the compiled lest content into exam resource file 120, if any of plugins 150 are being used. Exam resource file 120 comprises, for example, a hierarchical storage structure, as will be described in further detail below. Other storage structures may optionally be used. XXL compiler 140 determines to which storage location a specific segment of the compiled test content is to be stored. However, if any of plugins 150 are used to validate the portion of any of the data from exam source files 130, then the plugins 150 store the data directly to the exam resource file, based upon directions from XXL compiler 140. XXL compiler uses IPersistResource interface 192, co-located with l-Plugin interface 167 in Figure 3, to control the . persistence of the data to exam resource file 120. XXL compiler 140 and plugins 150 write the data to exam resource file 120 using POLFSS interfaces 191.
Figure 5 illustrates the contents of exam source file 130, which are compiled into exam resource file 120 by XXL compiler 140 and plugins 150. FrontPage 2000 Web 200 is used, for example, to author the test. Exam source files 130 contain media files 210, visual files 220, and logic files 230. Media files 210 are multimedia files used to enhance the presentation of the test, including, for example, XML data files 212, .sound files 214, image files 216, and binary files 218. XML data files 212 include the XXL test definition language and the XXL extensions from the plugins 150 that use XML. The test specification, presentation, scoring and other information is specified in the XML files. Sound files 214 include any .sounds that pre to be used during the test, such as .mp3 files, .au files, etc. Image files 216 include any images to be used during the test, such as .jpg files, .gif files, etc. Binary files 218 include any data needed by a plugin 150 that is not in XXL formal. Visual files 220 are HTML files that specify the visual presentation of the test as presented to the examine on the display device, including items files 222, presentation files 224, score repoit files 226, and custom look files 228. hems files 222 include HTML files that are used to specify (he visual component of test questions, e.g., stems and distractors. hems files 222 are capable also of referencing external exhibits. An exhibit could be a chart, diagram or photograph. Formats of exhibits include, for example: .jpg, .png, etc. Presentation files 224 define what-is seen by the examinee on the display device at a

particular instant during the lest. Score report files 226 is; typically an HTML the with embedded script that includes, for example candidate demographics, appointment information, and candidate performance. The performance might include pass/fail, achievement in different content areas, etc. Custom look files 228 are typically HTML flics with embedded script to layout, for example, the title bar and information contained therein. Logic files 230 are XML files that specify the functional aspects of the rest, including test specification files 232, plugin files 234, item bank files 236. and template files 238. Test specification files 232 specify the content and progression of the test as provided by the client. Plugin files 234 define plugins 150 and contain any data necessary ta implement plugins 150. Item bank files 236 include the data content and properties of the items, or test questions, that are to be presented to the examinee during the lest. Properties of an item include the correct answer for the item, the weight given to the item, etc. Template files 238 define visual layouts that are used with the display screen during the test.
Referring again to Figure 3, once a test has begun, lest driver 110 accesses exam resource file 120 for the instructions and files needed to implement the lest, using POLESS interfaces 193. Test driver MO also accesses plugins 150 for additional data that expands the functionality of lesl driver IK) in the areas of items, navigation algorithms, information displays, scoring algorithms, timing algorithms, test unit .selection algorithms, results persistence reporting, printed score reporting, and/or helm types. Test driver 110 communicates with plugins 150 using various COM interfaces 16^. COM intertaces facilitate OLE linking. As staled previously, test driver I 10 is an Active Document container application and plugins 150 are embedded objects. The COM interfaces function as communications paths between the container application and the objects.
There are, for example, ten COM interfaces utilized in computer-based test delivery system 100. IPIugin interface 167, which is also a COM interface, is supported by all of plugins 150. COM interfaces 169. therefore, includes the IPIugin interface. The IPlugin interface contains generic operations such as loading and unloading, required of all plugins 150. In addition to the global IPlu!:in interface, each plugin 150 also uses, for example, a .second, individual COM interface 169 to communicate with lest driver 110. Alternative structures of the IPIugin interface mav also be used. Tabic ) shows the relationship between each plugin 150 and the COM interface 169 used with that particular plugin 150.





Several administrative environments perform the administrative functions of computer-based test delivery system l00, for example: Test Center Manager ("TCM") Bridge 172; Educational Testing Service ("ETS") Bridge 174; and Unified Administration System ('UAS") 174. Admiiiislrativc functions include, for example: checking-in an exammee, stalling the lest, aborting the test, pausing the test, resuming the test, and transmitting results.
There arc preferably two ways to run Test driver 110. The first is through a .series of command line options and the second is using COM interfaces describing appointment information. The command hue option exists for backwards compatibility in a standard ETS environment and a TCM environment. Tabic 2 shows a list of command line options test driver 110 supports. There arc, for example, four programs which launch the test through the COM interface, for example: I) LaunchTest.exe (for test production and client review); 2) UAS; 3) UTD2ETS.dll (an interna!


lAppointment interface 176 is pan of UAS 174 and allows access by lest driver I 10 to examinee information for the examinee taking the test, such as demographics. The examinee information is included in candidate exam results file ISO, which is created by the test driver, LLaunch2 iniciface 177 functions as the primary control interface for UAS 174 and allows UAS 174 to control \arioiis components such as test driver I 10. screen resolution change, accommodations for disabled candidates, examinee check-in, etc.. in a lest center, which is the physical location where the examinee is taking the lest. ITransfer]interface 199 transfers candidate exam results file 180 and other files back to UAS 174. IPrint interface 198 sends information regarding any repons to printer 182.
II. XXL Compiler interfaces and Classes
Figures 6A and 6B illustrate the main diagram for XXL compiler 140. XXLcompiler 14U comprises the following classes, for example: cCompile 2000; cData 2004; cArea 2006; cTempiate 2008; cCategory 2010; cltem 2012; cPrcscntation 2014; cGroup 2016; cSeclion 2018; cForm 2020; cFromGroup 2022; cExam 2024; cMsgBox 2026; cChecksum 2028; cEvent 2030; cResuit 2032; ' cReport 2024; cPlugin 2036; and cXXL 2038.

The main interface lo XXL compiler 140 is ICompile interface 2002. ICumpile interface 2002 is implemented by cCompiler class 2000. All control and initiation of compilation of exam source files 130 into exam resource file 120 occurs by way of this single public interface. The core, non-plugin related clements of the XXL test definition language, as stored in XXL files 134, are compiled by classes in XXL compiler 140. For example. cScction class 2018, compiles the section element, and cGroup class 2016 compiles the group element.
ICompile interface 2002 supports the following operations, for example: createResourceO; addSourcc(); addData(); closeRcsourca() about(); linkResource(); opcnResourceO and getCryptoObject(). CreateResource() creates a resource file, for example, an XXL based resource file such as exam resource file 120. AddSourceO compiles an XXL file into the resource file. AddDala()adds a file directly to a data branch of the resource file. CloseResource() closes the resource


All content and specification destined for a plugin 150 appears in the data clement in XXL. For example, below is an item definition in XXL:



Test driver 1 U) defines various interfaces to allow test driver 110 to communication with different pans of computer-based test delivery sysrem 100. Test driver 110 includes, for example, ten COM interfaces 169 to communicate and transfer data with plugins 150. (See Table i above) The COM interface.s 169 are denoted in Figure 7 as follows, for example: IDisplay interface 169a; IHelm interface 169b; litem interface I69c; lUnilTimer interface !69d; ISelection interface 169e; INavigate interface I69f; IScore interface I69g; IResults interface I69h; IRepoa interface !69i:and IPIugin


examinee interaction. IContainerNulifyHelm interface 206 allows helm plugin \54 lo request navigation from lesl driver 110 after receiving an input from the examinee to move lo another section of the lesl. IMore interface 202 is used lo convey whether the examinee has seen all content in a presentation. For example, a "more" button appears in place of the next button when the content exceeds the window length. When the examinee scrolls to the bottom, the "morc" button disappears and is replaced with the "next" button. Collection interface 204 is used by test driver i 10 to hold any group entities, for example, categories and sections of the test.
The remaining interfaces are, for example, Microsoft defiacd /Active Document interfaces, used to implement OLE linking functions of test driver 110 and the visible plugins, display plugin 152, helm


and to directly instruct its client site to activate it as a document object. A client site with this ability is called a "document site".
B. Core Classes
Figures SA and SB illustrate the main classes of test driver 110 and the interfaces between test driver 110 and plu;:ins 15(3. Also shown are the classes that interface to UAS 174. ITransfer inteifacc 199. IPrint interface 198, ILaunchZ interface 177, and 1 Appointment interface 176 represent the connections from test drive 110 to UAS 174. as described previously. Some of the lines depicf'^d in Figure 8 are solid and sume are dashed. The solid lines, for example, between IcResulis interface 241)


vvpy of IPIugin interface 169]- Furthermore, all plugins 150 get (lExam) using the IPIugin interface 169, also.
The cExam class selects and delivers the form, using cFormGroup class 228 and IForm interface 238. The form delivers results using IcResulls interface 240, reports using IcReport interface 242, and sections contained with in the lest using ISection interface 250. Classes that are in the test delivery chain preferably derive from cEvent class 252.
The cRcsults class (not shown) delivers a results plugin 166 that implcntents IResult interface I69i. The cReport class (not shown) delivers a report plugin 168 that implements IRepon interface I69h. The cSection, cGroup, and cForm classes (not shown) use several invisible plugins 150 to control the delivery of the test. These plugins 150 are timer plugins 158, which implement IUnitTimer interface 169d, selection plugins 160, which implement ISelection interface I69e, scoring plugins 164, which implement IScore interface i69g, and navigation plugins 162, which implement LNavigatc interface 169f. The cPresentation class (not shown) supplies data to its template for the display of the presentation. The three visible plugins 150 are created and controlled through cTemplate class 236 and child objects cArea class 234. Item plugins 156 have an extension class in the cltem class (not shown) that wraps the item plugin 156 and provides generic extended services that all item plugins 156 implements. The cltem class in test driver 110 is a wrapper class. The cltem class provides two base services, for example: generic item functionality and access to item plugin 156, which is the wrapping function Item generic functionality includes, for example: having an item name, having an item title, determining if the item is scored or un-scored, determining whether the item has been preseniecl to the exammce, etc. These services are generic to all items and are provided by test driver ] 10. Item plugins 156 perform the actual scoring of the item, which is unique to each item type. Item plugins 156 present the content of the item and allow the examinee to interact with the item. These services are unique to each item type.
In addition to the interfaces described previously, test driver 110 implements IRcgistry interface 220, which allows VB code to access the Windows registry. Test driver 1 10 also implements ILcgacyltem interface 258 and ILegacyScore interface 260, which are defined by test driver 110 and are implements by certain item plugins 156 and scoring plugins 164. ILegacyllen) interface 258 and ILegacyScore interface 260 allow old item types that existed in previous test drivers to report results like the previous lest drivers. For some tests, test driver 110 must report results for old item types, which had very specific ways of reporting results. ILegacyltem interface 258 and ILegacyScore interface 260 allow the new item plugins 156 that represent old item types to report this legacy format of information to result plugins 166 trying to imitate previous test drivers.
A complete description of test driver 110 classes and interfaces is included in Appendix A.
IV. POLESS
All persistent .storages, exam resource file 120 and exam instance file 170, preferably utilize POLESS. POLESS allows data lo be embedded, linked, or references as external files from the

persistent storage to test driver 1 10 and Active Document container application I 12 (Figure 3). POLESS supports a hierarchical tree structure with node or branch level additions, replacements, and deletions. POLESS also supports optional data encryption at the node level. The type of encryption employed depends on the destination of the information in the persistent storage. For example, different encryption keys may optiortaliy be used for data being routed to test centers. dat;i being routed to administrative data centers, and data being routed for client use (e.g., client review). Microsoft Crypto-API is preferably used to perform encryption of data in the persistent storage. Finally. POLESS also supports optional compression at the node level, preferably using Lempal-Zev compression.
POLESS is an extension of OLE structured storage compound document implementation. A compound document is a single document that contains a combination of data structures such as text, graphics, spreadsheets, sound and video clips. The document niay embed the additional data types or reference external files by pointers of some kind. There are several benefits to structured storage. Stmctured storage provides file and data persistence by treating a single Hie as a structured collection of objects known as storage elements and streams. Another benefit is incremental access. If test driver 110 or plugins 150 need access to an object within a compound file, only that particular object need be loaded and saved, rather than the entire file. Additionally, structure storage supports transaction processing. Test driver 110 or plugins 150 can read or write to compound tiles in transacted mode, where changes made can subsequently be committed or reverted.
-A. POLESS Components
Figure 9 shows the major components that support POLESS and the interfaces that connect the components. POLESS 300 may be either exam resource file 120 or exam instance file 170. POLESS 300 milizes PKware library component 330 for storage compression and decompression. POLESS 300 uses Crypto API component 332. a Microsoft application, for storage encryption and decryption. Crypto API component 332 relies on a crypto service provided ("CSP") 334 to perform the actual encryption talgorithms. Access to the services of these components is facilitated by standard (API) interfaces exposed by these components.
0LE2SS component 310 contains all the interface definition that makeup structure storage. These interfaces can be realized by any structured storage implementation, such as compound document implementation 0LE2 320 and P0L|ESS 300. The interfaces include, for example: IStream interrface 340; ISequentialStream interface 342; IStorage interface 344; and IRootstorage interface 346. POLESS 300 additionally implements IStreamVB interface 348 and ISlorageVB interface 350.
IStreamVB interface 348 suppons .several functions, for example: ReadVB(j; \VriteVB(); Cicarf): Reset(): get_sName(); gef_oStream(); and CopyTo(). ReadVB() reads a specified number of bytes to a data array. WriteVB() writes the byte data to the .stream. Clear() clears the stream of all data. ResetO sets position to the beginning of the stream. get_sName() is a read-only function that returns the name of the stream. gel_oStream() is a read-only function that returns the IStream interface 348. CopyTo() copies a source stream to a destination stream.

IStorageVB interface 350 supports several functions, for example: Cleaf(); CommiuVB(); RevertVBO; sElemenlNameO; bStorage(); oElement(): CreateSlrcamO; OpenStream(): CrcateStorage(); OpenStorageO; get_sName(); get_oStorage(); get_ncount(); GetCompression(); GelEncryption(); GetCRC(); CreateStreamLinked(); CrealePropertyStgO; OpenPropenyStg(); SetClassO; RegisterAlias(); DestroyO; and get_ElementType(). Clear() clears the storage of all elements. CommiuVBO causes transacted mode changes to be reflected in the parent. RevertVB() discards changes made since the last commit. sElementName() returns the name of the element. bSlorageO returns TRUE if the element is sub-storage. oElement() returs IStrcamVB interface 348 or IStorage interface VB 350 for the element. CreatcStreamO creates and opens a stream and returns IStrcamVB interface 348.
OpenStreamO opens a stream and returns IStreamVB interface 348. CrcateStoragc() creates and opens a nested storage and returns IStreamVB interface 348. OpenStorageO opens an existing storage and returns IStreamVB interface 348. get_sName() is a read-only function that returns (he name of the storage. get_oStorage() is a read-only function that returns IStorage interface 350. get_nCount() is a read-only function that returns a count of the elements. GetCompressionO returns the status of file compression. GetEncryptionO returns the status of file encryption. GctCRC()returns the status of file CRC checking. CreateStreamLinked() creates and opens a linked stream and returns IStreamVB interface 348. CreatePropcrtyStgO creates and opens a properly storage and returns IpropertyStorageVB interface 414. OpenPropenyStgO opens a property storage and returns I|)ropertyStorageVB interface 414. SetCIassO .sets the CLSID for the storage. RegisterAliasO registers a pluggable protocol. De.stroyO destroys the specified elements. get_ElemenlType{) is a read-only function that returns the type of the element.
B. POLESS Classes
Figures lOA and lOB illustrate the main class of POLESS 300, the interfaces used to implement the classes, and the flow of the creation of streams 424 and storages 426. cFileRoot class 400 is the first object instantiated and is u.sed to create a new or open an existing a POLESS file. cStorageRooi class 406 is returned, which is a slightly overloaded version of cStoraec class 410. From cStorageRoot class 406 creates or opens cStream class 408 and cStorage cla.ss 410, from which any streams or storages and sub-storages of those can be created or opened, respectively. For instance, cSiorage class 410 creates cPropertyStorage class 412, which creates storage for property sets. The classes implement interfaces that perform operations and/or define attributes that further define the function or properties of the class. A complete description of POLESS 300 classes and interfaces is included in Appendix B.
1) cFileRoot Class
cFileRoot class 400 is the root POLESS class and controls the creation and opening of all , POLESS files. cFileRoot class 400 is generally instantiated first before any other POLESS objects can be created, although other sequences are possible. cFileRoot class 400 implements IFileRoot interface

401, which is collocated in Figure 10 with cFileRool class 400. fFileRooi inrerface 401 is used to open one file at a time and is not released until all other storage object 426, stream object 424, and property storage interfaces are released and the file is ready to be closed. cFileRool class 400 and IRooi interface support the following operations, for example: StorageFileCreate(); StorageFileOpenf); CryploGctO; bStorageFiieO; StorageAmalgamatedGet(); DeltaFileCreatet);DeltaFileApply(); GeiObjeclFromPath(); CreateStreamFromBSTRf); MemoryStreamFromStream(); GetPictureO: and SavePiclure().
StorageFileCreateO creates a new storage file, returns the root slorage to interface, marks the new structured storage file as a POLESS file by storing the class ID ("CLSID") of this class in a stream in the root storage. SlorageFileOpen() opens an existing storage file and returns the root storage unferrfacc. CryptoGet() gets a default configured crypto class and should be set and used on the upon or create of the storage file. bStorageFilc() returns true if the file provided is an OLLE structured storage file and not a POLESS storage file. StorageAmalgamatedGet() gets an empty small cStorageAmalgamated class 404, DeltaFileCreate() creates a POLESS difference file by comparing the original POLESS file to the updated POLESS file. DcltaFileApplyO applies a POLESS delta tile and applies the original POLESS file to the delta file to create an updated POLESS file. GetObjectFromPalhO uses monikers to retrieve the object named hy the path and returns a pointer to the object retrieved. CreatcSlreamFromFile() creates a strucutred storage stream and populates it with the contents of the file. CreateStreamFromBSTR() creates a structures storage stream and fills it with the specified siring. MemoryStreamFromStream() is used to copy a stream to a newly created memory stream object. GetPictureO loads a picture from stream object 424. SavePicturc() saves the picture into the stream 426.
2) cCrypto Class
cCrypto class 402 controls the configuration of the encryption/decryption of POLESS 300. cCrypto class 402 has the following attributes, for example: sProviderName: eProviderType; sContainerName; and sPassword. SProviderName represents the name of CSP 334 being used to perform the encryption/decryption .services. eProviderType is the type of CSP 334. The field of cryptography is large and growing. There are many different standard data formats and protocols. These are generally organized into groups or families, each of which has its own set of data formats and way of doing things. Even if two families used the same algorithm, for example, the RC2 block cipher. they would often u.se different padding .schemes, different key links, and different default modes. Crypto API is designed so that a CSP provider type represents a particular family. sContainerName is the key name and must be provided by cCrypto class 402. sPas.sword is an optional password on the public/private key pair and can only be entered by a human operator. The password can be used for review disks and their resource files.
cCrypto class 402 implements ICrypto interface 401 and they support the following properties and method, for example: ProviderName; Password; FileType; Algorithm; EnumProviders(); and

EnumAIgorithms(). Get_ProvidcrName() returns the name of the Crypto provider. Put_ProviderName() sets the name of the Crypto provider. Get_Password() and Put_Pass\vord() are only used for sponsor resource files. Gel_FileType() gels the file type and put_FileType() sets the file type. Get_Algorithm() gets the encryption algorithm and put__Algorithm() sets (he encryption algorithm. EnumProviders() returns an enumerator for the list of installed providers. EnumAlgorithmsO enumerate a list of algorithms for the current provider.
3) cStoragcAmalgamated Class
cStorageAmalgamated class 404 is an implementation of IStorage intertace 344. cStoragcAmalgamated class 404 holds references to an ordered collection of IStorage objects. When a stream is opened, cStorageAmagalmated class 404 searches the collection of storage objects in order lo find the first storage object that has the requested stream and returns this stream. cStoragcAmalgamated class 404 handles compound storage resolution and delegates all other work to cStoragc class 410, cStoragcAmalgamated class 404 is, for example, read-only. cStorageAmalgamated ciass 404 will not allow stream or storages to be created but is primarily for reading exam resource file 120. cStorageAmalgamated class 404 implements IStorageAmnlgamated interface 405. cStorageAmalgamated class 404 and IStorageAmalgamaled interface 405 support the following operations, for example: StorageAdd(); ClearstorageO; OpenSioragcAmalgamated(); and OpenPropenyStgAmalgamatedO. StorageAddO adds a new storage lo the collection of storages. CIcarstoragcO clears all the storage objects from the collection. OpenS(orageAmalgamated() opens a sub-sioragc of the current amalgamated storages in an amalgamated fashion.
OpenPropenyStgAmalgamatedO opens a property storage of the current amalgamated storages in an anialgamated fashion. Amalgamation is described in greater detail, in the co-pending application filed on the same dale, entitled "EXTENSIBLE EXAM LANGUAGE (XXL) PROTOCOL FOR COMPUTER BASED TESTING, incorporated herein by reference.
4) cStorageRoot Class
cSlorageRoot class 406 is the POLESS implemcntation of IStorage interface 344 and IRootstorage interface 346. cStorageRoot class 406 handles any storage object 426 that is POLESS specific and then delegates work lo the cSlorage class 410. U^ootstorage interface 346 supports the SwiichToFileO operalion, which copies the current file associated with the storage object to a new file, which is then used for the storage object and any uncommitied changes. cStorageRoot class 406 also implements IPersisiFile interface 418, which provides methods that permit an object lo be loaded from or saved to a disk file, rather than a storage object or stream. Because the information needed to open a file varies greatly from one application to another, the implementation of lPersistFiIe::Load on the object preferably also open its disk file. IPersistFile interface 418 inherits its definition from IPersisL so all implementations must also include the GetClassIDO method of IPersisi interface 418.

cStream class 408 is the POLESS implementation of IStream interface 340. cSiream class 408 handles any storage object 426 that is POLESS specific and then delegates work to compound document implementation 0LE2 320. The spccitlc work includes compression/decompression and encryption/decryption of stream object 424.
IStream interface 340 supports the following operations, for example; Seek(); SetSize():CopyTo();Con-imitt(): Revert(): LockRegionf): UnlockRegionO; Stat(); and Clunc(). ScekO changes the seek pointer to a new location relative to the beginning of stream object 424. the end of stream object 424, or the current seek pointer. SetSize() chamges the size of stream object 424. CopyTo() Copies a specified number of bytes from the current seek pointer in stream object 424 to the current seek pointer in another stream object 424. Commit() ensures that any changes made to a stream object 424 open in transacted mode are reflected in the parent .storage object, Reven() discards all changes that have been made to a transacted stream since the last call to IStream::Commit. LockRegionO restricts access to a specified range of bytes in stream object 424. Supporting this functionality is optional since some file systems do not provide this operation. UnlockRegion() removes the access restriction on a range of bytes previously restricted with IStream::LockRegion. StatO retrieves the STATSTG structure for the stream object 424. Clone() creates a new stream object that references the same bytes as the original stream but provides a separate seek pointer to those bytes.
IStrcamVB interface 348 is an automation friendly version of IStream interface 340.
iStreamVb inicrtnce 348 supports the following operations, for example: Readt); Write(); CIear():
ReseK): get_sName{); get__oStream; and CopyTo{). Read() reads data from stream objeci 424. WriteO
writes data, including the entire byte array, to stream objeci 424. Clear() clears stream object 424 o\' all
data. Reset() re.sets the position in stream object 424 to the beginning of stream object 424.
Get_sName() returns the name of the stream. Ge(_oStream() returns the IDispatch interlace. CopyToO
copies the contents of a source stream to a destination stream.
6) cStoragc Class
cStorage class 410 is the POLESS implementation of IStorage interface 344 and IcSioragc interface 411. cStorage class 410 handles any storage object 426 that is POLESS specific and then delegates work to compound document implementation 0LB2 320.
(Storage interface 344 supports the following operations, for example: CreateSircam(); OpenStream(); CreateStorage(); OpenSlorageO; CopyTo(); MoveElemcntTo(); Commii(); Revert(); EnumElementsO; DestroyElement{); RenameElemcnt(); SetElementTimcsO; SetClass(); SetStateBils(); and Stat() CreateStreamf) creates and opens a stream object 424 with the specified name contained in a storage object. OpenSlream() opens an existing stream object 424 within a storage object using specified access permissions. CreateStorage() creates and opens a new stream object 424 within a storage object. OpenStorage() opens an existing storage object 426 with the specified name according to the specified access mode. CopyTo() copies the entire contents of an open storage object 426 into another storage object. The layout of the destinarion storage objeci may differ from the layout of the

source storage object. MoveElementTo() copies or moves a sub-storagc or stream object 424 from one storage object 426 to another storage object.
Commit() reflects change*; for a transacted storage object 426 to the parent level. Revert() discards all changes that have been made to the storage object 426 since the last lSlorage::Commit operation. EnumElements() returns an enumerator object that can be u.scd to enumerate storage objects 426 and stream objects 424 contained within a storage object. DestroyElement() removes the specified storage object 426 or stream object 424 from a storage object. RenamcElemenl() renames the specified storage object 426 stream object 424 in a storage object. SetElementTimes() sets the modification, access, and creation limes of the indicated storage element, if supported by the underlying file system. SetClassO assigns the specified CLSID to a storage object. SetStateBits() stores state information in a storage object, for example up to 32 bits. Stat() returns the STATSTG structure for an open storage


programming tools and other applications that support Automation. COM components implement the 1Dispatch interface to enable access by Automation clients, such as Visual Basic. Get_nCount( l returns the count of elements in the storage. GetCompressiont) determines if streams may be compressed in the file and if enabled streams may optionally be compressed when created. GetCRCO indicates whether a cyclic-redundancy-check ("CRC"), or a digital signature, check is to be performed on the file. CreateStreamLinkedO creates a link to a stream in another POLESS file.


property. WriteMulliple()writes properly values in a properly sel. ReadMultipie()reads propenv values in a property sel.
8) cPropenyStorage Amalgamated Class
cPropertySloragcAmalgamaied class 416 implements IPropenyStorageAmalgamated interface 417, which supports the following operations, for example: PropertyStorageAddO and ClearSlorage(). ProperlySlorageAdd() adds a property sel to the collection of property sets. ClearStorageO clears the collection of property sets.
C. POLESS Exam Resource File
Figures 11 and 12-25 illustrate the POLHSS layout of exam resource file 120 according to the present invention. Exam resource file 120 stores the various pieces of compiled information from exam source files 130, as shown in Figure 5. Exam resource file 120 contains all of the content required to deliver the test. However, where the te.st is media-intense, exam resource file 120 will contain the core elements for the test with "links" to the external content. XXL compiler 140 and plugins 150 store the compiled information to exam instance file 120 using one of IPcrsistResourceStream interface 192a, rPersistResourceSet interface 192b, or IPersistResourceStore interface 192 to store the compiled information as a stream of data, a set of data, or a storage element, respectively. In a preferred embodiment, the layout of exam resource file 120 is in a hierarchical POLESS format that directly * implements the format of the XXL lest definition language. The lest developer uses the XXL test definition languace to create the logic files 230 and data files 212 (Figure 5) of exam source file 130. By having a storage structure that follows the format of the XXL te.st definiiion language, the incremental access aspect of POLESS is easily implemenied. XXL compiler 140 determines the storage location in exam resource file 120 that stores a particular piece of compiled information, even information stored into exam resource file 120 by one of plugins 150.
Figure 11 illustraics the main storage branches of exam resource file 120, which corresponds to the lop-level elements of ihe XXL test definition language, denoted by reference numeral 500. The main storage branches of exam resource file 120 are, for example: exams branch 550; forms branch 600; items branch 650; category branch 700; templates branch 750; sections branch 800; groups branch 850; plugins branch 900; data branch 950; formGroups branch 1000; attributes branch 1050; scripts branch I 100; and inessage box ("Msgbox") branch 1150. Other storage branches may alternativcly be used.
Exam branch 550, as seen in Figure 12, stores, for example, the primary attributes, properties, and data that govern the lest. Exam branch 550 can store information for various tests, as is denolcd bv the three, vertical ellipses. A specific lest is identified by the data stored in name attribute storage 552. Again, the various tests may each be identified by a different name, as denoted by the solid border around name attribute storage 552 or other identification scheme. Attributes storage 554 stores, for example, version information 555, and title information 556 of the lest as a stream of data or other data storage format. Title information 556 is optional, as is denoted by the broken border. Any optional.

customized information regarding the test is stored in custom properties 558 as a property storage or other data storage formal. Information relating to the forms of the lest are optionally stored in forms property storage 560. A form is a fixed or substantially fixed order of testing events. Many different forms can be stored in forms storage 560, giving flexibility to test driver 110 in controlling progression of the test. FormGroups storage 562 optionally stores information relating to a collection of exam forms as a stream of data or other data storage format. Preferably, a single form from the forrnGroup is chosen to deliver to an examinee. the selection of the form from the group is performed by a selection plugin 160. Exam branch 550 prefably contains at least one forms storage 560 either independently or within formGroups storage 562. Other information relating to the test may be stored under exam branch 550. Other storage formats may optionally be used
Forms branch 600. as seen in Figures I3A and 13B, stores, for example, the primary attributes, properties, and data that govern the progress of the test. Forms branch 600 can store information for various forms, as is denoted by the three, vertical ellipses. As described previously, a form is a fixed or substantially fixed or substantially fixed order of testing events. A single form is identified by the data stored in name attribute storage 602. Other identification formats may optionally be used. Again, the various forms may each be identified, for example, by a different name, as denoted by the solid border around name attribute storage 602. Attribute storage 604 stores, for example, begin section information 605. end section information 606, event information 607, and optionally stores version information 60S, title information 609. skip allowed information 610. restartable information 611. with information 612. height informatin 613, and bit depth information 614. All information stored in attribute storage 604 is stored as a stream of data or other data storage format. Begin section information 605 and end section information 6l)6 indicates, for example, respectively, which section of the test begins and ends the test.
Eveni information 607 indicates, for example, the order of events of the te.st for that form. Each event has a name and is prefixed with an event type and a colon. Other formats are optional. The event type includes 'section'*, "report", and "results". Version information 60S and litle information 609 indicate the version and title of the form, respectively. Skip allowed information 610 indicates, for example, whether or not by default skipping of sections is allowed. Restartable information 611 indicate, for example, whether the form can be restarted. Any optional', customized information regarding the form is stored in custom storage 616 as a property set or other data storage format. Timer storage 62S .stores, for example, information relating to how the form is to be timed as it storage element. Attributes storage 630 stores, for example, the names of Timer Plugin 158 to be used with the form. Plugin data storage 632 and plugin data storage 633 store any data necessary for timer plugin 158 as a storage element and a stream of data, respectively. Plugin data storage 632 and plug in data storage 633 are optional. Scoring storage 634 stores, for example, information relating to the scoring of the form. Attributes storage 636 stores, for example, the name of scoring plugin 164 to be used with the

form. Plugin data 638 and plugin data 639 optionally store any data needed for scoring Pluigin 164 as a storage element and a stream of data respectively.
Items Branch 650, as seen in Figure 14, stores, for example, the primary attributes, properties, and data that govern the items, or lest questions, to be delivered to the examinee during the lest. Items branch 650 can store information for various items, as is denoted by the three, vertical ellipses. A single item is identified by the data stored in name attributes storage 652. Again, the various items may each be identified by a different name, as denoted by the solid border around name atlribules storage
'2. Attributes storage 654 stores, for example, weight information 65'-..scored informaiion 655, and optionally stores skip allowed information 656, title information 657, start information 658, finish informaiion 659, and condition information 660. Weight information 654 indicates, for example, a value used for judging and scoring the item. In one embodiment, by default an item is given a weight of one in accordance with one embodiment, but other values may be utilized. Scored information 655 indicates, for example, whether or not the item is scored as opposed to whether the item is being used as an example. The default of .scored informaiion 655 is true. Skip allowed information 656 indicates, for example, whether the examinee can skip the item without answering.
Start information 658 indicates, for example, script execution at the beginning of the item and finish informaiion 659 indicates, for example, script execution at the end of the item. Condition information 660 indicates, for example, whether or not there is a condition on (he item being delivered to the examinee. The information stored in attributes storage 654 is stored as a stream of data or other data storage formal. Data storage 662 and data stream 664 store any information regarding the properties of the item. For example, data storage 662 or data stream 664 can store the correcl answer of a multiple choice item. Data storage 662 and data stream 664 stored the information as a storage element and a stream of data respectively.
Any optional, customized information regarding the item is stored in customs storage 666 as a stream of data or other data storage formal. Category storage 668 stores, for example, information relating to each category to which the item belongs. The informaiion stored in category storage 668 preferably and optionally is redundant, as category branch 700 stores, for example, all the items within the specific categories. The reason for the optional redundancy is so that test driver 110 can quickly look up the category of any item.
Category branch 700, as seen in Figure 15, stores, for example, the primary aitribules, properties, and data that govern the le.st categories. A tesl category provides a grouping mechanism, which is independent of delivery of the lest, allowing for exotic reporting and scoring if necessary. Category branch 700 is optional as denoted by the broken border. Category branch 700 can store information for various categories, as is denoted by the threc, vertical eliip.ses. A single category is identified by the data stored in name attributes storage 702. Again, the various categories may each be identified by a different name, as denoted by the solid border around name attributes .storage 702. ' Attributes storage 704 stores, for example, complete information 705, duplicates information 706,

contents information 707, and optionally stores, for example, description information 708. Complete informaiion 705 indicates, for example, whether or not every item in the category must appear within I he category or within its subcategories. Duplicates information 706 indicates, for example, whether the item can appear more than once within the category or within the subcategories. Contents informaiion 707 determines what can exist within a catesorv.
e
De.scription informaiion 70S is used withm the category to contain a description of the category's contents. Category storage 710 stores, for example, information relating to any subcategories under the catego. identified in name attribute storage 702. Items storage 7 i 2 indicates for example, any items that exist within the category. Sections storage 714 contains information indicating what any sections that exist within the category. Scoring storage 716 contains information relating to the scoring of the items within the category. Attributes storage 718 stores, for example, the name of the scoring plugin to be used with the item. Data storage 720 and data stream 722 contain the informaiion needed to initialize scoring plugin 164. Data storage 720 and data stream 722 store the information as a storage element and a stream of data respectively.
Templates branch 750, as seen in Figure 16, stores, for example, the primary attributes, properties, and data that govern the templates used in the lest. Template branch 750 can store informaiion for various main templates, as is denoted by the three, vertical ellipses. A single main template is iddentified by the data stored in name attributes storage 752. Again, the various templates may each be identified by a different name, as denoted by the solid border around name attributes storage 752. Attributes storage 754 stores, for example, split information 756. order information 757, and optionally stores size information 759. Split informaiion 656 defines how a specific area within the lemplate is to be split or separated, for example, either by rows or columns or other shapes and/or sizes. Size informaiion 759 indicates, for example, possible values for describing the size of the template, for example. pixels, percentages, or html syntax. Template storage 760 stores, for example, information relating lo any sub-templates to be used under the templates specified by the information in name attributes storage 752. Sub-templates are identified by the information in name attributes storage 762. Many sub-templates 760 can exist as denoted by the three vertical ellipses.
Areas storage 764 indicates, for example, informaiion relating to the areas used within the template denoted bv the information in name attributes storage 752. Manv areas mav exist within a lemplate as denoted by the three vertical ellip.ses. Each area is identified by the information stored in name aunbute siorage 766. Auribule siorage 768 stores, for example, visible plugin name informaiion 760, size information 770, and allow more information 771. Plugin name information 760 indicates, for example, the name of the visible plugin to be used with the area. Size informaiion 770 indicates, for example, the size of the area, as for example a pixel value, a percentage value, or HTML syntax. Plugin data 772 and plugin data 774 store information relating to the visible plugin to be used in the area. The data stored in either plugin data storage 772 or plugin data stream 774 is executed by the * visible plugin when the template is loaded. Plugin data storage 772 and plugin data stream 774 stores.

for example, the information as either a storage element or a stream of data, respectively. Other information may optionally be stored.
Section branch 800, as seen in Figure 17, stores, for example, the primary attributes, properties, and data that govern test sections. Test sections dictate the navigation and liming of groups of items as well as displays within the lest. Sections branch 800 can store information for various sections, as is denoted by the three, vertical ellipses. A single section is identified by the data stored in name attribute storage 802. Again, the various sections may each be identified by a different name, as noted by the solid border around name attributes storage 802. Attributes storge 804 stores, for example, group information 805 and optionally stores title information 806, skip allowed information 807, stait information 808, finish information 809, and condition information 810. Group information 805 indicates, for example, to which group of the test the section belongs. Skip allowed information 807 indicates, for example, whether or not the items within the section may be skipped. Start information 808 indicates, for example, .script execution at the beginning of the section and finish information 809 indicates, for-examplc, .script execution at the end of the section. Condition information 810 indicates, for example, any conditions that exi.st regarding the section. Any optional, customized information regarding this .section is stored in custom property storage 812 as a stream of data or other data storage format. Cu.stom attributes will be stored as a property set. The "key" for each attribute will be a siring or other acceptable formal.
Timer storage 814 stores information regarding, for example, the timing of the section. Attribute siorage 816 .stores, for example, information identifying timer plugin 158, which is to be used wiih a section. Plugin data storage 818 and plugin data storage 820 stores, for example, data needed for timer plugin 158. Plugin data siorage 818 and plugin data siorage 820 stores, for example, information as a storage element and a string of data, or other acceptable format, respectively. Navigation storage 822 stores, for example, information relating to the delivery of presentations and groups within the section. Attributes .storage 824 stores, for example, information indicating which navigation plugin 162 is to be used with this .section, Plugin data storage 826 and plugin data stream 828 store information needed for the navigation plugin 162. Plugin data storage 826 and plugin data stream 828 store the information as a storage element and a stream of data respectively. Groups branch 850, as seen in Figure 18, stores, for example, the primary attributes, properties, and data that govern the groups within the test. A group determines the order of events within the test. Groups branch 850 can store information for various groups, as is denoted by the three, vertical ellipses. A single group is identified by the data store in name attributes storage 852. The various groups may each be identified by a different name, as noted bv the solid border around name attributes storace 852. Attributes siorage 854 Stores, for example, type information 855, event information 856, title information 857, and reviewed name information 858. Type information 855 indicates, for example, whether the group is either a "group holder" (group of presentations), or a 'section holder" (group of sub-sections). These are mutually exclusive.

Event information 856 indicates, for example, the order of events within the test. Review name information 858 indicates, for example, whether or not a presentation within the group is to be used as a review screen. Any optional, customized information regarding the group is stored in custom storage 860 as a stream of data or other data storage format. Events storage 862 stores event intormaiion, for example, as is described in further detail in Figure 19. Scoring storage 864 stores, for example, information relating to the scorine of items within the group. Attributes storage 866 stores, for example, information indicating which sconng plugin 164 is to be used with the group. Selection storage 872 stores, for. ample, information relating to the selection of items within the grouj. Attributes storage 874 indicates, for example, which selection plugin 160 is to be used with the group.
Figures ]9A, 19B. I9C, and 19D illustrate the events sub-branch of groups branch 850 in creater detail, in accordance with one embodiment of the invention. In Figure 19A, events sub-branch 862 can store information for various events. For example, events sub-branch 862 is storing information in events name sub-branch 880, event name sub-branch 890, and event name sub-branch 897. Attributes storage 881, in Figure I9B, under events name storage 880 stores, for example, type information 882, template information 883, and optionally stores title information 884, counted information 885. start information 886, finsh information 887, and condition information 888. Type Information 882 indicates, for example, whether the event is an item or a display. Template information 883 indicates, for example, which template is being used with the event. Counted information 885 indicates, for example, whether a presentation should be included in the totals of presentations presented to the examinee in a section. Generally, pre.sentalions with items, or questions. arc counted and introductory pre.sentalions are not counted.
Start information 886, finish information 887, and condition information 8S8 indicates, for example, start, fmish. and conditional scripts respectively. Any optional, customized information regarding the event is stored in custom storage 889. The "key" for each custom attribute will be a string. Referring again to Figure I9A, event name storage 890 indicates, for example, a different event, which contains different attributes. Additionally, area information 891, in Figure 19B, indicates, for example, which area is rendering the presentaions content and item information 892 indicates, for example, the name of the associated item if the event is of the item type. Additionally, data storage 803. data streant 894. data storage 895, and data storage 896 contain information used in a nested presentation. The data off of a nested presentation are the contents of the item or the presentation. This data may be a stream, n storage, a link to a strearn, a link to a storage, or other format. In Figure I9C, e\eni nan")e 897 indicates, for example, another event, which includes a sub-event 898, in Figure I9D.
Plugins branch 900. as seen in Figure 20, stores, for example, the primary attributes, properties, and data that govern any plugins 150 u.sed for the test. Plugins branch 900 can store information for various plugins, as is denoted by the three, vertical ellipses. A single plugin is identified by the data stored in name attribute .storage 902. A CLSID is stamped with the name of the plugin 150. Attributes storage 904 .stores, for example, information identifying the plugin 150 by a program ID. Data storage

906 and data storage 908 store initialdata for the plugin as either a storage element or a stream of data respectively.
Data branch 950. as indicated in Figure 21. stores, for example, any global data needed for the test. Data stored optionally under data branch 950 may be stored as cither a storage element or a stream of data as indicated by data storage 952 and data storage 954. Data stored under data branch 950 may be directly used by a plugin 150 or the data may be resources (.gif, jpeg, .wab, .mpeg, etc.) used internally by a plugin 150.
FormGroups branch lOOOl as seen in Figure 22. stores, for example, the primary auributes properties and data that govern the formGroups of the test. FormGroups branch lOOO can store information for various formGroups, as is denoted by the three, vertical ellipses. A single formGroup is identified by the data stored in name altributes storage 1002. The various formGroups may each be identified by a different name, as denoted by the solid border around name attributes storage 1002. Attributes storage 1004 stores, for example, information indicating which forms are to be used within the formGroup. Selections storage 1006 stores, for example, information relating to the selection of items within (he formGroup. Attributes storage 1008 indicates, for example, which selection plugin


Running branch 1202 stores, for example, the stale information of all runnine objects in test driver 110 and plugins 150. Plugins 150 use one of IPersislInsranceSlream interface 196a, IPcrsistlnstanceSet interface 196b. or IPersislInsianceStore interface 196c to store information to exam instance file 170 as a stream of data, a set of data, or a store of data, respectively. Any of plugins 150, except display plugin 152. results plugin 166, repon pluigin 168, and helm plugin 154. which do not contain examination stale information, store examination slate information to exam instance file 1 70. Test driver 1 !0 determines the storage location in exam instance file 170 that stores a panicular piece of examination. state information.
Exam sub-branch 1204 contains examination state information relating to the exam. Contents storage 1206 stores, for example, exam status information 1207 and version information I20S. Exam status information 1207 indicates. for example the status of the exam, for example, initializing or terminating. Template storage branch 1210 stores, for example, examination slate information relating lo templates mnning in the exam. Name attribute storage 1212 stores, for example, count information 1214 and observed ever information 1215. Observed ever information 1215 indicates, for example, ^vhelher or not the template's content has ever been fully seen by the examinee.
Form storage branch 1216 contains information relatini: to the forms used within the exam. Contents storage branch 1218 stores, for example, seconds infortnation 1219, date start information 1220. dale finish information 1221. current section information 1222. and version information 1223. Current section information 1222 indicates, for example, the current section being delivered to the examinee in the form. Version information 1223 indicates, for example, the identification of the form.
Sections chosen storage branch 1224. as illustrated in Figure 26B. stores, for example, information relating to sections in the form being delivered lo the examinee. Contents storage 1226 stores, for example, the names of the sections that have been or will be delivered to the examinee. Name attribute storage 1228 indicates, for example, the name of a panicular section. Contents storage 1230 stores, for example, current child information 1231, seconds information 1232, date .start information 1233. and dale finish information 1234. Navigation storage 1236 and navigation storage 1237 store the state information of navigation plugin 162. Navigation storage 1236 stores, for example, the examination state infurmation from navigation piugin 162 if navigation plugin 162 implements the
iPersisilnterfaceSet 196b or IPersistlnterfaceStore 196c. Navigation storage 1237 stores, for example,

the information from navigation plugin 162 if navigation plugin 162 implements IPersistlnterfaceStream 196a. Timers storage 123S and timers storage 1239 store information from timer plugin 158. Timer «;torage 1238 is used if timer plugin 158 implements IPersisilnterfaceSet 196b or IPersistlnterfaceStore 196c. Timers storage 1239 is used if timer plugin 158 uses IPersisthnerfaceSiteam 196a.
Items chosen sub-branch storage 1240 stores, for example, information relating to items that have been or will be delivered to the examinee. Contents storage branch 1242 stores, for example, the names and order of all the items that have been or will be delivered to the examinee.. Name attributes

storage 1244 indicates, for example, the identification of a particular item. Contents storage branch 1246 stores, for example, presented information 1244, complete information 1248, skipped information 1249, seconds information 1250, dehydrated information 125!, and observed ever information 1252. Presented information 1247 indicates, for example, whether the item has ever been delivered to the examinee. Completed information 1248 indicates, for example, whether ornot the item has been completed. ^Skipped inforinaiion 1249 indicates, for example, whether the item has been skipped. Item plugin storage 1254 and item plugin storage 1255 stores, for example, examination state information from item plugin 156. Item plugin stor,.-,j 1254 is used if item plugin 156 uses IPersistlntcrfaceSet 196b or IPersistlnterfaceStore 196c. Item plugin storage 1255 is used if item plugin 156 uses IPersistlnterfaceStream 196a.
In Figure 26C. item light storage 1256 exists only if the item was dehydrated (to save memory or when a section ends). The dehydrated item stores the data but actions on the data are no longer available until the item is re-hydrated. Item light storage 1256 stores, for example, score candidate information 1257. Score minimum information 1258, score nominal information 1259, .score maximum information r26(), complete information 1261, skipped information 1262, correct answer display 1263, response results 1264, and correct answer results 1266. Timers storage 1268 and timers storage 1269 store information from timer plugin 158. Timer storage 1268, as seen in Figure 26B. is used if timer plugin 158 implements IPersistlnterfaceSet 196b or IPersistlnlerfaceSiorc 196c. Timers storage 1269 is used if timer plugn 15S uses IPersistlnterfaceStream 196a. Score storage 1270 and Score storage 127! store information from timer plugin 158. Timer storage 1270 is used if timer plugin 158 implements IPersistlntcrfaceSet 196b or lPersistInierfaceSrore 196c. Score storage 1271 is used if timer plugin 158 uses IPersistlnterfaceStream 196a.
In Figure 26C, groups chosen sub-branch storage 1272 indicates, for example, which groups have been or will be delivered to the examinee. Contents storage 1274 stores, for example, the names of the groups. Name attribules storage 1276 indicates, for example, the name of a panicular group. Contents storage 1278 stores, for example, names of groups and the order of groups. Scoring storage 1280 and scoring storage 1281 store examination state information from score plugin 164. Scoring storage 1280 is used if score plugin 164 implements IPersistlntcrfaceSet 196b or IPersistlnterfaceStore 196c. Scoring storage information 1281 is used if score plugin 164 implements IPersistlnterfaceStream 196a. Selection storage 1282 and selection storage 1283 store information from selection plugin 160. Selection storage 1282 is u.sed if selection plugin 160 implements IPersistlnterfaceSel 196b or IPersistlnterfaceStore 196c. Selection storage 1283 is used if selection plugin 160 implements IPersistlnterfaceStream 196a. Delivered storage 1284, in Figure 26D, stores, for example, an ordered list of groups chosen for delivery. Delivered storage 1285 stores, for example, an ordered list of the sub-clas.ses of the form, for example: sections, reports and results.
Presentations chosen storage sub-branch 1286 indicates, for example, any presentations that have been or will be delivered to the examinee. Contents storage 1288 stores, for example, the narr^es

of the presentations. Names storage sub-branch 1290 stores, for example, the name of the presentation Names storage 1290 also stores, for example, comment information 1291, marked information 1292, count information 1293, name information 1294. observed ever information 1295. name information 1296, and observed ever information 1297. Name information 1294 and observed information 1295 relate to the name of the first presentation area stored under preseniaiions chosen sub-branch 1286 and whether or not the presentation has ever been observed, and name information I 296 indicates, for example, the last presentation area that was delivered to the examinee and vvhether or not the sentation was ever observed. Contents storage 1298 stores, for example information Icadmg to events. Contents storage 1298 stores, for example, ready information 1299 ever checked information 1300, ever staned information 1301, and ever finished information 1302. Ready inforniation 1299 indicates, for example, whether the event is ready to be delivered to the examinee. Ever checked information 1300 indicates, for example, whether an event's conditional delivery script ever been checked. Preferably, the conditional delivery script is only checked once. Ever started information 1301 indicates, for example, whether the event was ever started by the examinee. Ever finished information 1.302 indicates, for example, whether the event was completed by the examinee.
Referring again to Figure 26A, contents branch 1310 stores, for example, a propeny set containing information to identify the examination instance and the examination start count 1312. The identifying information u.sed is the examinee appointment identification 1311, the name 1313 of exam resource file 120, and the name 1314 of the specified form or group.
History branch 1320 is a single stream of chronological text messages that togs the history o\' the test. These text messages are u.sed by staff at system headquarters to diagno.se problems that occurred in the field. Each text message is prefixed with the date, time, and a level of severity, for example: information, warning, or error. Test driver 110 will filter the text messages to a level of diagnostics desired for lest driver 110, such as determining errors m test driver 1 10 or detail history tracking. including general information.
V. Expansion of Test Driver Using Flugins
Figure 27 illustrates the process for customizing lest based on specific requirement from the
client using plugins 150. denoted generally by reference numeral 1400. First, the client presents the
new rcqunements, for example, a new item type, to the lest developer, step 1402. The test developer
then writes and XML .schema to define the XXL test specification, step 1404. The schema is
subsequently used to validate the XXL test specification. An example of the XXL schema is as
follows:
! - - [linear_navigate-schema.xml] - ->








The validation of the lest specification and content is illustrated in greater detail in Figure 29, by the method denoted generally by reference numeral 1512. When the test specification and content stored in exam source files 130 specifically references a plugin 150, that plugin 150 is instanliated, step

1M4. Pantal lest specification and content relating to that plugin 150 are loaded into the plugin 150 from exam source files 130. step 1516. In an alternative embodiment, the partial test specification and content are loaded into a private memory in data communication with the plugin 150. The plugin 150 validates the panial lest specification and content, step 1518. The validated lesr specification and content are then unloaded from the plugin 150 into a storage clement within exam resource file 120.
Figure 30 illustrates the method of the test delivery cycle in greater detail. When the previously validated lest specification and content stored in exam resource file 120 references a plugin 150, the plugm I50 is inst iated, step 1525. The storage element in exam resource file 120 contaning the validated test specification and content are provided to the plugin 150, step 1527. The validated lest specification and content are loaded into the plugin 150 from the storage clement within exam resource file 120, step 1529. Finally, the examination slate information, which includes, for example, the examinee's responses, is stored into exam instance file 170, slep 1533.





The ten plugins defined in the previous example represent eight different types of plujzms 150. Noi all of the possible types of plugins 150 are required to build any one lest. Also, more than one piugin 150 is implemented for a specific type, in the above example, two navigation plugins 162 and two item plugins 156 arc defined. XXL compiler 140 reads information from e.xam source files 130 usins IStream interface 340, iNode interface 1424, which is the Microsoft interface used to access a node of an XML document in the document object model (*1)0M"). and IStreamVB interface 34S. XXL compiler 140 instantiates the requested piugin 150 using, for example, the call CoCrcaielnstance{). CoCreaielnstanceO creates a single, uninitialized object of the clas-s associated with a specified CLSID, using a prog ID tha( has been converted into the CLSID.
If the data referring to piugin 1.50 has been customized by the test developer, XXL compiler 140 may not recognize the new data. Therefore, XXL compiler 140 passes the data directly lo piugin 150 and piugin 150 loads the data into a private memory (not shown). In one embodiment, the private memory is internal to piugin 150. and in another embodiment, the private memory is external lo piugin 150. Piugin 150 can then validate the data using the XXL schema. If the data is invalid, piugin L'iO reports the error. In an alternative embodiment, piugin 150 can validate the data using an XML document type definition (*DTD"). A DTD is a formal description in XML Declaration Syntax of a particular type of document. Similar to a schema, a DTD sets out what names are to be used to the diflerent types of elements, where they may occur, and how they all fit together. However, the XXL schema is preferred for validation since schemas are easier lo read than a DTD and are very flexible.
If piugin 150 declares that the data is valid, XXL compiler 140 prepares a POLESS storage object 300 in exam resource file 120 to which piugin 150 saves the data at a command from XXL compiler 140, in step II. As described previously, XXL compiler 140 determines where the daia from piugin 150 is to be saved in exam resource file 120 and creates the appropriate storage location. The name, CLSID, and data associated with piugin 150 is stored in plugins branch 900 in exam resource tile 120 (Figure 20). Piugin 150 implements IPersi.stResource interface 192 to store the data to exam resource file 120. Data storage 906 stores, for example, the data, for example as either a stream, set of data, or as a storage element if piugin 150 implements either IPersislResourceSlream 192a, IPersistRcsourccSet intertace 192b, or IPersistResourceStorc interface 192c, respectively. Data storage 908 stores, for example, the data as a stream of data if piugin 150 implements IPersislResourceStream


incorporated herein by reference.
Referring again to Figure 32, during the test delivery cycle, lest driver 110 reads the lest specifications stored in exam resource file 120 through POLESS objects 300. Test driver 1 10 reads


Step V occurs if the lest is irncrrupted. for example, because of a power failure, and the test needs to restart. When test driver 110 is required lo return to a particular operation state, lesl driver 1 10 reads the examination stale information from exam instance file 170. Plugin 150 is provided the storage object containing the slate of plugin 150 as saved in step FV using IPersisllnstance interface 196. Using the previous example, item plugin 156 retrieves its state information from item plugin storage 1254 or for item plugin storage 1255. Plugin 150 is able lo become operational from the retrieved stale information, enabling a restart of the lest from the point at which the test was interrupted.
The delivery sequence of a plugin 150. as shown in steps 11. IV, and V in Figure 32. are illustrated in greater detail in Figures 34A. 34B. 34C, and 34D. As seen in Figure 34A, delivery , sequence 1520 particularly relates to visible plugins 150, e.i;., display plugin 152. helm plugin 154. and


After plugin 150 is properly loaded, cTcmplate class 236 in test driver 1 10 uses. 1 ::PresentationSlarting() call 1532 (continued in Figure 34B) to inform visible plugin 150 that the presentation is siariing, in step Illd. P':;Presen!alionS(aning() call 1532 is made to any visible plugin 150 being used in the presentation on the appropriate interface, for example: [Display interface 169a, lltem interface 169c, or IHelm interface 169b. For example, an lltem::PresentationSturting() call is used for item plugin 156. cTemplate class 236 then instruct visible plugins 150 to display using I01eObject::DoVcrb(Shovv....) command 1534, step llle. lOleObject inierface 1522 is the Active Document interface used to implement the Active Document pre.sentation. iOieObject inierface 1522 is the combmation of the Active Document interfaces described in conjunction with Figure 7. After nisiructing visible plugins 150 to display, test driver I 10 awaits notification from each visible plugin 150 that the specific visible plugin 150 has successfully shown. Visible plugins 150 call back to test driver 110 using lContaincrNotify::Activated() call 1536, step Illf (continued in Figure 34B). Now, the presentation is started and active such that the examinee can interact with the presentation.


The many features and advantages of the invention are apparent from the detailed specification, and thus, it is intended by the appended claims ro cover all such features and advantages of the invention, which fall within the true spirit and scope of the invention. Further, since numerous modificalions and variations will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction illustrated and described, and accordingly, all suitable modifications ' and equivalence may be resorted to, falling within the scope of the invention.

APPENDIX A-UTD CLASSES AND INTERFACES

































































PHYSICAL VIEW REPORT



























































PHYSICAL VIEW REPORT Set by the consumer of the plugin (the driver or compiler).
Public Opcrations.

Validates the source XXL for the plugin. The plugin must raise an automation error that describes any problems with the source. The source is not required to be complete only the portions provided should be verified.
If the contents of the stream is Unicode it will be marked with the BOM (byte order mark) as defined be Unicode standard (www.unicode.org). The BOM is normally FFFE.
If the stream contains ASCII or UTF-8, no BOM will be included.
The oCompilerServices interface is provided to offer additional features and information to the plugins.
Unload 0 :
Unload data and references to UTD objects.

Load with references to UTD objects. Only called during exam delivery,
UTDCore1ContainerNotifyHelm (Interfaces)
This interface consumed by plugins to inform the container to navigate. Public Operation.*::

Requests that the driver proceed in the direction specified. The driver next requests this movement from the navigation plugin.
The second parameter optional specifies the presentation. This is only used for the JUMP.



PHYSICAL VIEW REPORT
The collection of all items chosen (that is, returned by a selection plugin) in the exam. This is regardless of their section.
coIAHSections : UTDCore ISections The collection of all sections of the exam regardless of their level.
datSurt: Date
The form start date and time.
datFinish : Dale The form finish date and time.
oTimer : UTDCore.HJoilTimer
The timer for the form.
oScoring : UTDCore.IScorc
The scoring plugin for the form.
sVersion : String The version of the form.
colDclivered : UTDCore J Events
The collection of all delivered lop level sections of the form. Read-only.
nCurlndex : Long = 0
Index of last delivered event in colDclivered. Read-only.
eSlatus : UTDCore.eScorcSlalus Returns the value of oForm.oScore.eStatus.
colAHPresenlalions: UTDCorcIPrcsenlations The collection of all presentations of the exam regardless of their level.








The collection of presentations or sub-sections for the section.
oSclection : UTDCore.ISelcclion
The selection plugin for this section.
oTemplate: UTDCort-cTemplate The tmeplate for the section.







PHYSICAL VIEW REPORT
The administration system interface for save score repons Also handles initial printing.
It is also emulated by UTD2ETS and Launchtest components tjat emulate the UAS.
UASIAppointment (Interfaces)
This interface is pan of the Unified Administration access to the candidate information for the candidate
It is also emulated by UTD2ETS and Launchtest emulate the UAS.

PHYSICAL VIEW REPORT
TOTALS:
2 Components 5|1 Classes
COMPONENT PACKAGE STRUCTURE
Component View

APPENDIX B-POLESS CLASSES AND INTERFACES








lEnumProviders
Public Operations:
Next (ppProv : IProvider**): HRESULT
Returns the next provider interface, or NULL if there are no more providers. Skip (cell : ULONG): HRESULT
Skips over the next specified number of providers. Reset 0: HRESULT
Resets the enumerator to the beginning.





LOGICAL VIEW REPORT
Applies a POLESS delta file. It applies to the original poiess filetlo the delta poless file and create an updated poiess file.
The CRC in the delta file for the original poiess file in compared to the original file
calculated CRC.
If they match then the deltas are allied to create the update
poiess file. The CRC of the update file is calculated and compared to the update file
CRC in the delta file.
GctObjectFromPatb (sFuIIPath : BSTR, eAcctssMode : eACCESS.MODE, ppDisp : IDispatch "! : HRESULT
Uses monikers to retrieve the object named by the path. Returns a IDispatch poinici i^ the object retrieved.
CreateStrcamFromFilc (sName : BSTR, ppDisp : IDispatch'*): HRESULT
Creates a structured storage stream and populates it with the contents of thefile-CrealeStrcamFromBSTR (sin : BSTR, ppDisp : IDispatch*'): HRESULT
Creates a .structured storage stream and fills it with the specified BSTR. MemorystreamFromStream (pStreamIn : IStrcam*, ppDisp : IDispalch**) : HRESULT
Used to copy a stream to a newly created memory stream object, seek pointt:rs for twith streams are reset to beginning of stream after operation.
GeiBindCtx (ppBindCu : IBindCu**) : HRESULT
Returns the static bind context that is used for creating monikers.
POLESS.IPropertyStorageAmalgamated
Public Operations:
Property'StorageAdd (oPropertytch: IDispatch*. bEnd : VARJANT.BOOL = TRUE) : HRESLFLI
Add a PropertySet to the collection of Property Sets. ClearStoragc f): HRESULT
Clears the collection of PropcrtySets.
POLESS.IPropertyStorageVB
Manages the persistent properties of a single property set.















TOTALS:
1 Logical Packages 26 Classes
LQGICA|L PACKAGE STRUCTURE
Logical View 0LE2SS



What is claimed is:
1. A system for computer-based testing tor at least one lest, the at
presentation format and data content, comprising:
a test driver, having an executable code that controls functionality that deliver the at least one test to an examinee using a display device, manage the at progression of the at least one lest, control scoring of the at least one test, and of the at least one test:
a resource file, m operative data communication with the test driver, relating to the data content, the presentation format, progression, scoring, and least one test, the information being accessible to the test driver to enable the driver; and
an expansion module, in operative data communication with the test file, that retrieves the information relating to at least one of the data content, the progression, the scoring, and the results reporting of the at least one test provides the information to the test driver during delivery of the at least one expanding the functionality of the lest driver without necessitating modification of the test driver.
2. The system of claim I, wherein the information stored in the
extensible markup language format.
3. The system of claim 1, wherein the expansion module
4. The system of claim 3, wherein the expansion module is ' built.
5. The system of claim I, wherein the resource file
data.
6. The system of claim 1, further comprising a function
communication between the test driver and the expansion module, wherein
enables the test driver to load core object references into the expansion
delivery cycle and to unload the core object references from the expansion
the test delivery cycle.
7. The system of claim 6, wherein the test driver accesses
expansion module, the system further comprising a feature interface that
between the test driver and the expansion module and enables the test driver
in the expansion module to enhance the functionality of the test driver.
8. The system of claim 7, wherein the information stored in
comprises at least one of non-interactive display material, lest navigation,

itcms, liming, selection, scoring, results, and reporting, and wherein the feature test driver to access the information.


18. The system of claim 10, wherein the test packager comprises a compiler
19. The system of claim 10. wherein the information stored in the


a lest driver, having an executable code that controls functionality that deliver the at least one test to an examinee using a display device, manage


display material, test navigation, test navigation controls, items, liming, selection, scoring, results, and reporting.
33. The system of claim 32, wherein the test driver accesses the information stored in the expansion module, the information being of a type relating to at least one of noti-intcractive display material, test navigation, test navigation controls, items, timing, selection, scoring, results, and reponing. the system further comprising a feature interface that enables communication between the v^{ driver and the expansion module and enables the test driver to acces 34. The system of claim 26, further comprising data files, visual format files, and multimedia files, wherein the data files, visual format files, and multimedia files comprise further information relating to the data content, presentation format, progression, scoring, and results reporting of the at least one test.
35. The system of claim 26, further comprising:
a resource persistence interface that enables communication between the expansion module and the resource file such that the expansion module is capable of storing (he information from the source file into the resource file and retrieve the information from the resource file, wherein the expansion module retrieves the information from the resource file during the delivery of the at least one test, and wherein the information is able to be stored into the resource file as at least one of a stream of data, a set of data, and a directory; and
an instance persistence interface (hat enables communication between the expansion module and the instance file such that the expansion module is capable of storing the examination state information to the instance file and retrieve information from th instance file, wherein..if the at least one test is intemupted. the expansion module is capable of retrieving the examination state information that was provided by the examinee before the at least one test was interrupted and enable the examinee lo continue with the at least one test in situ.
36. The system of claim 26, wherein:
the test packager comprises a compiler:
the information in the source file comprises extensible markup language format; and validating the information entails the expansion module determining whether the information is correctly formatted.
37. The system of claim 26, wherein a coirect format for the information is defined in a schema.
38. The system of claim 26, wherein a schema is employed as the information is being authored such that the information is able to be validated as the information is being authored and written to the source file.

39. A system for computer-based testing for at least one test, the at least one test having a
prresentalion format and data content, comprising:
test driver means for controlling the delivering the at least one test to an examinee using a display device, managing the at least one test, controlling progression of the at least one test, controlling scoring of (he ai least one test, and controlling results reporting of the at least one test, (he test driver mcuns having an executable code that controls the test driver means:
resource storage means, in operative data communication with the lest driver means, for storing mformation relating to the data c(..ent, the presentation format, progression, scoring, and results reporting of the at least one lest, the information being accessible to the lest driver means to enable the functionality of the test driver; and
an expansion means, in operative data communication with the test driver means and the resource storage means, for retrieving the information relating to at least one of the data content, the presentalion format, the progression, the scoring, and the results reporting from the resource storage means and provided the information to the lest driver means during deliver of the at least one test, the cxpan-^ion means expanding the ability of the test driver means to control the tielivering the a! least one lest lo an examinee using a display device, manage the at least one lest, control progression of the at least one test, control scoring of the ai least one lest, and control results reporting of the at least one test without necessitating modification to the executable code of the lest driver means.
40. The system of claim 39. further comprising function interface means for enabling
communication between the test driver means and the expansion means, wherein the function interface
means enables the test driver means to load core object references into the expansion means at a start of
a test delivery cycle and lo unload the core object references from the expansion means at a completion
of the lest delivery cycle, and wherein the function interface means enables the test driver means to
notify the expansion means that the at least one test is being delivered and that the lest driver means
wishes to access the informaiion stored in the expansion means.
4 I. The system of claim 40, wherein the lest driver means accesses the information stored in the expansion means, the information being of a type relating to al least one of non-interactive display material, test navigation, test navigalion controls, items, timing, selection, scoring, results, and reporting, the system further comprising a feature interface that enables communication between the test driver means and the expansion means, the expansion means expanding the ability of the test driver means to control the delivering the at least one test to an examinee using a display device, manage ihe at least one test, control progression of the at least one test, control scoring of the al least one test, and control results reporting of the at least one test without necessitating modification to the executable code of the test driver means.
42. The system of claim 39, further comprising:

source storage means, in operalive data communication with the expansion means, for storing the information relating to the data content, the presentation format, the progression, the scoring, and the results reporting of the at least one test; and
test packager means, in operative data communication with the source storage means and the expansion means, for passing the information from the source storage means to the expansion means such thai the expansion means is capable of validating the information from the source storage means.
43. The system of claim 42. further comprising a function interfacc means that enables communication between the test packager mcans and the expansion means, wherein the function interface means enables the lest packager means to notify the expansion means that the source storage means is present, that the information in the source storage means is to be validated, and to pass the information from the source storage means to the expansion means such that the expansion means is capable of validating the information from the source storage means, the information further comprising at least one of non-interactive display material, test navigation, test navigation controls, items, timing, selection, scoring, results, and reporting.
44. The sysiem of claim 42, further comprising data files, visual formal files, and multimedia files, wherein the data files, visual format files, and multimedia files comprise further information relating to the data content, presentation format, progression, scoring, and results reponing o\' Ihe at least one test.
45. The sysiem of claim 42, further comprising resource persistence interface means for enabling communication between the expansion means and the resource storage means such that the expansion means is capable of storing the information from the source storage means into the resource storage means and retrieve the information from the resource storage means, wherein the expansion means retrieves the information from the resource storage means during the test delivery cycle.
46. The system of claim 45, wherein the resource persistence interface means enables storing the information from the source storage means into the resource storage means as al least one of a stream of data, a set of data, and a directory.
47. The sysiem of claim 46, wherein the information in the source storage means comprises extensible markup language format, and wherein validating the information entails the expansion means determininc whether the information is corrcctlv formatted.
48. The system of claim 39, funher comprising instance storage means for storing examination state information, which includes responses provided by the examinee to items presented to the examinee durinc the at least one test.
49. The sysiem of claim 48, wherein the instance storage means funher stores at least one of timing utilized and lime remaining on units of the at least one test, a cun'ent unit of delivery, and an examinee score.

50. The system of claim 49, further comprising an instance persistence interface for enabling communication between the expansion means and the instance storage means such that the expansion means is capable of storing the examination state information to the instance storage means and retrieve the examination state information from the instance storage means, wherein, if the at least one test is interrupted, the expansion module is capable of retrieving the examination slate information stored within the expansion means before the at least one test was interrupted and enable the examines to continue with the at least one test in situ.
? 1. A system fur computer-based testing for at least one test, the at least one test havnig a presentation format and data content, comprising:
means for authoring information relating to at least one of the data content, the presentation format, progression, scoring, and results reporting of the at least one lest:
first means fur storing the information
first means for relrieving the information from the first means for storing;
means for receiving the information stored in the first means for storing from the means for reltieving;
means for validating by the expansion module the information received from the means for retrieving
second means for storing the information validated by the means for validating;
means fur controlling functionality to deliver the at least one test to an cxammee using a display device, to manage the at least one lest, to control the progression of the at least one test, to control the scoring of the at least one lest, and to control the results reponing of the at least one test;
second means for retrieving the information stored in the second means for storing;
means for providing the information retrieved by the second means for retrieving to the means tor controlling during delivery of the at least one lest, wherein the information enables the functionalii) of the lest driver:
third means for storing examination state information comprising responses provided by the examinee to items presented to the examinee during the at least one test by an instance file, the examination state information enabling a restart of the at least one test if the at least one lest is interupted: and
third means for retrieving the examination state information bv the second means and the means for controlling from the third means for storinh.
52. A system for computer-based testing for at least one test, the at least one lest having a presentation format and data content, comprising:
a test driver having an executable code that controls functionality that enables the test driver to deliver the at least one lesl lo an examince using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, and control results reponing of the at least one test;

a resource Hie, in operative data communication with the lesl driver, that stores a plurality of information, including first information relating to non-interactive display material, second information relating to test navigation, third information relating to test navigation controls, fourth information relating lo items, fifth information relating to liming, sixth information relating to selection, seventh information relating lo scoring, eighth information relating lo results, and ninth information relating to reporting for the at least one test, the plurality of information accessible to the test driver to enable the functionality of the test driver;
a first expansion module, in operative data dommication with the resource file and the test driver, that retrieves the first information relating to non-interactive display material from the resource file and provides the first information to the test driver during delivery of the at least one test:
a second expansion module in operative data communicaiion with the resource file and the test driver thai retrieves the second information relating to test navigation from the resource file and provides the second information to the test driver during the delivery of the test;
a third expansion module in operative data communication with the resource file and the lesl driver that retrieves the third information relaling to test navigation controls from the resource file and provides the third information to the lest driver during the delivery of the lesl;
a fourth expansion module in operative data communicaiion with the resource file and the test driver that retrieves the founh information relating to items from the resource file and provides the fourth information to the test driver during the delivery of the test;
a fifth expansion module in operative data communication with the resource file and the test driver that retrieves the fifth information relating to liming from tiie resource file and provides the fifth information lo the lest driver during the delivery of the test;
a sixth expansion module in operative data communicaiion with (he resource file and the lesl driver thai retrieves the sixth information relating to selection from the resource file and provides the sixth information to the test driver during the delivery of the tesl;
a seventh expansion module in operative data communication with the resource file and the tesl driver that retrieves the seventh information relating to scoring from the resource file and provides the seventh information to the test driver during the delivery of the test;
an eighth expansion module in operative data communication with the resource file and the lest driver that retrievers the eighth information relating to results from the resource file and provides the eighlh information to the test driver during the delivery of the test; and
a ninth expansion module in operative data communication with the resource file and ihe lesl driver that retrieves the ninth information relating to reporting from the resource file and provides the ninth information lo the tesl driver during the delivery of the test,
wherein the expansion modules expand the functionality of the lest driver without necessitating modification to the executable code of the test driver.

53. A method for computer-based testing for at least one test, the at least one lest having a
presentation format and data content, the at least one test being controlled by a test driver, the test
driver having an executable code that controls functionality that enables the test driver to deliver the at
least one test lo an examinee using a display device, manage the at least one test, control progression of
the at least one test, control scoring of the at least one test, and control results reporting of (he at least
one test, the method comprising the steps of:
instantiating an expansion module;
providing to ".e expansion module a resource storage element within a resource file.
loading information from the resource storage element into the expansion module during delivery of the at least one test, wherein the information from the resource storage element relates to at least one of the data content, the presentation format, progression, scoring, and results reporting of the at least one test: and
providing the information from the expansion module to the test driver during the delivery of the at least one test such that the expansion module expands the functionality of the test driver without necessitating.programming changes to the executable code of the test driver.
54. The method of claim 53, wherein the information loaded from the resource storage element into the expansion module further relates to at least one of non-interactive display material, test navigation, lest navigation controls, items, liming, selection, seoring. results, and reponing.
55. The method of claim 53, wherein the resource file comprises strcturc storage persistent data, and wherein the resource storage element comprises a structure storage element.
56. The method of claim 53, wherein the loading of information from the resource storage element into the expansion module is facilitated by a resource persistence interface.
57. The method of claim 56, wherein the resource persistence interface enables loading the information from the resource storage element into the expansion module as at least one of a stream of data, a set of data, and a directory.
5S. The method of claim 53, wherein instantiating the expansion module is facilitated by standard Microsoft object instantiation using a component object model server.
59. The method of claim 53, wherein the information stored in the resource storage j element comprises extensible markup language.
60. The method of claim 59, instantiating the expansion module further comprising the step of calling the expansion module using a product identification comprising extensible markup language in the resource storage element.
61. The method of claim 53, further comprising the step of loading core object references from the test driver into the expansion module at a start of a test delivery cycle.

62. The method of claim 61. wherein loading the core object referenceii from the test driver into the expansion module is facilitated using a function interface.
63. The method of claim 53, wherein providing the information from the expansion module to the test driver is facilitated using a feature inlerface.
64. The method of claim 53, further comprising the steps of:
providing to the expansion module an instance storage element within an insumce file: unloading examination stale information, which includes responses provided by the examinee to items presented to the examinee during the at least one test, from the expansion module into the instance storage element; and
loading the examination state information from the instance storage element into the expansion module, wherein, if the at least one test is interrupted, the expansion module is capable of retrieving the examination state information that was provided by the examinee before the at least one test was interrupted and enable the examinee to continue with the at least one test in situ.
65. * The method of claim 64, wherein the instance file comprises structure storage
persistent data and the instance storage element comprises a structure storage clement.
66. The method of claim 64, wherein loading examination state information and unloadim: examination state information is facilitated using an instance persistence interface.
67. The method of claim 66, wherein the instance persistence interface enables unloading (he examination state information from the expansion module into the instance storage element and unloading of the examination state information from the instance storage element into the expansion module as at least one of a stream of data, a set of data, and a directory, and wherein a instance storage element typc is determined based on how the information is loaded into the instance storage element.
68. The method of claim 64, further comprising the step of sending a query from the lesi driver to ihe expansion module, wherein the query enables the test driver to determine whether the expansion module is storing examination state information, and wherein, if the expansion module is storing examination state information, the test driver provides the instance storage element to the expansion module.
69. The method of claim 64, wherein the at least one test is interrupted, funher comprising the step of reloading the information relating to at least one of the data content, the presentation format, progression, scoring, and results reporting of the at least one test from the resource storage element into (he expansion module, such that the expansion module is capable of retrieving the information from the resource storage element that the expansion module was storing before the at least one lest was interrpted and enable the examinee to continue with the at least one test in siiu.
70. A method for computer-based testing for at least one test, the at lea.st one test having a presentation format and data content, the at least one test being controlled by a test driver, the test

driver having an executable code that controls functionality that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one test, control progression of the at least one test, control scoring of the at least one test, and control results reporting of the at least one test, the method comprising the steps of:
instantialing an expansion module;
loading information into the expansion module from a source Hie, the information relating to ai least one of the data content, presentation format, progression, scoring, and reporting of test results of the at least one test;
validating the information from the source File; and
unloadmg the information from the validation expansion module into a resource storage element within a resource file, wherein the expansion module expands the functionalily of the lest driver without necessitating programming changes to the executable code of the test driver.
71. The method of claim 70, wherein the information from the source file further relates lo al least one of non-interactive display material, test navigation, test navigation controls, iterns, timing, selection, scoring, results, and reporting.
72. The method of claim 70, wherein instantiating the expansion module is facilitated by standard Microsoft object instantiation using a component object model server.
73. The method of claim 70, w herein a test packager takes the information from the source file lind loads the information into the validation expansion module.
74. The method of claim 73, wherein the test packager comprises a compiler, and wherein the information from the source file comprises extensible markup language formal, validating the information further comprising determining whether the information is correctly formatted.
75. The method of claim 74. instantiating the expansion module further comprising calling the expansion module using a product identification comprising extensible markup language in the source file.
76. The method of claim 70, wherein validatingtihe information from the source tile is performed by the expansion module.
77. The method of clairn 70, wherein the resource file comprises structure storage persistent data and the resource storage element comprises a structure storage element.
78. The method of claim 70, wherein unloading the information from the validation expansion module into the resource storage element is facilitated by a resource persistence interface.
79. The method of claim 78, wherein the resource persistence interface enables unloading the information from the validation expansion module into the resource storage element as at least one of a stream of data, a set of data, and a directory, and wherein a resource storage element type is determined based on how the information is loaded into the resource storage element.

80. A method for computer-based testing for at least one test, the at least one lest having a presentation format and data content, the at least one test being controlled by a lest driver, the test driver having an executable code that controls functionality that enables the test driver to deliver the at least one test to an examinee using a display device, manage the at least one lest, control progression of (he at least one tesl. control scoring of the at least one test, and control results reporting of the at least one test, the method comprising the steps of:
instantiating an expansion module during production of the ai least one lest;
loading information into the exp....sion module from a source file, the information relating to at least one of non-interactive display material, test navigation, tesl navigation controls, items, liming, selection, scoring, results, and reporting;
validating the information from the source file;
unloading the information from the validation expansion module into a resource storage clement within a resource file;
instantiating the expansion module during delivery of the al least one test;
providing to the expansion module the resource storage element within the resource file;
loading information from the resource storage element into the expansion module during the delivery of the al least one lest; and
providing the information from the expansion module to the test driver during the delivery of the at least one test such that the expansion module expands the functionality of the tesl driver without necessitating programming changes to the executable code of the test driver.
S1. A method for computer-based testing for at least one lest the at least one test having a presentation formal and data content, the method comprising the steps of:
authoring information relating to al least one of the data content, the presentation formal, progression, scoring, and results reporting of the at least one tesl;
storing the information initially in a source file;
retrieving by a test packager the information from the source file;
receiving by an expansion module the information stored in the source file from the test packiiger;
validating by the expansion module the information received from the lest packager;
storing into a resource file the information validated by the expansion module;
controlling functionality by a tesl driver to deliver the at least one test to an examinee using a display device, to manage the al least one test, to control the progression of the at least one lest, to control the scoring of Ihe at least one test, and to control the results reporting of the al least one lest;
retrieving by the expansion module the information stored in the resource file;
providing the information retrieved by the expansion module to the test driver during delivery of the at least one test, wherein the information enables the functionality of the tesl driver;

storing examinalion state information comprising responses provided by the examinee to items presented to the examinee during the at least one test by an instance file, the examination state information enabling a restart of (he at least one test if the at least one test is interrupted: and
retrieving the examinalion state information by the expansion module and the lesl driver from the instance file.
82. The method of claim S1. wherein the instance file further comprises at least one of •liming utilized and time remaining on units of the at least one test, a current unit of delivery, and an examinee score.

AMENDED CLAIMS
[received by the International Bureau on 27 March 2003 (27.03.03);
new claims 83,84 added, remaining claims 1-82 unchanged (1 Page)]
83. An exam extensible markup language system for computer-based testing of at least
one test, the at least one test having a presentation format and data content, comprising:
a test driver, at least one of delivering the at least one test to an examinee using a display device, managing the at least one test, controlling progression of the at least one test, controlling scoring of the at least one test, and controlling results reporting of the at least one test;
a resource file, in operative data communication with the test driver, that stores information comprising extensible markup language and relating to at least one of the data content, the presentation format, progression, scoring, and results reporting of the at least one test, the information being accessible to the test driver; and
an expansion module, in operative data communication with the test driver and the resource file, that retrieves the information and the exam extensible markup language relating to at least one of the data content, the presentation format, the progression, the scoring, and the results reporting of the at least one test from the resource file and provides the information and the extensible markup language to the test driver during delivery of the at least one test, the expansion module expanding the functionality of the test driver without necessitating modification to the executable code of the test driver.
84. A method for computer-based testing using exam extensible mark'up language of at
least one te.st having a presentation format and data content, the test driver controls
functionality that enables the test driver to at least one of deliver the at least one test to an
examinee using a display device, manage the at least one test, control progression of the at least
one test, control scoring of the at least one test, and control results reporting of the at least one
test, the method comprising at least one of the sequential, non-sequential and sequence
independent steps of:
instantiating an expansion module;
loading information comprising extensible markup language from a resource storage element into the expansion module during delivery of the at least one test, wherein the information from the resource storage element relates to at least one of the data content, the presentation format, progression, scoring, and results reporting of the at least one test; and
providing the information from the expansion module to the test driver during the delivery of the at least one test such that the expansion module expands the functionality of the' test driver without necessitating programming changes to the executable code of the test driver.

84. A system for computer-based testing for at least one test, substantially as herein described with reference to the accompanying drawings.
85. A method for computer-based testing for at least one test, substantial!) as herein described with reference to the accompanying drawings.


Documents:

1048-chenp-2004-abstract.pdf

1048-chenp-2004-assignement.pdf

1048-chenp-2004-claims filed.pdf

1048-chenp-2004-claims granted.pdf

1048-chenp-2004-correspondnece-others.pdf

1048-chenp-2004-correspondnece-po.pdf

1048-chenp-2004-description(complete)filed.pdf

1048-chenp-2004-description(complete)granted.pdf

1048-chenp-2004-drawings.pdf

1048-chenp-2004-form 1.pdf

1048-chenp-2004-form 18.pdf

1048-chenp-2004-form 26.pdf

1048-chenp-2004-form 3.pdf

1048-chenp-2004-form 5.pdf

1048-chenp-2004-pct.pdf

abs-1048-chenp-2004.jpg


Patent Number 209791
Indian Patent Application Number 1048/CHENP/2004
PG Journal Number 50/2007
Publication Date 14-Dec-2007
Grant Date 06-Sep-2007
Date of Filing 13-May-2004
Name of Patentee M/S. PROMETRIC, A DIVISION OF THOMSON LEARNING, INC
Applicant Address 2711 Centerville Road, Wilmington, Delaware 19808
Inventors:
# Inventor's Name Inventor's Address
1 BOWERS, Clarke, D 1000 Lancaster Street, Baltimore, Maryland 21202
PCT International Classification Number G06F 11/36
PCT International Application Number PCT/US2002/036264
PCT International Filing date 2002-11-13
PCT Conventions:
# PCT Application Number Date of Convention Priority Country
1 60/331,228 2001-11-13 U.S.A.