Title of Invention

A COMPUTER-BASED SYSTEM AND METHOD FOR TESTING AND VALIDATING AN EMBEDDED OPERATING SYSTEM WITHIN A TARGET DEVICE

Abstract A computer-based system for testing and validating an embedded operating system within a target device, comprising: a. a host computer; b. a target device provided with an operating system; and c. an operating system validator for testing and validating operating system, said validator being provided in said host computer, wherein said validator comprises a graphical user interface means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system, test source codes and executable programs for testing and validating operating system and a protocol acknowledgment means conducive to use with said target device, wherein said protocol acknowledgment means utilizes an operating system-generated event handle as a member field of a protocol for releasing an execution thread which is waiting for an acknowledgment message from said target device, and wherein said event handle is placed in a header portion of a message packet and is sent back n said acknowledgment message, and wherein a receiving thread unblocks any send threads of execution which are waiting for said event handle in said acknowledgment message; and d. a logging library means for manipulating and storing test related information as generated by said operating system testing and validating programs.
Full Text FORM 2
THE PATENTS ACT,
1970 (39 of 1970)
COMPLETE SPECIFICATION
'See Section 10, rule 13


A COMPUTER-BASED SYSTEM AND METHOD FOR TESTING AND VALIDATING
AN EMBEDDED OPERATING SYSTEM WITHIN A TARGET DEVICE.




BSQUARE CORPORATION of 3150 139TH AVENUE S.E., SUITE 500, BELLEVUE, WA 98005-4081, U.S.A. a AMERICAN Company
The following specification particularly describes the nature of the invention and the manner in which it is to be performed : -

. ;
CROSS-REFERENCE TO RELATED APPLICATIONS This patent application claims the benefit of U.S. Provisional Patent Application Serial No. 60/137,629, entitled "PROTOCOL ACKNOWLEDGMENT BETWEEN HOMOGENEOUS SYSTEMS," filed June 4, 1999, and is a continuation-in-part of related U.S. Patent Application Serial No. 09/489,308, entitled "CE VALIDATOR TEST SUITE," filed January 21, 2000, which claims benefit of U.S. Provisional Patent Application Serial No. 60/116,824, entitled "CE VALIDATOR TEST SUITE," filed January 21, 1999.
TECHNICAL FIELD This invention relates to product quality assurance and to test systems and methods for validating operating systems provided in computerized products. More particularly, the present invention relates to product quality assurance and to test systems and methods for validating an operating systems during the development of computerized products. Even more particularly, the present invention relates to product quality assurance and to test systems and methods for validating operating systems, such as Windows CE, manufactured and sold by Microsoft, Incorporated of Redmond, WA, typically provided in computerized products.
BACKGROUND OF THE INVENTION Increasingly, developers are embedding operating systems, such as Windows CE, into many different types of computerized products, including set-top boxes, gaming systems, bar¬code scanners," and factory automation systems. As Windows CE has grown, so too has the need fojr "off-the-shelf software development tools. Although many tools and "off-the-shelf software kits have been on the market for saving device design time, leading to speedy device development, no fast testing system or method existed to verify compatibility of these new products, especially at the final stages of device development.

Traditionally, only two operating system device testing options have been available. (1) in-house writing of the test code, or (2) out-sourcing the custom code development to another firm. To complete the testing project in-house, Original Equipment Manufacturers (OEMs) must spend months training their staff, more months developing the test codes, and yet even more months of preparation before their product can be tested using such codes. Likewise, an out-source custom code development house would spend months writing the code. Thus, both options are time-consuming and, therefore, costly.
In related art quality assurance device testing systems, several network protocols are used which suspend program execution while waiting for an acknowledgment of a sent message. For example, Transmission Control Protocol (TCP), at a low level, blocks program execution until certain sent messages are acknowledged by the remote end of a "connection." Most protocols are implemented in an O/S-independent manner. As such, sent messages requiring an acknowledgment must be maintained in a list which cross-references the identification of a sent message with an execution thread, such thread waiting for such acknowledgment. Such related art systems follow a lengthy sequence: creating a message ID prior to sending the initial message; adding an element in a list associating the execution thread with the message ID; sending the message; blocking the sending thread of the execution until the receiving code unblocks it; acknowledging, by the remote process, the sent message by sending back an ACK message containing the original message ID; receiving ACK message by the sending machine; parsing for the message ID of the originally sent message; looking-up the message ID in the list; determining which execution thread to release from the list; and finally, releasing the original sending thread of execution to continue program execution.
These time consuming operating system device testing options have created a long-felt need for a consolidated testing system and method, utilizing a protocol acknowledgment between homogeneous systems, for improving product quality, imparting time and cost savings of many person-months, and streamlining of the product development process. In particular, a system and method for testing and validating devices having an embedded Windows CE operating system installed is needed to overcome the foregoing problem and thus provide a system and method which improves product quality, imparts time and cost savings of many person-months, and streamlines the product development process resulting in a fully automated design verification package for device designers. In particular, a need exists for a system

which uses O/S-provided events, an O/S-internally-maintained event list, and an O/S-internally-maintained blocking threads (i.e. sending threads) list. Thus, unlike related art protocol acknowledgment methods, several steps need to be eliminated (e.g., adding an element which associates the execution thread with the message ID in a list, cross-referencing the message ID in a list, and determining which execution thread to release from a list entry). Thus, a code which is simpler, shorter, and less error-prone via an alternative method of processing ACKs would be beneficial where a plurality of actions could be executed upon reception of an acknowledgment message (e.g., multiple cross-referencing and attendant delays elimination, code for the reception of ACK which is largely identical with a single-thread case, priority data which need not be maintained in a sent message list if multiple tlireads are prioritized, and an ID list which need not be entirely scanned to determine a thread-release priority as an O/S restarts all threads with an appropriate priority). A beneficial system would also be able to implement multiple pending threads of execution which are waiting for an ACK message, the threads needing merely to use the O/S-provided function of waiting for an event handle which was embedded by the initial "send" in the protocol header. Likewise advantageous, multiple threads of execution should be triggered, without additional cross-reference processing, by an ACK for any message of a plurality of sent messages.


BRIEF SUMMARY OF THE INVENTION
Accordingly, the present invention, to be commercially available under applicant's assignee's trademark of CEValidator™ is an operating system validator, (herein also referred to as O/S Validator, and designed in the Figures with the numeral "1", which solves the foregoing problems by providing a test system encompassing an automated test suite method for testing a port of an operating system, such as Windows CE, in a target device's hardware and/or software being newly developed. The O/S Validator comprises a comprehensive code base, specifically developed to purposefully stress the O/S, device driver, OEM Adaptation Layer (OAL), and hardware interaction through the use of a unique contemporaneous multithreaded execution methodology for acknowledging protocol between homogeneous systems. The provided test suites focus on identifying three primary defects: hardware design, hardware programming (drivers/OAL), and operating system interaction. Special diagnostic emphasis is placed on windows CE subsystems which have historically shown the most problems. The test suites comprise nearly 1500 tests which include system stress-testing routines, as well as feature-and-function tests, providing a complete analysis of a Windows CE port. These test are grouped by the O/S Validator. The O/S Validator includes both test source codes and executable programs for all tests.
According to a computer-based system for testing and validating an embedded operating system within a target device, comprising a host computer; a target device provided with an operating system; and an operating system validator for testing and validating operating system, said validator being provided in said host computer, wherein said validator comprises a graphical user interface means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system, and a protocol acknowledgment means conducive to use with said target device, wherein said protocol acknowledgment means utilizes an operating system-generated event handle as a member field of a protocol for releasing an execution thread which is waiting for an acknowledgment message from said target device, and wherein said event handle is placed in a header portion of a message packet and is sent back in said acknowledgment message, and wherein a receiving thread unblocks any send threads of execution which are waiting for said event handle in said acknowledgment message; and a logging library means for manipulating and storing test related information as generated by said operating system testing and validating program. A computer-based method for testing and validating an embedded operating system within a target device, comprising the steps of providing a host computer; providing a target device having an operating system providing an operating system validator, said validator being provided in said host computer, wherein said validator comprises a graphical user interface means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system, and a protocol acknowledgment means conducive to use with said target device, wherein said protocol acknowledgment means utilizes an operating system-generated event handle as a member field of a protocol for releasing an execution thread which is waiting for an acknowledgment message from said target device, and wherein said event handle is placed in a header portion of a message packet and is sent back in said acknowledgment message, and wherein a receiving thread unblocks any send threads of execution which are waiting for said event handle in said acknowledgment message; and providing a logging library means for manipulating and storing test related information as generated by said operating system testing and validating program; executing said operating system testing and validating program on said target device and testing and validating said operating system wherein said executing step comprises unblocking a send thread of execution by responding to said event handle in said acknowledgment message; and generating pass and fail test results.
To simplify execution of test suites and collection of logging results, an intuitive user interface for the O/S Validator host component, such as a standard Windows application leveraging the Microsoft Windows user interface, is utilized. The O/S Validator distributes test suites as a client/server application. A graphical user interface (GUI) interacts with a small application, CEHarness.exe, which is running on a target device. Because this communication may occur over Ethernet, at least one host may run suites against at least one target device.
The O/S Validator generates useful error information when a target device fails a test. While the suites are running, results are displayed in a plurality of dynamically created log windows as well as in a configuration's summary tab. The logging windows contain the full text of a given test's results. Failures are color-coded red to ease identification. Navigation buttons in the logging window allow the user to quickly move from one failure to another. The logging APIs in the test also cause a prolog and an epilog to be generated in each result file. Information such as concurrently running processes, battery power level, and execution date and time is automatically recorded in the results file and displayed in the log window. Useful

summary information, such as loss of program memory, loss of storage memory, or total test execution time is provided in a log window tab. The summary information for a given test result is also collected and displayed in a Summary tab of the configuration window. The summary tab reports the number of PASS and FAIL test cases in real time. Breakout PASS and FAIL numbers for individual suites are also displayed. The configuration window's Summary tab facilitates quick navigation to an individual failure among perhaps thousands of test results. The exact source file and line number corresponding to a logged failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator provides the source code for all of its executables, being able to go directly to the source code reporting an error is a powerful adjunct to the textural descriptions of the failure.
The present invention uses an O/S-generated event handle as a member field of a protocol, releasing an execution thread which is waiting for an acknowledgment (ACK) message. A O/S-generated event handle such as a WIN32 event handle, used for blocking an original send thread of execution, is placed in a header and sent back in the ACK. A receivmg thread does not require the looking-up of a transaction identification (ID) in a list. Instead, line receiving thread unblocks any threads waiting for the event. In other words, the present invention uses the O/S-provided events, the 0/S-internally-maintained event list, and the O/S-internally-maintained list of blocking threads (i.e. sending threads). Thus, unlike related art protocol acknowledgment methods, several steps are eliminated: adding an element which associates the execution thread with the message ID in a list, looking-up the message ID in a list, and determining which execution thread to release from a list entry.
Consequently, the code is simpler, shorter, and less error-prone via the instant invention's alternative method of processing ACKs. Thus, the present invention offers many advantages in expediting a plurality of actions upon the reception of an acknowledgment message: multiple cross-referencing and attendant delays are eliminated, code for the reception of ACK is largely identical with the single-thread case, priority data need not be maintained in a sent message list if the multiple threads are prioritized, and the ID list need not be entirely scanned to determine the thread-release priority as the O/S restarts all threads with the appropriate priority. To implement multiple pending threads of execution which are waiting for an ACK message, the threads need merely use the O/S-provided function of waiting for an event handle which was embedded by the initial "send" in the protocol header. Similarly,

multiple threads of execution may be triggered, without additional cross-reference processing, by an ACK for any of a plurality of sent messages.
Other features of the present invention are disclosed or are apparent in the section entitled, "DETAILED DESCRIPTION OF THE INVENTION."
BRIEF DESCRIPTION OF DRAWINGS
For a better understanding of the present invention, reference is made to the below-referenced accompanying drawings.
Figure 1.0 is a schematic diagram representative of a computerized product presently being provided with an embedded operating system in a control unit.
Figure 2.0 is a manufacturing flow diagram illustrating quality assurance testing on a computerized product provided with an embedded operating system, in accordance with the present invention.
Figure 3.0 is a block diagram showing the primary components of the operating system validator of the present invention, including a graphical user interface, an engine, a plurality of test suites, and a logging library, in accordance with the present invention.
Figure 4.0 is a block diagram showing the execution of a plurality of test suite configurations from a host device for testing a plurality of target devices provided with an embedded operating system, in accordance with the present invention.
Figure 5.0 is a block diagram showing the present invention essentially as depicted in Figure 4, except showing communication by the target device with the O/S Validator 1 at the host via Ethernet means.
Figure 5a illustrates an arrangement where communications between a plurality of host and target devices may occur over Ethernet, in accordance with the present invention.
Figure 6 shows yet another arrangement for a test suite execution situation, in accordance with the present invention.
Figure 7.0 is a table listing of functional areas of specific APIs tested in automatic and manual test suite execution, in accordance with the present invention.
Figures 8A. 8B, and 8C, together, comprise a comprehensive table listing of functional areas and their respective APIs which may be tested in automatic or manual mode, in accordance with the present invention.

Figure 9.0 is a table listing of selected APIs for use in building automation scripts, in accordance with the present invention.
Figure 10.0 is a schematic diagram representative of the concept of the present invention providing source code for all executable programs, in accordance with the present invention.
Figure 11.0 is a block diagram representation of a window showing the test suites selection options as well as other related summary functions, in accordance with the present invention.
Figure 12.0 is a block diagram representation of a logging window showing tabs for test results, test failures, and related test summary options, in accordance with the present invention.
Figure 13 illustrates, in graph form, the test cycle time as a function of the number of test devices being concurrently tested, in accordance with the present invention.
Figure 14.0 is a block diagram representation of a configuration window showing tabs for executing a variety of configuration related functions, in accordance with the present invention.
Figures 15A, 15B, and 15C, together, comprise a table listing testing details of the operating system components, in accordance with the present invention.
Figure 16 is a flowchart illustrating the unique contemporaneous multithreading capability, with respect to acknowledgment protocol between homogeneous systems, in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION Figure 1.0 shows a computerized product 1000, (9), typical of computerized products such as computer workstations, set-top boxes, gaming systems, bar-code scanners, and factory automation systems presently being provided with an embedded operating system, depicted by the numeral 1001a. As illustrated, and as described herein, product 1000 may comprise a typical target device 9 provided with an operating system 1001a, such as an embedded Windows CE operating system (O/S). The computerized product 1000 may function as a standalone device, having an installation of the present invention, the O/S Validator 1, for testing and validating its own operating system 1001a. The standalone testing facilitates new

test development and de-bugging of reported defects as elaborate knowledge of O/S Validator infrastructure is not necessitated. However, in a more likely application, as shown in Figure 2.0, product 1000 may function, in a manufacturing quality assurance testing environment M, as a host computer 4 having an installation of the present invention, the O/S Validator 1 for testing a target devices 9, and being provided with an operating system 1001a. Referring back to Figure 1.0, a computerized product 1000 may comprise, by example, several sub¬components, including a control unit 1020, including a plurality of input/output ports 1021, a keyboard 1009, a printer 1010, a mouse 1011, and a monitor 1012. The sub-components 1009, 1010, 1011, and 1012, themselves, may be testable target devices. The typical control unit 1020, itself, comprises several sub-components, including a central processing unit 1001, storage devices such as a hard disk drive 1004, other memory components including RAM, 1002, a ROM 1003, a compact disc 1005, an audio component 1006, a network/server card 1007, and a modem 1008. Necessarily included in the control unit, is an operating system 1001a, to make product 1000 functional as a useful device.
Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12. The GUI 2 and the Engine 3 communicate internally, in both directions, through a component called HarnessLink.dll, designated by the numeral 7 within O/S Validator 1 and discussed below in more detail. Figure 4.0 illustrates a host computer 4 provided with O/S Validator 1. As illustrated, a plurality of target devices 9, are provided with an O/S 1001a for being tested in accordance with the present invention. The O/S Validator 1 has capabilities of generating testing configurations, such as a plurality of test configurations 21a, 21b, and 21c, for testing a particular function under control of OS 1001a within target devices 9. A device side component, termed CEHarness 8 communicates with Engine 3 in O/S Validator 1. As depicted in Figures 5 and 5a, CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via Ethernet means 4a. Figure 5a illustrates, that because communication may occur over Ethernet, a plurality of hosts 4 may run suites against a plurality of target devices 9. In yet another alternative, and as depicted in Figure 6 for a test suite execution situation, CEHarness 8 may also communicate with Engine 3 in O/S Validator 1 via suite execution connection 1021a, where host computer 4 may comprise a NT host computer, where the logging library 12 is also provided in a target device 9, and where the test results are provided to host

computer 4 via socket connections 1021b.
In operation, the O/S Validator 1 lests and validates target devices 9 provided with an embedded operating system, by example a Windows CE Operating System. Broadly stated, the O/S Validator 1 functions by (1) validating the Windows CE port to a target device, (2) providing stress and performance testing, (3) logging and analyzing results, (4) testing a plurality of functional areas, including a plurality of applications program interfaces (APIs), (See Table 1.0 and Table 2.0 in Figures 7, 8A, 8B, and 8C, respectively), (5) executing a plurality of pass/fail tests in a plurality of test suites, (6) facilitating customization of tests in the automated suites, (7) providing a host side graphical test harness, (8) stressing and evaluating memory performance, (9) providing means for building test automation, including a plurality of APIs, (See Table 3.0 in Figure 9.0), and (10) providing a results analysis tool termed CEAnalyzer. As previously stated, and as depicted in Figure 10, O/S Validator 1 includes, for all tests, both test source codes SC and executable programs EP (also referred to as test executable). The sole implementation requirement for a test executable EP is that test case "passes" and "failures" be reported using two specific APIs: WRITETESTPASS () and WRITETESTFAIL ( ). These macros have signatures similar to the well-known printf ( ) function, but their use generates a standard test case result format in the "results" file amenable to automated summarization and integrates the reporting with the host user interface. The O/S Validator 1 method further comprises mediating between the GUI 2 and the test cases encapsulated in the test executables EP and providing a means by which the GUI 2 distributes and coordinates the execution of tests. The test suites 11 comprise text files 5 composed of suite language commands (e.g. PUT, RUN, WAIT, and DELETE) which are the direct expressions of the tasks needed to distribute and execute the test. Other supported suite commands are GET for retrieving a file from a target device 9, RUNHOST for running an executable program EP on the host computer 4 which is useful for client/server style tests, WAITHOST for waiting for the termination of a process on the host computer 4, which is also useful for client/server style tests, PUTSYSTEM for putting a file in the device's system directory (/Windows), SLEEP for basic liming when all else fails, MSGBOX for displaying a message box on the host machine, SETREG for adding or changing a registry setting on the device, and DELREG for removing a registry setting from the device. Initial suite file comments, in addition to providing interna] suite documentation, are presented as the suite

descriptions in GUI 2. The O/S Validator 1 method includes organizing the test suite files 5 by tlieir hierarchical placement within a directory structure on the host computer 4. As shown in Figure 11, test suites 11 are divided at the top level as being either automatic test suites (Au), manual test suites (Ma), or stress test suites (SS). Automatic suites Au are tests which do not require any user intervention during their execution. In contrast, manual suites Ma do require user intervention (e.g. keyboard and touch panel suites). The O/S Validator 1 method includes stressing the system, through the stress suites SS by pressing input/output (I/O) throughput to the limit, operating with little or no available program or object store memory, and possibly bringing down a custom file system with multi-threaded concurrent stressing. Below this top level of hierarchy, the O/S Validator 1 method includes arranging the test suites 11 by the functional areas (See generally Figure 7).
As notes above, Figure 3.0 shows the primary components of the O/S Validator 1 including a graphical user interface (GUI) 2, an Engine 3, a plurality of Test Suites 11, and a Logging Library 12. O/S Validator 1 utilizes GUI 2 as a Visual Basic component since it works well for any level of user. The GUI 2 design is based on the concept of interfacing the user input with underlying components. The Engine 3 holds the core of the functionality on the host 4. The Engine 3 reads a plurality of suite files 5, parses them, and executes the commands. The Engine 3 obtains the information from the GUI 2 and uses the information to set-up a variety of execution options. The Engine 3 is written in C/C++ language. The Engine 3 is linked to the GUI 2 via a component termed HamessLink.dll 7, which is an ActiveX control. HamessLink.dll 7 is instantiated and called from the GUI 2 by a variety of information, which is passed to the Engine 3 before it begins to execute. D11 link 7 also functions to communicate between the Engine 3 and the GUI 2 during the execution, to relay information, error messages, and some dynamic run-time commands. The target device 9 comprises a device-side (as opposed to a host-side) component called CEHarness 8. CEHarness 8 is a C/C++ program residing on the target device 9 and, as shown in Figure 4, communicates nearly exclusively with the Engine 3, unless broadcasting target information on the network, with the GUI 2 receiving such information and passing it to the Engine 3 (See Figures 5 and 5a). CEHarness 8 is an event-driven application, where the Engine 3 sends die events and CEHarness 8 responds. The two remaining components, test suites 11 and logging library 12, are intertwined since the test suites 11 are written using a plurality of application

program interfaces (APIs) 13 that arc part of the logging library 12 (Sec Figures 8A, 8B, and 8C). These APIs 13 have substantial functionality and are somewhat dependent on the information passed through the component chain. The logging library 12 has a simple functionality concept, communicating the test results 14 by creating log files 15 by either logging TCP/IP information 16 back to the GUI 2 (See Figure 5.0) or by writing results 14 and log files 15 directly to the device 9 (See Figure 6.0). As depicted in Figure 12, a logging window LW shows the test results 14 in test files 15, failures F and a summary lab SumT which facilitates user-access to program memory, passes, failures, and timing information. The plurality of test suites tests 11 comprise the indispensable component of the O/S Validator I. Figure 13 illustrates in graph form that the test cycle time CT decreases as the number of test devices are concurrently tested.
In further detail, the GUI 2 is a complex code due to its degree of functionality required for handling its layer of components. The GUI 2 provides a "wizard" whose primary function is walking new users through the various selectable settings and listing the default settings. As shown in Figure 14, GUI 2 also provides a configuration window CW, as the means for executing a test run which comprises a single pass through a set of selected suites 11 on a target device 9. As shown in Figure 4, a plurality of configurations 21a, 21b, and 21c may be run to simulate a variety of scenarios. The contents of a configuration window CW comprises a plurality of tabs for user control. By example, suite tab S provides a tree view of suite files directory under the O/S Validator directory. This tree view is organized to provide meaningful distinctions between the types of tests 11 the user selects. Additionally, this tree view is created as the user opens the configuration window CW, allowing the user to extend the O/S Validator 1 by adding new user-input suites to the suite file directory created by the O/S Validator 1 installation program. Test suites 11 are scripts of commands that the Engine 3 reads and then performs actions corresponding to such scripts of commands. The suite files 5 are generally started with a series of comments. These comments appear in the suite file information section of the file. If the word "Manual" appears at the stop of a suite file 5, the file is deemed to require manual execution and is, therefore, assigned a different icon. In the test suite section, as illustrated in Figure 11, the user may reorder the lest suite files 5 in any way. Still referring to Figure 14, the logging tabs contains considerable valuable information. The user may select three methods of logging, namely LH for logging to the host 4, LTD for

logging to the target device 9, or LHD for logging to both host 4 and target device 9. The log
information is then stored in a configurable directory listed in an edit box. All this information
is sent through the DLL 7 to the Engine 3 and then to CEHarness 8 in target device 9.
Subsequently, the information is acquired by the logging library 12 upon running a test 11.
Other tabs in the configuration window CW include a set stress condition tab SC, selecting
high priority threads during suite execution by selecting thread tab T, reducing program and
storage memory by selecting tabs PM and SM, selecting run time by selecting tab SRT, and
stopping the run by selecting tab STOP. The user can utilize infinite loop tab Hoop for finding
memory leaks in the system. Useful summary information such as loss of program memory,
loss of storage memory, or total test execution time is provided in a summary tab SumT. The
summary information for a given test result is also collected and displayed in a Summary tab
SumT. The summary tab reports the number of PASS and FAIL test cases in real time.
Breakout PASS and FAIL numbers for individual suites are also displayed. The configuration
window's Summary tab facilitates quick navigation to an individual failure among perhaps
thousands of test results. The exact source file and line number corresponding to a logged
failure are automatically reported by the O/S Validator's logging APIs. Since O/S Validator
provides the source code for all of its executables, being able to go directly to the source code
reporting an error is a powerful adjunct to the textural descriptions of the failure. The
logging options vary dramatically in their implementation and effects. The first option presumes that whenever the user runs a test suite 11 resulting in a "pass", a set of log files 15, summarizing these test results 14, automatically issues. Dependent upon the chosen logging method, the summary files 15 are created by either CEHarness 8 or the Engine 3. Basically, the Engine 3 traverses all the log files 16 in the logging directory, so the user may receive a log files list not corresponding to the lest 11 being run. In order to make the summary files 15 more indicative of the run tests 11, the user can delete the log directory before running, or the user can select a logging option such as "Return only the newest summary results" which causes the Engine 3 to traverse only one log file 15 for each test 11. This means that if the user ran a file system test thirty times on a given day, there would only be one entry in the summary log for that test corresponding to the most recent execution. The summary log would have only one entry for each test 11 whose log file 15 still resides in the logged directory. The other two options are handled in the GUI 2. If the user logs onto the host 4 via a TCP/IP

connection 4a, an entry goes into a log file 15 created on the user's host 4 while appearing in a log window within the GUI 2. This allows the user to examine the log files 15 within the context of the O/S Validator 1, a great advantage since the user can immediately monitor the passes, and more importantly, the failures. However, in some circumstances, there may be an excessive amount of log files 15 due to the size of the test 11 ran; therefore, closing the log window LW, without opening it further, would maintain the memory of the host 4 and also impart a clear viewing area for the O/S Validator 1. Also advantageous is the closing of all log files 15 without failures. Failures indicate, to product design personnel, that the target device 9 will require further development. The user may want to keep open all the log windofws F with failures by clicking an option in the logging window to keep the F window open.
When operating the present invention as shown in Figure 5a, a window, termed Available Targets, shows the active devices on the network that are broadcasting information to the GUI 2. The active devices send a large volume of information, some of which is displayed in the Available Targets window. The user may view the information by selecting a View/Available Targets menu. Another window must be accessed to obtain the complete set of broadcasted information. This broadcast information is valuable, because it is used to initialize the connection from the Engine 3 to a particular CEHarness 8 in a test target device 9.
Referring to Figure 11, the stress Test Settings SS are now further described. The Stress Suites 11 are manifested in a variety of forms; however, their fundamental purpose is for stressing the target device 9 by running a very long test, running multiple iterations of a short test, or electing a wide range of parameters in one test. These remain specific to their own test 11 area, for example, a Database Stress Suite only stresses the Database functionality of an O/S, such as Windows CE. The Stress Test Options are distinguishable from Stress Suites 11. The Stress Test Options are different, because they target functionality to provide more broadband stress scenarios equivalent to real world usage models. These scenarios run in conjunction with any user-selected set of test suite files 5. The Stress Test Options can and should be run both conjointly and disjomtly as, in so doing, a significant boost to the range of any test plan is provided. The first two Stress Test Options are related to the memory on the target device 9. The first Stress Test Option is the Low Virtual Memory Option which greatly

reduces the amount of virtual memory of me target device before running the selected tests 11 This simulates the realistic harsh circumstances that might occur when a user has opened fifteen applications, effecting a malfunction. The second Stress Test Option is the Low Storage Memory option. When selected, this second Stress Test Option fills the storage memory of the target device 9 to its maximum capacity in order to experience the target device 9 response to low storage memory solutions. In some cases, this second Stress Test Option is also good for testing application programs as contained within the target device 9 as they may depend on non-existent storage memory. The next three Stress Test Options are execution options. The first executable stress option is the infmite loop, which is ideal for long test cycles. A common problem in many device drivers is malfunction under long, intense, stressful situations. This infinite loop stress test provides a test for determining a possible breakdown. This infinite loop test runs the selected suites 11 until the user manually hits the Stop button. The next stress execution option is the configurable CPU cycle deprivation test available as a text file identified as O/S Validator \Tests\TestInputFiles called Data.txt. Two examples are provided in the file which a user may copy, reuse, or modify. The text file, Data.txt, controls the number of threads and their attributes that the user may include in his test run. In other words, the user can run his tests while other processes are consuming the CPU time, eliminating many problems including timing. The last Stress Test Option is the Random Execution. When the user selects this option, the GUI 2 will reorder the list of test suites 11 at run time so that they run in a different order. This option is ideal, because it facilitates the diagnoses of interaction problems with various components.
The remaining Test Run options are generic activities mat the user may control. The first option, "Use Selected Target Exclusively," is important, because, when a target device 9 is connected over the Ethernet, other users in a subnet can access that target device 9 through the O/S Validator 1 available target devices window. This helps to create stress on the target device 9. In the event that the user wishes to isolate a problem, extra stress should not be applied. In that situations, the user should have exclusive access to the target device 9. The last Test Run Option is set Target Time which prompts the Engine 3 to send the time from the host computer 4 to the target device 9, and thereby synchronizing the target device 9 system time to the host computer 4 time. Synchronization is advantageous as the log files return with a date and time stamp related to the dale and time on the target device 9. In order to keep these

accurate, the user should set the target device 9 date and time. The last tab before running a test is the Environment Settings tab which contains valuable information for the various selectable environment variables. These environment settings are designed to extend and abstract the test suite files 5 by allowing the test suite files to contain environment variable instead of hard carded information. For example, the Serial Tests take an available com-port as a parameter. 11 the coin-port is not entered as an environment variable, the test fails, because it is unable to open a com-port. All the environment variables used in the suites are provided; however, any additional environment variables may be user-added to the user-input suites. After running a test, a Test Status is available for obtaining status information for the current test run. The information is dynamically being updated.
A Suites Run section window lists the selected suites that have started. The user may open any log file from this window by selecting a desired test icon. The other icons in this control provide failure information. The failure icons are shown, by example, as a stylized beaker crossed-out with a red "X." A Test Run Summary Information keeps tracks of the number of test suite files run, suites selected, suites with a failure, and percentage of suites with a failure. On completion of a test run, the user may select a Configure Failed Suites tab prompting the appearance of a new configuration window selecting all the failed suites in the current test run, facilitating regression testing.
The remaining two sections are called Test Details. One of these Test Details section monitors the individual test cases that pass, as well as fail, which section is valuable for gauging the value of a test run. The remaining Test Details section is the Failed Suites section where all selected suites with a fail are listed by suite name, showing the number of corresponding passing and failing test cases. All this information gives the user a very good idea of the limits of his target device 9 during a test run (i.e. what passes, and more importantly what fails).
The primary object of the present invention is to properly test a port of an O/S 1001a, such as Windows CE. To accomplish this task, hundreds of suite tests 11 are needed and are provided by the O/S Validator 1. By example, nearly 1500 test suites 11, as grouped by the verified O/S subsystem, are provided. As indicated in Figure 10.0, the O/S Validator 1 includes both source codes and executable codes for all tests 11, covering the major O/S 1001a subsystems and the common adaptation drivers, with special emphasis on subsystems

historically exhibiting the most problems. O/S subsystem components tested by the O/S Validator 1 include: Ethernet/NDIS, PCMIA, a memory, a file system, a serial port a video system having a plurality ofapplication program interfaces, an infrared system, an original equipment manufacturer adaptation layer, a touch panel, a mouse, a keyboard, and an audio/wave system,. Figure 15A, 15B, and 15C, together, show a table listing testing details of these system components.
As discussed above, and shown in Figure 3, Engine 3 is linked to the GUI 2 via a component termed HarnessLink. dll 7, which is an ActiveX control. Specifically, HarnessLink. dll 7 provides a set of functions on the GUI 2 for calling (i. e. affecting) the Engine 3. A majority of HarnessLink. dll 7 function-calls set-up some parameters for the Engine 3's command line. All the initial information is passed to the Engine 3 through this command line, insuring readiness of the Engine 3 for specified execution. A plurality of GUIrelated functions provide information on the command line. The information on the command line corresponds the GUI 2 information. The other major function, that HarnessLink. dll 7, serves is to communicate activity during the running of the Engine 3 by opening a named pipe.If an error or a need arises to transmit some memory information, the Engine 3 communicates through the named pipe. A named pipe insures that the communication between the Engine 3 and a specific Harness link is direct, accurate, and without any duplication problems if a plurality of Engines 3 are running. When the HarnessLink. dll 7 receives a message from the pipe, it signals the appropriate VB event which, in turn, causes the GUI 2 to take the information and process it accordingly. As described with relation to Figure 5.0, Engine 3 communicates with one HarnessLink. dll 7, which, in turn, communicates with one CEHarness 8. The Engine 3 execution is simple: a command line is received and processed, establishing the execution socket connection to the target device, opening the pipe for communication with the GUI 2, reading the test suite files 5, and subsequently executing the tests in three phases, PreExecution, Execution, and PostExecution. The PreExecution stage establishes the error socket connection between the target device 9 and the host 4. Relevant data such as logging paths and styles, various test run information, and stress scenarios are sent during thePreExecution stage. The Execution stage involves a response to each sequential suite command. A suite command is generally sent by the host 4 and is processed by the CEHarness 8 which, in turn, responds with a socket message when execution of the command is completed. The PostExecution stage primarily involves reducing the log information and generating the summary logs. Upon completion of these summary logs, the Engine 3 exits.

CEHarness 8 in test target device 9 is a significantly more complicated component than the Engine 3. This complexity is because each device 9 has one instance of CEHarness 8 at any given time; however, that device can handle a plurality of simultaneous connections, a vitally important feature to the testing methodology provided by the O/S Validator 1. When the user starts CEHarness 8, it creates two threads that endure through the entire execution time: a broadcast thread and an execution thread. This broadcast thread updates information such as the device IP, connection type, and available corn-ports every ten seconds, sending a broadcast message at the same rate to me network. If device 9 is connected to the host computer 4 via Windows CE Services (i.e. on the NT side) and to either remnet or repllog (i.e. on the device side), the connection type will change to PPP_PEER. If this occurs, the broadcast message is sent only to the host 4 from which the target device 9 is directly connected. If the user changes the connection at some point during execution, the message is updated. Meanwhile, the execution thread waits for a connection attempt from an Engine 3. When the execution thread receives the connection, it spawns another thread, a main execution thread, that perforaxs the various functions required. The main execution thread starts another socket for sending any error or memory information. Thus, the execution thread is event-driven, receiving a command and responding appropriately. Each connection attempt spawns its own execution thread; therefore, single CEHarness 8 may have many active connections, extending the test functionality by running a plurality of configurations simultaneously on one target device 9, and thereby creating a more realistic stress situation.
Figure 16 is a flowchart illustrating the present invention protocol acknowledgment system 4000 comprising a message sender 2000 and a message receiver 3000. The message sender has a receive thread 2001 and a send thread 2009 which may parallel-process information. The send thread 2009 generates a message at block 2008, placing a header around the message to form a packet where a WIN32 event handle is in a field of the header, as shown in block 2007. Such packet is then sent, as per block 2006, either to block 2005 for blocking me send thread 2009 or to block 3001 for receiving the packet by the message receiver 3000, which would in turn, either send the packet having the WIN32 event handle in a field of the header, as shown by block 3002, to the receive thread 2001 via receiving block 2002 or continue processing in accordance with message receiver 3000 functions. The receive thread 2001, having received the acknowledgment packet, then uses the WIN32 event handle

in the header of the acknowledgment packet for unblocking the send thread at block 2003. and then either resumes the send thread 2009 at block 2004 or continue processing of receive thread 2001 functions.
Thus, the Logging library 12, shown in Figure 3.0, is a complex tool integrated into the O/S Validator 1 in various ways. Primarily, the Logging library 12 is integrated into the source files for the tests. The tests make calls to the library APIs whereupon the library handles all the details regarding capture and recordation of test results. The Logging library 12 also supports a variety of communication options. The recommended option is through the TCP, allowing the user to see the readout of the log files as they migrate from the TCP connection. Another communication option is direct logging to files on a device. This can be advantageous, for example, if the user wants to log to a PCMCIA card but does not want extra information to be broadcast over the network. Although, the Logging library 12 acts as the device-side component to the TCP communications, a host 4 component acts as the GUI 2-side component to the communications. This aspect of the logging library 12 provides a log window on the host 4 and color coding for the test results. By example, failure messages are displayed as a red line. The name and location of the source code file as well as the line number where the message was generated is included in the logging library 12 messages. Also, very importantly, a detailed error message is provided describing current events in the program. Each log window has a summary tab which facilitates user-access to program memory, passes, failures, and timing information. Another important feature of the log files is that they capture a large volume of information at the beginning and at the end of a test. This information provides snapshot of the Memory, System, Power, and other valuable information.
Information as herein shown and described in detail is fully capable of attaining the above described object of the invention, the presently preferred embodiment of the present invention, and is, thus, representative of the subject matter which is broadly contemplated by the present invention. The scope of the present invention fully encompasses other embodiments which may become obvious to those skilled in the art, and is to be limited, accordingly, by nothing other than the appended claims, wherein reference to an element in the singular is not intended to mean "one and only one" unless explicitly so stated, but rather "one or more." All structural and functional equivalents to the elements of the above-described

preferred embodiment and additional embodiments that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. Moreover, it is not necessary for a device or method to address each and every problem sought to be solved by the present invention, for it to be encompassed by the present claims. Furthermore, no element, component, or method step in the present disclosure is intended to be dedicated to the public regardless of whether the element, component, or method step is explicitly recited in the claims. -

We Claim:
1. A computer-based system for testing and validating an embedded operating system within a target device, comprising:
a. a host computer;
b. a target device provided with an operating system; and
c. an operating system validator for testing and validating operating system, said validator
being provided in said host computer,
wherein said validator comprises a graphical user interface means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system, test source codes and executable programs for testing and validating operating system and a protocol acknowledgment means conducive to use with said target device,
wherein said protocol acknowledgment means utilizes an operating system-generated event handle as a member field of a protocol for releasing an execution thread which is waiting for an acknowledgment message from said target device, and
wherein said event handle is placed in a header portion of a message packet and is sent back n said acknowledgment message, and wherein a receiving thread unblocks any send threads of execution which are waiting for said event handle in said acknowledgment message; and

d. a logging library means for manipulating and storing test related information as generated by said operating system testing and validating programs.
2. The computer-based system, as claimed in Claim 1, wherein said operating system validator further comprises a set of functions on the graphical user interface for calling the engine mean.
3. The computer-based system, as claimed in Claim 1, wherein said target device further comprises an event-driven application for communicating with said engine means, and wherein said engine means sends the events and said event driven application responds.
4. The computer-based system, as claimed in Claim 1, wherein said operating system in said target device comprises a Windows CE operating system, and said event handle comprises a WIN32 event handle.
5. The computer-based system, as claimed in Claim 1, wherein said test suites comprise at least one system stress-testing routine; and at least one feature-and-function test.
6. The computer-based system, as claimed in Claim 5, wherein said system stress-testing routine comprise a code base for stress testing said at least one component of said operating system,
said at least one component of said operating system being selected from a group of operating system components comprising an Ethernet/NDIS, PCMIA, a memory, a file system, a serial port a video system having a plurality of application program interfaces, an infrared system, an original equipment manufacturer adaptation layer, a touch panel, a mouse, a keyboard, and an audio/wave system,
said test identifying at least three defects, namely, hardware design, hardware programming, and operating system interaction, and being executed in automatic or manual mode.

7. The computer-based system, as claimed in Claim 1, wherein said target device comprise a stand-alone unit provided with said operating system validator for conducting validation and stress testing independent of said host computer.
8. The computer-based system, as claimed in Claim 1, further comprising an Ethernet connection coupled to said host and to said target device for conducting testing and validation tasks.
9. The computer-based system, as claimed in Claim 1, wherein said logging library means comprises at least one pass test result file using a WRITETESTPASS application program interface.

10. The computer-based system, as claimed in Claim 9, wherein said at least one pass test file resides in said target device.
11. The computer-based system, as claimed in Claim 1, wherein said logging library means comprises at least one fail test result file using a WRITETESTFAIL application program interface.
12. The computer-based system, as claimed in Claim 11, wherein said at least one fail test file resides in said target device.
13. A computer-based method for testing and validating an embedded operating system within a target device, comprising the steps of:

(a) providing a host computer;
(b) providing a target device having an operating system;


(c) providing an operating system validator, said validator being provided in said host
computer,
wherein said validator comprises a graphical user interface means for interfacing with a user, an engine means for communicating with said target device and responding to command from said graphical user interface, a plurality of test suites comprising at least one test for testing and validating at least one component of said operating system, test source codes and executable programs for testing and validating operating system and a protocol acknowledgment means conducive to use with said target device,
wherein said protocol acknowledgment means utilizes an operating system-generated event handle as a member field of a protocol for releasing an execution thread which is waiting for an acknowledgment message from said target device, and
wherein said event handle is placed in a header portion of a message packet and is sent back in said acknowledgment message, and wherein a receiving thread unblocks any send threads of execution which are waiting for said event handle in said acknowledgment message; and
(d) providing a logging library means for manipulating and storing test related information as generated by said operating system testing and validating programs;
(e) executing said operating system testing and validating programs on said target device and testing and validating said operating system wherein said executing step comprises unblocking a send thread of execution by responding to said event handle in said acknowledgment message; and
(f) generating pass and fail test results.
14. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 comprising the step of providing said operating system testing and validating program with a set of functions on the graphical user interface for calling the engine

mean.
15. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 comprising the step of providing said target device with an event-driven application for communicating with said engine means, wherein said engine means sends the events and said event driven application responds.
16. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 said operating system in said target device comprises a Windows CE operating system, and said event handle comprises a WIN32 event handle.
17. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 comprising the step of providing test suites in the form of system stress-testing routine comprising a code base for stress testing said at least one component of said operating system, said at least one component of said operating system being selected from a group of operating system components comprising an Ethernet/NDIS, PCMIA, a memory, a file system, a serial port a video system having a plurality of application program interfaces, an infrared system, an original equipment manufacturer adaptation layer, a touch panel, a mouse, a keyboard, and an audio/wave system, said test identifying at least three defects, namely, hardware design, hardware programming, and operating system interaction, and being executed in automatic or manual mode.
18. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 further comprising the step of providing an Ethernet connection coupled to said host and to said target device for conducting testing and validation tasks on said target device.

19. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 further comprising the step of: providing said logging library means with at least one pass test result file using a WRITETESTPASS application program interface.
20. A computer-based method for testing and validating an embedded operating system, as claimed in claim 13 further comprising the step of providing said logging library means with at least one fail test result file using a WRITETESTFAIL application program interface.
Dated this 26th day of November, 2001.

HIRAL CHANDRAKANT JOSHI AGENT FOR BSQUARE CORPORATION.

Documents:

abstract1.jpg

in-pct-2001-01481-mum-cancelled pages(07-01-2005).pdf

in-pct-2001-01481-mum-claims(granted)-(07-01-2005).doc

in-pct-2001-01481-mum-claims(granted)-(07-01-2005).pdf

in-pct-2001-01481-mum-correspondence(17-01-2005).pdf

in-pct-2001-01481-mum-correspondence(ipo)-(15-12-2006).pdf

in-pct-2001-01481-mum-drawing(07-01-2005).pdf

in-pct-2001-01481-mum-form 1(29-06-2004).pdf

in-pct-2001-01481-mum-form 19(17-10-2003).pdf

in-pct-2001-01481-mum-form 2(granted)-(07-01-2005).doc

in-pct-2001-01481-mum-form 2(granted)-(07-01-2005).pdf

in-pct-2001-01481-mum-form 3(29-06-2004).pdf

in-pct-2001-01481-mum-form 5(29-06-2004).pdf

in-pct-2001-01481-mum-form-pct-ipea-409(07-01-2005).pdf

in-pct-2001-01481-mum-form-pct-ipea-416(07-01-2005).pdf

in-pct-2001-01481-mum-form-pct-isa-210(07-01-2005).pdf

in-pct-2001-01481-mum-petition under rule 138(18-01-2005).pdf

in-pct-2001-01481-mum-power of attorney(17-07-2004).pdf


Patent Number 204066
Indian Patent Application Number IN/PCT/2001/01481/MUM
PG Journal Number 21/2007
Publication Date 25-May-2007
Grant Date 15-Dec-2006
Date of Filing 26-Nov-2001
Name of Patentee BSQUARE CORPORATION
Applicant Address 3150 139TH AVENUE S.E., SUITE 500, BELLEVUE, WA 98005-4081, U.S.A.
Inventors:
# Inventor's Name Inventor's Address
1 GREGORY, PETER R. 600 84TH AVENUE NE, MEDINA;WA 98039, U.S.A.
2 WALTERS, JAMES, FLOYD 1101 SENECA STREET #404, SEATTLE, WA 98101, U.S.A.
3 SAMPLE, IAN 5256 UNIVERSITY WAY NE #101, SEATTLE, WA 98105, U.S.A.
4 LUCAS, SHAWN, MICHAEL 7239 NE 171ST LANE, KENMORE, WA 98028, U.S.A.
5 DING, JIE, H. 27523 NE 31ST COURT, REDMOND, WA 98053, U.S.A.
6 BOYCE, DAVID, MATTHEW 25331 62ND AVENUE S. # T208, KENT, WA 98032, U.S.A.
PCT International Classification Number G06F11/263
PCT International Application Number PCT/US00/15342
PCT International Filing date 2000-05-31
PCT Conventions:
# PCT Application Number Date of Convention Priority Country
1 60/137,629 1999-06-04 U.S.A.