Title of Invention

METHOD AND SYSTEM FOR PROCESSING DATA FROM AN INPUT DEVICE

Abstract Disclosed is a system, method, and program for managing input events to a computer. Indication of a selection of a location on a display object displayed on computer monitor and an input event are received. A deteralination is made of a segment of the display object including the selected location. The display object includes at least two segments such that each segment is capable of mapping to a different input action. A determination is made of one input action for the determined segment. An application program performs a command corresponding to the determined input action.
Full Text METHOD, SYSTEM, AND PROGRAM FOR PROCESSING DATA FROM AN INPUT DEVICE
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a method and system for processing data from an input device.
2. Description of the Related Art
In prior art Windows-type operating systems, a user can use a mouse input device to select a displayed window. Current input devices that allow a user to manipulate a graphical pointer in the display area, such as mice and pen styluses, include one or more user selectable input buttons. A user selects one of the buttons while the pointer controlled by the input device is displayed over a displayed graphical window of an application program. Selection of one of the input device buttons causes a message to be sent to the application program displaying the active window indicating the user action. An application program may associate different input device commands, i.e., buttons on the input device, with different control options. For instance, in a word processing application program, selection of the left mouse button may position a displayed cursor at a pointed-to position and selection of the right mouse button may display a menu of word processing operations. In this way selection of different buttons on a mouse or pen stylus input device can cause different actions to occur in the application program. In such prior art systems, the same operation is performed regardless of the location of the pointer in the application window when the input event occurs.
Certain prior art input devices do not include input buttons that can cause different levels of control directly from the input device. For instance a one-button mouse can only
trigger a single action from an application program. Further a pen stylus without buttons or user manual selection on a touch screen display can only cause a single operation to occur with respect to the application program, i.e., selecting a word, field or other position in the displayed window of the application program.
There is thus a need in the art for an improved method, system, and program for providing multiple levels of control using an input device that exceeds the available selectable options on the input device.
SUMMARY OF THE PREFERRED EMBODIMENTS
To overcome the limitations in the prior art described above, preferred embodiments disclose a system, method, and program for managing input events to a computer. Indication of a selection of a location on a display object displayed on a computer monitor and an input event are received. A determination is made of a segment of the display object including the selected location. The display object includes at least two segments such mat each segment is capable of mapping to a different input action. A determination is made of one input action for the determined segment. An application program performs a command corresponding to the determined input action.
In further embodiments, the selected location is within a display region of an application window controlled by an executing application program.
Still further, determining the segment in the display object may comprise determining segments of the application window and determining the segment in the application window including the selected location.
Preferred embodiments provide a mechanism to allow an input device, such as a mouse pointer or touch sensitive display screen to simulate different input actions from an input device including.multiple buttons such as a multi-button mouse. This is accomplished in preferred embodiments by segmenting the display region of a display
object and assigning an input action io each segment. Perfonning the same input event with respect to different segments of the display object Jesuits in different input actions, depending on the segment in which the input event occurred. In this way, different input actions are performed by selecting different segment regions of the display object.
According, the present invention provides for a method for processing data from an input device, for managing input device, for managing input events to a computer comprising:
receiving indication of a selection of a location on a display object displayed on a computer monitor and an input event;
determining a segment of the display object the selected location, wherein the display object at least two segments, and wherein each segment is capable of mapping to a different input action; and
determining one input action for the determined segment, wherein an application program command corresponds to the determined input action.
The present invention provides for a system for processing data from an input device, for managing input events, comprising:
a computer;
an input device in communication with the computer;
a display monitor in communication with the computer;
means for receiving indication of a selection of a location on a display object displayed on the computer monitor and an input event generated by the input device;
means for determining a segment of the display object including the selected location, wherein the display object includes at least two segments, and wherein each segment is capable of mapping to a different input action; and
determining one input action for the determined segment, wherein an application program command corresponds to the determined input action.
The present invention also provides for an article of manufacture for processing data from an input device, for use in managing input events to a
computer, the article of manufacture comprising computer useable media accessible to a computer, wherein the computer usable media includes at least one computer program that is capable of causing the computer to perform:
receiving indication of a selection of a location on a display object displayed on a computer monitor and an input event;
determining a segment of the display object including the selected location, wherein the display object includes at least two segments, and wherein each segment is capable of mapping to a different input action; and
determining one input action for the determined segment, wherein an application program command corresponds to the determined input action.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
Referring now to the drawings in which like reference numbers represents corresponding parts throughout:
FIG. 1 illustrates a computing environment in which preferred embodiments in accordance with the present invention are implemented;
FIG. 2 illustrates an example of how application windows may be displayed on a display screen in a manner known in the art;
FIG. 3 illustrates the segmentation of an application window in accordance with preferred embodiments of the present invention;
FIG. 4 illustrates logic to determine an input action corresponding to an input in accordance with preferred embodiments of the present invention; and
FIG. 5 illustrates the segmentation of a display object in accordance with preferred embodiments of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following description, reference is made to the accompanying drawings which form a part hereof, and which illustrate several embodiments of the present invention. It is understood that other embodiments may be utilized and structural and operational changes may be made without departing from the scope of the present invention.
FIG. 1 illustrates a computing environment in which preferred embodiments are implemented. A computer 2 includes an input driver 4, windows manager 6, and executing application programs 8a, b, c. An input device 10, such as a mouse, pen stylus, touch sensitive display screen, etc., translates user movement into an input command that is translated by the input driver 4 into a computer readable data indicating the coordinates, of the movement with respect to the display area. The ciperating system 6)receives this input information and determines the application 8a, b, c that is the target of the user input A display monitor 12 displays information generated from the computer 2.
The computer 2 may be comprised of any computing device operating under microprocessor control known in the art, including a personal computer, workstation, client, server, laptop, hand-held device, etc. The windows manager 6 is the component of an operating system that manages, generates, and displays graphical user interface (GUI) windows for the different executing application programs 8a, b, c. The windows manager 6 may be implemented in operating systems known in the art, such as Microsoft Windows 95, 98, NT, and CE; OS/2; Red Hat Linux; MAC OS; etc.** The display monitor 12 may be any computer display device known in the art, such as a LCD screen, CRT, touch sensitive display screen, etc. The application programs 8a, b, c may comprise any application program known in the art capable of generating a separate GUI window in the display monitor 12.
FIG. 2 illustrates how the display 12 may display GUI windows for the applications 8a, b, c in a manner known in the art. The windows manager 6 would generate on the display monitor 12 three GUI windows 14a, b, c to provide a user interface to each application 8a, b, c, respectively. In the arrangement of the GUI windows 14a, b, c in FIG. 2, window 14a is the active window which the user is currently accessing. In such case, the windows 14b, c are inactive and would generally be displayed behind active window 14a. A user may manipulate on the display 12 a graphical pointer lousing the
input device 10. In FIG. 2, the graphical pointer 16 is currently performing operations within window 14a to control application 8a.
In preferred embodiments, the windows manager 6 includes the capability to define each displayed GUI window 14a, b, c into multiple selection segments. FIG. 3 illustrates an application GUI window 20 having a control menu 22 including menu items and icons which the user may select to execute a specific command in the application. The window 20 further includes an application area 24 which the windows manager 6 defines into four conceptual segments A, B, C, D. The user may directly manipulate the application program in the application area 24 using the graphical pointer 16, i.e., change text in a document, change values in a spreadsheet or database, manipulate displayed elements indicating program operation, etc.
The windows manager 6 defines an area for each segment to include all Cartesian (X, Y) coordinates within a segment area. The action to perform in response to an input event caused by user manipulation of the input device 10 depends on which segment A, B, C, D the graphical pointer 16 is positioned when the user selects an input option, e.g., depressing a button on a mouse or pen stylus.
For instance, if a user, by manipulating the input device 10, positions the graphical pointer 16 in segment A and makes a selection, by depressing a button on the input device 10 or applying a certain degree of pressure, a first operation may be performed. Positioning the graphical pointer 10 in the other segments B, C, and D and making the selection would likewise cause different operations to be performed. In this way, each segment A, B, C, D may be associated with a different operation when the user performs the same action in the segment. The different actions performed may comprise displaying different menu options in a pop-up menu, performing different application operations, performing a window management operation (e.g., resizing the window, etc.). In this
way, the application area 24 may be divided into areas that produce different operational results for the same input event with respect to the input device.
In preferred embodiments, the windows manager 6 maintains a mapping of segments (e.g., A, B, C, D) to different control buttons on a multi-button input device, such as a multi-button mouse or pen stylus. For instance, if the input device 10 has only one button, then the mapping may map segments to multiple input device buttons. In such case, segment A may map to clicking a right mouse button, segment B to clicking a left mouse button, segment C to double-clicking the right mouse button, and segment D to double-clicking the left mouse button. In this way, clicking the button on the input device 10 in a segment simulates a particular action with a input device 10 that includes multiple buttons to select. A user control panel (not shown) may be provided to allow the user to specify the number of segments in a window and the input control operations to which the segments map.
FIG. 4 illustrates logic implemented in the windows manager 6 to process input information received from the input device 10 when the Application area 24 is segmented into multiple display regions associated with different input actions]' Control begins at block 50 with the windows manager 6 receiving from the input driver 4 the x-y coordinates of a displayed location of the graphical pointer 16 when the user selected the button, otherwise known as the "hot spot", in a manner known in the art. The windows manager 6 determines (at block 52) whether there is a window being displayed over a display region area that includes the x-y coordinate. If not, then the windows manager 6 handles (at block 54) the selection according to control options for selections of the computer desktop. If mere is one window covering a display region including the x-y coordinate, then the windows manager 6 determines (at block 56) whether there are multiple windows displayed over the x-y coordinate. If there are multiple windows, the windows manager 6 determines (at block 58) the window having the highest z-order in a manner known in the
art. The z-order indicates how overlapping windows are displayed with respect to each other, i.e., how windows are displayed over each other.
After determining the window overlapping the x-y location, having the highest z-order if there are multiple windows, the windows manager 6 determines (at block 60) whether the window is active. If not, then the windows manager 6 makes (at block 62) that window the active window. Otherwise, if the window is active, the windows manager 6 determines (at block 64) the segment areas within the active window. If the segments comprise quadrants or some other ratio of a display region, then the windows manager 6 would apply such segment ratios to the entire active window to divide the active application window 14a, b, c into the segments according to the ratio and segment arrangement maintained by the windows manager 6. The windows manager 6 then determines (at block 66) the segment including the x-y location where the graphic pointer 16 was positioned when the button on the input device 10 was clicked. The windows manager 6 further determines (at block 68) the input action to which the determined segment is mapped. A message is then sent (at block 70) to the application 8a, b, c having the active window. The message includes the x-y location and the determined input action.
In response to the message, the application 8a, b, c would perform an operation corresponding to the input action indicated in the message. Each application 8a, b, c may include a unique mapping of input actions, e.g., right button select, left button double click, to application commands. In this way, the application 8a, b, c believes it receiving input from an input device including multiple buttons and selection options when in fact the input device 10 may include only one button, or in the case of a touch sensitive screens no buttons.
In touch screen embodiments, the user depressing the touch sensitive screen in a segment of the application window 24 window would cause the execution of the operation
mapped to the selected segment. If the touch screen is sensitive to the degree of touch, then a light touch may select an element within the application window 24 and a relatively heavier touch would cause the execution of the operation associated with the segment where the touch was made. For touch screen embodiments, step 50 would comprise receiving x-y coordinates of the location on the touch screen that the user depressed and step 64 would comprise determining the segment of the screen that the user depressed.
Preferred embodiments allow a single input event generated by an input device to map to different input actions generated by a different type of input device based on a selected locatipn when the input event occurred. In mis way, the segments may be used to map actions from an input device with no buttons, such as a touch screen, to an action from an input device having numerous buttons, such as a multi-button mouse. Preferred embodiments allow the display area of a display object, such as a window, to be segmented so that a user may invoke different commands with only a single selection of the input device, e.g., depressing the touch sensitive screen or depressing a button on the input device.
Alternative Embodiments and Conclusions
This concludes the description of the preferred embodiments of the invention. The following describes some alternative embodiments for accomplishing the present invention.
The preferred embodiments may be implemented as a method, apparatus or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The term "article of manufacture" (or alternatively, "computer program product") as used herein is intended to encompass one or more computer programs and/or data files accessible from one or more computer-readable devices, carriers, or media, such as magnetic storage media, "floppy disk," CD-ROM, optical disks, holographic units, volatile or non-volatile electronic memory, etc. Further, the article of manufacture may comprise the implementation of the preferred embodiments in a transmission media, such as a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope of the present invention.
The preferred embodiments may apply to any mapping of input events generated from an input device to input device actions. For instance, one or more buttons on the input device could be mapped to different input actions that are capable of being interpreted by the application.
In alternative application level embodiments, each application program may maintain the mapping of segments to input actions. In such embodiments, the windows manager 6 would send a message to the application program 8a, b, c indicating the x-y coordinates of the pointer and the button selected on the input device 10, which may just have one button, or in the case of a touch sensitive screen no buttons. The application program 8a, b, c would then determine the segment including the x-y location, i.e., hot
spot, and map the actual selected input event to another input action, which may be an input action on an input device having an arrangement of input mechanisms different than the arrangement on the actual input device.
Preferred embodiments were described with respect to segmenting a window into different sections corresponding to different input actions. In further embodiments, display objects other than windows may be segmented to produce different input actions when different segments of the display object are selected. Such display objects that may be segmented to different corresponding input actions include graphical icons, the windows desktop, HTML web pages, application GUIs, or any other displayed image, object or GUI known in the art.
FIG. 5 illustrates the display of a window 200 including a display object 202, such as a menu, icon, etc. The display object is segmented into two parts, 202a and 202b. Selection of one segment 202a will cause an action or the display of options that is different from the action or display of options displayed upon the selection of the segment 202b. In this way, any display object can be segmented to provide different control features depending on the segment of the display object that was selected. For instance, selection of different segments or areas of a displayed menu may trigger different menu operations.
In summary, the present invention provides a system, method, and program for managing input events to a computer. Indication of a selection of a location on a display object displayed on a computer monitor and an input event are received. A determination is made of a segment of the display object including the selected location. The display object includes at least two segments such that each segment is capable of mapping to a different input action. A determination is made of one input action for the determined segment. An application program performs a command corresponding to the determined input action.
The foregoing description of the preferred embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto. The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.
**WINDOWS, MICROSOFT, and WINDOWS NT are registered trademarks of Microsoft Corporation; OS/2 is a registered trademarks of International Business Machines Corporation; MAC is a registered trademark of Apple Computer, Inc.; Red Hat is a trademark of Red Hat, Inc.



We claim:
1. A method for processing data from an input device, for managing
input device, for managing input events to a computer comprising:
receiving indication of a selection of a location on a display object displayed on a computer monitor and an input event;
determining a segment of the display object comprising the selected location, wherein the display object has at least two segments, and wherein each segment is capable of mapping to a different input action; and
determining one input action for the determined segment, wherein an application program command corresponds to the determined input action.
2. The method as claimed in claim 1, wherein the display object is a
member of the set of display objects comprising a window, icon, and
displayed desktop.
3. The method as claimed in claim2. wherein determining the segment
of the display object has:
determining segments of the application window; and
determining the segment in the application window including the
selected location.
4. The method as claimed in claim 4 wherein each segment is defined
as a particular ratio of the application window, wherein determining the
segments of the application window comprises applying the ratio of each
segment to the application window to determine areas of the application
window including the segments.
5. The method as claimed in claim 3, comprising the step of sending a
message to the application program indicating the selected location and
the determined input action, wherein the application program maps the determined input action to a command to execute.
6. The method as claimed in claim 2, wherein the selected location is
within the display region of multiple application windows controlled by
different application programs, and wherein determining the segment on
the display object including the selected location has determining the
application window having a highest z-order of the application windows
having a display region comprising the selected location, wherein the
determined application performs the command corresponding to the
determined input action.
7. The method as claimed in claim 1, wherein the input event comprises
a user depressing a touch sensitive display screen and the selected location
is the location where the user depressed the touch sensitive display screen.
8. The method as claimed in claim 1, wherein the input action for each
segment corresponds to one input action performed by a multiple button
input device.
9. The method as claimed in claim 1, wherein the input event comprises
a user depressing a button on an input device, wherein the input device
controls the movement of a pointer that is used to determine the selected
location.
10. A system for implementing a method for processing data from an
input device as claimed in any preceding claim, for managing input
events, comprising:
a computer;
an input device in communication with the computer;
a display monitor in communication with the computer
characterized by
means (6) for receiving indication of a selection of a location on a display object displayed on the computer monitor and an input event generated by the input device;
means (6) for determining a segment of the display object including the selected location, wherein the display object includes at least two segments, and wherein each segment is capable of mapping to a different input action; and
determining (6) one input action for the determined segment, wherein an application program command corresponds to the determined input action.
11. A method for processing data from an input device, for managing input events to a computer substantially as herein described with reference to the accompanying drawings.

Documents:

939-del-2000-abstract.pdf

939-del-2000-claims.pdf

939-del-2000-correspondence-others.pdf

939-del-2000-correspondence-po.pdf

939-DEL-2000-Description (Complete).pdf

939-del-2000-drawings.pdf

939-del-2000-form-1.pdf

939-del-2000-form-19.pdf

939-del-2000-form-2.pdf

939-del-2000-form-3.pdf

939-del-2000-form-5.pdf

939-del-2000-gpa.pdf

939-del-2000-petition-137.pdf

939-del-2000-petition-138.pdf


Patent Number 216821
Indian Patent Application Number 939/DEL/2000
PG Journal Number 13/2008
Publication Date 31-Mar-2008
Grant Date 19-Mar-2008
Date of Filing 17-Oct-2000
Name of Patentee INTERNATIONAL BUSINESS MACHINE CORPORATION
Applicant Address ARMONK, NEW YORK 10504, U.S.A.
Inventors:
# Inventor's Name Inventor's Address
1 KRAUSE TERRY ROBERT 11316 JOLLYVILLE RD, APT. 233, AUSTIN, TEXAS 78759, U.S.A.
2 SNIDER ROBERT L. 800 PRIZE OAKS DRIVE, CEDAR PARK, TEXAS 78613, U.S.A.
3 SULLIVAN MICHAEL JOSEPH 10008 CHINA GARDEN COVE, AUSTIN, TEXAS 78730, U.S.A.
4 WAGNER JONATHAN MARK 2926 CEDAR CREST CIRCLE, ROUND ROCK, TEXAS 78664, USA
PCT International Classification Number G06F 3/023
PCT International Application Number N/A
PCT International Filing date
PCT Conventions:
# PCT Application Number Date of Convention Priority Country
1 09/439,051 1999-11-12 U.S.A.