The process for using cursors
To use cursors, do the following.
1. Create the cursor.
2. Define the cursor in a window.
3. Perform some work, during which the cursor is shown. Whenever the mouse pointer is over a window, the defined cursor for the window is shown.
4. Undefine the cursor from the window.
5. Free the cursor.
Before using a cursor, you must create it by sending a cursor creation message to either a CgDisplay, CgFont, or CgPixmap object, depending on what type of cursor (font, glyph, or pixmap) is being created. The method allocates the cursor and returns a CgCursor object. Although there are different ways of creating cursors, once created all cursors have the same behavior. See the subsections that follow for more on creating cursors.
A cursor can be shown in a window by sending defineCursor: to the window, specifying the cursor to use.
Cursors can only be defined in the window of a widget, not the root window of the screen. The widget can be any widget and is not restricted to the shell widget of the application.
The cursor is reset to the default cursor by sending undefineCursor to the window. When the application is finished using the cursor, it should be released from memory by sending freeCursor to the cursor. This process is illustrated by the following code. (This example and the following cursor examples assume that the shell method answers a CwTopLevelShell widget.)
| window watch |
window := self shell window.
"Create the cursor."
watch := window display
createFontCursor: XCWatch.
"Display the cursor."
window defineCursor: watch.
(Delay forSeconds: 5) wait.
window undefineCursor.
watch freeCursor
Last modified date: 04/18/2020