UI Browser icon

PFiddlesoft Frameworks — Features

Cocoa Frameworks for Assistive Applications

Scroll ↕

The PFiddlesoft Frameworks include the PFAssistive Framework and the PFEventTaps Framework.

The PFAssistive Framework

The PFAssistive Framework was created in 2003 as the engine driving PFiddlesoft’s highly regarded UI Browser application for developers and for users of Apple’s AppleScript GUI Scripting technology. The framework supports and enhances Apple's Accessibility API.

The Accessibility API implements the concept of a UI element, an object that represents a user interface element on the screen in any running application, such as a menu, a window, or a button, or the application itself. The Accessibility API also implements the concept of an observer, an object that registers to observe a UI element that issues Accessibility notifications when changes occur.

The PFAssistive Framework implements these same concepts in its PFUIElement, PFApplicationUIElement, and PFObserver classes, each of which instantiates and encapsulates an associated Accessibility API object and makes its capabilities available to an assistive application using standard Objective-C or Swift techniques and the Cocoa frameworks. For example, an assistive application using the PFAssistive Framework can implement optional delegate methods declared in the framework’s PFUIElementDelegate and PFObserverDelegate formal protocols to observe Accessibility notifications. The downloadable disk image includes our Assistive Application Programming Guide for macOS and a detailed PFAssistive Framework Reference, as well as the distribution license.

With the PFAssistive Framework, an assistive application discovers UI elements, reads UI element attributes, and controls UI elements and, through them, the target application, all in the interest of supporting assistive applications that enable a user with disabilities to use the computer to perform the same tasks that any user can perform with the graphical user interface. An assistive application typically does this by performing these tasks:

  • An assistive application discovers individual UI elements in the target application in one of three ways: by locating an element on the screen, for example, the element currently under the mouse or the element that is the target application’s frontmost window; by receiving a notification from an element that something about it has just changed; or by navigating the element hierarchy from a known starting point.
  • When an assistive application discovers a UI element of potential interest, it ascertains the element’s identity and nature by reading its attributes, such as its role, its title, and its position and size on the screen.
  • Once it identifies and understands a UI element, an assistive application manipulates or controls the element and, through it, the target application, at the user's direction. There are three ways to do this, depending on the nature of the element: by setting the element's value or other attribute; by performing an action on the element; or by sending a keystroke to the target application while the element has keyboard focus (or by sending a key combination that the application recognizes as a keyboard shortcut).

By performing these tasks—exploring, monitoring, and controlling a UI element—an assistive application enables a user with disabilities to do everything that a user without disabilities can do with the target application.

The PFEventTaps Framework

The PFEventTaps Framework was created in 2007 as the engine underlying PFiddlesoft’s free Event Taps Testbench utility for developers. The framework supports and enhances Apple's Quartz Event Taps API.

The Event Taps API implements the three concepts of an event tap to monitor and intercept user input events, a hardware event source such as a mouse, keyboard, scroll wheel, tablet, tablet pointer, or virtual input device, and a user input event that is generated by an event source.

The PFEventTaps Framework implements these same concepts in its PFEventTap, PFEventSource, and PFEvent classes, each of which instantiates and encapsulates an associated Event Taps API object and makes its capabilities available to an assistive application using standard Objective-C and Swift techniques and the Cocoa frameworks. For example, an assistive application using the PFEventTaps Framework can implement optional delegate methods declared in the framework’s PFEventTapsDelegate formal protocol to observe Quartz events as they are generated by user input devices and virtual devices, and to filter, modify, block, and respond to the events. The framework bundle includes our Assistive Application Programming Guide for macOS and a detailed PFEventTaps Framework Reference, as well as the distribution license.

With the PFEventTaps Framework, an assistive application monitors user input events generated by devices such as a mouse, a keyboard, or a tablet, and it filters, modifies, blocks, and responds to user input events and generates additional synthetic events, all in the interest of supporting assistive devices and applications that enable a user with disabilities to use the computer to perform the same tasks that any user can perform. An assistive application typically does this by performing these tasks:

  • An assistive application creates and installs a PFEventTap object that intercepts user input events at one of several points in the system's low-level event handling machinery. It sends delegate messages or callbacks to an assistive application to enable it to filter, modify, block, and respond to the events.
  • An assistive application retrieves a PFEvent object that represents a user input event generated by an event source and reported to an assistive application by an installed event tap. The application uses the PFEvent object to filter, modify, block, and respond to the event. It can also respond to the event by sending additional synthetic events before or after the received event and by taking other actions.
  • An assistive application creates PFEvent objects that post independent synthetic user input events, for example, from a virtual onscreen keyboard.
  • An assistive application retrieves or creates a PFEventSource object that represents a user input device such as a mouse, keyboard, scroll wheel, tablet, or tablet pointer, or a virtual device. An event source reports current state information about the associated device outside of the event stream.

The two frameworks are independent of one another. Both together enable you to write a full-featured assistive application, but you may need only one or the other to accomplish more limited purposes.