Two of the things that have been on my TODO list for about two (maybe three) years are cross-app scripting and mouse gestures. StepTalk had some preliminary support for cross-app scripting, but I don't think it made it into a release. I never really liked its approach, since it seemed horrible over-engineered (for reference, the Smalltalk interpreter in StepTalk is about twice as much code as the Pragmatic Smalltalk compiler and support library).
Yesterday, I committed the first version of ScriptKit. This is a very lightweight cross-app scripting framework built on top of Distributed Objects. It simply exports a dictionary containing a set of named objects for scripting. By default,
NSApp (the application object) is exported. If you don't want to give unrestricted access to remote scripts you can export your own object with the 'Application' key and filter out some messages. You can also export other objects with their own names. In future we will define a set of standard-but-optional ones that Étoilé services should export (e.g. the current document, some CoreObject related things and so on).
For the paranoid, I plan on adding a 'Paranoid Mode' which uses a pre-shared key to prevent unauthorised scripts from controlling the app.
The nice side-effect of using DO as the core is that it is also trivial to send scripting events from Objective-C. Anyone who has tried doing this with Cocoa has probably given up and just generated a string containing AppleScript code and passed this to the scripting engine. Since we are using a Smalltalk which is toll-free bridged with Objective-C, it makes sense to just expose scripting objects as Objective-C / Smalltalk objects (well, object proxies) and use them directly, without a confusing abstraction layer.
The other thing I added yesterday was a gesture recognition engine. Today I remembered that 'x is a-cross' and fixed it so that it actually works. This is embedded in Corner.app, which currently handles hot corners for Étoilé (allowing scripts to be run when the mouse enters and leaves a screen corner). If you hold down control and shift, it enters a gesture quasi-mode. It then tracks mouse movements. Each movement is treated as an approximation of a movement in one of 8 directions, numbered 1 to 8 clockwise from the top (i.e. 1 is up, 5 is down, and so on). Complete gestures are therefore turned into strings ('gesture words'), so an 'h' shape would be '5135' (down-up-right-down). Distance moved in each direction is ignored because when doing mouse gestures I am rubbish at getting distances right, while with this system I can consistently do the gesture I was trying to.
Corner maintains a dictionary mapping gesture words to objects. These objects can be written in Smalltalk or Objective-C. They have to implement a
-gesturePerformed method, and this will be called whenever the gesture they are associated with is drawn. Now that cross-application scripting is working, this can be used to control any application, for example locking the screen and setting an away message in the Jabber client.
Currently, there is one default gesture - drawing an h hides the active application (if the active application supports scripting, otherwise it does nothing). Others will probably be added in time for 0.4.