Little hack to help with testing

Want the ability to switch in different test drivers, mock objects, or other test-specific behaviour? Here’s a pattern I came up with (about a year ago) to do that in a GNUstep test tool, which can readily be used in Cocoa:

NSString *driverClassName = [[NSUserDefaults standardUserDefaults] stringForKey: @"Class"];
Class driverClass = NSClassFromString(driverClassName);
id myDriver = [[driverClass alloc] init];

With a healthy dose of no, seriously, don’t do this in production code, you now have the ability to specify your test driver on the command-line like this:

$ ./myTestingTool -Class GLTestDriver

This uses the oft-neglected behaviour of NSUserDefaults, in which it parses the executable’s command-line arguments to create a defaults domain, higher in priority than even the user’s preferences file. You can use that behaviour in a graphical app too, where it comes in handy when working in Xcode. It then uses a combination of the runtime’s duck typing and introspection capabilities to create an instance of the appropriate class.

Posted in cocoa, gnustep, objc, openstep, test | Leave a comment

Wistfully Wonderful Den of Coders

It’s the time of the year to acknowledge that yes, I am going to WWDC this year. Left it a bit last minute to get the flights and the hotel, but everything is in place now so hopefully I’ll see some of you guys/gals there. This is the first year that there’s been anything going on that isn’t Mac (it isn’t the first year there’s been non-MacDev, though; since the first WWDC I attended in 2005 there’s always been an IT track occupying around 20-25% of the sessions, though not much lab space). There have been mixed impressions of that – a representative sample:

About Time

New developers might screw up the experience

New developers might realise how cool Leopard is

I think this is going to be an exciting conference, especially for the new developers. I’ve never been as a newbie; in 2005 I’d already been doing GNUstep, WebObjects, Cocoa and NeXTSTEP development for varing numbers of years, though admittedly without particular expertise. From a perfeshunal perspective I’m not amazingly excited about iPhone development, I might drop in to a few of the sessions just to see what the state of play is, what people are interested in, what apps they’re creating and so on. No, for me this is the first year that I’ve actually got a project in full swing over the conference week so I’ll be most interested in heading down to the labs and getting mmalc to write my code finding out what I could improve.

And, of course, the networking (by which I mean the going out for beers and food every night)…

Posted in whatevs | 3 Comments

Managers: Don’t bend it that far, you’ll break it!

Go on then, what’s wrong with the words we already have? I think they’re perfectly cromulent, it’s very hard to get into a situation where the existing English vocabulary is insufficient to articulate one’s thoughts. I expect that linguists and lexicographers have some form of statistic measuring the coverage in a particular domain of a language’s expression; I also expect that most modern languages have four or five nines of coverage in the business domain.

So why bugger about with it? Why do managers (and by extension, everyone trying to brown-nose their way into the management) have to monetise that which can readily be sold[1]? Why productise that which can also be sold? Why incentivise me when you could just make me happy? Why do we need to touch base, when we could meet (or, on the other hand, we could not meet)? Do our prospectives really see the value-add proposition, or are there people who want to buy our shit?

Into the mire which is CorpSpeak treads the sceadugenga that is TechRepublic, Grahames yrre bær. The first words in their UML in a Nutshell review is "Takeaway". Right, well, I don’t think they’re about to give us a number 27 with egg-fried rice. (As a noun, that meaning appears only in the Draft Additions to the OED from March 2007.) Nor is there likely to be some connection with golf. All right, let’s read on.

UML lets you capture, document, and communicate information about an application and its design, so it’s an essential tool for modeling O-O systems. Find out what’s covered in O’Reilly’s UML in a Nutshell and see if it belongs in your library.

Ah, that would be a précis, unless I’m very much mistaken. Maybe even a synopsis. Where did you get the idea this was a takeaway? I can’t even work out what the newspeak meaning for takeaway might be. Had I not seen the linked review, I had thought the “if you take away one idea from this article, make it this” part of the article. In other words, if you’re so stupid that you can only remember one sentence from a whole page, we’ll even tell you which sentence you should concentrate on. This use[2] doesn’t fit with that retroactive definition though, because the conclusion which can be drawn from the above-quoted paragraph is that one might want to read the whole article. I would much rather believe that management types in a hurry would remember the subsequent sentence as their only recollection of the article.

UML in a Nutshell: A Desktop Quick Reference is not misnamed.

[1]You may argue that the word should be spelled “monetize”, as the word most probably came from American English, but it doesn’t matter because it doesn’t bloody exist. Interestingly, the verb sell originated in the Old English verb sellan, meaning to give, with no suggestion of barter or trade.

[2]Language usage is the only place I’ll admit the existence of the word usage.

Posted in Business, mythicalmanmonth, rant | 3 Comments

My name in lights

I’ve been published.

Posted in whatevs | Leave a comment

Social and political requirements gathering

I was originally going to talk about API: Design Matters and Cocoa, but, and I believe the title of this post may give this away, I’m not going to now. That’s made its way into OmniFocus though, so I’ll do it sooner or later. No, today I’m more likely to talk about The Cathedral and the Bazaar, even though that doesn’t seem to fit the context of requirements gathering.

So I’ve been reading a few papers on Requirements Engineering today, most notably Goguen’s The Dry and the Wet. One of the more interesting and subtle conclusions to draw from such a source (or at least, it’s subtle if you’re a physics graduate who drifted into Software Engineering without remembering to stop being a physicist) is the amount of amount of political influence in requirements engineering. Given that it costs a couple of orders of magnitude more to mend a broken requirement in maintenance than in requirements-gathering (Boehm knew this back in 1976), you’d think that analysts would certainly leave their own convictions at the door, and would try to avoid the "write software that management would like to buy" trap too.

There are, roughly speaking, three approaches to requirements elicitation. Firstly, the dry, unitarian approach where you assume that like a sculpture in a block of marble, there is a single "ideal" system waiting to be discovered and documented. Then there’s the postmodern approach, in which any kind of interaction between actors and other actors, or actors and the system, is determined entirely by the instantaneous feelings of the actors and is neither static nor repeatable. The key benefit brought by this postmodern approach is that you get to throw out any idea that the requirements can be baselined, frozen, or in any other way rendered static to please the management.

[That’s where my oblique CatB reference comes in – the Unitary analysis model is similar to ESR’s cathedral, and is pretty much as much of a straw man in that ‘purely’ Unitary requirements are seldom seen in the real world; and the postmodern model is similar to ESR’s bazaar, and is similarly infrequent in its pure form. The only examples I can think of where postmodern requirements engineering would be at all applicable are in social collaboration tools such as Facebook or Git.]

Most real requirements engineering work takes place in the third, intermediate realm; that which acknowledges that there is a plurality among the stakeholders identified in the project (i.e. that the end-user has different goals from his manager, and she has different goals than the CEO), and models the interactions between them in defining the requirements. Now, in this realm software engineering goes all quantum; there aren’t any requirements until you look for them, and the value of the requirements is modified by the act of observation. A requirement is generated by the interaction between the stakeholders and the analyst, it isn’t an intrinsic property of the system under interaction.

And this is where the political stuff comes in. Depending on your interaction model, you’ll get different requirements for the same system. For instance, if you’re of the opinion that the manager-charge interaction takes on a Marxist or divisive role, you’ll get different requirements than if you use an anarchic model. That’s probably why Facebook and Lotus Notes are completely different applications, even though they really solve the same problem.

Well, in fact, Notes and Facebook solve different problems, which brings us back to a point I raised in the second paragraph. Facebook solves the "I want to keep in contact with a bunch of people" problem, while Notes solves the "we want to sell a CSCW solution to IT managers" problem. Which is itself a manifestation of the political problem described over the last few paragraphs, in that it represents a distortion of the interaction between actors in the target environment. Of course, even when that interaction is modelled correctly (or at least with sufficient accuracy and precision), it’s only valid as long as the social structure of the target environment doesn’t change – or some other customer with a similar social structure comes along ;-)

This is where I think that the Indie approach common in Mac application development has a big advantage. Many of the Indie Mac shops are writing software for themselves and perhaps a small band of friends, so the only distortion of the customer model which could occur would be if the developer had a false opinion of their own capabilities. There’s also the possibility to put too many “developer-user” features in, but as long as there’s competition pushing down the complexity of everybody’s apps, that will probably be mitigated.

Posted in Business, metadev, mythicalmanmonth, usability | Leave a comment

The Dock should be destroyed, or at least changed a lot

I found an article about features Windows should have but doesn’t, which I originally got to from OSNews’ commentary on the feature list. To quote the original article:

The centerpiece of every Mac desktop is a little utility called the Dock. It’s like a launchpad for your most commonly used applications, and you can customize it to hold as many–or as few–programs as you like. Unlike Windows’ Start Menu and Taskbar, the Dock is a sleek, uncluttered space where you can quickly access your applications with a single click.

Which OSNews picked up on:

PCWorld thinks Windows should have a dock, just like Mac OS X. While they have a point in saying that Windows’ start menu and task bar are cumbersome, I wouldn’t call the dock a much better idea, as it has its own set of problems. These two paradigms are both not ideal, and I would love someone to come up with a better, more elegant solution.

The problem I have with the Dock (and had with the LaunchPad in OS/2, the switcher in classic Mac OS, and actually less so with the task bar, though that and the Start Menu do suffer this problem) is that their job basically involves allowing the internal structure of the computer to leak into the user’s experience. Do I really want to switch between NeoOffice Writer, KeyNote and OmniOutliner, or do I want to switch between the document I’m writing, the presentation I’m giving about the paper and the outline of that paper? Actually the answer is the latter, the fact that these are all in different applications is just an implementation detail.

So why does the task bar get that right? Well, up until XP when MS realised how cluttered that interface (which does seem to have been lifted from the NeXT dock) was getting, each window had its own entry in the task bar. Apart from the (IMO, hideously broken) MDI paradigm, this is very close to the “switch between documents” that I actually want to perform. The Dock and the XP task bar have similar behaviour, where you can quickly switch between apps, or with a little work can choose a particular document window in each app. But as I said, I don’t work in applications, I work in documents. This post is a blog post, not a little bit of MarsEdit (in fact it will never be saved in MarsEdit because I intend to finish and publish it in one go), the web pages I referenced were web pages, not OmniWeb documents, and I found them from an RSS feed, not a little bit of NetNewsWire. These are all applications I’ve chosen to view or manipulate the documents, but they are a means, not an end.

The annoying thing is that the Dock so flagrantly breaks something which other parts of Mac OS X get correct. The Finder uses Launch Services to open documents in whatever app I chose, so that I can (for instance) double-click an Objective-C source file and have it open in Xcode instead of TextEdit. Even though both apps can open text files, Finder doesn’t try to launch either of them specifically, it respects the fact that what I intend to do is edit the document, and how I get there is my business. Similarly the Services menu lets me take text from anywhere and do something with it, such as creating an email, opening it as a URL and so on. Granted some app authors break this contract by putting their app name in the Service name, but by and large this is a do something with stuff paradigm, not a use this program to do something one.

Quick Look and Spotlight are perhaps better examples. If I search for something with Spotlight, I get to see that I have a document about frobulating doowhackities, not that I have a Word file called “frobulating_doowhackities.doc”. In fact, I don’t even necessarily have to discover where that document is stored; merely that it exists. Then I hit space and get to read about frobulating doowhackities; I don’t have to know or care that the document is “owned” by Pages, just that it exists and I can read it. Which really is all I do care about.

Posted in aqua, metadev, rant, usability | 2 Comments

Yeah, we’ve got one of those

Title linkey (which I discovered via slashdot) goes to an interview in DDJ with Paul Jansen, the creator of the TIOBE Programmer Community Index, which ranks programming languages according to their web presence (i.e. the size of the community interested in those languages). From the interview:

C and C++ are definitely losing ground. There is a simple explanation for this. Languages without automated garbage collection are getting out of fashion. The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection.

So, to those people who balked at Objective-C 2.0’s garbage collection, on the basis that it "isn’t a 4GL", I say who cares? Seemingly, programmers don’t – or at least a useful subset of Objective-C programmers don’t. I frequently meet fellow developers who believe that if you don’t know which sorting algorithm to use for a particular operation, and how to implement it in C with the fewest temporary variables, you’re not a programmer. Bullshit. If you don’t know that, you’re not a programmer who should work on a foundation framework, but given the existence of a foundation framework the majority of programmers in the world can call list.sort() and have done with it.

Memory management code is in the same bucket as sorting algorithms – you don’t need for everybody to be good at it, you need for enough people to be good at it that everyone else can use their memory management code. Objective-C 2.0’s introduction of a garbage collector is acknowledgement of this fact – look at the number of retain/release-related problems on the cocoa-dev list today, to realise that adding a garbage collector is a much bigger enhancement to many developers’ lives than would be running in a VM, which would basically go unnoticed by many people and get in the way of the others trying to use Instruments.

Of course, Objective-C and ApPLE’s developer tools have a long history of moving from instrumental programming (this is what the computer must do) to declarative programming (this is what I am trying to achieve, the computer must do it). Consider InterfaceBuilder. While Delphi programmers could add buttons to their views, they then had to override that button’s onClick() method to add some behaviour. IB and the target-action approach allow the programmer to say "when this button is clicked, that happens" without having to express this in code. This is all very well, but many controls on a view are used to both display and modify the value of some model-level property, so instead of writing lots of controller code, let’s just declare that this view binds to that model, and accesses it through this controller (which we won’t write either). In fact, rather than a bunch of boilerplate storage/accessors/memory management model-level code, why don’t we just say that this model has that property and let someone who’s good at writing property-managing code do the work for us? Actually, coding the model seems a bit silly, let’s just say that we’re modelling this domain entity and let someone who’s good at entity modelling do that work, too.

<

p>In fact, with only a little more analysis of the mutation of Objective-C and the developer tools, we could probably build a description of the hypothetical Cen Kase, the developer most likely to benefit from developing in Cocoa. I would expect a couple of facts to hold; firstly that Cen is not one of the developers who believes that stuff about sorting algorithms, and secondly that the differences between my description of Cen and the description used by Apple in their domain modelling work would fit in one screen of FileMerge on my iBook.

Posted in cocoa, metadev, nextstep, objc, openstep | Leave a comment

Tracking the invisible, moving, unpredictable target

An idea which has been creeping up on me from the side over the last couple of weeks hit me square in the face today. No matter what standards we Cocoa types use to create our user interfaces, the official Aqua HIG, the seemingly-defunct IndieHIG, or whatever, ultimately producing what is considered a usable (or humane, if you like) interface for Mac OS X is not only difficult, but certainly unrepeatable over time.

The “interface” part of a Cocoa user interface is already hard enough to define, being a mash-up of sorts, and to differing degrees, between the Platinum HIG which directs the default behaviour of some of the *Manager controls and the OpenStep HIG which describes the default behaviour of most, if not all, of the AppKit controls. If that isn’t enough, there is an inexact intersection – some controls work differently in (what are loosely called, and I’m not getting into the debate) Cocoa apps than in Carbon apps. There have also been innovative additions on top of the aforementioned guides, such as sheets, unified toolbars and (the already legacy) textured interfaces. There have been subtractions from both – miniwindows still exist but nobody uses ’em, and window shading went west with Rhapsody.

But all of that is related to the user interface, not to user interaction (I’m in the middle of reading Cooper’s The Inmates Are Running the Asylum, I’m going to borrow some terminology but studiously avoid discussing any of the conclusions he presents until I’m done reading it). It’s possible to make HIG-compliant inspectors, or HIG-compliant master-detail views, or HIG-compliant browser views and so on. It’s also possible to make non-compliant but entirely Mac HID views, coverflow views, sidebars and so on. But which is correct? Well, whichever people want to use. But how do you know which people want to use? Well, you could get them to use them, but as that’s typically left until the beta phase you could ask usability gurus instead. Or you could take the reference implementation approach – what would Apple (or Omni, or Red Sweater, or whoever) do?

Well, what Apple would do can, I think, be summed up thus: Apple will continue doing whatever Apple were previously doing, until the Master User takes an interest in the project, then they do whatever the Master User currently thinks is the pinnacle of interaction design. The Master User acts a little like an eXtreme Programming user proxy, only with less frequent synchronisation, and without actually consulting with any of the other 26M users. The Master User is like a reference for userkind, if it all works for the Master User then at least it all works for one user, so everyone else will find it consistent, and if they don’t find it painful they should enjoy that. The official job title of the Master User role is Steve.

All of this means that even inside Apple, the “ideal” usability experience is only sporadically visited, changes every time you ask and doesn’t follow any obvious trend such as would be gained by normalisation over the 26M users. Maybe one day, the Master User likes inspectors. Then another day he likes multi-paned, MDI-esque interaction. On a third day he likes master-detail control, in fact so much so that he doesn’t want to leave the application even when it’s time to do unrelated work. Of course you don’t rewrite every application on each day, so only the ones that he actually sees get the modernisation treatment.

So now we come back to the obvious, and also dangerous, usability tactics which are so prevalent on the Windows platform, and one which I consciously abhor but subconsciously employ all the time: “I’m the developer, so I’ll do it my way”. Luckily there are usability, QA and other rational people around to point out that I’m talking shite most of the time, but the reasoning goes like this. I’m a Mac user, and have been for a long time. In fact, I might know more about how this platform works than anyone within a couple of miles of here, therefore(?) I know what makes a good application. One problem which affects my personal decisions when trying to control the usability is that I’m only tangentially a Mac person, I’m really a very young NeXTStep person who just keeps current with software and hardware updates. That means I have a tendency to inspector my way out of any problem, and to eschew custom views and Core Animation in favour of “HIG is king” standard controls, even when other applications don’t. And the great thing is that due to Moving Target reference implementation, I can find an application which does something “my” way, if that will lend credence to my irrational interface.

The trick is simply to observe that taking pride in your work and expressing humility at your capabilities are not mutually exclusive. If tens of other Mac users are telling me they don’t like the way it works, and I’m saying it’s right, apply Occam’s razor.

And if there isn’t enough fun for you in one usability experience, a bunch of us are presumably going to be providing the iPhone HIG-compliant V on top of our Ms and Cs before long.

Posted in aqua, carbon, cocoa, openstep, usability | Leave a comment

Come back, purple button, all is forgiven!

As a great philosopher once wrote: don’t it always seem to go, that you don’t know what you got ’til it’s gone? Previews of Mac OS X had a user interface feature, known by all who saw it as the Purple Button. Look at this screenshot from System Preferences:

The boiled sweet on the top-right of the window would go purple, hence the name. Clicking on it activated a single-window mode. All documents except the one that you were working on would be minimised into the Dock, and switching between them would minimise the earlier one before restoring the newly-focused document. Of course, the problem with this in the developer previews/public beta which rendered it unusable were performance-related. The “lickable” eye-candy in Aqua was ambitious even on the top-end G4 systems available at the time, and so time spent in the Genie or Scale effects was really noticable. Add to that the effect of applications being slow enough not to update their views in time – the System Preferences application you can see above is a Cocoa-Java app, and back then the JVM wasn’t amazing for performance – and you have a really sucky single-window experience.

On the other hand, it’s really bloody useful. Look at apps like WriteRoom or GLTerminal, which go out of their way to get rid of all that other clutter. Or Spaces (or CDE virtual desktops, WindowMaker virtual desktops… you get the idea), also designed to let you forget all those other apps are there. Well, spaces is quite nice (and a little more flexible than purple button was), but playing spaces ping-pong tends to make me a bit seasick. Not to mention the time it wastes being about as great as the unperformant purple button switching…so please, purple button, come back!

Some environments provided the same user experience out of a lack of choice – for instance, OZ couldn’t show more than one application if it wanted to, and certainly running more than one at once was out of the question (it would simulate multi-tasking by suspending background tasks).

Posted in aqua, cocoa, FTFF, usability | 1 Comment

How exciting

Today I was pleasantly surprised by Interface Builder. Not shiny, new, where the hell have they put that buttonstreamlined IB3, but boring old IB2 which even Slowlaris users could work out how to use. I dragged a header defining a category with an IBAction onto IB, and lo, nay even behold, it did the right thing.

That may seem unexciting and even expected, but it’s one of those nice cases where it’s pleasing that everything just works. I thought category headers might be edge-case enough to confuse the thing; many people would put their IBAction definitions in the “regular” @interface header so that the IBOutlets are in the same place.

Posted in whatevs | Leave a comment