
If you've not yet seen
BlenderGuru's talk on Improving Blenders UI and
Brecht's UI analysis and challenges from this years
Blender Conference (
#bcon or
#bcon13), do, as they're both worth a watch.
A couple of notable points; first and foremost is Andrews realising the 'mistake' of his initial approach to the problem, it was unfortunately based on not having a more complete grasp of Blenders general environment and program ecosystem, it really isn't like other applications (both points he acknowledges in the talk). His latest modified ideas make much more sense now, not because the original ideas were wrong necessarily, but because they are now more appropriately contextualised with respect to the way Blender works; this is so very important and something initially (and still is in some quarters) grossly underestimated in the clamor for change.
Brecht's talk is much more pragmatic and reasoned around Blender being more akin to a series of editors held in place by a broader program container. This makes it difficult to implement certain types of change because Blender uses the mouse to determine where and what is to be affected by a particular action. As an example of this, open Blender and position the mouse over the 3DView, press "A" and everything is selected; do this over the Properties panel and it collapses whatever property sub-section the mouse happens to be hovering over; do this over the Outliner and all entries listed are highlighted (but not selected).
So, fundamentally, this means that Blender doesn't have the kind of hierarchical structure that might otherwise better lend itself to the type of 'global' changes being discussed in the broader conversation about the UI without perhaps tweaking what's going on under the hood. What then are the repercussions of this? Would it mean the Outliner becoming a 'master' or 'parent' editor to which everything else is subservient or a child - would this then allow for the data selection alluded to in Brecht's talk, about being able to double click a data point in the Outliner and having that open in the respective editor. Allied to this is then a question of whether the Outliner needs to be visible all the time (or at least easily accessible), in such instance how are other panels/editors organised - would it be possible to 'dock' or overlay them given Blender mechanics?
But... all this is sophistry at present and ignores a more pressing and important issue;
how much of this UI issue is really to do with Blender truly being "a difficult application to use", versus how much of it is simply down to users not knowing where things are or what they do. The two are not synonymous. Neither is the problem particular to Blender (
it's expressly naive to think otherwise it should be noted); anyone new to 3D, irrespective as to the application used, faces the same fundamental obstacle of not knowing what all the buttons do. How then is this problem solved by other software providers. How do they address what is ostensibly an informational issue.
The interesting take-away point from the two talks linked above is the creation of a UI team which Brecht will head (no solid info on other team members at time of writing), and it certainly important they have executive authority so-to-speak - open-source democratisation works to a point before it's just spinning its wheels.
Further Reading