As I said in the last post, I appreciate the power of GUI prototyping tools. And I will definitely use them when appropriate.
However, I still stand by my philosophy that in many cases a good choice is to make a prototype or mockup with the actual visual development environment that the developers use. For instance, the UX person or GUI designer can mock-up things up with Qt’s Creator without knowing anything about programming. Likewise with Microsoft tools and pretty much any visual GUI designer. Then, the developers can immediately use the saved files for the actual development, and no prototyping or GUI design effort is wasted or duplicated. It’s also a method of communication back and forth between designers and developers.
The downside is that you might not have enough dynamic interactions as you can achieve with a prototyping tool. Of course, you could use both—mock it up with a developer’s visual IDE, and then use screenshots of that to achieve the dynamic flow in a prototyping tool.
I discussed this with a few of the UX people at the seminar and they did not seem to have though of using developer tools. It’s as if UX people assume that programmer’s tools can only be used by programmers. Fortunately, I’m a Renaissance Man so I can (ab)use any tool I want to.
But What about Non-Standard GUIs?
Although I sometimes deal with desktop/WIMP style GUIs, a lot of my GUI design has used an approach similar to many video games, which is to reject any common GUI elements that aren’t appropriate and make custom ones. I layer and overlap whatever is best for the design, as opposed to being limited to what default widgets and/or a particular GUI framework / editor can do.
For instance, on multiple occasions in the past I have made applications where video is the main focus of the user. But I wanted various types of widgets rendered over the video. So, being in the position of a GUI designer and a programmer, on these various programs I made my own OpenGL view which renders a frame of video and renders whatever sprites, etc. are needed for whatever overlayed widgets/graphics are present.
I did sometimes photoshop mockups for these apps. You could call them “prototypes,” but they weren’t very dynamic since I could only present a small series of mocked-up screenshots. However, now that there are more prototyping tools (and I’m more aware of their existence), I would consider using them in combination with Photoshop.
In other words, photoshop the custom widgets and/or graphics, and then use the prototype tool to put it all together and to set up the “script”—the dynamic narrative that you will demonstrate with the prototype.GUIs, IxD, programming, prototyping, ux