Very interesting. I whole-heartedly agree with the first half of the video where they discuss the limitations of the traditional approach. But I don't really care for their solution all that much, it gives too much emphasis to one application IMHO. Just picture working on a lab report where you're typing in a word processor, referring to numbers in an e-mail, and using a scientific calculator to calculate formulas. In a traditional windowed approach all three can be easily visible at the same time, but in their approach a lot of vertical space is wasted and each of those windows would be quite narrow. Personally, I think the computing world would do well to break two conventions. The first is a constant pointer position. If I'm watching a movie I don't want the pointer to be anywhere. If working with widgets on opposite sides of the screen then I have no desire to transverse the space between them. No clue what a great solution would be, but one idea I had is to use absolute positioning for the initial touchpad tap after a few seconds of inactivity, so tapping the left side teleports the cursor to that rough position (use traditional, relative movement after that initial tap for finer movement). The other is that there can be only one cursor and one active application. In the physical world I have two hands that I can use independently. If I bring up two applications at once, side-by-side, then it's arbitrary which of them is currently focused (which gets keyboard input notwithstanding). Likewise, I'd like the ability to use two cursors simultaneously. A decent example of an activity that would benefit would be 2D window resizing and moving. Another would be selecting drawing tools in an image editor. Also, how often do you commit to dragging something, but discover you need to move a window out of the way or something? Maybe in R2 some of these paradigms can be addressed. Multi-touch is a reality now, and touchscreens don't lend themselves to cursors. One API issue that might need resolving is be with a touchscreen is that you can interact with an inner BView without calling MouseMoved() on the surrounding BViews; and would MouseMoved() even need to be called if it's just a tap? I can't imagine that being much of a problem, but I suppose at some point the developer suggestions should mention whether or not such things are safe assumptions to make. On Thu, Oct 15, 2009 at 8:24 AM, Bruno Albuquerque <bga@xxxxxxxxxxxxx>wrote: > I was impressed by this. Maybe one day this will be the default in Haiku. > ;) Watch the video. > > > http://flowingdata.com/2009/10/14/is-10gui-the-future-replacement-of-the-mouse-and-keyboard/ > > -Bruno > >