Dell And the Future of the Desktop

I likely should have titled this “Oh Crap, Dell has R&D!” because while labs from companies like HP, IBM, and Intel are famous you don’t really hear that much out of Dell.  Dell, for the most part, doesn’t talk about products generally until they are ready for market and in many cases this is a good thing.  I’m still struggling with the vision of the amazing Smartwatch that HP had working in their lab years ago that could have cornered the market but was killed by their then CEO Mark Hurd.  This year at Dell world Dell brought out their Smart Desk concept which is mostly targeted at folks that buy workstations. 

Ironically last week I had a chat with one of DreamWorks’ executives while at Jon Peddie’s Virtualization Conference about what they wanted for their animators in the future (DreamWorks typically uses a Dell competitor) and I was struck by how well this would fit into their workflow.  In fact, in thinking about this more, I figured it would do some rather wonderful things for my own workflow if I could justify the cost.  I’d better start saving my pennies. 

Smart Desk

Sometimes I think we may be overusing the word “Smart” especially given much of the products we call “Smart” aren’t that really intelligent.  Be that as it may this was a cool concept.  The idea was to turn the desktop into a second display and make it sensing.  Ironically, or sadly depending on how you think about it, the result was very much like taking Microsoft’s old Surface table and adding a monitor to it.  

Why I say sadly is that I was a huge fan of the Surface table and would use every opportunity I had to play on it.   I even tried to buy one several times but folks refused to sell me one because they were convinced I’d hate it given it was mostly designed for bars and hotels.   But the idea of a sensing display table was incredibly compelling.  Your work surface is active and this means you can change the virtual tools you are using dynamically and not mess with the monitor where you finished work is. 

For a photographer, engineer, or animator that means you are drawing and controlling the image under your hands on your desk and seeing the result on your screen.  Touch screen monitors have been somewhat problematic for professional artists and engineers because they tend to too far away to comfortably touch and we are more used to drawing down on our desks than up on a monitor (strangely you’d think the monitor touch approach would work better for artists used to easels but I have yet to see one used in this fashion).  

If you need a physical control like a dial you just pick up something that looks like a physical dial and put it on the screen and the screen recognizes this new element and incorporates it into the interface.  This was similar to the Surface table where you could up a variety of tools, like air hockey controls, and upon touching the table it would recognize their function and deal with them appropriately allowing the table to morph between virtual and physical control elements. 

Uses and Interface:

The demonstration went from how you’d use the workstation for editing pictures using Photoshop, animations using Maya, or for mixing music but it could be adapted to CAD/CAM, architecture or any other creative endeavor where you need a lot of control elements to create something that results in a digital still or moving image.  I could even see how it could be amazing for playing certain kinds of games but given this will undoubtedly be an expensive offering and targeting professionals they didn’t talk about that.  

Some interesting aspects of this are how it would affect mouse and keyboard use.  They had physical traditional mice and keyboards in the demo but showcased you could use a screen keyboard (like you probably have on your phone) and either a virtual touchpad or a mouse control that the screen would see and treat like a real mouse.  But I imagine users will quickly turn away  from physical constructs much like Smartphone users weaned themselves from keyboard phones and more aggressively move to alternatives like speech interface or controls that worked with the and emulated physical mice and keyboards.  Eventually, given they’ll have much more flexibility, I expect the market will come up with even more creative ways to replace these traditional constructs and better use the near unlimited options that a fully rendored desktop could enable. 

Wrapping Up:   Timing and Futures

I’m fascinated by the Smart Desktop and I’m pretty sure I’m going to want one when it comes out.  Thanks to Microsoft’s work with the Surface table the technology to create the specialized screen this will require already exists and, given the demonstrations, the software isn’t that far off either.  Without pushing too hard I figure Dell could have this in market in 24 months and they might even be able to make 12.   But I think this showcases a real move to change how we configure a desk and real move away from traditional keyboard and mice interfaces which we clearly started with tablets and smartphones years ago.  Where this is going to end up I can’t be sure but I can hardly wait to get there.