Today I was at the 2011 Embedded Systems Conference / DesignCon exposition. I typically attend technology expos in Boston, keeping an eye out for devices and software that I might be able to use in my job. But of course, I’m also interested in what embedded systems technology will enable in the near future.
There wasn’t anything mind-blowingly cool, but I will mention a few things that may be of interest to my readers.
First, IBM had an instantiation of Watson there, which was housed in a large black monolith that would be menacing if not for the colorful touch screen. Yes, Watson can run on a computer that IBM actually sells, which is the IBM Power 750 server.
I started playing Jeopardy against this Watson, but lost interest when I found that there wasn’t any voice recognition (to get a question right after winning the buzz, the software would tell you the answer, at which time you would honorably press a button to confirm or not).
I also experienced NLT’s new (samples became available in June 2011) 3D display. This is an LCD module which does not require glasses to see the 3D, and although I only stared at it for less than a minute, it did work and I did not have to be in a very specific location relative to the screen. I’d like to try an actual application that made use of mixed 3D/2D. That is part of what’s supposedly unique about this 3D LCD, is that it can mix 2D and 3D and it’s all at the same resolution. This is due to their HDDP (horizontally double-density pixel) tech. NLT also claims their LCD reduces cross talk (when your brain’s visual system mixes right and left eye information).
Speaking of display tech, I also played with Uneo’s Force Imaging Array System and 3D-Touch Module. The force array was not combined with a screen, and I’m not sure exactly what the killer app(s) would be—they claim it could be used for some unspecified medical, automotive, industrial apps. But I tried it and it works, and they told me that they would have one with even higher resolution soon (the current one has 2500 elements).
The 3D-Touch module was embedded in a tablet, and that also worked pretty well. The example app was of course a paint program, where you can see how your finger’s pressure affects the brush width as you paint. This doesn’t use the array—instead it uses sensors at the corners of the screen. That means you should be able to add it to any existing screen—it doesn’t have to be layered into the display stack. I certainly could imagine this being useful, at least occasionally, in various apps on my phone. Uneo has demoed it with Android devices so far but plans on getting support from the other mobile OSes.
Microsoft was there. Nothing amazingly new…they had the Xbox 360 Wireless Speed Wheel, which ships in October as far as I know. It seems like such an obvious controller that I was surprised that it didn’t come out until 2011.
They had a Kinect there, of course, and that’s always fun to play with—I spent about 10 minutes chopping flying fruit with my sword-hands. For those that are excited by this prospect, Fruit Ninja is available as of last month. For those living under rocks, Kinect is a super massively best selling controller for the XBox 360 which tracks the movement of your body as input for games. When it came out, people immediately started hacking it and using the sensor for robot applications. Microsoft didn’t like that at first, but now they’ve given in and offer a legit SDK (Software Development Kit) for it.
I was pleased to see that one attendee teleconned in with a VGo telepresence robot. Note this photo is of the back of robot.Tags: 3D, AI, computers, embedded, games, Kinect, robots, sensors, XBox 360