Rambling: Should All Software Be Real-Time?

Posted in programming on February 2nd, 2013 by Samuel Kenyon

As a design guideline, most disembodied “smart” computer systems should be real-time, at least of the soft variety. But they aren’t. We’ve gotten used to the cheap non-real-time properties of mainstream software.

Evolution may have resulted in that design guideline for minds. But our computer programs and networks aren’t minds—at least not yet. However, successful narrow AI techniques continually get added to the toolbox of software engineering. Although most software systems are not in any partition considered to be “minds,” they are doing narrow tasks. Some tasks are human level, such as identifying faces in a photo on Facebook. Some are not human level at all, such as a MapReduce system searching exabytes of data.

All of these should respond quick enough to all inputs so as not to slow down or foul up the system. The humans involved will not wait for very long on their end. Obviously a lot of computer programs are not real-time (not even soft real-time).

It’s more about usability—the slow and/or inconsistent-response programs are less useful and more annoying. They only survive in the software / Internet ecosystem because the nature of that ecosystem is different than the nature of biological evolution.

Tags: ,

The Disillusionment of Math

Posted in philosophy, robotics on July 16th, 2012 by Samuel Kenyon

The Early Years


When I was in elementary school, a consultant who offered optional advanced studies taught a small group of us some basic algebra. This was amazing to me at the time—solving for the mysterious x!

The next amazing mathematical concept I learned of was imaginary numbers. Just like pornography, I learned about it long before I was supposed to.

Read more »

Tags: , ,

Embedded Systems Expo 2011: A Few Notes

Posted in artificial intelligence, interfaces, robotics on September 28th, 2011 by Samuel Kenyon

Today I was at the 2011 Embedded Systems Conference / DesignCon exposition. I typically attend technology expos in Boston, keeping an eye out for devices and software that I might be able to use in my job. But of course, I’m also interested in what embedded systems technology will enable in the near future.
There wasn’t anything mind-blowingly cool, but I will mention a few things that may be of interest to my readers.

First, IBM had an instantiation of Watson there, which was housed in a large black monolith that would be menacing if not for the colorful touch screen. Yes, Watson can run on a computer that IBM actually sells, which is the IBM Power 750 server.

IBM Watson

I started playing Jeopardy against this Watson, but lost interest when I found that there wasn’t any voice recognition (to get a question right after winning the buzz, the software would tell you the answer, at which time you would honorably press a button to confirm or not).

I also experienced NLT’s new (samples became available in June 2011) 3D display. This is an LCD module which does not require glasses to see the 3D, and although I only stared at it for less than a minute, it did work and I did not have to be in a very specific location relative to the screen. I’d like to try an actual application that made use of mixed 3D/2D. That is part of what’s supposedly unique about this 3D LCD, is that it can mix 2D and 3D and it’s all at the same resolution. This is due to their HDDP (horizontally double-density pixel) tech. NLT also claims their LCD reduces cross talk (when your brain’s visual system mixes right and left eye information).


Speaking of display tech, I also played with Uneo’s Force Imaging Array System and 3D-Touch Module. The force array was not combined with a screen, and I’m not sure exactly what the killer app(s) would be—they claim it could be used for some unspecified medical, automotive, industrial apps. But I tried it and it works, and they told me that they would have one with even higher resolution soon (the current one has 2500 elements).

The 3D-Touch module was embedded in a tablet, and that also worked pretty well. The example app was of course a paint program, where you can see how your finger’s pressure affects the brush width as you paint. This doesn’t use the array—instead it uses sensors at the corners of the screen. That means you should be able to add it to any existing screen—it doesn’t have to be layered into the display stack. I certainly could imagine this being useful, at least occasionally, in various apps on my phone. Uneo has demoed it with Android devices so far but plans on getting support from the other mobile OSes.

Uneo 3D Touch example (photo from Uneo)

Microsoft was there. Nothing amazingly new…they had the Xbox 360 Wireless Speed Wheel, which ships in October as far as I know. It seems like such an obvious controller that I was surprised that it didn’t come out until 2011.

Xbox 360 Wireless Speed Wheel (stock photo)

They had a Kinect there, of course, and that’s always fun to play with—I spent about 10 minutes chopping flying fruit with my sword-hands. For those that are excited by this prospect, Fruit Ninja is available as of last month. For those living under rocks, Kinect is a super massively best selling controller for the XBox 360 which tracks the movement of your body as input for games. When it came out, people immediately started hacking it and using the sensor for robot applications. Microsoft didn’t like that at first, but now they’ve given in and offer a legit SDK (Software Development Kit) for it.

Fruit Ninja Kinect (stock screenshot)

I was pleased to see that one attendee teleconned in with a VGo telepresence robot. Note this photo is of the back of robot.

VGo robot in use at ESC 2011

Tags: , , , , , , , ,

Under the Dome: MIT Open House

Posted in culture, robotics on June 11th, 2011 by Samuel Kenyon

This is a belated post from April. I live near MIT, so when they held an open house on April 30 I felt it was my duty to attend.

However, the most surprising thing was not the technology on display so much as the vast swarms of yuppie larvae—aka virus vectors, aka children. After awhile (about 5 minutes) my perception of their presence incremented from “cute” to “horrific.” Even worse were the parents of said children, whose method for navigating crowds consisted of crashing into other people like a bunch of semi-autonomous pinballs. So I departed, but not without taking a few photos first.

Multi-touch Table

El Cheapo Multi-Touch Table

Innards of the Student-Built Multi-touch Table

Innards of the Student-Built Multi-Touch Table

Cars That Won't Crash

Cars That Won't Crash

Supervisory Control of Cyberphysical Systems (poster)

Supervisory Control of Cyberphysical Systems (poster)

A Wearable Vital Signs Monitor at the Ear (poster)

A Wearable Vital Signs Monitor at the Ear (poster)

And now photos of human children engaged with robots:

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

Children with Robots

And just for shits and giggles, here are some ancient computing artifacts that were on display in the Stata Center.  The first is an Atari 2600 “video computer system” (nowadays, a “console”) with a Space Invaders cartridge, right underneath a sign about Moore’s Law.

Atari 2600 console with Space Invaders cartridge

Atari 2600 console (released in 1977)

And one of the first cell phones, being fondled by me:

Motorola DynaTAC 8000X (circa 1983)

Motorola DynaTAC 8000X (circa 1983)

I conclude with a video I took of a good ol’ floating electromagnet:

Tags: , , , , ,