Softer, Better, Faster, Stronger

Posted in interfaces, transhumanism on September 22nd, 2010 by Samuel Kenyon

Now published on h+ magazine: my article “Softer, Better, Faster, Stronger: The Coming of Soft Cybernetics.” Check it out!

I have a titanium screw in my head.  It is a dental implant (root-form endosseous) covered with a crown.

xray of a dental implant

Note: This is a representative photo from Wikipedia, not my personal implant

Osseointegration (fusing implants with bone) is used for many things these days, such as bone-anchored hearing aids and bone-anchored leg prostheses.

photo of a bone-anchored leg prosthetic

This is cool, but there’s a major interface problem if you have a metal rod poking out of your skin–it’s basically an open wound.  Researchers have found a solution however based on deer antlers, called ITAP (Intraosseous Transcutaneous Amputation Prosthesis), in which they can get the skin to actually grow into the titanium.

photo of deer head with antlers

Deer antlers go through the skin, like bone-anchored prosthetics

They do this by carefully shaping the titanium and putting lots of tiny holes in it.  ITAPs are what the momentarily famous “bionic” cat Oscar received last June.

In these examples, biology is doing most of the work.  Sure, the chemical properties of titanium make it compatible, but when will the artificial technology pull its weight?  Where are the implants that integrate seamlessly with your body and with other implants?  Where are the computer interfaces that automatically and robustly integrate with any person’s nervous system?

Sure, there’s a lot of great medical technology which does successfully interface with human biology.  Let’s not forget the AbioCor artificial implantable replacement heart as featured in the illustrious film Crank: High Voltage.

photo of abiocor artificial heart


image of jason statham with battery charger attached to nipple and tongue (from the movie Crank 2)

Crank: High Voltage

But there’s a lot of things that don’t work well yet, such as direct neural interfaces–although there are glimmers of hope such as optical interfaces to the nervous system.  And besides medical technology, what about all machines–why are they so inflexible and high-maintenance?  And it’s not just hardware–the software realm seems to be particularly behind with “soft” and flexible interfaces.

In recent article called “Building Critical Systems as a Cyborg”, software architect Greg Ball compares the von Neumann algorithmic approach of most conventional software to the cybernetics approach.  He says:

Don’t assume those early cyberneticists would be impressed by our modern high-availability computer systems. They might even view our conventional approach to software as fatally arrogant, requiring a programmer to anticipate everything.

What if instead of fighting changes and new interactions, our software embraced them?  A cybernetic approach to software would more oriented around self regulation, including parts that are added in to the system from outside.

You might argue that regulation with feedback loops has been part of engineering systems for a long time.  But we still have a lot of brittleness in the interfaces.  It’s not easy to make systems out of components unless the interfaces match up perfectly.  In the software realm, things are pretty much the same.  Most of our technology behaves very differently from biology in terms of interfacing, adaptation, learning and growth.  Eventually we can do better than biology, but first we need to be as soft as biology.  This will help us not only for making machines that operate in the dynamic real world of humans, but will also help us make devices that directly attach to humans.

Do We Need Fuzzy Substrates?

photo of fuzzy thing

Computers are embedded in almost all of our devices, and most of them are digital.  Information at the low levels is stored as binary.  Biology, in contrast, often makes use of analog systems.  But does that matter?  Take fuzzy logic for example.  Fuzzy logic techniques typically involve the concept of intermediate values between true and false.  It’s a way of dealing with vagueness.  But you don’t need a special computer for fuzzy logic–it’s just a program running on the digital computer like any other program.

Fuzzy logic, probability and other soft-computing approaches could go a long way to cover the role of adaptive interfaces in the computer code of a cyborg.  But are adaptive layers running on digital substrates enough?

USCD has been doing research with electronic neurons, which are made from analog computers.  So unlike most computers, the substrate does not represent information with discrete values.

Joseph Ayers and his lab members at Northeastern University were at one point attempting to use these electronic neurons in biomimetic lobster robots.  The electronic nervous system (ENS) would generate the behaviors of the robot, such as the pattern of signals to cause useful motion of the legs.  The legs are powered by nitinol (an alloy of titanium and nickel) wires, which expand and shrink thus causing movement.

photo of biomimetic lobster robot from Northeastern University

biomimetic lobster robot from Northeastern University

The robots already had a digital control system, so the main point of moving to the ENS was for chaotic dynamics.  As Ayers described the situation:

The present controller is inherently deterministic, i.e., the robot does what we program it to do. A biological nervous system however is self-organizing in a stimulus-dependent manner and can use neurons with chaotic dynamics to make the behavior both robust and adaptive. It is in fact this capability that differentiates robotic from biological movements and the goal of ENS-based controllers.

Besides the dynamic chaos in nervous systems, the aforementioned USCD also researches synchronized chaos.  It sounds paradoxical, but it actually happens.  It could potentially be used for certain kinds of adaptable interfaces.  For instance, synchronized chaos can achieve “asymptotic stability,” which means that two systems can recover synchronization quickly after an external force messes up their sync.

I have given you a mere taste of soft cybernetics.  Its usage may have to increase, although it is not clear yet whether we need new information substrates such as analog computers.

Image Credits:

  1. DRosenbach at en.wikipedia
  2. Elizabeth Banuelos-Totman, University of Utah
  3. Marieke IJsendoorn-Kuijpers
  5. Crank: High Voltage (2009), Lionsgate
  6. Mostaque Chowdhury
  7. Jan Witting
Tags: , , , , , , , , , ,

Multitask! The Bruce Campbell Way

Posted in culture, interaction design, posthuman factors, transhumanism on September 7th, 2010 by Samuel Kenyon

I have a new essay up on the h+ magazine website:

photo of Bruce Campbell talking on a cell phone

Some have pointed out the supposed increase in multitasking during recent decades.  An overlapping issue is the increase in raw information that humans have access to.  It is certainly a fascinating sociocultural change.  However, humans are not capable of true multitasking.  First I will describe what humans do have presently, and then I will discuss what future humans might be capable of.

Read more…

Tags: , , , , , , , ,

The Great Drama of Interfaces

Posted in culture, interfaces on August 30th, 2010 by Samuel Kenyon

The great drama of the next few decades will unfold under the crossed stars of the analog and the digital.

—Steven Johnson, Interface Culture

Credit:, CC by-nc-sa 2.0

Credit: Brian Despain

Credit: E. Benyaminso via A Journey Round My Skull, CC by- 2.0

Credit: J (, CC by- 2.0

Credit: J (, CC by- 2.0

Credit: Roberto Rizzato, CC by-nc 2.0


Tags: , , ,

What Bruce Campbell Taught Me About Robotics

Posted in artificial intelligence, robotics on March 16th, 2010 by Samuel Kenyon

One of the films which inspired me as a kid was Moontrap, the plot of which has something to do with Bruce Campbell and his comrade Walter Koenig bringing an alien seed back to earth.


Nothing ever happens on the moon

This alien (re)builds itself out of various biological and electromechanical parts.

The Moontrap robot

The Moontrap robot

At one point the robot had a skillsaw end effector, not unlike the robot in this exquisite depiction of saw-hand prowess:

Cyborg Justice

Cyborg Justice (Sega Genesis, 1993)

In that game—which I also played as a child—you could mix-and-match legs, torsos, and arms to create robots.

The later movie Virus had a similar creature to the one in Moontrap, and if I remember correctly, the alien robots in the movie *Batteries Not Included could modify and reproduce themselves from random household junk.

The ability for a creature to compose and extend itself is quite fascinating. Not only can it figure out what to do with the objects it happens to encounter, but it can adjust its mental models in order to control these new extensions.

I think that building yourself out of parts is only a difference in degree from tool use.


During the long watches of the night the solitary sailor begins to feel that the boat is an extension of himself, moving to the same rhythms toward a common goal.  The violinist, wrapped in the stream of sound she helps to create, feels as if she is part of the “harmony of the spheres.”  The climber, focusing all her attention on the small irregularities of the rock wall that will have to support her weight safely, speaks of the sense of kinship that develops between fingers and rock, between the frail body and the context of stone, sky, and wind. —Csikszentmihalyi [1]

Human tool use

Humans are perhaps the most adaptable of animals on earth (leave a comment if you know of a more adaptable organism).

Our action-perception system may have morphology-specific programming. But it’s not so specific that we cannot add or subtract from it. For instance, anything you hold in your hand becomes essentially an extension of your arm. Likewise, you can adapt to a modification in which you completely replace your hand with a different type of end effector.

Alternate human end effector

You might argue that holding something does not really extend your arm. After all, you aren’t hooking it directly to your nervous system. But the brain-environment system does treat external objects as part of the body.

We have always been coupled with technology. We have always been prosthetic bodies.

Something unique about hands is that they may have evolved due to tool use. Bipedalism allowed this to happen. About 5 million years after bipedalism, tool use and a brain expansion appeared [2]. It’s possible that the homo sapiens brain was the result of co-evolution with tools.

Oldowan Handaxe

Oldowan Handaxe (credit: University of Missouri)

The body itself is part of the environment, albeit a special one as far as the brain is concerned. The brain has no choice but to have this willy-nilly freedom of body size changes—or else how would you be able to grow from a tiny baby to the full size lad/gal/transgender you are today?

An example of body-environment overlap is the cutaneous rabbit hopping out of the body experiment [3].

rabbit tatoo

The white cutaneous rabbit

The original cutaneous (==”of the skin”) rabbit experiment demonstrated a somatosensory illusion: your body map (in the primary somatosensory cortex) will cause you to report tapping (the “rabbit” hopping) on your skin in between the places where the stimulus was actually applied. The out of the body version extends this illusion onto an external object held by your body (click on figure below for more info).

Hopping out of the body

Hopping out of the body (credit: Miyazaki, et al)

Some other relevant body map illusions are the extending nose illusion, the rubber hand illusion, and the face illusion.

Get Your Embody Beat

Metzinger’s self-model theory of subjectivity [4] defines three levels of embodiment:

First-order: Purely reflexive with no self-representation. Most uses of subsumption architecture would be categorized as such.

Second-order: Uses self-representation, which affects its behavior.

Third-order: In addition to self-representation, “you consciously experience yourself as embodied, that you possess phenomenal self-model (PSM)”. Humans, when awake, fall into this category.



Metzinger refers to the famous starfish robot as an example of a “second-order embodiment” self-model implementation. The starfish robot develops its walk with a dynamic internal self model, and can also adapt to body subtractions (e.g. via damage).

I don’t see why we can’t develop robots that learn how to use tools and even adapt them into their bodies. The natural way may not be the only way, but it’s at least a place to start when making artificial intelligence. AI has an advantage though, even when using the naturally inspired methods, which is that the researchers can speed up phylogenetic development.

What I mean by that is I could adapt a robot to a range of environments through evolution in simulations running much faster than real time. Then, I can deploy that robot in real life where it continues its learning, but it has already learned via evolution the important and general stuff to keep it alive.

Body Mods

The ancient art of cyborg hands

This natural adaptability that you have as part of your interaction with the world could also help you modify yourself with far stranger extensions than chainsaws and cyborg hands.

Well-designed cyborg parts will exploit this natural adaptability to modify your morphology, if you so desire. Perhaps the same scheme could work even with a complete body replacement, or a mind-in-computer scenario in which you may have multiple physical bodies to choose from.



[1] M. Csikszentmihalyi, Flow: The Psychology of Optimal Experience. New York: Harper Perennial, 1990.

[2] R. Leaky, The Origin of Humankind. New York: BasicBooks, 1994.

[3] M. Miyazaki, M. Hirashima, D. Nozaki, “The ‘Cutaneous Rabbit’ Hopping out of the Body.” The Journal of Neuroscience, February 3, 2010, 30(5):1856-1860; doi:10.1523/JNEUROSCI.3887-09.2010.

[4] T. Metzinger, “Self models.” Scholarpedia, 2007, 2(10):4174.

Tags: , , , , , , ,