Games controllers can end up in the strangest places. Just this week,
the US Navy announced it had approved a laser weapon to be deployed on
an amphibious vessel serving in the Persian Gulf. The weapon is
essentially the kind of death ray that science fiction has been
promising for decades. And, as the demonstration showed, this space age
weapon is guided by something every self-respecting 14-year-old is
familiar with: a controller just like those used to play video games.
They
used to be such simple devices. A single control stick and a few
buttons were all a gamer needed to blast aliens or score a winning goal
on the primitive, pioneering games consoles.
But in the last few
decades, they have grown up. They’ve become more intelligent, more
pleasing to hold and use, and able to adapt to the increasing complexity
of the gameplay they are meant to be controlling.
And their
effect is starting to be felt a long way away from the first-person
shooters and football simulations that spawned their ongoing design.
Video game controllers can now be found in an astonishing range of
places, from pilots controlling drones to medical students training
through virtual surgery.
Drone operators, like this one in Iraq, have been using flight sim joysticks (Science Photo Library)
But how did we get here? Today’s Xbox or
Playstation controllers took a long time to achieve their current,
ergonomic form. In 1958 the American physicist William Higinbotham
created the interactive game Tennis for Two. The game paired an early
analogue computer with an oscilloscope (the kind of instrument you’d see
in a mad scientist’s lab in a 1950s B movie) serving as a basic
monitor. Players played the game by using an aluminium controller to hit
the ball.
The first commercially available game controller was
released over a decade later in 1972, with the Magnavox Odyssey’s pair
of gaming paddles. The Odyssey’s 2D games were crude simulations of
sports such as tennis or basketball.
Each of the paddles was
fitted with two dials on either side. One dial controlled horizontal
movement on screen whilst the other would control the vertical. Each
dial was connected to a device which regulated the flow of electricity,
within the controller. Twisting these dials would influence the
electrical flow which would in turn translate into the appropriate
movement on screen. Analogue direction
It
was only with the release of the Nintendo Entertainment System in the
1980s that controllers radically evolved to the gamepad design we are
now familiar with. This style of game controller replaced the
once-popular joystick with the directional-pad and a set of action
buttons. It became the de facto standard until the Nintendo 64
controller was released, which included an analogue stick.
Otherwise
known as the control stick or thumbstick, analogue sticks are a
variation of the joystick. Unlike gamepads, which give a simple on/off
control for the different directions, analogue sticks allowed a more
graded response, based on how firmly or gently the player moves the
analogue stick controller. Controllers could suddenly move in more
dimensions than just side to side. “The first analogue-joystick
controller (for the N64) was especially designed to enable the kind of
3D games that [Shigeru] Miyamoto wanted to make, such as Super Mario
64,” says Steven Poole, author of Trigger Happy 2.0: The Art and
Politics of Videogames.
In recent years, a lot of effort has gone
on enhancing comfort too. Zulfi Alam, Microsoft’s personal devices
general manager, states that comfort and immersive gaming were the key
qualities when designing the controller for the recent Xbox One. “The
main thing was that you need to make the controller fit into the palm of
your hand,” explains Alam. “The variation in palm sizes is so
significant that getting the comfort right was paramount.”
The BodyViz simulation uses Xbox controllers, which are thought to be more intuitive than a mouse (BodyViz)
In fact, game controllers have now become so
ergonomic and efficient at navigating us through virtual worlds that
they are finding uses beyond just video games.
The 3D MRI and CAT
scan visualisation software BodyViz uses Xbox controllers to manipulate
the view of the display. The previous mouse-and-keyboard method proved
to be a cumbersome. However Curt Carlson, the president and CEO of
BodyViz, found the Xbox controller to be a much simpler solution. The
design of the controller makes it easier for surgeons to intuitively
“rotate, pan, zoom or fly-through a patient's virtual anatomy” in order
to properly prepare for invasive surgery.
Game controllers are
also finding roles in the armed services. Tim Trainer, a vice president
at iRobot's Defence & Security business unit has been taking
controllers out of the living room and into military service. The
original Pack-bot bomb disposal robot with its 20kg Portable Command
Console (PCC) was replaced by a toughened laptop with a PlayStation
controller plugged into it. This new control method was far lighter than
the previous PCC. Trainer says the “younger military operator has
hundreds of thousands of hours [experience] on game-style controllers,
so the training and take-up time for becoming proficient is minimal.” Virtual interface
iRobot
has also developed “full-spec” qualified controllers, which have been
designed for heavy-duty use within extreme conditions. However, these
militarised variants cost thousands of dollars to produce and are three
times as heavy as the normal controller. Operators have discovered the
original Playstation controller was already fairly robust, and, should
they break, only cost £20 to replace.
The US Navy this week released footage of the laser test
iRobot
has also developed an app for tablets that allows the user to control
the droid via a tablet, using a pair of virtual control stalks on the
screen.
With touch-screen interfaces becoming commonplace,
developers are also making the most of our familiarity with video game
controllers. By adapting the technique, rather than the technology, they
have developed a virtual interface that apes games controllers. By
moving their thumbs over the virtual control stalks displayed on the
screen, users can control and navigate through the environment using the
touch-screen interface of a mobile device, be it on a tablet or a smart
phone.
Nasa are trying a similar approach. The Jet Propulsion
Laboratory at the California Institute of Technology is currently
developing a “natural” control system, using the Oculus Rift virtual
reality headset and the Xbox Kinect motion-sensing input device, for a
robot arm. “My lab often finds that we can apply technologies from
seemingly unlikely sources, such as the video game industry, to
challenges in space exploration” says Jeff Norris, founder of JPL’s Ops
Lab, which designs ways for humans to interact with robots.
iRobot's EOD bot can be manipulated with a PlayStation controller (iRobot)
Oculus Rift could allow an operator to get a
better sense of how the robot relates to its environment; a head-mounted
display allows the operator a similar view of the robot arm their own.
“I’ve found that at times it makes the robot arm feel a bit like an
extension of your own arm,” Norris says.
Most robots are
controlled through joysticks, but Norris discovered that “people are
very inefficient when controlling a robot arm with a joystick because
the mapping of the different buttons and axes of the joystick are often
unintuitive.”
Although this particular robot will not leave the
Jet Propulsion Laboratory, the techniques developed here will be used in
the future. The robot arms mounted on the exterior of the International
Space Station, for instance, may one day be manipulated with a similar
device.
All of which suggests that games controllers aren’t just for fun. They can do serious work too.
No comments:
Post a Comment