Monday, August 25, 2008

More progress...

Well, two days and four drill bits later, the casters are finally mounted. They had to be modified from the original design. Instead of bending the caster fork, steel brackets were custom created from spare steel and were then fastened to the gimbal frame. Turns out, it worked equally as well after extending the base about 2".

The PVC Frame as it is now:


A short clip of the rolling motion in action:

Wednesday, August 13, 2008

The control panel is done. Wiring and labels to be completed after all PVC work is done. This will be mounted inside the PVC frame.


Below are two photos of the current status of the PVC frame. All pieces are complete however I've run into a bit of a snag mounting the casters. I've come up with an alternate design which should work the same but will involve mounting L-brackets to the casters and extending the base dimensions by a bit. I'd estimate my total project completion at about 80% thus far.


Wednesday, July 23, 2008

PVC cuts are complete. Need to bend forks on casters to allow first semblance of motion.

Control panel layout is complete. Wiring begins soon.

Tuesday, April 22, 2008

PVC is in place and awaiting cuts. Rough mockup of control panel complete.

Tuesday, February 5, 2008

The "Joyrider" model will be used to provide a low-cost means of reproducing sensations of movement. It's model can later be modified to further enhance movement as specified here.
Immersion is the state where you cease to be aware of your physical self. It is frequently accompanied by intense focus, distorted sense of time and effortless action.[1]

^ Varney, Allen (August 8, 2006). Immersion Unexplained (HTML). The Escapist. Retrieved on 2007-04-06
  • Visual: Modern graphic cards, expansive screen
  • Tactile: Force feedback from unit, switches & panels mimic flight dash
  • Aural: Force feedback from audio signal, headphones coupled with surround sound
  • Motion: Motion of unit coupled with head tracking technology
Project Background:

Flight simulations have long provided a benchmark of the progress of both raw computing power and interactivity between the user and machine. In terms of simulations, they are often cited as the most strenuous test that a user can run on a machine – a computer is asked to render miles of landscape while maintaining an object moving through and interacting with the landscape, all while abiding by natural laws of physics.

When simulations are taken into consideration, the implied objective is to allow the virtual world to mimic reality to the highest degree possible – that is, the computer simulates reality. Yet simulations often lack a physical counterpart aside from the consumer’s choice of joystick or mouse and keyboard combination. In everything from museum installations to engineering projects, the keyboard is the accepted input method, mostly due to sheer habit. Imagine instead a flight exhibit that incorporates dedicated levers and pushbuttons. In this instance the user’s feedback is at both tactile and direct. They do not have to remember which key on the keyboard represents acceleration. They simply and naturally push the throttle lever forward. The user is able to experience the simulation without the distraction of an input device which is at best non-immersive.

One device has introduced practicality to the construction of such interface devices. Simple push buttons and switches can be constructed in elaborate chains to create any interface imaginable when combined with a small integrated circuit called a keyboard encoder. This encoder board takes a button press and converts it to a keystroke entry using a matrix methodology. For example, from the computer’s perspective, the user is hitting the “G” key on the keyboard, when in fact the user has actually lowered a large arm marked “landing gear”. It is thus that literally hundreds of switches and levers can be arranged to perform complex functions by means of a single keyboard encoder.

I first employed a keyboard encoder in the construction of an arcade cabinet from scratch. (See http://arinsmame.blogspot.com for the construction blog.) I became fascinated with the process of combining woodwork with technology and wanted to eventually build a device which would immerse the user beyond the abilities of traditional input methods.


Objective:

The objective of this project is to develop a better knowledge of human interface devices and immersive environments by exploring options for interacting with software platforms by means of developing a flight cockpit as a test model. This model will focus on dedicated inputs to correlate with single resulting functions on the flight simulator. The final project will consider practicality when designing specialized input devices to correlate with a specific software platform as well as feasibility for widespread implementation across multiple studies.

Methodology:

This project has four phases: planning, construction, testing and review.

Phase I will consider the layout and means of construction of a flight instrument panel.

Phase II will involve gathering the switches, wire, wood and construction materials required to assemble a working flight cockpit.

Phase III will use flight simulation software to test and tweak immersion of the user with the simulation package.

Phase IV involves compiling the results of the project and gauging whether the human interface device (in this case, the cockpit) has allowed the user to more fully experience the simulation and how it has impacted the resulting simulation. All findings and user experiences will be compiled on a web build log. Video footage of test subjects will be hosted as well.

Evaluation:

A functioning flight cockpit and demonstration to library staff will mark the project complete. Subsequent comments and user experiences will be welcomed on the project’s build log maintained on the web.

Introduction

This blog will serve as a progress log for the project: "Interactivity & The Flight Model". This project will seek to explore the notion of immersion as applied to an installation and its effects upon the user's experience.