Shop OBEX P1 Docs P2 Docs Learn Events
assign constants to wave patterns and transmit to pc — Parallax Forums

assign constants to wave patterns and transmit to pc

DataFreedomDataFreedom Posts: 6
edited 2008-03-06 13:27 in BASIC Stamp
No Laughing!


I got hold of a digital EEG on ebay. I have been concentrating on stupid things for long periods of time. Like concentrating on cookies. I have been up now for 9 days (need new coffee pot) and with the unit connected directly to my PC via serial port I have been able to collect similar wave functions and store them. by thinking cookie (with a bit of VB) I can click the start button. ok. the freq.= 0.5 - 400hz at 12 bits.


can a basic stamp sample the eeg, convert it from digital to analog, assign a value/constant to the wave data, convert it back to digital, and transmit it back to the PC via bluetooth?

I want to make it moble so I can always have it on. If nothing else I'll ware a laptop in a backpack but the eeg its self is like 30 lbs :S

Comments

  • DataFreedomDataFreedom Posts: 6
    edited 2008-03-05 12:20
    I have also noticed that the longer I am awake the less coherent the wave patterns are. I think with some time I can write an algorithm to calculate my uptime (: o


    and after having had some sleep I think I can use that algorithm (after some development to act as correctional bias for the command input VB prglet.

    Post Edited (DataFreedom) : 3/6/2008 1:27:08 PM GMT
  • DataFreedomDataFreedom Posts: 6
    edited 2008-03-05 12:38
    firmware info for other microcontrolers that are actually home brew EEG's

    check out whats going on over a SourceForge
    sourceforge.net/projects/openeeg




    ModularEEG Firmware Readme
    ==========================

    License

    All source code is licensed under the GNU General Public License (GPL) v2.
    You can obtain your copy of the license from the Free Software Foundation
    at www.fsf.org


    Compiler information

    If you want to modify the sources and recompile the .hex files you need to
    use avr-gcc, a version of GCC compiled for AVR microcontroller support.

    Documentation, and links to downloads for Windows and Linux are
    available from http://www.avrfreaks.net


    Firmware selection

    There are several flavors to choose from. Obviously you must pick one that
    matches your microcontroller (ATmega8 or AT90S4433)

    * Data transmission formats

    There are two versions currently in use. The older, version 2, is being
    obsoleted, but is still around because ElectricGuru does not (yet) support
    version 3, which is the recommended format.

    This means that for testing and tuning with ElectricGuru, you need to use
    v2 before switching to v3 when you are done.

    * Basic firmware

    The basic firmware is made as short and simple as possibe without any
    special features. It supports both protocols, and both microcontrollers.

    * Bidirection firmware

    Feature rich. Can receive commands from the PC, has support for IrDA
    (which needs extra hardware patched in) in development.

    Currently it only supports the AT90S4433 microcontroller and packet format
    version 2.


    General programming notes

    First a WARNING: Always disconnect the programming cable before connecting the
    EEG to a test subject !!!! This is important because it crosses the opto-
    isolator and connects PC-ground to EEG-ground.

    The following sections describes how to program the microcontroller on a
    ModularEEG with the home-made programming cable and sp12 programmer software
    for either Linux or Windows.

    However, if you plan on using some other programming software, and intend
    to use an ATmega8, note that you must program the low fuses to be all ones
    (0xff).


    Sp12 Software setup (Linux)

    Download and install the SP12 programmer software.

    Link: http://www.xs4all.nl/~sbolt/Packages/sp12v1_0_3-Linux.tgz

    Installation and usage instructions:
    http://www.xs4all.nl/~sbolt/Packages/sp12.txt

    Make sure your system is setup properly, and that you (or sp12) have root
    privileges.

    The sp12 software assumes that it can get access to the hardware ports.
    However, if you have a parallel port driver compiled into the kernel, then it
    won't let sp12 have access. There is in fact a /dev/parport provided to allow
    access from user programs, but sp12 doesn't use it, preferring to hit the
    ports directly.

    So, the only way to get sp12 to work reliably is to run a kernel with parallel
    port access compiled as a module. Then if you try sp12 and it doesn't work,
    ou can do a 'lsmod' and look for the modules that are holding onto the ports.
    Doing a 'rmmod' on all of the likely candidates (for example, "lp", "parport"
    and "parport_pc") should do the trick.

    Then after running sp12 to burn your firmware, you can reenable your printer
    using 'modprobe lp'. So long as the parport clashes are dealt with like this,
    sp12 works fine, and the command lines from the 'readme.txt' that comes in the
    ModEEG firmware directory work without problem.


    Sp12 Software setup (Windows)

    First download and install the SP12 programmer software.

    Link: http://www.xs4all.nl/~sbolt/Packages/sp12v103.exe

    Installation and usage instructions:
    http://www.xs4all.nl/~sbolt/Packages/sp12.txt

    Windows NT4, 2000 or XP users should then install Userport. It is an
    application that gives usermode applications (i.e regular programs) direct
    access to selected hardware ports without a special driver.

    Windows 98 users can not use it and do not need it.

    Link:
    http://www.embeddedtronics.com/public/Electronics/minidaq/userport/UserPort.zip
    If it is down or has changed, try Google:
    http://www.google.com/search?q=userport+windows.

    Installation instructions can be found in its zip-file.


    Programming

    After the SP12 programmer is installed, simply open a command prompt or shell,
    "cd" to the directory where you keep the sp12 software, if it is not in the
    path.

    The first time you run sp12 after installation, you must write

    sp12 -i

    This allows the sp12 software to initialize itself and calibrate its main
    programming loop for your computer. The calibration value will be stored in a
    file in the current directory.

    Unless you have already done so, connect the ModularEEG to the PC's parallel
    port using the programming cable. Make sure you orient the ribbon cable
    correctly to the PCB before turning on the power.

    If you want to program an ATmega8, begin by programming the fuses:

    (Important: AT90S4433 users should NOT execute this step)

    sp12 -wF11111111

    This programs the low set of fuses so that the ATmega8 microcontroller works
    with the external 7 MHz crystal. The factory default is to use an internal
    1MHz oscillator, and that does obviously not work here.

    Finally, program the ModularEEG (regardless of microcontroller type) as
    follows:

    sp12 -wpf firmware.hex

    Replace firmware.hex with the path and name of the firmware you wish to use.
  • DataFreedomDataFreedom Posts: 6
    edited 2008-03-06 13:27
    I ran across this a bit ago. I did get some rest and I still need a new coffee pot! I am looking at the the wave patterns I recorded While I slept.
    Somebody said...

    Bionics - mechanics integrated with human beings - has forever seemed the stuff of Isaac Asimov novels and cheesy '70s TV shows (remember Steve Austin as the Six Million Dollar Man?). It was a utopian dream that seemed to live in some distant future.
    But the reality is that we're a lot closer than you think. Today, bionics give limited hearing to the deaf and sight to the blind, allow a quadriplegic to move a computer cursor with his mind and a robotic arm to be controlled by thought.
    Tomorrow, we can expect bionics to restore sight and hearing and to cure paraplegia, to enhance and, just maybe, replace functions of the human brain.
    Where did bionics begin? The bionic revolution can be traced to the 1990s with the advent of neural interfaces - that is, direct communication with the brain.
    Perhaps the most advanced example of this is the Australian-developed cochlear ear implant, more commonly referred to as the bionic ear. For the past 35 years, the implant's lead designer, Professor Graeme Clark has worked on developing and enhancing a device that transmits audio directly to the brain.
    The bionic ear - implanted into some 2000 Australians - converts audio into electrical impulses that the brain can understand.
    According to Professor Clark, "recipients of the implant can hear, on average, about 80 per cent of what a normal person can hear with speech".
    The bionic eye is another area of research that has witnessed breakthroughs in recent years. At the Dobelle Institute in the US, an artificial eye that consists of a camera attached to a pair of glasses sends information to a computer worn at the belt, which processes the information, transmitting signals directly to the brain via 68 electrodes plugged into the visual cortex.
    What the blind person "sees" is an array of points of light designed to give the recipient's environment an impression of depth. The electrodes deliver information at one frame per second.
    At this stage, vision remains poor - about 20/400 according to the Dobelle Institute - but is enough to allow a blind person to move independently about on foot and possibly by car. Other researchers are looking at connecting chips to the retinal nerve endings. At the Bonn and Stuttgart Universities in Germany, a group of researchers has developed chips that transmit information through the regular optical pathways, bypassing non-functioning retinas, in a manner similar to the cochlear ear.
    An external computer attached to a camera codes visual information, which is then transmitted via infra-red signals to the chip in the eye, which in turn channels information along the visual pathways. The hardest part, according to researchers at the Retinal Implant Association, is building a chip that lasts longer than 18 months in the salty conditions of the eye.
    Closer to home, a joint research team from the Universities of Newcastle and NSW are making headway with a bionic eye, announcing last month that they had developed a chip, implanted in the body, that replaces the eye's retina.
    Advances in neural interfaces are also making it possible to read information from the brain. US company Neural Signals specialises in the development of neural interfaces that enable totally paralysed people to communicate. Its most publicised case was stroke victim Johnny Ray, who was trained to move a computer cursor with his thoughts. Johnny Ray - described by Dr Phil Kennedy, head of Neural Signals, as "the world's first cyborg", - was totally incapable of any movement. By using a virtual keyboard on the computer screen, Johnny Ray, who died recently, could use the mouse to type messages.
    The chip implanted in the recipient's head hooks up to neurites in the motor cortex. The recipient "thinks" about moving and the cursor will move in response to those thoughts.
    A transmitter and amplifier send signals to an external receiver, which then passes that information to the computer. The chip is a little larger than a 10-cent piece, and the electrodes connecting the brain are about the size of the tip of a ball-point pen. The rest of the external apparatus is huge, although Kennedy expects both the chip and external computing devices to shrink in the future.
    Training to use the device is not difficult but it is exhausting. "It takes hours of training because the patients can only work for half to one hour per day," Kennedy says.
    Similarly, experiments at Brown University in the US enabled a monkey to play a simple pinball game through a neural interface without any prior training. "We showed we could build a signal that works right away, in real-time," says senior researcher John Donoghue.
    The interface read signals sent to the motor cortex during movements of the monkey's hand. The monkey began playing the pinball game with the regular mouse and, during the game, the researchers switched the interface to the brain electrodes. The monkey kept on playing.
    Like Neural Signals, the researchers see this as a means by which totally paralysed people can communicate. "Our results demonstrate that a simple mathematical approach, coupled with a biological system, can provide effective decoding for brain-machine interfacing, which may eventually help restore function to neurologically impaired humans," Donoghue says.
    The holy grail of such work is some form of spinal cord bypass, says Clark, that can read the brain and transmit the information to a living or prosthetic robotic limb. Paraplegics, for instance, could "reconnect" their legs. Lost limbs could be replaced with robotic ones, controlled just like living limbs.
    Kennedy is already looking at interfacing directly with computers by thought, effectively harnessing them to enhance our memory or ability to process information, play games, control robots remotely or drive cars by thought alone.
    It may not be as futuristic as it sounds. Experiments at controlling planes by thought have been under way at Armstrong Laboratories in the US since the mid-'90s. Using EEG (electro-encephalogram) readings from sensors attached to the head, a pilot could will an aircraft to pitch left or right; similar to that in the 1982 film, Firefox, starring Clint Eastwood.
    Like the Brown University work, Massachusetts Institute of Technology researchers implanted a chip in the motor cortex in a monkey's brain to read the movement signals it sent to its hand. These signals were then sent over the Internet to a robotic arm located in a different part of the country. In effect, the robotic arm mimicked the monkey's as it reached for food.
    The MIT Touch Lab, which developed the interface, is looking at using it for virtual reality environments. A paper by James Biggs and Mandayam Srinivasan, who work at from the lab, points to "video games and simulators that enable the user to feel and manipulate virtual solids, fluids tools and avatars".
    The researchers are also working on developing independent control for the arm - attempting to teach the monkey how to use the robotic arm as an extra appendage. According to Srinivasan, "this might result in the remote arm being incorporated into the body's representation in the brain".
    In a laboratory at the University of Southern California, Professor Theodore Berger is working on ways in which computer chips can be used to enhance or replace brain functions.
    His research, in particular, focuses on chips that perform the same function as the brain's hippocampus, which is responsible for learning and memory functions. This, he believes, could attack diseases such as Alzheimer's, while augmenting the memory and brain functions of those unaffected by disease.
    "Theoretically, it should ultimately be possible to replace the dentate gyrus of a brain with an advanced chip," Berger told a publication of his university.
    Researchers like Dr Richard Andersen, from the California Institute of Technology, imagine a day when chips are implanted directly in to the brain so that patients don't need to be tethered to a machine. Others at the Jet Propulsion Laboratory, in Pasadena, are working to miniaturise the chip technology.
    But how far can bionics and the potential to meld man and machine go? How long before your daily news digest and the latest Hollywood blockbuster are directly downloaded in to the brain? Apart from Berger, few researchers at this stage appear to be looking seriously at these longer-term human enhancements.
    Clark doesn't go along with trans-cranial implants "for ethical reasons". "I don't see it happening in the near future."
    Of course, the future never runs to a single script. A recent report commissioned by the US National Science Foundation and Commerce Department outlines the prospect that, within a single generation, we will have "converging technology products for improving human physical and mental performance".
    In draft form, Converging Technologies for Improving Human Performance advocates that, within 20 years, we may see major "social and business reform" as a result of the advancements in human enhancement technology - specifically bionics, nanotechnology and genetics. It talks about "telepathy" through wireless systems, direct computer-brain interaction, improved human mental functions and physical capabilities, as well as amelioration of the effects of ageing.
    In contrast to the usually dystopian view of human-machine integration, the paper proposes a "national R&D priority area on converging technologies focused on enhancing human performance" that are "crucial to the future of humanity".
    Both public education and some caution are recommended. Kennedy warns that there must be clear aims with direct human benefit. "But there is a fine line between (beneficial and deterimental) aims," he says.

Sign In or Register to comment.