Some Surelink questions
Archiver
Posts: 46,084
Hi All,
I am thinking of using the Surelink modules for the hydrogen powered robot,
and I have a few questions. Firstly, are these goals reasonable:
Real time robot control (i.e. RC mode)
Streaming operational and experimental data
Still images
The idea would be to have a Surelink module on the robot, and build a base
station with another connected to a laptop via another Stamp. We are using a
2p40 for the robot, and can use a 2p24 for the base station. This would
allow the students to operate the robot out of sight, and do some cool room
mapping and exploration projects, among other things. Being able to send
data about fuel cell operation would be nifty too.
In the case of still images, how would/could that be accomplished?
While there are demo boards for the Surelink, I assume you can just use a
Stamp at either end. Is this correct? I looked around on the Parallax site
for a N&V column or application type information, and came up empty-handed.
Did I miss it?
Thanks,
Jonathan
www.madlabs.info
I am thinking of using the Surelink modules for the hydrogen powered robot,
and I have a few questions. Firstly, are these goals reasonable:
Real time robot control (i.e. RC mode)
Streaming operational and experimental data
Still images
The idea would be to have a Surelink module on the robot, and build a base
station with another connected to a laptop via another Stamp. We are using a
2p40 for the robot, and can use a 2p24 for the base station. This would
allow the students to operate the robot out of sight, and do some cool room
mapping and exploration projects, among other things. Being able to send
data about fuel cell operation would be nifty too.
In the case of still images, how would/could that be accomplished?
While there are demo boards for the Surelink, I assume you can just use a
Stamp at either end. Is this correct? I looked around on the Parallax site
for a N&V column or application type information, and came up empty-handed.
Did I miss it?
Thanks,
Jonathan
www.madlabs.info
Comments
>> I am thinking of using the Surelink modules for the hydrogen
>> powered robot, and I have a few questions.
I have not used the Surelink modules specifically, but I recently
built a similar robotics project and I think that I can give you
some ideas.
>> are these goals reasonable: Real time robot control (i.e.
>> RC mode), Streaming operational and experimental data,
>> Still images
Yes, yes, and yes. For real time robot control you will want a
wireless protocol that has as low latency as possible. This is
measured from the time that you write to the serial port on the PC
to the time that the stamp sees the data. You want this as low as
possible, but definitely no more that 100ms. A value down in the
range of 50ms or less will make the device feel very responsive like
a remote control toy.
For streaming of operational and experimental data you simply need
enough throughput to handle the stream. This is easy to calculate,
just figure out what you want to send, how big it is, and how often
you want to send it. Make sure that you calculate this for data in
both directions.
Capturing still images is a bit more of a challenge, but can still
be accomplished pretty well. I am not sure what your camera source
is going to be, but the CMUCam would be a good choice. Its nice
because it can dump an image as a serial stream of data. This stream
can then be redirected to the radio and transmitted to the PC. The
only caveat here is that images tend to be large so you probably
want to use one of the faster stamps so that it can handle higher
data rates.
>> This would allow the students to operate the robot out of sight,
>> and do some cool room mapping and exploration projects, among
>> other things.
Wireless robot control is a great model and I think that you will be
very happy with it once your get everything up and running. An
additional benefit to this model is that your students can program
the robot controller in any language that they choose since it will
be running on the PC. You will need a small control application on
the stamp itself, but all of the control decisions will be made by
software on the PC.
>> In the case of still images, how would/could that be
>> accomplished?
I recommend using the CMUCam because it is by far the easiest way to
get this done. I wrote applications for both the PC and PocketPC to
render CMUCam images from my robot, so if you go this route let me
know and I will send you the code. The PocketPC application is
written to use the built in Bluetooth radio on an iPaq, but it could
easily be modified to use whatever RF solution you select. I like
the iPaq with built in Bluetooth because it is such a capable little
device that I can have a powerful wireless robot controller without
having to drag a laptop around.
If you are going to be using this wireless solution in a classroom
environment you also want to consider interference problems with
multiple units. I believe the Surelink modules can address up to
sixteen different units, but you have to worry about how much
throughput you lose when you have a number of units in the same
space. The manufacturer can probably give you more specific details
about this.
Sincerely,
Bryan Hall
A7 Engineering
http://www.a7eng.com
Thanks for the reply. I guess it is a new product, as you are the only one
to answer this thread. First off, I was talking to tech support about the
Embedded Blue module. It seems a lot easier to use than the Surelink, but
the SL is much faster. Any thoughts on which would be best for our project?
> Yes, yes, and yes. For real time robot control you will want a
> wireless protocol that has as low latency as possible. This is
> measured from the time that you write to the serial port on the PC
> to the time that the stamp sees the data. You want this as low as
> possible, but definitely no more that 100ms. A value down in the
> range of 50ms or less will make the device feel very responsive like
> a remote control toy.
I will check into the latency. I don't see it on the spec sheet. Our current
controller (a universal TV remote) has latency of 100 ms or so, and is, as
you say, just acceptable. We would like to improve this if possible, but can
live with it if needed.
> For streaming of operational and experimental data you simply need
> enough throughput to handle the stream. This is easy to calculate,
> just figure out what you want to send, how big it is, and how often
> you want to send it. Make sure that you calculate this for data in
> both directions.
The data, aside from images, will be pretty small. How large are low res.
images from a the CMUCAM?
>
> Capturing still images is a bit more of a challenge, but can still
> be accomplished pretty well. I am not sure what your camera source
> is going to be, but the CMUCam would be a good choice. Its nice
> because it can dump an image as a serial stream of data. This stream
> can then be redirected to the radio and transmitted to the PC. The
> only caveat here is that images tend to be large so you probably
> want to use one of the faster stamps so that it can handle higher
> data rates.
A delay while getting picture is fine. We may implement a time delay anyway,
to simulate the transmission distances in space. We are using a 2p40 on the
robot and have a 2p24 that we will use for the base station.
> Wireless robot control is a great model and I think that you will be
> very happy with it once your get everything up and running. An
> additional benefit to this model is that your students can program
> the robot controller in any language that they choose since it will
> be running on the PC. You will need a small control application on
> the stamp itself, but all of the control decisions will be made by
> software on the PC.
Hmm, I had thought we would be recieving the data on another Stamp, then
sending it to the PC. None of us are very familiar with PC programming. Our
thought was to handle giving commands to the robot via the Stamp run base
station.
> I recommend using the CMUCam because it is by far the easiest way to
> get this done. I wrote applications for both the PC and PocketPC to
> render CMUCam images from my robot, so if you go this route let me
> know and I will send you the code. The PocketPC application is
> written to use the built in Bluetooth radio on an iPaq, but it could
> easily be modified to use whatever RF solution you select. I like
> the iPaq with built in Bluetooth because it is such a capable little
> device that I can have a powerful wireless robot controller without
> having to drag a laptop around.
I just looked at the CMUCAM, and it seems like it would be ideal for our
purposes. And the iPAQ sounds like it might be neat, and another reason why
the embedded blue module might be best for us. Any thoughts on this? We
would most definitely like your code, and if we use it of course you will be
credited for it on our web page, which you can see here:
http://madlabs.info/H2_FCRobot_Chron.shtml
>
> If you are going to be using this wireless solution in a classroom
> environment you also want to consider interference problems with
> multiple units. I believe the Surelink modules can address up to
> sixteen different units, but you have to worry about how much
> throughput you lose when you have a number of units in the same
> space. The manufacturer can probably give you more specific details
> about this.
For the foreseeable future, this wil not be a problem, as there is only on
hydrogen powered robot, and with the cost of building, will probably remain
so. Although you can never tell for sure...
Thanks a lot for the help!
Jonathan
www.madlabs.info
>> I was talking to tech support about the
>> Embedded Blue module. It seems a lot easier
>> to use than the Surelink, but the SL is
>> much faster. Any thoughts on which would
>> be best for our project?
This is not correct. The eb500 can communicate at sustained speeds
of up to 230.4kbps. The Surelink module can communicate at speeds up
to 115.2kbps, but the data transfer rate maximum is 76.8kbps. This
means that the eb500 maximum data transfer rate is three times
faster than the Surelink module.
The confusion may have come from the use of 9.6kbps as the default
data rate for the eb500 module. This rate was chosen so that it may
be used out of the box with all basic stamp variants. To change the
data rate simply use the "set baud" command described on page 74 of
the user manual.
>> (a universal TV remote) has latency of 100
>> ms or so, and is, as you say, just acceptable.
>> We would like to improve this if possible,
>> but can live with it if needed.
There are a number of factors that can affect this, but if you try
for low latency you can get down in the neighborhood of 15ms with
the eb500 module. If the module is allowed to go into a low power
mode, the latency will jump up to about 50ms for the first byte
transmitted and then drop back down below 15ms.
>> The data, aside from images, will be
>> pretty small. How large are low res.
>> images from a the CMUCAM?
The CMUCam II give you better control than CMUCam I and can send low
res color frames with as little as a 9K of data. You can probably
even reduce this further by sending only one of the color frames to
get a single low res grayscale image that is about 3K.
>> Hmm, I had thought we would be recieving
>> the data on another Stamp, then sending
>> it to the PC. None of us are very
>> familiar with PC programming. Our thought
>> was to handle giving commands to the
>> robot via the Stamp run base station.
Using Bluetooth on the PC can be as easy as using the serial port.
In fact without any programming at all, you can connect to your
robot over Bluetooth and control it using hyperterminal. Rather than
press buttons on a Basic Stamp you can type simple commands
like "turn left", "stop", etc. Or even simpler single character
commands like "1", "2", "3", etc.
Page 48 of the eb500 manual describes using hypertermal on a
Bluetooth enabled PC to communicate with a Board of Education. Page
37 of the manual describes another sample called "Monkey See, Monkey
Do". It is a simple robot control application that responds to
single character commands. If you load the monkey do sample into a
BOE-Bot and connect to it via HyperTerminal, you can control the
robot by simply typing numbers in the HyperTerminal window. This is
a simple way to start and does not require writing any PC software.
>> I just looked at the CMUCAM, and it seems
>> like it would be ideal for our purposes.
>> And the iPAQ sounds like it might be neat,
>> and another reason why the embedded blue
>> module might be best for us. Any thoughts
>> on this?
A7 Engineering developed the EmbeddedBlue module, so I am a bit
prejudiced on the subject. I do believe however that using a
standard protocol like Bluetooth provides a tremendous amount of
benefit over more traditional RF solutions like the Surelink module.
The specifications sheet describes the benefits pretty well and I
recommend taking a quick look at it if you have not already.
Talk to you soon,
Bryan Hall
A7 Engineering
http://www.a7eng.com