Shop OBEX P1 Docs P2 Docs Learn Events
Using the Say-It Module 30080 with the Propeller??? Deal of the Day — Parallax Forums

Using the Say-It Module 30080 with the Propeller??? Deal of the Day

WBA ConsultingWBA Consulting Posts: 2,933
edited 2013-03-19 16:02 in Propeller 1
After seeing the Say It Module up today as the Deal of the Day, I am considering on getting one for a project idea I have for my 4 year old daughter. However, I would like to use it with the Propeller, but I do not see any documentation or threads to support that. After looking at the datasheet, it appears that the Say It Module is geared specifically for the BS2.

Will I be starting from scratch if I try to use the Say It Module with the Propeller?

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Andrew Williams
WBA Consulting
WBA-TH1M Sensirion SHT11 Module
Special Olympics Polar Bear Plunge, Mar 20, 2010
«1

Comments

  • Mike GreenMike Green Posts: 23,101
    edited 2010-03-02 17:56
    From a hardware standpoint, the Say It Module is also designed to work with the Propeller. It can use 3.3V for its power source and, using 3.3V, the I/O is directly Propeller compatible.

    From a software standpoint, you're mostly starting from scratch, particularly since there's no document describing the command / response codes for the Say It Module. The communication between the Module and a Propeller is straightforward serial I/O and any of the existing serial I/O objects could be used. You could pretty easily translate the existing demo program from PBasic to Spin. Unfortunately, you couldn't use the GUI with a Propeller since the "bridge" program source is not available and there's no equivalent program for the Propeller. You could train the Say It Module using the GUI and a Stamp, then use it with a Propeller. You might be able to do some experimenting to figure out most of the training codes from the PBasic demo program which seems to use some of them.

    The documentation provided does have a table of commands that can be used to control the Module using any of the existing serial I/O objects. You can't use the provided GUI with the Propeller, but you could take the generated PBasic code and use that for a template for your Propeller application.

    Post Edited (Mike Green) : 3/2/2010 7:10:22 PM GMT
  • Duane DegnDuane Degn Posts: 10,588
    edited 2010-03-02 18:56
    Andrew - I’m also wondering about purchasing a Say It Module. I think it shouldn’t be too hard to get it to work with a Prop. On page 4 of the pdf lists the build in commands. The communication protocol is described on page 14 followed by a list of commands.

    Mike - Aren’t the commands listed on page 15 of the pdf enough to use the Say It Module?

    Off to order a Say It Module.

    Duane
  • Mike GreenMike Green Posts: 23,101
    edited 2010-03-02 19:05
    Duane - Thanks for the pointer. You're absolutely correct. I don't know how I missed that table before. Sorry folks.
  • Damo2Damo2 Posts: 10
    edited 2010-03-03 20:53
    I ordered the say it module, thinking that there would be a Prop - object available, but it seems there is not.

    If anybody does write something - be a pal a share it with us.· I'm kinda new at the prop, and re-writing the BS2 program into spin would probably see· me well into my sixties.

    Cheers.
  • jeffrey.rickjeffrey.rick Posts: 3
    edited 2010-12-30 20:40
    hi all! im trying to get the Say It module to work with the propeller as well and have been working on this for some time now. im still a beginner at spin programming, so if any one has gotton it to work and has a sample program PLEASE let me know! and likewise, ill be sharing any results i have on this forum as well as on www.savagecircuits.com and on obex.
    thanks!
    Jeff
  • HumanoidoHumanoido Posts: 5,770
    edited 2010-12-30 21:22
    If you want to run the module on the Propeller, use the Propeller OBEX function library with BASIC Stamp PBASIC commands. It works well with Stamp devices and Propellers. If I had the Say It module, I would write the code and post it.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-04 10:11
    Has anyone used a Say It Module with the Propeller yet?

    I purchased a couple of these modules when they were the Deal of the Day and I'm just getting around to using them.

    I thought I'd add the Say It Module to the wireless controller of my most recent robot project.

    If someone else has some Propeller code and is willing to share it, I'd prefer to have a head start.

    I'll be sure and post any (working) code I come up with.

    Are there still others wanting to use a Say It with the Prop?
  • Martin_HMartin_H Posts: 4,051
    edited 2011-06-04 11:28
    Using the Say-It module with the Propeller would run into one snag. I've only seen versions of the Say-It module GUI for the Basic Stamp and the Arduino. You use the GUI to train the Say-It via a Stamp or Arduino acting as a programming bridge.

    Now you can still use the pre-programmed speaker independant commands with the propeller chip, or use a Basic Stamp to train the speaker dependant commands and then use the Say-It with a propeller. The Say-It serial protocol is fairly simple to use and translating a GUI generated program from PBasic to Spin shouldn't be hard.

    The GUI can also generate the PBasic source for the bridge program, so it might be possible to translate that program into Spin and use the Propeller to program the Say-It. But that's for advanced users.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-04 13:03
    The manual has all the commands in it. I think the GUI uses the same commands. I thought I'd make simple menu for the PST so one could enter commands from the terminal. I think I might want to be able to change some commands on the fly also. I plan to have a TV with the controller I'm using the Say It with so I might have some sort of GUI with the TV using a PlayStation 2 controller for input.

    I was just wondering if anyone has something I could start from.

    Duane
  • Martin_HMartin_H Posts: 4,051
    edited 2011-06-04 18:31
    Sorry I don't have any Spin examples as I've only used it with my BS2e. If you go to the original vendor's home page http://www.veear.eu/Support/Downloads.aspx they have some Arduino examples. This might serve as the basis for a Catalina program.

    The GUI lets you train the Say-It with additional commands, but only when the Say-It is connected to a supported uController in a supported configuration. For example I added commands like North, East, South, and West to allow me to align the robot to a compass. But even without additional commands the Say-It is pretty neat. It occurred to me that the Say-It coupled with the Propeller chip is more powerful than using it with a Basic Stamp or Arduino. This is because one cog could always tend the Say-It and allow for asynchronous verbal commands.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-04 20:42
    I'm pretty sure you can program additional commands without the GUI.

    The manual lists all the commands and parameters needed to program speaker dependent commands.

    I believe the GUI just makes it easier to add the additional commands. I don't think there is anything the GUI can do that can't be done by having the Propeller give the appropriate command.

    I do have several Basic Stamps. It just want to see if I can come up with a Propeller only solution.
  • Martin_HMartin_H Posts: 4,051
    edited 2011-06-05 04:58
    You are correct that would would. Probably some sort of serial terminal text based program would be the way to go.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-05 06:43
    Martin_H wrote: »
    Probably some sort of serial terminal text based program would be the way to go.

    Yeah, that's what I'm leaning toward. So far I've written code to wake up the Say It and to have it use English. I still have a ways to go.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-08 19:17
    @Martin, Thanks for the link to the veear website. I learned a lot there.

    It turns out making a bridge program for the Propeller is relatively easy. (You can still call me an advanced user if you want.)

    I've attached a simple bridge program. I've had to disconnect and reconnect the Say It from the Prop after the program as started in order to get the GUI to find the Say It. I also have to try connecting several times from the GUI before it is successful.
    CON
      _Clkmode = xtal1 + pll16x     ' 80MHz  Change these two settings to match
      _Xinfreq = 5_000_000          ' your setup.
      ' Pin Assignments
      _SayItLed = 15                  ' Connected to Led on Say It
     
      _SayItTx = 27                 ' Connected to Rx on Say It
      _SayItRx = 26                 ' Connected to Tx on Say It 
      _DebugTxPin = 30
      _DebugRxPin = 31
     
      _SayItBaud = 9600
     
    OBJ
      Com[2] : "FullDuplexSerial"                 ' uses two cog
     
    PUB Main | localIndex, localAttmpts
     
      Com[0].start(_DebugRxPin, _DebugTxPin, 0, _SayItBaud)
      Com[1].start(_SayItRx, _SayItTx, 0, _SayItBaud)
      waitcnt(clkfreq / 4 + cnt)
      repeat
        result := Com[0].rxcheck
        if result <> -1 and result <> 0
          Com[1].tx(result)
        result := Com[1].rxcheck
        if result <> -1 and result <> 0
          Com[0].tx(result)
    

    I'm also writing a more advance/useful demo for using the Say It. By using a second serial line to the computer, I can have both the GUI up and the PST connected at the same time. My program listens in on the conversation between the computer and Say It and displays the conversation to the terminal window. This has been very useful in seeing the protocol in action.

    BTW, The Say It works fine with 3.3V.

    One thing that wasn't clear from just reading the manual was the different wordsets don't have to be used in any particular order.

    You can kind of think of the wordsets as button pads. Only one button pad (wordset) can be active at a time but your program determines what happens when one of the words in the set is said. Just like you can program any behavior from a press of a button. Your program can either switch to a different wordset or take some other action.

    I had though the trigger word was to wake up the Say It. This is not the case. You could use any of the wordsets as the first set of words the Say It listens for. This makes for many many possible combinations of words.

    I programed the Say It with Calculator, plus, minus, times, divided_by, equals and digits. I figure I can use the Say It to make a verbal calculator. I'll use the word "digits" to have the Propeller switch the Say It to wordset3 (the numbers zero through ten). I'll say the digits into the Say It and use the word "ten" to indicate I'm through entering digits. I figure I don't need "ten" to be a number since I can use "one" and "zero". I think this will be fun.

    I'll post my other code when I have it working better. If anyone else is using the Say It with the Propeller and would like to see the other code as it is now, let me know. I'm always willing to share my code even if it's a work in progress.

    Duane

    Edit(3/11/15): Warning, the code attached is an old version. There are better options available.
    I plan to upload this program or an improved version to my GitHub account
    If there isn't a replacement on GitHub send me a message and I'll make sure to upload the replacement code.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2011-06-09 12:47
    I need to put this project on hold for a while so I'm posting what I have so far.

    The attached code is very buggy but it does have some useful stuff.

    If you use a Prop Plug as a second serial connection to the Prop, you can use the second connection as a bridge to the GUI and listen in on the conversation with a terminal program.

    There is also a feature to record the GUI/Say It conversation to EEPROM. I don't think this record feature is needed for now. I included it so one could record the information if only using one serial line. I haven't been able to get the bridge to work using the same line as the debug line.

    My suggestion to anyone trying this program is to type "w" at the menu to wake the Say It up. Then type "o" and enter "0" to change the timeout to infinity. Then select "t" from the menu to have the Say It listen for the trigger word.

    Once the Say It hears the trigger the terminal will prompt you with a list of words of the current wordset it is listening for. The program will then lead you though the various branches of options depending on which word is spoken. Many of the options end with needing to say three digits. The idea is that if you want your robot to turn right, you would then give the angle to indicate how much to turn.

    If you try it out, I hope you let me know.

    There are all sorts of unused variables. The code is horribly inefficient. I normally would want to rewrite this but I just don't have time right now.

    I don't make use of any speaker dependent word groups yet.

    Hopefully this will be useful to someone.

    The Say It module is a lot of fun to play with.

    Duane

    P.S. The program writes the value of one variable to the EEPROM the first time the program is run. There are several menu items that will write to the EEPROM. These menu items are marked with an "*". If you do use the EEPROM menu options, the program assumes you have a 64MB EEPROM on your board.

    Edit(3/11/15): Warning, the code attached is an old version. There are better options available.
    I plan to upload this program or an improved version to my GitHub account
    If there isn't a replacement on GitHub send me a message and I'll make sure to upload the replacement code.
  • Martin_HMartin_H Posts: 4,051
    edited 2011-06-09 17:59
    Duane, thanks for the bridge program and Spin program. I'll take a look at it this weekend. When I write SayIt programs I tend to use a grammar like this: attention->command->argument (e.g. robot->go->north)
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-02 08:41
    I had someone ask about using a SayIt or VRBot with the Propeller. I thought I'd give additional information here since this thread has the bridge program in post #15.

    The GUI and BS2 sample program may be found here.

    I was just reviewing the program attached to post #16 of this thread. It looks like it might be a useful program if it didn't require a second com line to the PC (via Prop Plug) to work. I'd like to fix this 20 month old code some time today if I have time. I might not have time today since my day job has decided to get in the way of my bot building today.

    I think the code in post #16 give an example of the way the SayIt works. You need to specify to the SayIt which word group to listen for and then the program needs to decide is the word group should be changed or not based on the word spoken. It looks like the main decision making is taking place in the "FindBranch" method and the "TakeAction" method.

    I haven't used the SayIt much yet. I keep intending to incorporate it into a robot or a robot remote. I still intend to do so.

    While I don't have a lot of experience using the SayIt, I think I understand generally how it works, so if anyone has questions about using it a Propeller, I'll try to answer them.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-02 17:50
    Apparently I've learned a few things since June 2011. It was never possible to use the debug line to monitor the exchange between the Propeller and the GUI program.

    The SimpleBridge in attached to post #15 works as a bridge between the SayIt and the GUI program.

    I removed the bridge section of the program I attached to post #16. I also simplified the program a bit and hopefully made it a bit easier to understand. I've attached the newer version to this post.

    Here's the output to the terminal window of my recent use of this program. I'm going to break the output into small parts to make easier to explain what's going on. The program waits for a keypress before starting.
    Com driver startedPress any key to start
    Press any key to start
    Press any key to start
    Press any key to start
    Press any key to start
    Press any key to start
    

    It then displays a menu of choices. The SayIt needs to be awakened before using so the one of the first options to choose should be "w".
    Enter "d" to display this menu.
    Enter "w" to wake up Say It module.
    Enter "i" to initialize Say It module.
    Enter "x" to request firmware ID of module.
    Enter "o" to set timeout of module.
    Enter "m" to request non-empty groups of module.
    Enter "c" to request count of speaker dependent commands.
    Enter "p" to request speaker dependent commands of module.
    Enter "t" to have Say It listen for trigger.
    
    
    
    
    
    
    Wake attempt #1
    Say It returned =w
    Wake attempt #1 failed.
    Wake attempt #2
    Say It returned =o
    The Say It Module is now awake.
    

    The SayIt woke on the second attempt. The menu was then displayed again which I'll omit here.

    I then chose "t" from the menu to have the SayIt listen for the trigger word.
    Say It is now listening for trigger.
    Listening for wordset =0
    Say one of the following words:
    0) robot
    You many now speak commands to the Say It Module
    The current active wordset is the Tigger or Wordset 0:
    0   robot
    

    After the above was displayed the menu was also displayed. The menu is displayed frequently while the program is running. I'll delete future main menu output without comment from now on.

    At this point I said the word "robot". If I were to wait longer than the timeout period, I'd receive an error message and I'd again need to select "t" from the menu before continuing. The timeout amount can be changed by selecting "o" from the main menu.

    Since I said "robot" within the timeout period, the following was displayed.
    Say It returned =s
    Say It returned =A
    Say It heard "robot"
    Listening for wordset =1
    Say one of the following words:
    0) action
    1) move
    2) turn
    3) run
    4) look
    5) attack
    6) stop
    7) hello
    

    I then said the word "attack".
    Say It returned =s
    Say It returned =F
    Attack command received.
    Waiting for direction to attack.
    Say It heard "attack"
    Listening for wordset =2
    Say one of the following words:
    0) left
    1) right
    2) up
    3) down
    4) forward
    5) backward
    

    I then said "up".
    Say It returned =s
    Say It returned =C
    Up command received.
    Waiting for amount to attack up.
    Say It heard "up"
    wordset3DigitRemaining = 3
    Listening for wordset =3
    Say one of the following words:
    0) zero
    1) one
    2) two
    3) three
    4) four
    5) five
    6) six
    7) seven
    8) eight
    9) nine
    10) ten
    

    The program accepts three digit numbers. I might switch to using the word "ten" and an end of number indicator to allow the number of digits to be variable. For the program as it is now, I need to say three digits. Each digit needs to be said separate from the others. I said the numbers "7", "6" and "3".

    Here's the output while saying these numbers (I've cut some of what was displayed).
    Say It returned =s
    Say It returned =H
    command index = 7
    wordset3Total = 7
    wordset3DigitRemaining = 2
    Say It heard ""
    wordset3DigitRemaining = 2
    Listening for wordset =3
    Say one of the following words:
    0) zero
    1) one
    2) two
    3) three
    4) four
    5) five
    6) six
    7) seven
    8) eight
    9) nine
    10) ten
    Say It returned =s
    Say It returned =G
    command index = 6
    wordset3Total = 76
    wordset3DigitRemaining = 1
    Say It heard ""
    wordset3DigitRemaining = 1
    Listening for wordset =3
    Say one of the following words:
    0) zero
    <snip>
    10) ten
    Say It returned =s
    Say It returned =D
    command index = 3
    wordset3Total = 763
    wordset3DigitRemaining = 0
    Say It heard ""
    

    Once the SayIt had hear all three numbers, it could then act on the command. In this case the action is just to display what the command was, but if I were to use this with a robot, I could have the robot take some preprogramed action. I just need to decide what the robot should do when it hears "Attack Up 763".

    Here's the remainder of the output.
    Received command to attack up 763 units
    Add method to attack appropriate direction.
    Listening for wordset =0
    Say one of the following words:
    0) robot
    

    The robot went back to waiting for the trigger word.

    The microcontroller (uC) needs to keep track of which word group the SayIt is listening for and the uC also needs to decide if the word group should be changed or not once the SayIt has heard a word.

    You can kind of think of the SayIt as a touch screen device with changing menu options. The welcome screen would have just the word "Robot" on it. Once the "Robot" is selected another menu appears with the options:
    "action"
    "move"
    "turn"
    "run"
    "look"
    "attack"
    "stop"
    "hello"
    The program needs to decide which menu (word set) to present (or listen for). When one of the above words is spoken, you could either have the program command the SayIt to listen to word set 2 or word set 3 or go back to listening for "robot".

    The SayIt doesn't make decisions. It just tells the uC which word of the word set has been spoken. It's up to you as the programmer to decide which actions should happen based on which word was spoken. This includes needing to program the uC to send a command to SayIt to start listening for whichever word set you decide should be next.

    I listed word set #1 above. Word set #0 is just the trigger word "robot". There are also word sets #2 and #3. Here are the word sets as I've listed them in my program.
    iWordset2     byte "left", 0[5]
                  byte "right", 0[4]
                  byte "up", 0[7]
                  byte "down", 0[5]
                  byte "forward", 0[2]
                  byte "backward", 0
            
    iWordset3     byte "zero", 0[5]
                  byte "one", 0[6]
                  byte "two", 0[6]
                  byte "three", 0[4]
                  byte "four", 0[5]
                  byte "five", 0[5]
                  byte "six", 0[6]
                  byte "seven", 0[4]
                  byte "eight", 0[4]
                  byte "nine", 0[5]
                  byte "ten", 0[6]
    

    I included trailing zeros so each word takes up nine bytes of memory. This makes it easier to display the word based on its assigned number.

    By using speaker dependent words the number of possible branches gets very large. Instead of "word sets" the speaker dependent words are grouped together in "groups". It's possible to define speaker dependent words using a uC but it would probably be easier to implement speaker dependent words and groups by entering them in through the GUI and then listing these groups and words within your program.

    I haven't used speaker dependent words from within my program yet. I think the commands to send to the SayIt are a bit different but the concept is the same as with using word sets.

    As I previously mentioned, I haven't used the SayIt much my I think I've got a basic understanding of how it works and I'd be glad help out with questions if I can.

    There's a bit of an issue the labels on the SayIt. The TX and RX labels on the SayIt correspond to the TX and RX of the uC not the SayIt module. The diagram in the manual adds to the confusion. Here's my annotated diagram.

    attachment.php?attachmentid=99652&d=1362271916

    Edit(3/11/15): Warning, the code attached is an old version. There are better options available.
    I plan to upload this program or an improved version to my GitHub account
    If there isn't a replacement on GitHub send me a message and I'll make sure to upload the replacement code.
  • NWCCTVNWCCTV Posts: 3,629
    edited 2013-03-02 19:47
    Did they sell out already? Link comes up blank and a search reveals nothing.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-02 20:48
    NWCCTV wrote: »
    Did they sell out already? Link comes up blank and a search reveals nothing.

    You're just three years too late. To the day apparently (kind of weird).

    Parallax hasn't sold SayIt modules for a while. I received a PM asking about my bridge program so I thought I'd update the program I had been working on. Deal of the Day, those were the days. I purchased a lot of stuff through DoD.

    While the SayIt isn't around anymore the module the SayIt used is still for sale. Here it is at the Robot Shop as EasyVR. I actually like the EasyVR form factor better than the SayIt's form factor.
  • NWCCTVNWCCTV Posts: 3,629
    edited 2013-03-02 21:00
    DOH!!!! I looked at the date and seen 3/2 and did not even look at the year!!!!!
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-05 20:29
    I've updated my bridge plus monitor program. The attached program acts as a bridge between a SayIt module and the PC GUI program similar to the SayItSimpleBridge object attached to post #15. I've added a feature to the attached bridge program that allows one to monitor the exchange of data between the PC and SayIt module.

    Since the GUI program uses the com port to the Propeller an additional serial connection to the Propeller is required to monitor the exchange between PC and SayIt. I used a Prop Plug attached to P18 and P19.

    I found the PC GUI program uses commands not listed in the SayIt documentation. I don't think it's important to understand these extra commands, but I think it's interesting they exist.

    I monitored the exchange between PC and SayIt as I experimented with speaker dependent commands. For example this is the exchange to rename the command in group #1 index #1 to "KITCHEN":
    PC: nBBHKITCHEN
    Say It: o
    

    While the command to change a name is given in the SayIt documentation, I found it very helpful to see a few actual examples of the code in use.

    Here's the command to add the command "HELP" to group #2 index #0 (the group was previously empty):
    PC: gCA
    Say It: o
    PC: nCAEHELP
    Say It: o
    

    You can see in the case where the there wasn't already a command for that group, the command had to be created. The documentation state the command "g" is to used to insert new speaker dependent command. The following two characters identify the group index and command position.

    After creating and naming the new command, I clicked "train" and followed the GUI prompts. The training exchange was:
    PC: eCA
    Say It: o
    PC: tCA
    Say It: o
    PC: tCA
    Say It: o
    

    When I attempted to test group 2 but I didn't speak soon enough, I received an error message from the GUI. Here's the exchange:
    PC: dC
    Say It: e
    PC: <$20>
    Say It: B
    PC: <$20>
    Say It: B
    

    In the above example, the characters "<$20>" represents the space character. I wanted to make sure each character was displayed in the terminal window. Any characters that were ASCII values 32 or less or 127 and above where displayed as their hexadecimal value.

    Here's another attempt at using group #2 in which the SayIt successfully identified the command with index #7:
    PC: dC
    Say It: r
    PC: <$20>
    Say It: H
    

    I hope it's clear where the displayed data originated.

    The program adds the heading "PC: " to data received from the PC GUI program and the program adds "Say It: " to data from the SayIt module.

    I think the attached program is useful in understanding the commands expected by the SayIt and also makes it clear what data to expect to receive back from the SayIt when commands are issued.

    I plan to add some speaker dependent commands to the program, previously posted, SayItAlpha and share the updated program here.

    If anyone else is using the SayIt with the Propeller, I hope they share their experience with the rest of us.

    Edit(3/6/13): Some of the output didn't copy correctly. I've fixed it.

    Edit(3/11/15): Warning, the code attached is an old version. There are likely better options available.
    I plan to upload this program or an improved version to my GitHub account
    If there isn't code similar to what is attached here on my on GitHub, send me a message and I'll make and check for any improved versions of the code.
  • shemuwelshemuwel Posts: 3
    edited 2013-03-13 10:17
    Hey Guys,

    I've been looking through the forums and have found a lot of things about the SayIt. I've tried to use this code to work with my EasyVR that I had purchased and used with my arduino, but I recently switched over to the propeller and have been trying to use it with that, but to no luck. I have used the code provided on the parallax site to use with the SayIt but connected the EasyVR. So far it only works with the built in command that go along with the Robot trigger, but when I try to use the words that I create it does not work at all. Does anyone know where I may have gone wrong or can provide any kind of assistance.

    Thanks
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-13 12:37
    shemuwel wrote: »
    Hey Guys,

    I've been looking through the forums and have found a lot of things about the SayIt. I've tried to use this code to work with my EasyVR that I had purchased and used with my arduino, but I recently switched over to the propeller and have been trying to use it with that, but to no luck.

    "no luck"? It sounds like you've had some? You found this thread right? That's lucky.
    shemuwel wrote: »
    I have used the code provided on the parallax site to use with the SayIt but connected the EasyVR.

    What code? Do you mean code posted in this thread? If so, which post?
    shemuwel wrote: »
    So far it only works with the built in command that go along with the Robot trigger, but when I try to use the words that I create it does not work at all.

    I assume using the code from this thread. I don't recall if the code I posted included a menu item for speaker dependent (SD) words. If it did include a menu item for SD words, then it was more of a place holder for a future feature than an included feature.

    I'm just barely figuring out how to use SD words myself so I'm pretty sure there's not any SD features in the code I've posted so far.
    shemuwel wrote: »
    Does anyone know where I may have gone wrong or can provide any kind of assistance.

    Thanks

    So it sounds like your EasyVR is working as well as a SayIt with the code posted in this thread. Wow, what luck!

    The SD words present additional challenges for a programmer since not only do you need to figure out what combination of words will do what but you also need to decide how many SD groups to use. There are lots of possible SD groups (something like 16) but you only get to define 32 (IIRC) SD words.

    How many SD words should be in each group? How many groups should be used? I'm still trying to figure this out.

    It seems like one group should be used to describe a general area of tasks. I was thinking of using words like "kitchen" and "yard" to indicate to the program which type of task I want the robot to perform. Once I select kitchen, do I want the SayIt (I'll use SayIt to mean either a SayIt or EasyVR) to listen to yet another SD group or should it listen for one of preprogrammed sets?

    These are some of the things I'm trying to figure out. As I mentioned previously in this thread, the SayIt doesn't make decisions, it just tells the microcontroller (uC) which word of a group or set was heard. It's up to the uC to tell the SayIt which group or set should be active and what should happen when a specific word is said. Should a different group or set be selected next or should it still listen in the same group/set.

    The commands for using SD words are included in the SayIt documentation. The above "conversation" between SayIt and PC also gives an indication of which characters the uC should be expecting and which characters the uC will need to send to the SayIt in order to select a specific group.

    If you have a FTDI device to use as a second serial connection between Propeller and PC, you could use the monitor program yourself to see which characters are passed between SayIt and PC.

    I don't think I have any example code of using SD with the Prop yet. Once I have some sample code, I'll post it here.
  • mikeamikea Posts: 283
    edited 2013-03-13 15:38
    Thanks for posting this Duane. I'm having a problem getting the prop to work with this vr shield, which sounds like its basically the same as the say it. I downloaded the sayitsimplebridge to the prop, but the easy vr commander won't recognize it. I think all else is correct, i first tried it with my basic stamp2. Are you using something like the commander to program recognition of new words or did you create all the code for that. Should this bridge work with the commander?
  • shemuwelshemuwel Posts: 3
    edited 2013-03-13 17:18
    Thanks Duane for the reply. Yes. I copied the bridge program from this thread and used it to connect to the bridge program and program the words. I also downloaded the Say-it Driver from the parallax object exchange that was published in 2011 at http://obex.parallax.com/objects/search/?q=say-it&csrfmiddlewaretoken=1d8173c511ac042fee41db0dd91b9c04. I works with the built in trigger and action words but not the words that I add. I will wait for you or someone who is more experienced with this to post something, Thanks again.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-13 18:42
    I assume the "commander" program is a GUI for entering new words, training and testing them? I've attached a screenshot of the Propeller GUI. Is it similar to the commander program?

    If the commander program operates at the same baud as the SayIt, it should work. Otherwise, you'd just need to change the baud settings in the bridge program.

    I had trouble with the tx and rx lines being switched on the SayIt. I don't know if the VR shield has the same issue or not.

    It possible to define words and train words with the Propeller as the input, but I think it's easier to just use the PC program to program the SD words and groups and then program the Propeller to watch for these words.

    As I mentioned above, I think the trick is to figure out how many groups to use and what words should be assigned to which groups.
    673 x 512 - 45K
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-13 18:57
    shemuwel wrote: »
    I also downloaded the Say-it Driver from the parallax object exchange that was published in 2011 at http://obex.parallax.com/objects/search/?q=say-it&csrfmiddlewaretoken=1d8173c511ac042fee41db0dd91b9c04.

    Wow, I hadn't ever seen that object.

    Based on the description it sounds like to does what the PBasic program does.

    I'm betting I'll get to programming some code to use SD words in within a week or so. I want to use the SayIt to control an automated pipette gizmo I've made. I used the SI words "turn" and "forward" to indicate I want the stepper motor to deliver some acid, but I think I'd rather use SD words like "fill" and "drops" to make the interface more intuitive.

    My code "SayItAlpha" posted above has an option of listening for the trigger followed by listening for the other word sets. I think you need to use the initialize option before you can have the module listen for the trigger word.

    The default timeout isn't very long. I'd suggest setting it to zero so it doesn't timeout at all. There's a menu option in the "SayItAlpha" program for setting the timeout.

    I think one of the weaknesses of my "SayItAlpha" program is it's too complicated. I'll try to write some simplified example code to make it easier to see what is needed to interface with the SayIt. The simplified code will have fewer possible outcomes since keeping track of which "branch" of the menu tree is what makes the code so complicated.
  • mikeamikea Posts: 283
    edited 2013-03-16 09:59
    @ Duane. The easy vr commander looks identical to the picture you posted. On the shield d12 = tx, d13 = rx when the jumper is set to sw mode. I finally got the prop working with sayitbridgeandmonitor130305 code, but i don't understand what is displayed in the debug terminal. It will show ...." %FF". I've been switching between the prop and bs2 to make progress and understand how it works. With bs2 it shows in the debug terminal that it recognizes the trigger word in group zero. After the trigger is recognized shouldnt it listen for a nonzero group on its own? I have 3 words in group one with associated code below each, but it doesnt seem to recognize or hear them. I tested each word in the commander btw so i know it will recognize them. Is there something i'm missing?
  • Duane DegnDuane Degn Posts: 10,588
    edited 2013-03-16 10:34
    mikea wrote: »
    i don't understand what is displayed in the debug terminal. It will show ...." %FF".

    The PC program outputs some data that's not in the documentation. Any character that's not a normal printable character is displayed as a hexadecimal value I add the brackets <$xx> to indicate it's a character outside the normal printable range. Space characters get displayed as hex since it would be hard to tell where they were displayed. The "FF" value could either be one of these undocumented characters or noise on the line. I think the serial driver I used doesn't catching framing errors.
    mikea wrote: »
    After the trigger is recognized shouldnt it listen for a nonzero group on its own?

    No, the SayIt or VRbot doesn't make any decisions on its own. It has to be instructed to listen for another set or group.

    To switch to group 1, you have to send "dB" to the module. It will then output "r" if it recognizes the word spoken. The Prop then needs to send a space character to tell the module to continue. The module will then output a character identifying the index number of the word recognized. If the module output a "B" it would mean it heard the word with index #1.

    The program then needs to decide what group or set the module should listen for next. If the module isn't instructed to change to a different group or set it will continue to monitor the same one.
Sign In or Register to comment.