Hobby electronics, where do you think it is going?
TC
Posts: 1,019
Hello all,
just a random thought(have a lot of them since meds got changed)
When I started working with microprocessors (BS2p), there was not a lot of info, or help that was readily available. The internet was all dial up, web sites were not that informative, and there was no such thing as a "forum"
Back then the beginning hobbyist did not have much choices. Sure, there were evaluation kits that chip manufacturers made, but they were expensive. And seeing puff the blue dragon was the last thing you wanted to see. And the only way you knew hardware existed, was from magazines, unless you lived in or by an area that dealt with technology. Not me, I lived in an area that dealt with steel. You also had Radio Shack that had parts, the sales men knew nothing about the parts, but they had parts.
But now things are different. You have unlimited information, examples, & help at your disposal. And also hardware has changed. Now we have a 32-bit, 8 possessor, chip that is under $10(prop). A over used(IMO) programing platform(Arduino). A mini computer that can run an OS(Raspberry Pi), and I have even seen FPGA kits that will not break the bank. Intell has the Galileo, that IMO looks promising. and they are coming out with the Edison, a computer(I know its not, but close) the size of a SD card.
It is amazing to see all the cool and wonderful things people are creating. Even children are coming up with some great ideas. And there are courses for kids that teach them about programing(wish I had that).
My question is to everyone,
Where do you think hobby electronics will go?
What do think it will be like in 10 years?
Thanks
TC
just a random thought(have a lot of them since meds got changed)
When I started working with microprocessors (BS2p), there was not a lot of info, or help that was readily available. The internet was all dial up, web sites were not that informative, and there was no such thing as a "forum"
Back then the beginning hobbyist did not have much choices. Sure, there were evaluation kits that chip manufacturers made, but they were expensive. And seeing puff the blue dragon was the last thing you wanted to see. And the only way you knew hardware existed, was from magazines, unless you lived in or by an area that dealt with technology. Not me, I lived in an area that dealt with steel. You also had Radio Shack that had parts, the sales men knew nothing about the parts, but they had parts.
But now things are different. You have unlimited information, examples, & help at your disposal. And also hardware has changed. Now we have a 32-bit, 8 possessor, chip that is under $10(prop). A over used(IMO) programing platform(Arduino). A mini computer that can run an OS(Raspberry Pi), and I have even seen FPGA kits that will not break the bank. Intell has the Galileo, that IMO looks promising. and they are coming out with the Edison, a computer(I know its not, but close) the size of a SD card.
It is amazing to see all the cool and wonderful things people are creating. Even children are coming up with some great ideas. And there are courses for kids that teach them about programing(wish I had that).
My question is to everyone,
Where do you think hobby electronics will go?
What do think it will be like in 10 years?
Thanks
TC
Comments
why?
Because people will be expecting more performance. That means faster chips. Usually SMTs of some sort. It means lower voltages. More finnicky PCBs and layout rules.
Manufacturers are moving away from DIP and other through-hole formats. Small-volume chips will become more expensive, and lead-times on small and medium orders will increase as resellers 'streamlines' inventories.
Add the Chinese knock-offs... Got a couple of Arduino Unos in the mail yesterday, and it transpires that they're Chinese versions. Had to download and install an alternate USB Serial driver to connect to them because to drive down the price, they had used a different chip.
Also, a lot of the Pis, the 'USB stick' computers and similar is a fad. How many of those just end up forgotten in a desk drawer?
I am pretty sure that some areas are NOT going to get developed.. just because the governments see them as problematic. So forget about radar, and ballistic missle navigation.
So what is left?
Personal automation. Whatever you desire to automate and to create a better user interface is pretty much wide open if you choose to go their. A lot of computer geeks have ignored a very old, but obvious concept -- ergonomics; and instead gone for tinier, flashier, more digitalized user interfaces.
Personal automation can include all your personal habitats -- home, office, school, and on the run. 3D printing has sudden become in vogue with corporate America. That may make it a useful accessory to such automation, but it is not at the core of independent innovation. The core is have things the way you like them and not having to be driven by what 'the market place' claims in the next wave.
I am now looking at how a barcode reader might improve my lifestyle. I don't want a touchpad. I like keyboards and mice.
++++++++++++
Backtracking into ignored needs is another interesting area. "The Economist" this week has an interesting article on the developer of new technology for Braile readers. The current retail price of a Braille reader is about $5000 USD and that makes it too expensive for the blind that really want one. They have to rely of speech-to-text, magnified screen displays, or tape recorded readings of material.
This all seems a bit negligent when OCR can easily scan text into alphabetical plain text and feed a good Braille device. So what is lacking it the $500 USD Braile reader, or even better a $100 Braille reader.
+++++++++++++
And for the future..
If I had a child that was uncertain about a career path, it would either be plastics or computer animation. The days of steel, stone, and wood are disappearing. Casting can produce products so much faster than milling and turning on a lathe. And with an every more demand for people's attention, short animation is likely to be a very secure career path for the right creative person.
We have a clue that there has been a demand for such things from the fact that the Arduino was a hit many years before already.
Chips and components will get smaller. DIP and through hole may disappear altogether. But as long as engineers need development kits and hobbyists need toys there will be someone willing to get boards and modules made and sell them. Market forces and all that.
In the absence of capacitors and resistors that we can actually see without a microscope we might need to start making our own!
The Comodore64 computer had just come out, it cost $700, add in a floppy drive for another $600. You get the idea. Who would ever dream that you could get a computer for a dollar (Prop = 8 for $8) The early personal computers had 256 bytes, maybe 1024 bytes of memory, that's Bytes, not KB or GB or TB.
So where is I headed? Well, the Prop is a good choice, Prop2 even better. Other chip sets also have their place. The only limiting factor is our poor-azz US educational system. It was bad when I was growing up, and nothing has changed. We are getting beaten up at every turn. Just look around at other countries, better educated, better technology, better business environment, the list goes on. As a business owner, it just pains me to buy anything from overseas, when we could be making the same product here in the US. AND it does not cost more, especially when you see the unemployed and homeless wondering the streets. Help. Me. Here.
And so, I never got a BeagleBone, a C3, or a Raspberry Pi.
I am very serious about the importance of good ergonomics at this point. I don't want something that is just a nest of wires sitting on my desk if I am actually using it as an appliance.
I got a Cubieboard because it seemed that all the right ergonomic features were present. Sure, it was a bit larger, but had double the RAM, a better USB interface, and a realistic power distribution (Why would someone want an Android or Linux computer that powers from the USB port of another computer? And yes, I do understand the USB wall warts are now available.)
I guess what I am saying is that in prosperous times, we are more than willing to throw a bit of cash at anything we are curious about. But in leaner times, a desire for value kicks in.
It is interesting that the Chinese Ardino clones require a different USB driver. I bought one locally for about $10USD or less, but have completely ignored it. I am having too much fun with Propellers.
WWW.mikronauts.com seems to be on the right track about modularity and ergonomics for the Propeller. They have through hole designs down pat. If would be wonderful if Parallax had them help out on boards with SMD devices installed. It is just as much an art and a skill.
The Raspberry Pis are made in the United Kingdom. That's after starting off manufacturing in China.
The new Parallella computers from Adapteva are being made in the USA.
These may not be huge volume things compared to phone manufacture or such but they show that it is quite possible get products made at home competitively.
Loopy, How so?
What if you would quite like that kind of computing power for your robot projects or such like? The smaller the better. Instead of buying a micro-controller in a big old DIP think of little boards and/or modules instead.
Ergonomics has nothing to do with it if it's buried inside whatever it is you are building.
This is neither good nor bad, but points up the change in how people want to experiment. Look at all the Chinese boards Erco points out in his posts. Why build something when you can get a DC-DC converter for 99 cents with shipping?
Long term, I see the hobbyist market dying. DIY is slowly going away as DIP and discrete components all go to SMT. Hobbyists will either buy pre-made boards for their projects or simply not do them at all.
Radio Shack's demise is just the first sign of this. I think there is also a cultural shift as well that is playing into this as well. People are spoiled rotten by the convenience of consumer electronics. So plopping down in front of a teenager a Z-80 or Prop isn't going to wow them, or at least in numbers that makes it cost effective to sell products to this segment.
Boards like the Raspberry, Cubie or Beaglebone are oddities and solutions looking for problems. A net book could do the same but doesn't have the pizazz of a bare bones ARM board.
Ever since discrete components and other devices were produced on silicon, the through-hole method was just a dead-end detour - the whole purpose of the lead-frame on an IC package was just so mechanical/electrical connections could be made with the tiny devices on silicon. With surface mount and wave-flow soldering, the connections could be made skipping much of the whole leadframe/packaging process, saving time and money.
I've just produced my first boards with surface mount devices. There is a learning curve, but well worth it. You apply solder with a stencil, place your components, reflow and it is done. As far as I am concerned, it is a better method for single or multiple boards. The only drawback is that it is not really practicable to make PCBs at home, which adds some time to construction.
I expect that in the near future we will have DIY printable PCBs for prototyping or small runs, utilizing some sort of 3D printer to lay down traces, a small laser to do vias (that would then be filled with more conductive material.) The printer could lay down solder paste or maybe even a conductive glue (eliminating heated reflow). Even a DIY pick-and-place is possible, probably a simple system of where the operator would, upon prompting by the layout software, place the next-needed device in a 'pick-up' zone, then to be placed by mechanical means.
The future for DIY is bright.
Then, things developed. During the 80s there was a transition period where, for example, LCD displays were available, but you had to use these massive 64 pin chips to control them. It was too daunting. During this time, I kind of took a hiatus from electronics and got far more into hiking and primitive tech (fire drill, water condensation collection, survival skills, etc).
Then I got interested again. I found that electronics had gone beyond its awkward intermediary phase and a new paradigm was emerging: The evolution of the "component".
Now, a component was a fully integrated serial LCD.
A fully integrated text-to-speech module.
And most importantly, a microcontroller that made it so that a circuit no longer needed to be a dedicated thing. What you build first to be a digital thermometer... same circuit but change the program, now it's a remote rain gauge. Change the program again (and the attached sensors) and it's an RFID door opener.
Hobby electronics has undergone a massive shift, due to the burn-inducing speed of evolution in the electronics field. When I started, it was all about biasing transistors (ok, tubes.....) properly. Now it's all about getting your communications protocol settings correct.
There will ALWAYS be a need in hobby electronics to know the basics - ohm's law, for example - but the micro-details are becoming ubiquitous. What we used to have to design to the discreet resistor/capacitor/transistor level, has been recognized by the industry at large as a nearly universally needed item. LCD displays, for example.
As electronics progresses, we will be more and more, working with modules which perform a given function.
The upside of this is that we will be able to very rapidly and easily duplicate functions we are familiar with.
The downside of this is that over time, innovation will be directed more by the known, than the possible.
And there... in the dark fringes occupied by the uber-geeks who still recognize the value of knowing how to bias a transistor in just the right point of its linear region, who know the resistor color-code so deeply they can almost feel the different bands, and who can create something that hasn't already been done... from their fertile minds the future will spring.
Long live the uber-geek! The future of our species depends on them.
What I would like to see is a general standardization on how I/O pins are handled within given languages.
This is just a pipe dream, because so many people shoot me down with "what's the point?"
For example, in C you have a pointer to a memory location, and that memory location just happens to correspond to an external area (and not a ram cell).
I would like to see some sort of separation of the concept, so that people can freely share code between disparate systems.
In industrial programming languages this happens already:
Variable declaration using the AT keyword:
light_1 AT %Q2.0 : BOOL;
eight_lights AT %Q3.0 : BYTE;
then in the program it's just the simple:
light_1 := TRUE; // or you can also say light_1 := 1; as well
or eight_lights := 16#01; //turn only the first light on.
At a lower level the output image (%Q) would be translated into the actual areas that have the i/o pins.
This makes code super portable between systems, and gets rid of all these terribly different libraries that tend to get abandoned very quickly.
It's really frustrating to have a standard language between systems, but then having the physical interface to input and output pins be so proprietary.
Even if both systems technically have "output pin 2" or whatnot.
ram is ram. files are files. digital pins are pins. it's time to have a standard for what it means to say "make pin 1 an output and turn pin 1 on".
That brings me to the point that our education system I just headed in the wrong direction, totally missing the point of preparing students for future jobs and advances in our culture (I purposely left out advances in science) As I stated before, we are getting beat up at every turn. Other countries are far ahead in education, technology, and business practices. Adding a few jobs in the technology sector just does not cut it. We need hundreds of thousands of jobs, not just a few high tech companies, but basic commodities (construction, food, clothes, manual trades) So stretching the frontier and going from DIY hobby stuff to real world problem solving, takes both imagination and technical know how. Some of the stuff I am working on; synthetic muscle (EAPs), electrophotonics (pigment color change), imaging pain, ion mobility cells (developing better odor and taste sensors). Just the tip of the iceberg. Think of something lets get to work.
Heathkit and several of the "radio/parts" stores were a favorite of mine for the next thing to build / design.
Today's basic board designs don't need to calculate bus loading issues of capactive "pf" or even driver "TTL" loads, we just purchase another Propeller and stick it in the circuit.
Tomorrow design issues will be a challenge (if I see the future correctly) with true PARALLEL processors integrated with a BIO-mechanical system.
Pulling power from a "body-cell" interface and interfacing with real nerves / muscles is the future.
No longer will you get a rabbit for a pet, just design one (with the modules / parts available from ..... online source) and assemble it.
No soldering, instead you will need a bit cleaner work area and it can happen.
Far fetched ? That's what my dad said when I asked if you could compress the features / functions of my VTVM into a watch (or today's cell phone).
No real change for a long time from what I can see.
When Intel was shipping 8 bit 8085 microprocessors the tools you needed, editor, assembler, linker, perhaps the PL/M high level language would cost you thousands of dollars. The development system you needed, from Intel, to run them on was about ten thousand dollars.
Following that tools were typically closed source, binary executable, only and only available for MSDOS or Windows. And still very expensive.
Today we have a lot of opensource compilers and tools available. They are cross platform. This is huge change. I count this as a recent change because it's only recently that Parallax, for example, has gotten on board with this idea and supported open source tool development.
Technically things have moved along a lot. We have a huge range of programming languages available. We have very sophisticated IDEs. We can run an entire Unix like OS on an embedded system.
When I programmed my first ever 8 bit micro-processor I did it in hexadecimal. We did not even have an assembler. Today I can type JavaScript into a device that is a lot smaller !
On the actual electronics side we have easily available tools for schematic capture and PCB design. In the 1980s I worked on a team building a PCB design package for the PC. It would cost you about 8000 pounds a copy if you wanted to play.
We have Spice simulators and other software for circuit design.
Or you can be designing huge complex logic in VHDL/Verilog for your now very cheap FPGAs
How much more change would you like to see?
Much, much more.
What you describe is the reduction of cost...LONG overdue.
Everything else you described I worked with 10-15 years ago...programming really hasn't changed...just the cost to play with the same oldy moldy tools has.
Hardware has seen significant changes...not so with software.
What changes would I like to see?
Better man/machine interfaces, AI layering, selfmending software, seamless software tool interfaces..heck something as simple as voice input as standard with NO keyboard needed...basic programming should be so simple that children could do it...a requirement that every kindergardener knows how to program along with their ABCs...we are not even close.
It just kills me when I see a program crash and one gets a memory dump/cryptic error msg...so 50s...did Elvis ever leave the software building? It would appear not.
Only when the software tools improve in the same way hardware has been will we see the complexity of the future software improve.
The future of software...just take a stroll down the Internet and check out the trash that passes for websites these days..and the tools that made it...I suspect that is what the future holds for software tools that will filter down into the hobby electronics that we love to work with.
We could do better...and should demand better.
<fast forward>
Now, the threshold for Bomber or someone like him for making real self-made money is much lower again: Idea -> modules -> programming -> kit ->global Internet market. = Fuel for dreams, inspiration, and incentives for learning and doing.
Also, I firmly believe humans (most of them) have a fundamental need for making stuff - be it cakes, sheds, or technology gadgets. DIY is the future, because it will be so much more available and feasible to everyone. With a Propeller and a 3D printer you can make anything. As long as you have an idea. People who have good ideas and the urge to make, have a bright future.
Erlend
I do agree, to a large extent programming has not changed. But a lot has:
Man machine interfaces have changed a lot in my time. From punch cards to GUI builders. From entering HEX op codes to typing JavaScript directly into a 2 dollar MCU. From keyboards to touch interfaces.
I'm not sure what you mean by "AI layering, selfmending software," Perhaps if we could actually define these things in a meaningful way we could stand a chance of realizing them.
Voice input may have it's place. I can't imagine it being useful for coding. Try dictating Spin/PASM code to a friend and have him type it in. Then tell him how you would like to change it here and there. I suspect it's not so easy. Actually it is. See raspberrypi.org and their Python and Scratch efforts. Years ago we had kids programming in Logo and SmallTalk. I don't get that any more. I get: Even shows me the source code at the error position. I'm going to look at this a bit sideways and propose that actually nothing much has changed in hardware.
Obviously computers have become vanishingly small, consume a lot less power, are much cheaper and a lot faster.
But conceptually nothing has changed since the ideas of Babbage and Turing.
We still have addressable memory, RAM and ROM for program and data. We have external storage on tapes and disks. We still have processors with registers and ALUs etc etc. We still have the "fetch, decode, execute" cycle.
We see more features added to microprocessors like caches, pipelines and so on but they were all thought about and implemented back in the mainframe days. Now we have multi-core processors and GPU but even that parallelism is hardly a new idea.
Ultimately it's the same old zeros and ones and NAND gates. Just now we have a lot more of them.
We can say the same about software. Nothing much has changed. It's still about variables, expressions, conditionals, and iteration. We just have a lot more of it now.