Rechargeable batteries and nominal voltage vs actual voltage
rwgast_logicdesign
Posts: 1,464
I was looking at the parallax project board today and the specs says it can handle 5 to 16v VIN. This got me wondering about something, because I use Hi Capacity NiMh a lot, either with home made chargers based on cell size or a standard wall charger that lets you plug double AAs in.
A constructed my first pack to use 16 cells, in order to get 19.2v, and was pretty surprised when it charged up to 22v. This started making me think, how in the heck do you size a battery pack, do you calculate your needs baed on a cells nominal voltage or peak voltage?? Lets say I wanted to run a Prop Project board using the biggest NiMh pack possible. That would equate to 13 cells which is 15.6v, but when fully charge the pack would run at 18v since the cells peak at 1.4v.
I also have a NiCad rated at 9.6v it charges up to 11 something volts, and it doesn't seem to drain down to 9.6v all that quickly, by the time it hits its nominal voltage its half way spent.
A constructed my first pack to use 16 cells, in order to get 19.2v, and was pretty surprised when it charged up to 22v. This started making me think, how in the heck do you size a battery pack, do you calculate your needs baed on a cells nominal voltage or peak voltage?? Lets say I wanted to run a Prop Project board using the biggest NiMh pack possible. That would equate to 13 cells which is 15.6v, but when fully charge the pack would run at 18v since the cells peak at 1.4v.
I also have a NiCad rated at 9.6v it charges up to 11 something volts, and it doesn't seem to drain down to 9.6v all that quickly, by the time it hits its nominal voltage its half way spent.
Comments
Motors can handle the overvoltage situation.
It would appear that controllers like the HB25 controllers that Parallax sells are really meant to control motors designed
for automotive applications and can tolerate the voltages required under automotive conditions.
I would like to use the HB 25 controllers for my Windshield Wiper motor project but I think that an 18V Dewalt Drill motor battery
would possibly cause damage to the HB 25 controller.
I guess the moral of this story is that battery chemistry and peak voltages are important in that If you have very voltage sensitive
components to control, A Voltage regulator of some type is required.
Marine also. I have 4 ea. pump motors that came from yachts bilge pumps. The HB-25's work flawlessly with them.
So the real answer is to run a few tests. One is to test the actual fully charged voltage of battery packs. This is going to exceed the nominal advertised voltage. Another is to test the voltage drop out of the device, so that you have a good idea of what is the acceptible operating range.
One just cannot depend on advertised ratings.. they are advertising and consumer electronics really is not about informing the consumer. You have to be the engineer and get real values and ask real questions in order to get good results.
Of course, there is another question of how to make the jump from logic circuits to power circuits. I generally use transistors as they are very forgiving, but they are more wasteful than MOSfets. A good designer really needs to know how to use Power MOSfets controlled by logic circuits down to as low as 1.8 volts. But I find that it is nearly impossible for me to build DIY projects with these kind of chips.. they are either too small or just not generally available. So I stick with my 2n3906, 2n2222, and TIP120 designs and waste more energy on internal voltage drops (aka, internal resistance).
Il take the Pico VIA 910 thread for a example.. Lets say the computer Was OK with a " 12" V car bat running it . so 13.8 ish Volts to 11.5 is the approx range .
the computer can handle the low and high ..... but as soon as you apply 14.3-.8 to charge the batt the thing smokes ...
THink of it this way ...... the batt is in parallal with the load and the charger ,,,,,, on paper the batt would soak up the extra voltage and not let it pass on the same rail as the batt out to the compiuter ..... but as the batt is charged the batts resistance goes up so its " shunting " the voltage from the charger less and less and the voltage on the V+ rail can go up ! ...... Its fasanating to watch on a set of meters...
so be careful on relying on the input tolerances on a comprnet or device to be part of a project ,,,,,
If you want to have an in line charge active while you are using the cells, the voltages for charging even creep higher. Automotive circuits are where this is most common as the so-called 12 volt system can actually be running in the 14-15 volt range with the altenator trying to charge the battery. And for the sake of switching noise spikes, almost anything automotive (capacitors, relays, etc.) should be rated 60 or 70 volts minimum. There is a lot of sudden loading and unloading - the horn, electric windows, high/low light beams, brake lights, and so on.
The basic rule here is Never engineer to the nominal voltage of a battery... consider the actual context and get good data.
In fact the chemistry is completely different and involves the diffusion of hydrogen ions through rare-earth metals.
NiCd were pretty much constant 1.2V throughout the charge cycle (with almost no warning of full discharge).
NiMH voltage varies from about 1.1V fully discharged to 1.35V fully charged and average about 1.3V.
The precise formulation varies (the "metal" can be different alloys, typically lanthanum though). Most of
the weight of a NiMH battery is the lanthanum or equivalent, not the nickel BTW.
A 4-cell NiMH pack is a handy 5.3V or so, convenient for 5V logic as it stays in the 5.5 to 4.5V range
during the full discharge curve pretty much. (But as I said formulation varies)
This allows you to be able to detect the charge state of the battery with minimally complex circuitry. This was by design.
Trickle charge was another design feature for use with consumer batteries (the AA, C, and D sizes)... The C/10 charging rate can be a dumb wallwart giving out low amperage, and when the cells are charged, the increased cell voltage will lower the current and they'll be able to dissipate the remaining energy as heat. In theory anyways, because cheap transformer based wallwarts also have a problem to where they increase in voltage as current decreases. Sometimes double. So that increasing voltage would potentially hurt the electronics that the battery was connected to, which is why you would have to take the batteries out of the device and put them in the separate trickle charger.
Fast charging for NiMH does get complicated because of the cell chemistry. There is definitely a lag in the increasing voltage, so temperature sensing is a must. If there is a sudden increase in delta temperature (an increase in the increase of temperature), charging of that cell needs to be stopped. Decently designed NiMH fast chargers still have a trickle charge mode to maintain peak charge.
compare this to the Lithium series of batteries where you have to do coulomb counting. Overcharge and the cell vents, undercharge and the cell shorts itself. Statically measure the voltage, current, temperature, or weight, and it won't tell you much. I'm not joking, E=mc^2 should tell you that the mass has increased... but it's laughably small.