News

Batteries for medical devices

Their characteristics and how to choose the right one.

The United States is the largest medical device market in the world. Despite this, there are still issues that can make it difficult for design engineers to choose the right battery for their portable medical devices. Here, Neil Oliver, technical marketing manager of battery specialist Accutronics, explains the battery characteristics that design engineers should consider and how they will help mission-critical devices continue to operate in the most demanding medical environments.

 

The US market for medical devices generates $180 billion in revenue, making it the largest in the world, according to the 2017 Global Market for Medical Devices report by Kalorama Information.

However, in my experience, there are still issues that original equipment manufacturers must address as we increasingly rely on portable, battery-powered devices. Variables such as battery chemistry, discharge profiles and the long-term effects of ageing, as well as charge rate, are all problems that continue to perplex many manufacturers.

 

Battery chemistry

Devices such as ventilators, infusion pumps, dialysis systems and anaesthesia machines have historically been powered by the mains AC supply, relying infrequently on their onboard battery as a backup only. This means that the battery often only receives a shallow discharge before being charged again.

While today’s rechargeable batteries do not suffer from the memory effect that was common with old technologies such as nickel-cadmium batteries — where the battery would lose capacity if it wasn’t fully discharged before being charged again — modern batteries such as sealed lead acid and valve-regulated lead acid still have a failure mechanism; a gradual rise in their internal resistance.

As the internal resistance increases, any sudden need for power, particularly from motor-driven devices such as dialysis machines and ventilators, can draw a lot of power, leading to a drop in voltage; not an ideal characteristic for a medical battery.

The most common mistake that I’ve seen equipment manufacturers make when matching a battery to a device is not selecting a battery with an internal resistance appropriate to the load. If the load current is high, or it has high pulses, and the battery has a relatively high internal resistance, then the voltage drop under load can be severe.

This will cause two problems. Firstly, the battery will heat up and waste energy and, secondly, the battery will quickly reach the device’s cut-off voltage earlier than desired.

 

Internal resistance

A battery’s internal resistance is made up of two factors: electronic and ionic resistance. Electronic resistance encompasses the resistivity of the actual materials that make up the individual cells such as the cell cover, can, and current collectors, as well as the welded interconnection links between cells and the battery-level elements such as wiring, FETs, fuses and sense resistors.

Ionic resistance is the resistance to current flow within the cells due to electrochemical factors such as electrolyte conductivity, ion mobility and electrode surface area. The combination of these factors makes the total effective resistance, which results in a voltage drop once the battery is placed under load.

Effective internal resistance is usually calculated by placing a fully charged battery under a low current load, typically 0.2CmA, for 10 seconds. Once this has elapsed, the current is immediately increased to a higher level, typically 1.0CmA, and this is held for 1 second. Ohms law is then used to calculate the resistance based on the difference between the two on load voltages and the two currents.

Figures one and two each show the discharge voltage of a 14.4V Lithium ion battery, from two different suppliers, consisting of eight 18650 cells in a 4-series, 2-parallel array. The load is switched between 110W x 200ms, 90W x 300ms and 40W x 1000ms, resulting in a voltage drop and a “thick” voltage trace.

Supplier A has a low effective resistance, calculated at 69mΩ, which results in a minimal voltage drop and a temperature rise of only 16.0 degrees Celsius. Supplier B has a far higher effective resistance, calculated at 209mΩ, which results in a severe voltage drop and 37.6 degrees Celsius temperature rise.

 

Temperature

Ultimately, batteries usually work best at room temperature. A higher ambient temperature may provide better short-term operation by making the charge and discharge reactions more efficient and reducing internal resistance as a result.

However, sustained operation at high temperatures also results in unwanted parasitic reactions between the electrodes and the electrolyte and a breakdown of the cell structure, leading to a long-term reduction in performance.

The international standard IEC 60601-1 that covers the general requirements for basic safety and essential performance requires that any part that can be touched by a person should not exceed 43 degrees Celsius. So, choosing a battery that heats up excessively could lead to non-compliance.

 

Fast charging

Temperature is also the reason why fast charging, despite the benefits it offers, is not conducive to the long-term health of the battery. In May 2017, it was reported that the electric vehicle giant Tesla was limiting the charging rate on vehicles being charged with its 120kW Superchargers to only 90kW if the vehicle had already accumulated too many DC fast-charge events. The company released a statement explaining the limitation.

“The peak charging rate possible in a li-ion cell will slightly decline after a very large number of high-rate charging sessions. This is due to physical and chemical changes inside of the cells.

“To maintain safety and retain maximum range, we need to slow down the charge rate when the cells are too cold, when the state of charge is nearly full, and also when the conditions of the cell change gradually with age and usage.”

It is best to charge a battery at the lowest current that a user can realistically tolerate, and then only recharge it when the remaining capacity has dropped to a level where there is an insufficient capacity left to deliver useful power.

In reality, this will usually result in a compromise between the longevity of the battery and the demand from the user. In medical environments where a portable dialysis machine needs to go with the patient as they move around a hospital, it is important that the unit can be recharged quickly rather than reducing the quality of care delivered to the patient.

 

Battery management

A typical Lithium ion battery charges in two stages. In stage one, the battery is charged at a constant current and the voltage is allowed to rise. Once the battery reaches its maximum voltage limit (typically 4.2V per cell) the voltage is held constant and the current is allowed to taper down towards zero. Charge is terminated once the current has tapered to a pre-set level.

Here, it is important that the charging voltage is never exceeded. Because lithium-ion batteries consist of many cells in series, it is advisable to use cell balancing circuitry that shuttles the charge current away from the cells that reach their charging voltage first, allowing the others to catch up.

 

Cell balancing

The most common method of cell balancing is to monitor the individual cell voltages and bleed energy away from the cells with the highest voltages. The energy is bled away through resistors and dissipated as heat. Other methods redirect the cell energy to charge other cells which wastes less energy but is more complex to accomplish.

While cell balancing will not turn a bad cell pack into a good one, it will keep good cells balanced and this will ensure the battery can be cycled many hundreds, sometimes thousands, of times. Reducing the charge voltage will also promote longer cycle life, albeit with a reduction in capacity per cycle.

Figure three shows a cell balancing diagram. Semiconductor specialist Texas Instruments explains that “Some of the current which would charge the cell is diverted through a parallel path.

“Cell balancing operates during on a 50-ms nominal period. Voltage monitoring occurs during approximately 10 ms of this interval, a balancing field-effect transistor internal to the IC is switched on during the remaining 40 ms of the period to provide a bypass path. The FET is switched off again at the end of the balancing interval to measure the cell voltage.”

To keep pace with the size and demands of the global medical device market, it is vital that design engineers think carefully about their choice of battery, ensuring that their mission-critical devices continue to operate in the most demanding of environments.

Categories
NewsTECHNOLOGY

Join our audience of healthcare industry professionals

Join our audience of healthcare industry professionals

X