Trends in power supplies. The impact of international standards in enhancing the performance of programmable DC power supplies

Feb. 1, 2000
The impact of international standards in enhancing the performance of programmable DC power supplies

Trends in power supplies

The impact of international standards in enhancing the performance of programmable DC power supplies

By Mark Edmunds

Up until the early 1990s, many North American manufacturers of programmable power supplies limited their design efforts to domestic markets. They tended to ignore the many foreign regulatory requirements that exist around the world.

The various approaches to safety and electromagnetic compatibility (EMC) regulations in Europe posed a particularly serious hurdle. U.S. products until the early 1990s often had approval to only one national standard. Typically, they would carry North American certification such as UL or CSA alone.

A decade ago, the U.S. and Canadian UL and CSA safety certifications ensured product safety and testing to the national standards of these two countries. Yet, they did not necessarily take into account the requirements of other regions. Particularly poorly addressed were stringent safety requirements — due to higher AC line voltages in most parts of the world — in places outside North America.

Since the largest markets for programmable power supplies were in North America at that time, there was little incentive to accommodate all the needs of the international market. However, as the electronics marketplace has globalized and expanded, upgrading power supplies to international standards has begun to deliver powerful performance benefits.

Switchmode power supplies

The emergence of high-frequency switchmode power supplies in the 1980s drew the attention of regulatory bodies around the world. In particular, it became apparent the electromagnetic interference conducted and radiated by this type of supply could wreak havoc with radio communications, and impede the operation of sensitive electronic circuits.

To limit the magnitude of this interference, the U.S. Federal Communications Commission (FCC) imposed the widely known part 15 J class A or B limits on radiated and conducted noise. In Europe, the more stringent CISPR 11 limits were put into place; they covered an even wider spectrum than the U.S. requirements. Most manufacturers designed and tested their products to the dominant American standard, and tended to leave accommodation of the stricter European standards to the product`s end user. This would often require the addition of extra filtering and more testing of the product once it was installed in a complete system. This resulted in project delays, and required the allocation of additional space in the equipment racks to hold unplanned-for filters.

The unified European CE mark

The creation of the European Union drove the need to develop a unified set of product regulatory standards to simplify and promote intra- European trade. The outcome of this is the well-known CE mark, which encompasses the following standards applicable to power supplies:

- EMC Directive EN50081-2 and EN50081-1, which limit emissions;

- EN50082-1 and EN50082-2, which define required level of immunity to external effects; and

- Low Voltage Directive EN601010-1, which defines safety and construction issues.

These standards gained momentum in the mid-1990s as other countries, such as the U.S. and Canada, adopted slightly modified versions of the same regulations for new electronic equipment sold domestically. Due to the widespread application of the CE requirements, even companies in countries that did not formally adopt European standards tend to apply them as part of their internal specification requirements. For the first time, one set of more stringent EMC and safety standards existed — and could be used to guide the development and test process of new programmable power supplies for use all over the world.

EMC requirements

The EMC directive became mandatory in 1996, and placed legal requirements on companies selling electronic products in Europe. These ensured the electromagnetic energy radiated and conducted on the AC power lines was below limits defined in EN50081-1 and EN50081-2.

The directive also required the units to withstand external electromagnetic effects such as external high-voltage discharges, and operation in RF fields, as defined in EN50082-1 and EN50082-2. Since these standards were more stringent than the previous FCC part 15 requirements, many North American power supply companies suddenly found themselves with older switchmode products that were not in compliance. Consequently, they were unable to sell these products into the European market.

Overcoming this hurdle required a considerable engineering re-design effort to lower the electromagnetic interference that the units generate. In some cases, this change was as simple as adding some new filtering stages to the AC input and DC output where space allowed. Often, though, the best plan was to introduce new product designs that eliminated the majority of the noise at its source.

The most common technique employed to do this has been a form of resonant switching in the main power transistors to ensure the actual energy being switched by the active device is reduced to nearly zero. This greatly decreases the unwanted high-frequency voltage and current transients — the culprits that supply much of the RF noise radiated and conducted out of the power supply.

This change to "zero voltage (or current) switching" — otherwise known as "soft switching" — not only reduces the overall electromagnetic noise in a power supply, but also prevents interference with other electronic equipment and radios. Still, there are other significant benefits.

Soft switching techniques essentially eliminate the power loss that normally occurs as the main power transistors change from a conducting to a non-conducting state. This reduction in wasted power will often improve the efficiency of a unit by approximately 2 percent. While this does not sound significant, it can account for a saving of more than 20 watts in a 1000-watt power supply.

This is all power that the main power switches (often MOSFETs) would have dissipated. These are the most critical components in any switchmode power supply. Reducing the power here lowers their junction temperature, giving increased operating margins and, hence, a longer life for the whole power supply.

So not only does a soft switching CE-compliant power supply generate significantly less electrical noise than its predecessors, but the unit is also more efficient. It has a longer mean time between failures, and better resists the effects of other equipment operating nearby.

Improved safety, reduced risk

The Low Voltage Directive portion of the CE requirements came into full force in 1997. With power supplies, this means compliance with the EN601010-1 safety standard. While products designed to meet older North American standards were normally safe, the requirements of this new standard place the power supply user at even less risk than before.

Many of the regulations influence the internal design of a power supply, and while they do increase safety through the use of thicker insulators and greater clearances in high- voltage circuits, the changes are not obvious to the user. But, as with the EMC requirements, there are safety changes that have enhanced user convenience.

The most useful new feature in this category is the style and implementation of the AC input and output power connectors. In the past, these were often exposed-screw terminal strips or bus bars, which required the user to be diligent about adding insulating covers or installing the whole unit in an extra enclosure.

Yet new designs use improved connector systems that prevent accidental contact through the use of plastic shrouded connectors, and equip units with easily installed connector covers featuring cable-strain relief clamps. The CE-compliant unit, then, poses little risk to the operator, while also greatly reducing the overall power supply installation and setup time.

New requirements

One of the last CE requirements coming into effect (likely in the first years of the next century) is IEC 1000-3-2, which will specify limits on the harmonic currents that line-connected industrial equipment may draw. These limits will help ensure that clean power is available to all equipment connected on the line, and reduce the overall power losses in the AC distribution system.

The most common way for power supplies to comply with this standard is to employ Active Power Factor Correction circuitry (simply, PFC) in the AC input section of the unit. This circuitry, which is essentially another power converter in series with the main power supply, forces the unit to draw current at a level that closely tracks the sinusoidal shape of the line voltage. Doing this gives an input power factor of very close to one, like an ideal resistive load (e.g., a light bulb), so that nearly all of the current is drawn at the fundamental line frequency, and is perfectly in phase with the line voltage.

A typical non-compliant, single-phase input switchmode supply may have an input power factor in the range of 0.65 to 0.7. This kind of power supply draws current in very high narrow peaks, causing heavy distortion of the line voltage and requiring an AC line rated to supply in excess of 30 percent more current than for the PFC-equipped unit. There are some benefits to using a unit with a near-unity power-factor-corrected input.

While designers are implementing this standard largely to aid the power utilities, there are some benefits for the user. With a near-unity power factor input, the input current requirements are greatly reduced — nearly all the current flowing is doing work within the power supply. For example, a non-power-factor- corrected 1000-watt supply running off a 120 volts AC line will draw as much as 16 amps, clearly in excess of a standard 15-amp distribution line rating. The same unit with PFC will draw only about 11 amps, well within the capability of the same 15-amp line.

Another feature often available as a side effect of a typical power factor correction circuitry implementation is the ability of the unit to operate off a wide range of AC input line voltage, typically 85 to 264 volts. This means one unit can operate virtually anywhere in the world with no complications from voltage range selection. It also dispenses with the need to order a special configuration of a product.

A power-factor-corrected unit will comply with the proposed regulations, minimizing interaction with other electronic equipment on the line. However, it will often simplify setup time by allowing the use of standard power outlets, and eliminate concerns over setups for a particular line voltage.

Mark Edmunds, P. Eng., is vice president of engineering at Xantrex Technology Inc., a manufacturer of programmable DC power supplies in Burnaby, British Columbia. His phone number is 604-415-4600.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!