DC-DC voltage converters are often used to provide a regulated voltage supply from an unregulated voltage source. Unregulated voltage sources can be rectified line voltages that exhibit fluctuations due to changes in magnitude. Regulated voltage supplies provide an average DC output voltage at a desired level (3.3 V, 2.5 V, etc.), despite fluctuating input voltage sources and variable output loads. Factors to consider when deciding on a regulated voltage supply solution include:
- Available source input voltages
- Desired supply output voltage magnitudes
- Ability to step-down or step-up output voltages, or both
- DC-DC converter efficiency (POUT / PIN)
- Output voltage ripple
- Output load transient response
- Solution complexity (one IC solution, # of passive components, controller and external FETs)
- Switching frequency (for switch-mode regulators)
The following sections describe several different voltage regulators.
Linear voltage regulators are commonly used for step-down (output supply voltage is lower than input source voltage) applications. Linear regulators are also available with either a fixed output voltage or a variable output voltage when using external biasing resistors.
The advantage of linear regulators is simple implementation and minimal parts (just the IC in the case of fixed output) and low output ripple. The major disadvantage of linear regulators is low efficiency. Significant power is dissipated within the linear regulator IC, as the converter is constantly on and conducting current. Linear regulators should be used when the difference between input source voltage and output supply voltage is minimal, and converter efficiency is not a concern.
Switching voltage regulators are commonly used for both step-up and step-down applications, and differ from linear regulators by means of pulse-width modulation (PWM) implementation. Switching regulators control the output voltage by using a current switch (internal or external to the IC regulator) with a constant frequency and variable duty-cycle. Switching frequencies are generally from a few kHz to a few hundred kHz. The switch duty-cycle ratio determines how much and how quickly the output supply voltage increases or decreases, depending on the load state and input source voltage. Some switching regulators utilize both variable switching frequency and duty-cycle, but these are not commonly used for FPGA/CPLD applications.
The clear advantage of switching regulators is efficiency, as minimal power is dissipated in the power path (FET switches) when the output supply voltage is sufficient for the load state. Essentially, the power converter "shuts off" when power is not needed, due to minimal switch duty-cycle. The disadvantage of switching regulators is complexity, as several external passive components are required on board. In the case of high-current applications, external FET ICs are required as the IC-converter acts only as control logic for the external FET switch. Output voltage ripple is another disadvantage, which is generally handled with bypass capacitance near the supply and at the load.
Buck, or step-down, voltage converters produce an average output voltage lower than the input source voltage. Figure 1 shows a basic buck topology using ideal components. The inductor serves as a current source to the output load impedance. When the FET switch is on, the inductor current increases, inducing a positive voltage drop across the inductor and a lower output supply voltage in reference to the input source voltage. When the FET switch is off, the inductor current discharges, inducing a negative voltage drop across the inductor. Because one port of the inductor is tied to ground, the other port will have a higher voltage level, which is the target output supply voltage. The output capacitance acts as a low-pass filter, reducing output voltage ripple as a result of the fluctuating current through the inductor. The diode provides a current path for the inductor when the FET switch is off.