Back to Basics: Analog Signals

Industrial sensors used for continuous position or process measurement commonly provide output signals in the form of either an analog voltage or an analog current. Both are relatively simple interfaces, but there are things to consider when choosing between the two.

AnalogCurrent Industrial sensors with current output are typically available with output ranges of 0 to 20 mA, which can be converted to 0-10 VDC by using a 500 Ω resistor in parallel at the controller input. Output ranges of 4 to 20 mA, which can be converted to 1-5 VDC by using a 250 Ω resistor in parallel at the controller input. Although it requires a shielded cable, current output allows use of longer cable runs without signal loss as well as more immunity to electrical noise. It is also easily converted to voltage using a simple resistor. Most, but not all, industrial controllers are capable of accepting current signals.

AnalogVoltageIndustrial sensors with voltage output are typically available with output ranges of:

  • 0 to 10 VDC (most common)
  • -10 to +10 VDC
  • -5 to +5 VDC
  • 0 to 5 VDC
  • 1 to 5 VDC

One of the main advantages of voltage output is that it is simple to troubleshoot. The interface is very common and compatible with most industrial controllers. Additionally, voltage output is sometimes less expensive compared to current output. With that being said, compared to current signals, voltage signals are more susceptible to interference from electrical noise. To avoid signal loss, cable length must be limited. Voltage output also requires high impedance input and shielded cable.

To learn more about this topic visit our website at www.balluff.us.

Analog Signals: 0 to 10V vs. 4 to 20 mA

In the world of linear position sensors, analog reigns supreme. Sure there are all kinds of other sensor interface types available: digital start/stop, synchronous serial interface, various flavors of fieldbus, and so on. But linear position sensors with analog outputs still account for probably two-thirds of all linear position sensors sold.

When choosing an analog-output position sensor, your choice generally comes down to analog voltage (e.g., 0…10 V), or analog current (e.g., 4…20 mA). So which should you choose?

0…10V versus 4…20 mA

When it comes to sensor interface signals, 0…10V is like vanilla ice creamr. It’s nothing fancy, but it gets the job done.  It’s common, it’s straightforward, it’s easy to troubleshoot, and nearly every industrial controller on the planet will accept a 0…10V sensor signal. However, there are some downsides. All analog signals are susceptible to electrical interference, and a 0…10V signal is certainly no exception. Devices such as motors, relays, and “noisy” power supplies can induce voltages onto signal lines that can degrade the 0…10V sensor signal.  Also, a 0…10V signal is susceptible to voltage drops caused by wire resistance, especially over long cable runs.

A 4…20 mA signal, on the other hand, offers increased immunity to both electrical interference and signal loss over long cable runs. And most newer industrial controllers will accept current signals. As an added bonus, a 4…20 mA signal provides inherent error condition detection since the signal, even at its lowest value, is still active. Even at the extreme low end, or “zero” position, the sensor is still providing a 4 mA signal. If the value ever goes to 0 mA, something is wrong.  The same can not be said for a 0…10V sensor.  Zero volts could mean zero position, or it could mean that your sensor has ceased to function.

In some cases, 4…20 mA sensors can be slightly more costly compared to 0…10V sensors. But the cost difference is becoming smaller as more sensor types incorporate current-output capability.

For more information on linear position sensors, click here.