English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If the power meter is said to be able to measure a certain range of rf power, why do they need the sensors? What exactly do sensors do for power meters?

2007-08-29 01:05:37 · 2 answers · asked by Bucfan 2 in Science & Mathematics Engineering

2 answers

AN RF power meter generally uses the heating effect of the RF energy to determine the power. The actual measurement is made by a sensor such as a thermistor that changes electrical properties in the presence of that energy. The sensing element is often in a bridge circuit in which two sensors (one exposed to the RF energy and the other identical one shielded from that energy) to get changes that are independent of ambient conditions and drifting of sensors.


An AC or DC power meter measures wattage and volt-amps. It uses a voltage and an amperage sensor simultaneously to get the information needed to do the arithmetic of combining those two numbers. If it is a very modern very smart meter (like Modicon's PowerLogic) it can also measure power factor, peak values, totals on all phases, and total power consumption. Each of those is derived from the two sensor readings and their timing.

2007-08-29 03:33:05 · answer #1 · answered by Rich Z 7 · 1 0

It depends on frequency and power input. The likes of an expensive Boonton or HP RF meter use a calibrated thermistor in the sensor head but use different sensor heads for different frequency ranges. Also using a more sensitive head gives a more sensible metering range than using attenuator pads in certain applications such as tuning a low power transmitter.

2007-08-29 01:15:02 · answer #2 · answered by Anonymous · 1 0

fedest.com, questions and answers