Understanding Anisotropic Magneto Resistance from Latest AMR-Based Current Sensors
Anisotropic magneto resistance (AMR) is a phenomenon that occurs when resistance builds as the current moves in the same direction as the magnetic force, yet declines at a 90-degree angle of the magnetic force. New components such as high-bandgap semiconductors are now integrated into power systems due to the high speeds at which they can switch high current.
Inside AMR Current Sensors
A strong magnetic field is generated in proportion to the amount of current passing through a wire, as summed up by Biot-Savert Law. This process involves three steps. First, the wire delivering the current is installed into or below the IC, generating a magnetic field. Then the magnetic field is converted to voltage through an analog circuit. From there the voltage must be converted into a useable format.
The sensor connects with the circuit as the current is measured in magnetism as the chip is electrically isolated, behaving like a transformer. This type of technology is used to deliver IoT information and is gaining favor with various manufacturers.
Two versions of these AMR components are now available, differentiated by voltage (3.3V and 5.0V). The AC or DC current carrying capacities extend from 5 to 50 amps. The main current path resistance of 0.9 milliohms for these components with 4.8 kV isolation consumes 4.5 mA. The operating temperature ranges from -40 to 105 degrees C.
The devices are calibrated at the factory to ensure a low offset error while providing an exact analog voltage output, which is proportional to the current. The current sensors can eliminate sensitivity with respect to stray magnetic fields. Additional filtering is unnecessary due to the low output noise.
Power systems can gain greater efficiency by using small filtering semiconductors based on GaN and silicon carbide materials. AMR devices are helpful for measuring switched current in real time and delivering this information to the rest of the system.