How many milliohms would it take to trip a 100 milliamp circuit?

1 answer

Answer

1262371

2026-04-19 08:30

+ Follow

To trip a 100 milliamp circuit, the resistance in milliohms that would cause a current of 100 mA can be calculated using Ohm's Law (V = I × R). Assuming a typical voltage of 120 volts, you would rearrange the equation to find R = V/I. This gives R = 120V / 0.1A = 1200 ohms, or 1,200,000 milliohms. However, if you're asking how much resistance is needed to trip a circuit breaker rated for 100 mA, it would depend on the specific characteristics of the breaker and the circuit. Generally, for a circuit breaker to trip, the resistance would need to effectively exceed safe limits, which can vary based on the breaker design.

ReportLike(0ShareFavorite

Copyright © 2026 eLLeNow.com All Rights Reserved.