0

I want to operate a 1-inch length of nichrome 80 wire, 26 gauge, at 200-600 degrees Farenheit. This type of wire has 2.657 Ohms of resistance per foot. So, for 1″, there would be 0.22 Ohms of resistance in the wire. According to my amperage chart I will need between about 1 amp and 2 amps to operate in my desired temperature range.

The problem is that according to my calculations I would have to run at 0.5 Volts if using 2 amps and 0.25 Volts if using 1 amp. This seems like a very small voltage to me.

For example, my power supply, a standard benchtop supply has 0-3 amps and 0-50 volts. The Voltage meter on my power supply has gradations in 2 volts. In other words between 0 and 10 there are 5 tick marks. So, to run at 0.25 Volts, for example, I would have to have the gauge at 1/8th of tick mark–a tiny amount on the gauge. It seems like if I just nudged the Voltage knob a tad too much I could blow out the wire.

I am doing something wrong here? Do I need some kind of special, ultra-low voltage power supply, or are my calculations wrong in some way?

What if I put a resistor in series with the wire? That would increase the voltage needed, but I would still need to operate at a very exact voltage, right? For example, if I added a 5-Ohm resistor, then operating at 1 Amp would seem to require 5-Volts, a more normal voltage, but once again am I risking a burnout if I nudge the know slightly too far?