English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I'm using a small CRT from a small TV. I was just wondering how hot the cathode has to get before it starts to emit the electron cloud. Any info on the topic would be very helpful.

2007-03-16 15:48:50 · 4 answers · asked by ExpErImEntEr 1 in Science & Mathematics Physics

4 answers

Those of us who can still remember vacuum tubes recall that the filaments had to glow a dim orange--of course that worked, but cathodes were attached later for improved emmissions, and they glowed, too. I would suggest about 1,000 degrees, just what it takes to cremate someone.

By the way, did you know that our Creator is a man of science? In fact, he wrote a mathematical code for intelligent people to crack. Check it out at:

2007-03-16 16:02:49 · answer #1 · answered by Anonymous · 0 0

In a sense,m rocky is correct. The equation for thermionic emission discovered by Richardson and Dushman is
J = At^2e^-(W/kT)
where J is in Amps/m^2,
A = 1.20173 x 10^6 Amps/(m^2°K^2) (Richardson's Constant),
W is the thermionic work function,
k is Boltzmann's constant, and
T is temperature in °K.
W, the thermionic work function is determined empirically for different materials, and can be found tabulated in a handbook of physics.

Since the emitter loses its structural strength at temperatures close to its melting point, the cathode heater must be designed to produce the highest temperature possible below the point at which the material becomes plastic or melts. This gives the maximum possible current density from the emitter, and is a temperature generally in the "red-hot" range.

2007-03-16 16:51:54 · answer #2 · answered by Helmut 7 · 1 0

It needs to get red hot which is about 700C

2007-03-16 15:58:33 · answer #3 · answered by Anonymous · 0 0

Anything above absolute zero should do.

2007-03-16 16:15:38 · answer #4 · answered by jimmymae2000 7 · 0 1

fedest.com, questions and answers