One of my goals in building this amp was to have more than just brute force power, it had to be clean. That's why I've spent a lot of time trying to find a way to test linearity that is simple, accurate, and repeatable. CW is not much of a problem; tune for the desired output with a given drive, and make sure that harmonics, efficiency, and grid current are acceptable. SSB linearity testing is a bit more troublesome though. All of the SSB testing to date has been using a continuous two-tone signal, either from a Yaesu Mark V FT-1000MP SSB exciter with a two frequency audio input, or a high level two-tone RF generator. I interpret the results in the conventional way: by viewing the IMD on a spectrum analyzer and measuring how far below PEP output it is. For routine testing I prefer the Yaesu, because the setup is simple, drive power is easily controlled, and it is reasonably clean, especially in the MK V's class A mode. For more accurate tests you need a drive source that is much cleaner (at least 10dB less IMD) than the amp under test, and for that I use the high level two-tone generator shown in the diagram below:
This system uses two CW signal generators, 2 separate amplifier chains, and a combiner to sum the RF tones together. When possible, I also use a pad at the output of the combiner to improve the amp input termination and increase isolation between the amplifier chains. The problem with continuous two-tone testing with either of these methods is that the peak to average ratio of the test signal (2:1) is much higher than that of typical voice (maybe 5 or 8 to 1). The result is a test that is easily interpreted in terms of quantifying the level of distortion generated, but is far too severe to be representative of normal voice operation. The load on the power supply drags the anode voltage down enough that the adjustments obtained won't be optimal for voice.
I'm currently working on a generator that will produce a lower (adjustable) duty cycle signal but still allow me to measure IMD on a spectrum analyzer and get the familiar "dB below PEP" type results. A signal like a two-tone burst properly gated at the zero power crossing would probably be a good place to start.
Another tool I use for distortion testing in the time domain is the system pictured below:
This circuit allows me to immediately see the effects of adjustments without having to wait for the 5 second spectrum analyzer sweep (100Hz RBW, 100Hz VBW) required when doing a close spaced (1kHz) two-tone test. You may think I'm impatient, but 5 seconds is a long time when you're trying to make interactive adjustments with a hot, sweaty, noisy amp blowing several kW of heat on you! In theory, signals at the input and output of the amp should be identical, except for the gain scaling. This circuit takes samples of the RF input and output of the amp under test, adjusts them for the amp gain, detects them with log amps, and displays the difference on a scope. The fixed and variable attenuators, along with the coupling factor of the directional couplers, are adjusted to set an equal and appropriate level at the log amp inputs. Any difference in the log amp outputs represents a nonlinearity in the amp under test. I used log amps because they are well matched and I had them handy, but I wouldn't recommend them if you want to build something like this. They have far more dynamic range than you need and tend to overemphasize the relatively unimportant low level signals. A pair of matched envelope detectors would be fine. With this setup it's easy to see when distortion is present, and identify what part of the RF envelope it's associated with, giving you a clue as to how to correct it (increase bias, increase loading, etc). Once the distortion is minimized, it can be quantified by measuring the IMD on a spectrum analyzer. Below are scope traces showing typical waveforms with a two-tone test signal. Displayed are log difference output, YC156 amp input, and YC156 amp output (scaled to not overlap). The dips in the error waveform are partly due to the high dynamic range of the log amps; 80 dB of range isn't needed in this application, it only makes the system more susceptible to noise and the plots harder to interpret. I won't go into excruciating detail explaining the photos (is anybody really interested?) , just that what you want see is that the output of the amp under test is tracking the input (a log amp difference of zero = a flat trace on the scope). For reference, the log amp scale factor is ~35mV/dB; scope 20mV/div.
Here are some more example waveforms and their spectral equivalents:
Here is a table summarizing some of my test results with:
Note the large drop in anode voltage from the nominal no load value of 7000 V (in part due to the wimpy wire in my AC mains......I'm working to correct this). For details on the AC input characteristics (current, power factor, etc), see the AC data. With voice modulation, the cap bank is able to hold the voltage much higher and fairly constant. These are early results, and operating conditions will need to be optimized further. In particular, bias should probably be reduced and lighter loading used, since IMD performance looks like it's better than it needs to be. Hopefully the data presented here will be useful and encouaraging to those of you who are considering using this tube. I expect to have test data reflecting actual voice performance available soon.
Reduced bias from 280 to 200mA. The result is more nonlinearity at the low power part of the waveform. Error trace now fills the screen, but IMD is still OK at -45dBc.