• You can now help support WorldwideDX when you shop on Amazon at no additional cost to you! Simply follow this Shop on Amazon link first and a portion of any purchase is sent to WorldwideDX to help with site costs.
  • A Winner has been selected for the 2025 Radioddity Cyber Monday giveaway! Click Here to see who won!

How to perform the 2sc2999 and Schottky diode swap

I guess I was one of the few to watch it due to this thread. It was confirmation of something that I already knew and discovered back a few years ago when I was working on broad-banding my Cobra 2000 (being the boards are almost identical with the 148GTL).

I just went to the band edges with my SG and took notes and came up with the same results. I didn't like the peak at centre and flattened things out. I could see the same graph in my "mind's eye".


Yes, I think you tuned it the best way :)

I think Uniden 'tried' to include something about how to do this 'flatter tuning' in the service manuals for some Uniden export CBs.

However, I believe the instructions got corrupted in the Uniden document either due to typing errors or translation etc.

For example, on some of the 120ch Uniden boarded export radios that were very popular here in the UK the alignment instructions sometimes tell the operator to screw one transformer core to the bottom whilst peaking others. Then peak up the 'bottomed' transformer core.

This type of instruction is given for both Tx and Rx circuits. I think they do this to try and prevent over peaking of the Tx and Rx strips and to try and achieve a flatter tuning result.

However, I believe they typed up the wrong core numbers in the service instructions in some cases for the receiver alignment and this messed up the whole point of the exercise LOL.

I saw lots of badly tuned up mk2 Cobra 148GTL-DX radios where the radio got peaked up too much giving very low gain on the outer channels. This became even more noticeable when the radio was expanded to 160- 200 channels as the 148 then gained a reputation for being very deaf on the outer blocks of 40 channels.

The official Uniden alignment instructions for this radio have several typing errors some more obvious than others :)

Be interested on you do the entire receive alignment procedure. After all, L8 & L9 are also VERY critical adjustments too. The theory behind it would also be very helpful. I was going to start a thread on the subject of radio receive alignments; but I'd rather that you did it. Could prove to be very useful and informative . . .

Sadly I don't have the time to do this plus I'm not a very confident presenter/speaker and I don't really enjoy making presentations and I don't even like watching my own videos... :(
However, at some point I may do a few videos about general RF theory to help people. I think a lot of the RF tutorial videos on youtube tend to dive into maths and charts and lots of 'terms' and this isn't the way to cement this stuff into the minds of the viewer.
 
Do it at your own pace. Make installments to that thread that you start. It can become a journal that can be viewed and shared. Viddys and diagrams could be used. More than that, the move from circuit theory to practical involvement which would have seldom been done on any forum. This kind and caliber of info empowers the readers.

Just a thought game - if nothing else . . .
 
Well it's interesting that people still are interested in doing the 2SC2999 swap... :)

Why not measure S/N ratio at the AF out put of the radio using a calibrated signal generator at the aerial port and a true rms meter at the audio jack? That's going to have less measurement uncertainty by a long way :)

...

Wouldn't try to measure audio S/N with an RMS AC voltmeter. The LF 8556 section of the HP will do that though, with a 1 KHz test signal modulated on the carrier from the signal generator. That's a good test to add, along with the carrier to noise ratio tests. The diode switching noise in AM demodulation will make the net audio S/N worse than FM, but its a good test to add to the plan.

I plan to do that test before and after the diode swap, to get some numbers on the reduction of the audio noise floor.

I have already modeled the C/N improvement I predict to see with the RF amp swap (2sc999E). If I raise the sky temperature high enough (4000K or so), then the realized C/N improvement will be in the less than 1dB category. That's what the test is about, to put real numbers behind the mod, with a real environment. The stated sensitivity of the unit is 0.5uV for 10 dB audio S/N. THe are 10 and 12 meter SSB ham rigs that have twice and three times that sensitivity, and CB radios half as senstitive.

As far as the actual system noise temp (including the antenna-ground-sky), I plan to measure that noise power (relative change) with my current 2SC1730. That is, connected to a 50 ohm load, and then connected to the vertical 1/2 wave dipole, on a couple of different frequencies.

I am having trouble locating a real 2SC2999E currently, but will find one soon.

As far as results, they are relative, since the equipment and methods won't change, just a few components. So calibration and ageing are second order error factors. I'm intrested in 1st differences here.
 
Very interesting information here, I would tend to agree. Although my testing methods are not as in depth as yours, I have found on the simple hfe test the numbers of the 2sc2999 to be at or below the results of the part being replaced.

I have a few of these left here although I am not sure of the "e" denotation it has been awhile since I have seen them. They were manufactured by Sanyo.
 
  • Like
Reactions: rabbiporkchop
Using the stated sensitivity of the radio of .5uV for 10 dB S/N-

Equates to ~ -112dBm carrier power at the input to the radio.

With a 1 dB gain antennna with 1 dB cabling loss to radio, the dBW/M2 at the antenna aperture is -152 dBW at 27 MHz, with a working BW of 3 KHz.

System Noise temp calculated with two different antenna noise temperatures-

2000K and 6000K

LNA NF varied from 1 to 8 dB.

Resultant C/N dB values at 7.8 MHz I/F-



LNA NF Antenna Temp K Antenna Temp K
2000 6000

1 15.78 11.45
2 15.54 11.36
3 15.25 11.25
4 14.91 11.11
5 14.51 10.94
6 14.06 10.74
7 13.56 10.5
8 13.00 10.46


If, on the other hand, the antenna temp was 290 deg K (ideal temp) and there was no ohmic line loss to the radio, the C/N would vary in direct 1 to 1 proportion to the NF of the first stage (~).

I didn't expect the latter, but do expect to see a small but measurable (relative) improvement, using the Hp 141T and 8553B RF plug in and 8552 IF plug in.
 
Wouldn't try to measure audio S/N with an RMS AC voltmeter. The LF 8556 section of the HP will do that though, with a 1 KHz test signal modulated on the carrier from the signal generator.

To be fair the true rms meter suggestion was against your previous (flawed) test method where you wanted to couple the analyser direct to the collector of the 2SC1730.

This flawed method will load the 2SC1730 circuit in an unnatural way. Also your 8553 on its own won't be sensitive enough to make it easy to measure the absolute change in noise level here anyway. I haven't used a 141/8553 for a great many years but I didn't think its sensitivity was much different to many other HP analysers.

Even allowing for the gain and NF of the (unnaturally loaded) 2SC1730 amplifier the 8553 will still be too deaf to show the real difference if you were to reduce the CB amplifier NF by 2dB. You would get better results if you inserted a preamp at the analyser and took into account its gain and noise figure as this would improve the deafness of your analyser.

But even if the 8553 'was' sensitive enough (it isn't) then you STILL don't know the real benefit of the mod because you aren't measuring the noise figure of the whole CB receiver path. This is because the system NF will not be dictated by the first RF amplifier alone. You need to include the noise contribution from the later stages as well or you will get a result that makes the mod look better than it really is.

You can still use the 8553 to make a reasonably valid measurement of the NF of the CB but you need to use it in a better (more scientifically valid) way :)

By the way I'm not familiar with the CB model you are modding but is it the same inside as a President Adams? I haven't used an Adams either but I do have a schematic here for one.

I'd expect the radio to have a noise figure that is better than 10dB but it's hard to tell from just looking at a schematic. The schematic I have shows the 2SC1730 as a common emitter amplifier and the first mixer is a dual gate mosfet.
 
Last edited:
See post #145 of this thread. It measures. I also use this rig (has both a 50 ohm and a 600 ohm input impedance) to align FM tuner RF/ IF sections, constantly.

Our mainstay analyzer (Hughes Space and Communications) in the 80s was the HP 8566 (2-22 gHz model). It was 60K then.

This radio is a TRC 458.

For NF of stages in Cascade-

http://www.minicircuits.com/applications/mcl_nf_calc.html

The other thing that can desensitize this receiver is oscillator phase noise, which would show up downstream of TR5 in the mixer. Other than that, the NF of TR5 dominates the NF of the receive chain.
 
  • Like
Reactions: rabbiporkchop
OK I looked up the 8553 manual and it suggests your analyser has a 24dB noise figure. So this will make it very difficult to see a small improvement in NF of the first amplifier stage in your CB radio unless you add a preamp in front of the 8553.

Why not punch some numbers into a gain/noise spreadsheet to see what I mean?

i.e. model the analyser as a high gain stage (with 24dB noise figure) and place it after the RF amplifier.

You will see it will not be able to display a 2dB reduction in NF of the stage before it because it is so deaf. i.e. the noise floor won't drop 2dB if you drop the NF of the RF amplifier by 2dB.
However, if you place a preamp in front of the analyser then you will find it will be MUCH easier to measure (and quantify) any NF improvement in the CB amplifier stage.

Our mainstay analyzer (Hughes Space and Communications) in the 80s was the HP 8566 (2-22 gHz model). It was 60K then.
You would have the same problem (possibly slightly worse) if you used the HP8566B as it has similar sensitivity to the 8553?
 
Last edited:
PLease read through-

http://www.teknetelectronics.com/DataSheet/HP_AGILENT/HP_8443a.pdf

with a 10 Hz resolution BW ( slow scan in storage mode), it has enough sensitivity to measure signals above a -140 dBm noise floor. That's way below the quiesent floor of the radio.

I'll post the data when I get it.

Also, TP 13 is not the same as the TR5 collector, which is on the other side of the transformer, I think you can see.

This small doc has a good review of resolution bw and effect on the noise floor. Have to use the storage mode and slow the scan way down.

http://www.gwinstek.com/en/knowledge/kb/981028 An Introduction to Spectrum Analyzer.pdf

I think the two most quantifiable (usefull) data I can obtain easily about the performance improvements from the transisitor-diode swap are to

1. measure the C/N of the last stage of the IF chain just before detection, prior to and after the 2SC2999E swap. A raio of these two C/Ns is equivalent to the delta NF at this point in the receiver, since Noise Factor (or noise figure if in dB) is equal to [S/N (out) / S/N (in)]. The measurements are relative, and although they wont indicate the actual NF of the receiver, they will indictate the improvement, if there is any. I'll use the RF gain control to keep the carrier peaks at the same level for pre swap and post swap measurements, since the changes will be in the noise floor levels. RF gain contrl changes (if required) will also be measured.

2. measure the S/N at the speaker terminals while driving the RF AM signal generator with a 1 KHz test tone at mid level S units, before the transitor swap, after the transistor swap, and then after the diode swap. Again, the noise floor below the 1 KHz signal will be measured, so that an audio improvement can be quantified. Peak level of the signal with the analyzer will be normalized between the two tests with volume knob.

Also plan to measure how much additional noise is present in the I/F chain conencted to outside antenna versus terminated, for various RF gain levels. This gives a clue as to how much noise there is at 27 MHz in the environment.
 
Last edited:
with a 10 Hz resolution BW ( slow scan in storage mode), it has enough sensitivity to measure signals above a -140 dBm noise floor. That's way below the quiesent floor of the radio.

Assuming your 8553 is unmodified then we can agree the fact below between us:

The -140dBm sensitivity in a 10Hz BW equates to a receiver (analyser) noise figure of 24dB. i.e. it has a very deaf receiver compared to a CB radio.

If we can agree your analyser has a NF of 24dB then if you place your CB RF amplifier ahead of it with maybe 18 to 20dB gain and maybe a 7dB noise figure then the 'system' noise figure will not be 7dB any more because the deafness of the analyser will degrade the system noise figure a couple of dB.

This will make it difficult for you to quantify subtle changes in the 7dB noise figure of the amplifier. You can overcome this issue by fitting a low noise preamp in front of the analyser.

I'm just trying to make sense of your test setup and help you make a better measurement :)
 
measure the C/N of the last stage of the IF chain just before detection, prior to and after the 2SC2999E swap. A raio of these two C/Ns is equivalent to the delta NF at this point in the receiver, since Noise Factor (or noise figure if in dB) is equal to [S/N (out) / S/N (in)]. The measurements are relative, and although they wont indicate the actual NF of the receiver, they will indictate the improvement, if there is any. I'll use the RF gain control to keep the carrier peaks at the same level for pre swap and post swap measurements, since the changes will be in the noise floor levels. RF gain contrl changes (if required) will also be measured.

2. measure the S/N at the speaker terminals while driving the RF AM signal generator with a 1 KHz test tone at mid level S units, before the transitor swap, after the transistor swap, and then after the diode swap. Again, the noise floor below the 1 KHz signal will be measured, so that an audio improvement can be quantified. Peak level of the signal with the analyzer will be normalized between the two tests with volume knob.

Also plan to measure how much additional noise is present in the I/F chain conencted to outside antenna versus terminated, for various RF gain levels. This gives a clue as to how much noise there is at 27 MHz in the environment.

I see you just added the above text as I posted my last reply...

The above methods are a LOT better and I think you will get much more meaningful results :)

The reason I think your analyser has a 24dB noise figure is because:

kTB noise is -174dBm/Hz
10Hz is 10dBHz
noise floor quoted as -140dBm in a 10Hz BW

So the NF of your analyser is -140 -10 - (-174) = 24dB
 
I see you just added the above text as I posted my last reply...

The above methods are a LOT better and I think you will get much more meaningful results :)

The reason I think your analyser has a 24dB noise figure is because:

kTB noise is -174dBm/Hz
10Hz is 10dBHz
noise floor quoted as -140dBm in a 10Hz BW

So the NF of your analyser is -140 -10 - (-174) = 24dB

Agree with your last, I use this nomograph to get a quick visual of noise floor at various NFs and BWs. I extended the right side BW line down at the same decade spacing to 10 Hz. Even a 24 dB NF will be very sensitive, with a narrow enough BW.

http://www.vtiinstruments.com/File/VTINotes/Noise_Figure_Nomograph.pdf
 
Even a 24 dB NF will be very sensitive, with a narrow enough BW.

Yes, if you select 10Hz BW this is true for discrete and narrow signals that are less than 10Hz wide but you are in the business of measuring noise levels so going down to 10Hz RBW doesn't make the analyser more capable in terms of measuring low noise levels.

eg if I inject a 24dB ENR (Excess Noise Ratio) noise source (eg made by NoiseCOM) into an analyser that has a 24dB noise figure then it doesn't really matter what analyser RBW I try and measure/confirm the ENR of the noise source.

The analyser won't be able to measure it accurately because it is too deaf and will contribute its own noise regardless of the RBW setting. So it will measure the ENR incorrectly by about 3dB regardless of RBW setting. eg selecting 10Hz RBW instead of 100Hz RBW will be futile. The analyser is simply too deaf to measure the noise accurately. All that will happen with 10Hz BW is the noise level shown on the display will drop 10dB but the noise arriving at the detector from the noise source will also drop 10dB because the operator narrowed the analyser detection window to 10Hz bandwidth. So you are no better off by selecting 10Hz RBW?

You are actually worse off from a human point of view because the analyser will take a lot longer to sweep a 10kHz span with a very narrow RBW.

If the ENR of the noise source was reduced by 2dB then the analyser would not display a 2dB change in noise level. Because it is too deaf.

The same thing will happen with a CB amplifier in front of the analyser because the effective ENR of a CB amplifier with 18dB gain and 6dB noise figure will be 24dB. i.e. the amplifier will look like a weak noise source to the analyser. But the noise is too close to the analysers own noise figure for it to measure properly.

If the noise figure of the CB amplifier were reduced by 2dB (so the ENR was 22dB) then the analyser could not measure this 2dB change accurately.

But if you added a low noise preamp to the analyser then this would change :)
 
Yes, if you select 10Hz BW this is true for discrete and narrow signals that are less than 10Hz wide but you are in the business of measuring noise levels so going down to 10Hz RBW doesn't make the analyser more capable in terms of measuring low noise levels.

Wrong on that.

Think of the 10 BHz BW as a sliding window through the wider BW signal. Thats why the storage mode is used, so that the information measured (slow scan rate) is reatined. It takes seconds to pass through the signal of interest this way. Noise is propotional to KTB. Bandwidth is set low (10 Hz) to minimize the product of these parameters. The analyzer is looking way below the floor of the CB radio, which has a large simultaneous bandwidth albiet a better NF, while the analyzer is looking at only a fraction at a time. KTB sets the stage for what can be done. Analyzer tackles the problem by using very narrow BW filters, and a history (memory) of the signal, so that over time, the whole BW of intrest is sampled.
 
But you can't dodge the 24dB noise figure limitation of the analyser when you are trying to measure NOISE levels.

We can agree that kTB noise in a 1Hz bandwidth at room temperature is -174dBm/Hz.

So.... if an analyser has a 24dB noise figure at 27MHz then it can be modelled as a receiver with a wall of noise at its input set at -174dBm/Hz + 24dB = -150dBm/Hz. That is -150dBm of noise power inside a 1Hz bandwidth.

If you put a CB amplifier ahead of this with a 6dB noise figure and 18dB gain the amplifier will also produce a wall of noise at -150dBm/Hz.
This is because -174 + 18 + 6 = -150dBm/Hz.

Once the two -150dBm/Hz noise levels meet at the analyser input then the noise level will rise 3dB (i.e. double the noise power) to -147dBm/Hz and this is what the analyser will display if it had a correctly calibrated 1Hz noise marker function. So the 'deaf' analyser gets the measurement of the CB amplifier noise level WRONG by about 3dB.

How are you going to be able to 'fix' this problem by selecting 10Hz bandwidth?

It's already a problem at 1Hz bandwidth.

If you select 10Hz BW then the noise from the CB amplifier will be -140dBm/10Hz and the noise contributed by the analyser will be -140dBm/10Hz.

So the analyser will read -137dBm/10Hz for the combined noise. It's STILL a measurement error of 3dB.

You haven't gained ANYTHING by swapping from 1Hz to 10Hz or vice versa. The analyser will still measure the level of the noise from the CB amplifier with about a 3dB error.

If you go to 100Hz RBW the analyser will display -127dBm/100Hz because (in a 100Hz RBW) the wall of noise at the analyser input is -130dBm/100Hz (because it has a 24dB noise figure) and the noise produced from the CB amplifier will also be -130dBm/100Hz. That pesky 3dB error is still there... :)


The other way to look at it is if you inject the same 24dB ENR noise source into a typical SSB CB. Where the deaf analyser can only display a change of 3dB in noise level once the noise source is turned on (even at 10Hz RBW) , the CB will show a change in noise level at the speaker of about 15dB despite the fact it has a much wider bandwidth.

i.e. the deaf old analyser thinks the ENR of the noise source is 27dB even when measured in a 10Hz RBW but the (10dB NFigure?) CB shows it is closer to 25dB. Both are wrong (correct answer should be 24dB) but at least the CB did a better job of measuring the noise level from the noise source because it has a noise figure that is about 14dB lower.
 
Last edited:

dxChat
Help Users
  • No one is chatting at the moment.