• You can now help support WorldwideDX when you shop on Amazon at no additional cost to you! Simply follow this Shop on Amazon link first and a portion of any purchase is sent to WorldwideDX to help with site costs.

AVG versus PEP?

RMS stands for "Root Mean Square", which is fancy mathematical formula for finding the average amount of power an amplifier can continuously produce. There is no legal standard for calculating RMS watts for an amplifier. Most amplifier makers get a "UL" Rating from the Underwriters Labratories to obtain the most 'true' RMS rating.

Since most amps sound their best when they are 'cranked', some musicians look at the Peak wattage more than the RMS rating. Also, tube-amplifiers are usually not rated by RMS.

IM not answering my own question here am I ?

Underwriters Laboratories (UL) doesn't rate equipment for anything but safety. The RMS ratings for amplifiers, and the testing procedures specified for deriving these ratings, come from the Federal Trade Commission.

True story: in the early 1970s, when the stereo craze was at full tilt but before the FTC had stepped in to mandate a level playing field, amplifiers were rated at "peak" power - usually into a 1 or 2 ohm load. Seeing an audio amplifier not much larger than one of today's HTs rated at 1500 watts "peak" (meaning that the reading was only microseconds' duration) was just ludicrous. I was visiting a local Radio Shack and noticed lots of stereo equipment with new signs advertising "RMS". I decided to play dumb, and asked one of the clerks about it. "Ah yes!" he explained. "That stands for 'Real Music Sound'".
 
Hmm, guess it just depends on how you do that measuring, doesn't it. Using a plain old sine wave...
Starting at '0' the wave climbs to a positive peak, descends to '0' again, then goes to a negative peak, and back to '0'. If you want Peak Envelope Power, measure the magnitude of one peak, positive or negative, drop the 'sign', and you've got it. Hmm again. That would make the 'average' of both positive and negative peaks equal to zero. How 'bout that! So, I think I'd only use the max value of one peak or the other, not both.
What's the average power in a positive peak of a sine wave? Or, how much 'work' can be done? Ain't 100%, is it? Nope, only about 0.707 time the total or peak value. Another 'Hmm'. Wonder where I've seen those numbers, 0.707 and 1.414? Bet they're related some way. Naw, can't be, it'd mean something I can't remember again, and I ain't looking in no stinking book!
- 'Doc

(all puns and sarcasm intended)
 
"What's the average power in a positive peak of a sine wave? Or, how much 'work' can be done? Ain't 100%, is it? Nope, only about 0.707 time the total or peak value...."

average power IS NOT .707 of the peak power. the effective RMS CURRENT that produces the SAME HEATING EFFECT in a pure resistance as a corresponding value of Direct Current IS THE RMS OR EFFECTIVE VALUE. the RMS or Effective value of CURRENT in a sine wave is equal to .707 times the peak value, OF CURRENT!, NOT POWER.

"sorry, you are wrong, no matter what the maximum output specs, these
radios are being modified.."

those "radios" and the amplifiers being used with them are being trashed. if it's a class c amplifier then you're starting out with trash. then you increase the trash by exceeding the drive input specs and operating voltages thereby radically altering and destroying any bias that's being used and then you destroy the transistors. then you replace them and start over again. in the meantime this "mentality" continues to be a major contributing factor that drives the cost of these transistors up.
 
Last edited:
Here is what you do...

  1. Get your RF Meter mfg on the horn.
  2. Ask to speak with the metrology dept or calibration lab.
  3. Ask them if the meter is true RMS/AVG/PEP
  4. Next, ask for NIST and calibration paperwork with real measured RF Power values for your meter.
  5. When they ask, ship your meter, pay the fee, wait for meter to arrive.
  6. Once received, make sure to return meter to factory for calibration once a year, or if used for agency (CSA)twice a year
  7. Calculate insertion loss of coax, connectors, and inline equipment
  8. Now calculate ERP
  9. Sip on a beverage of your choice (enjoy life), everything else is insignificant.
(y)

Truth be known for such places as Vectronics, Palstar, PDC, Dosy (most likely) and many others, they use the Bird for all their RF calibration.
 
I concur most RF meters are calibrated at FS or close to it as possible at least for analog (not sure about digital). Yes, for the moment these are rated at FS deflection. The problem however is the linearity of the scale for some of the economical meter movements.

I would prefer to see the actual measured RF power at certain percentages of the scale for each scale, then recording this value and comparing to a known calibrated source. Problem the $$$ of the meter goes up for such a service, and most likely it is overkill for the average end-user not in a certified ISO/IEC/IEE etc lab enviroment.

Using a scope? Guess it depends on the traceability of its accuracy as well.

:drool:
 
Underwriters Laboratories (UL) doesn't rate equipment for anything but safety. The RMS ratings for amplifiers, and the testing procedures specified for deriving these ratings, come from the Federal Trade Commission.

UL is an interesting entity, I deal with them quite often. UL has requested our electrical test lab to be ISO/IEC 17025:2005 "accredited" (ISO does not like the term "certified") next year. Since we perform UL testing in-house, all equipment used to measure AC V/A/W, etc., will need to have traceability with actual numerical values compared to a metrology standard. Meters must have a certain accuracy and number of decimal places and so on, and have written test procedures, etc. (don't want to bore with details).

Currently we are NVLAP accredited, but UL has added a few clauses that we must be incompliance with. What it boils down to is that they are cracking down and most likely want all calibration to be performed from a certified source, not in-house, due to a certain biasing factor.
 
Most "reputable" test equipment manufacturers provide a certificate with all new meters. It states something like "This meter, Fluke Model 87, Serial Number xxxxxx, has been calibrated against standards traceable to NIST on (date). This certificate is good for one year from date of sale."

This documentation accompanying the meter is just as good as an actual calibration sticker for that 12 month period. At about the 10th month, however, it's prudent to send it off for recalibration IF the job requires it.

To get the FULL calibration report, with a listing of all the standards used on your instrument (and THEIR pedigrees), will cost a bit more and it's generally not that necessary. The lab is required to keep these records, just in case their facility is looked at by somebody investigating fraud. Yes, it happens.

As for scale linearity with an RF meter -- the scale can be individually marked, if necessary. The big problem with this type of meter is the matching of the diodes. A little "off" usually translates to a relatively large scale discrepancy. Change diodes and the individually-engraved meter face may be worthless. However, with even the hallowed "Bird" having a stated max error of +/- 5% FSV, absolute accuracy in the field isn't likely to happen this side of a few thousand bucks.

"Using a scope?" They require periodic calibration just like other instruments, involving traceability to NIST.

I used to work in a Navy calibration lab at Puget Sound Naval Shipyard - a lab that would blow facilities of today right out the door. When OUR standards needed to be calibrated, we sent them down the street to the Secondary Standards Lab. They had to send their stuff to Boulder, Colorado - the Primary Standards Lab. I visited there twice. Almost scary - and NORAD wasn't far away!
 
Back to my rants...

I guess it depends on which accredidation standard is being utilized, the ISO/IEC 17025:2005 is a strict standard of NIST conformity, you need the actual calibration results for each scale under audit along with the maximum uncertanties calculated for each device in the circuit. When the UL auditor (big cheese) comes in for the yearly audit, we have to show the full monty on each piece of test equipment. Thing is--some UL auditors are more anal than others. Typically we send out our Rotek to be certified, then calibrate all test gear with the Rotek. CE requires every 6 months while UL once every 12 months. The uncertanties are only accurate if you keep the calibration within 12 months. If the UL inspector sees a lapse of even one day, then you lose all UL approval during that year, then you have to retest all products again under UL witness (yeah its that bad).

To keep the UL testing in-house, we need to comply to their rules which often differs from (i.e.) CSA, IEE and ISO. Lobbying to have UL sign-off in compliance to something like C390 isn't easy, as we have been working for nearly 10 years. It ain't UL approved until they sign-off and then we can use their sticker, CSA, CE is much more strigent from my 10 years experience. Of course each technician having to be UL certified and the monster amount of paperwork (more like a novel for each test) is truly a pain.

FYI The NVLAP NIST program is a yearly audit and costs well over $20,000 just to have an auditor sign off on all the data. Overkill for the standard end-user but handy when dealing with the federal government in energy efficiency standards for electrical motors.

:glare:

Some of the current transducers we use are Navy components. Count yourself lucky if you never have to fool with UL audits. The 5% of FS seems rather vague in nature but seems to work well for the end user but wouldn't fly for the ISO/IEC standards in some labs.

So in essence your basic RF wattmeter is just that. Is it accurate, fairly. You get what you pay for in this case.
 
Last edited:
this is it guys no more pep!

" So 100 average watts is about 140 Pep watts"

the above statement is wrong, for the second time. furthermore meter accuracy is not specified in power but in a percentage of error for total power measured referenced to center scale.

remember this?
PEP = AVG X 2.66

100WAVG = 266WPEP
now if you're still able to follow along we're talking about a difference of 126W.
reread my previous post until you get it.
that is the best post i have read on this site!thank you.too bad its so hard to convince people that average power is correct and accurate!
 
You missed the part where he agreed that PEP is the only true measurement, for AM, inclusive of SSB...

and that formula only works under a special case.
 
BTW, he really should not call it a modulation index of 100%. The index value would be 1.
 
And what I don't understand is why mix the two types of measurements if there's no particular reason. Oops, forgot! It's a 'big numbers' thingy, right?
- 'Doc
 
And what I don't understand is why mix the two types of measurements if there's no particular reason. Oops, forgot! It's a 'big numbers' thingy, right?
- 'Doc
its a real good way to sell radios 10 real watts in voila 30+pep out SOLD!
 

dxChat
Help Users
  • No one is chatting at the moment.
  • @ Wildcat27:
    Hello I have a old school 2950 receives great on all modes and transmits great on AM but no transmit on SSB. Does anyone have any idea?
  • @ ButtFuzz:
    Good evening from Sunny Salem! What’s shaking?