A typical SWR meter will show an SWR change when different output power levels are used. Not because the SWR is actually changing, usually, but because the typical SWR meter isn't capable of a wide range of input power without some compensation (re-calibrating/changing the slug/etc.). The most common instance where a change in power level would make a difference in the actual SWR meter between the amplifier and the antenna is when some portion of that 'equipment string' changes impedances. (Output of the amplifier not really 50 ohms as it should be, probably the most common, for whatever reason.) If any SWR meter (adjusted correctly) sees 50 ohms coming in one side of it, and sees 50 ohms on the other side of it going out, the SWR is 1:1. Changing the power level without correcting for it, means the meter is just being over loaded, so is operating out of it's "normal" range. Then there's really no telling what you might see. (Why they usually have different power range switches, right?) If there's a change in SWR it just means there's been a change in the impedances from one side of that meter and the other side. Find-n-fix the change where the change occurred... or make sure you've set up the meter correctly.
- 'Doc
been a long 'day-off' but had to work anyway. Top that off with trying to fix a Vista and program problem. At this point nothing is making sense so to @#$$ with it, I'm going to bed. Best advice is to speak firmly with your meter. Tell it what you enjoy doing with large hammers and naughty meters... Doesn't help, but it's satisfying as @#$$ !