What is the typical signal propagation loss to be expected from a 2 meter/70cm J-Pole antenna mounted in an attic where the roofing material is the common 5/8" plywood and asphalt shingles, compared to the same antenna mounted outside at the same height?
How much loss can be expected? Is it of the magnitude of reaching only half the distance, or is the loss less significant?
A related question. Since there is signal loss the longer the cable run especially at VHF/UHF, even using decent cable such as Belden 9913 air-dielectric coaxial cable, or its equivalent in other brands---Flexi-4, LMR-400, might I be better off with a 6' cable run to locate the J-pole in a corner of my ham shack, rather than mount it 9 feet higher in my attic that requires a 25' cable run?
This is a cable length versus antenna height tradeoff question.
How much loss can be expected? Is it of the magnitude of reaching only half the distance, or is the loss less significant?
A related question. Since there is signal loss the longer the cable run especially at VHF/UHF, even using decent cable such as Belden 9913 air-dielectric coaxial cable, or its equivalent in other brands---Flexi-4, LMR-400, might I be better off with a 6' cable run to locate the J-pole in a corner of my ham shack, rather than mount it 9 feet higher in my attic that requires a 25' cable run?
This is a cable length versus antenna height tradeoff question.
Last edited: