OK.... I put a new Shakespeare Phase111 antenna on today 8)
So I looked this up to get how far out I should get in a perfect situation
I have a total height of boat and antenna of about 18 feet so I should get out about 25 miles
Just thought Id pass the formula along :roll:
I am not a scientist i just play one on Classic Parker :shock:
The space wave signal path is the so-called "line-of-sight" path between the transmit and receive antennas. The curvature of the Earth is the primary limiting factor for the maximum distance a space wave propagated signal can travel.
If you are into math, the approximate distance (in miles) to the radio horizon can be calculated by multiplying the square root of the antenna height (in feet) by 1.415 times. For example, the theoretical distance to the radio horizon for an antenna 1,000 feet above the ground is just under 45 miles.
The distance, D1, to the radio horizon for the transmitter is 1.415 times the square root of h1 (feet). The theoretical maximum line-of-sight distance between two elevated points, presumably the transmitter (h1) and the receiver (h2), is the sum of the two distances to the radio horizon (D1 + D2).
All this, of course, assumes that the Earth is a perfectly smooth sphere, and that no signal disturbances or enhancements occur along the path between the transmit and receive points. As we know, the Earth is not a perfect sphere, and the space through which the radio signal travels is not perfect either.
So I looked this up to get how far out I should get in a perfect situation
I have a total height of boat and antenna of about 18 feet so I should get out about 25 miles
Just thought Id pass the formula along :roll:
I am not a scientist i just play one on Classic Parker :shock:
The space wave signal path is the so-called "line-of-sight" path between the transmit and receive antennas. The curvature of the Earth is the primary limiting factor for the maximum distance a space wave propagated signal can travel.
If you are into math, the approximate distance (in miles) to the radio horizon can be calculated by multiplying the square root of the antenna height (in feet) by 1.415 times. For example, the theoretical distance to the radio horizon for an antenna 1,000 feet above the ground is just under 45 miles.
The distance, D1, to the radio horizon for the transmitter is 1.415 times the square root of h1 (feet). The theoretical maximum line-of-sight distance between two elevated points, presumably the transmitter (h1) and the receiver (h2), is the sum of the two distances to the radio horizon (D1 + D2).
All this, of course, assumes that the Earth is a perfectly smooth sphere, and that no signal disturbances or enhancements occur along the path between the transmit and receive points. As we know, the Earth is not a perfect sphere, and the space through which the radio signal travels is not perfect either.