5gA New York University (NYU) student research team pushed the envelope for millimeter wave network range in a recently conducted field test in rural southwest Virginia.

Setting up a millimeter wave transmitter on the porch of the country home of their professor Ted Rappaport, the students found that they could receive signals at distances of over 10 kms (6.2 mi) even when line of sight was obstructed by a hill or a stand of trees.

Overall, the team picked up millimeter wave signals emanating from Prof. Rappaport’s front porch at 14 spots as far as 10.8 kms away that were within direct line of sight and at 17 spots that were as far as 10.6 kms away where transmission was obstructed.

Equally significant, the millimeter wave transmitter needed less than 1 watt (W) of power operating at 73 GHz, according to an IEEE Spectrum news report about the millimeter wave network range testing.

“I was surprised we exceeded 10 kilometers with a few tens of milliwatts,” Rappaport was quoted. “I expected we’d be able to go a few kilometers in non-line-of-sight but we were able to go beyond ten.”

The FCC in June opened 11 GHz of new spectrum in the millimeter wave band to network operators for the development of 5G technology, IEEE Spectrum notes.

Millimeter Wave Network Range
Propagating extremely high-frequency radio waves (30-300 Ghz) that lie between the ITU’s super-high and far infrared frequency bands to transport digital information and communications through the air is one of a variety of emerging wireless technologies that telecom, computing and network industry participants are looking to as they forge ahead with development of next-generation 5G wireless broadband networks.

One problem with millimeter wave signals is that they resonate with oxygen and other molecules in the atmosphere and lose strength when it rains. That has limited their use to fixed broadband network connections between two stationary points over short distances of about 1 km (0.6 mi.).

The results of the NYU student research team’s field trial bode well for prospective carrier-grade applications, particularly regarding using existing cellular telecom infrastructure to provide 5G broadband services in underserved rural areas, according to Prof. Rappaport.

The results could broaden industry participants’ and regulators’ perspectives regarding the use of millimeter wave technology, University of Texas, Austin wireless networking expert Robert Heath agreed. “I think it’s valuable in the sense that a lot of people in 5G are not thinking about the extended ranges in rural areas, they’re thinking that range is, incorrectly, limited at high carrier frequencies,” he said.

Prof. Rappaport certainly sees strong potential. “The community has always been mistaken, thinking that millimeter waves don’t go as far in clear weather and free space—they travel just as far as today’s lower frequencies if antennas have the same physical size. I think it’s definitely viable for mobile.”

Others disagree. University of California, San Diego professor of electrical and computer engineering Gabriel Rebeiz pointed out that the field trial was conducted over two days when the skies were clear. Previous research has found that rain can reduce millimeter wave signal strength at a rate of 20 decibels per kilometer, the equivalent of nearly 100-fold per kilometer.

Despite its potential in urban areas, Rebeiz believes proposing to use millimeter wave technology in rural areas is a non-starter.

Join the Conversation

Leave a Reply

Your email address will not be published. Required fields are marked *

Don’t Miss Any of Our Content

What’s happening with broadband and why is it important? Find out by subscribing to Telecompetitor’s newsletter today.

You have Successfully Subscribed!