Will your next-generation cellphone be unable to see the forest for the trees? The National Institute of Standards and Technology (NIST) and industry partners are working to measure trees' effect on millimeter waves, an effort that could help improve next-generation devices' ability to detect 5G antennae.
Millimeter waves, the new class of signals 5G cell networks will use, can carry more information than conventional transmissions and occupy a portion of the broadcast spectrum that communication technologies seldom use. However, millimeter waves also have drawbacks, including their limited ability to penetrate obstacles. These obstacles include buildings, but also the trees that dot the landscape.
Until recently little was known about how trees affected millimeter wave propagation, and few 5G cell network designers will be able to plan networks around it without such a crucial fundamental detail.
The 5G era will feature wireless communication not only between people but also between devices connected to the Internet of Things. The wireless industry is pursuing speedier, more effective communication to improve the performance of existing devices and services and help realize new ones. Autonomous vehicles, for example, will depend on such quick network response to function.
“We will be able to do new things if our machines can exchange and process information quickly and effectively,” said Nada Golmie, head of NIST’s Wireless Networks Division in the Communications Technology Laboratory. “But you need a good communication infrastructure. The idea is to connect, process data in one place and do things with it elsewhere.”
Millimeter waves, which are new turf for the wireless industry, could be part of the solution. Their wave crests are just a few millimeters apart — a very short distance compared with radio waves that can be several meters long. And their frequencies are very high, somewhere between 30 and 300 gigahertz, or billion wave crests per second. Compared with conventional radio transmissions, which are in the kilohertz (for AM) and megahertz (for FM) ranges, new 5G signals will be very high frequency indeed — something like a bird tweeting at the upper range of human hearing compared with radio’s deep, low bass.
It is millimeter waves’ high frequency that makes them both tantalizing as data carriers and also hard to harness. On the one hand, more wave crests per second means the waves can carry more information, facilitating faster downloads and network responses. On the other, high-frequency waves have trouble traveling through obstructions. Anyone who has passed near a house or car whose occupants are playing loud dance music knows that the throbbing bass frequencies are most of what reaches the outdoors, not the treble of a lilting soprano.
For 5G networks, the obstructing wall can be no more than an oak leaf. For that reason, NIST scientists embarked on a somewhat unusual task in September 2019: They set up measurement equipment near trees and shrubs of different sizes around the agency’s Gaithersburg, Maryland, campus. The study continued for months, in part because they needed seasonal perspective.
“The tree study is one of the few out there that looks at the same tree’s effect on a particular signal frequency through different seasons,” Golmie said. “We couldn’t only do the survey in the winter, because things would have changed by summer. It turns out that even the shape of leaves affects whether a signal will reflect or get through.”
The team worked with the wireless community to develop the mobile equipment that was needed to take the measurements. The researchers focused it on single trees and aimed millimeter-wave signals at them from a range of angles and positions, to simulate waves coming from different directions. They measured the loss, or attenuation, in decibels. (Each 10 dB of loss is a reduction by a power of 10; a 30 dB attenuation would mean the signal is reduced by a factor of 1,000.)
For one type of leafy tree, the European nettle, the average attenuation in summer was 27.1 dB, but it relaxed to 22.2 dB in winter when the tree was bare. Evergreens blocked more of the signal. Their average attenuation was 35.3 dB, a number that did not change with the season.
(As a measure of comparison, the team also looked at different types of building materials. Wooden doors, plasterboard walls and interior glass showed losses of up to 40.5 dB, 31.6 dB and 18.1 dB, respectively, while exterior building materials exhibited even larger losses, up to 66.5 dB.)
While NIST’s contributions to 5G network development effort could end up as ubiquitous as trees themselves, for most of us they will be considerably less visible. The measurements the team made are intended mainly for companies that create models of how different objects affect millimeter waves. Part of the effort was a collaboration with Ansys Inc, an engineering simulation company in Boulder, Colorado. The company used NIST's measurement data to tune the tree simulation models, which cell companies use to plan out their networks of antennas in detail.
“Most models don’t include measurement-based information about trees,” said NIST’s David Lai, one of the scientists who conducted the study. “They might simply say that for a given tree-like shape, we should expect a certain amount of signal loss. We want to improve their models by providing accurate measurement-based propagation data.”
NIST’s collaboration with Ansys contributed to guidance issued by the International Telecommunication Union (ITU), the organization that creates guidelines for telecom standards. The results now appear as a new section on trees in ITU’s Recommendation ITU-R P.833-10. This publication serves as a reference for signal propagation models, which others will develop.
“Our goal is to get these measurements in front of the entire wireless community,” Golmie said. “We hope this effort will help the entire marketplace.”