United Nations Educational, Scientific and Cultural Organization & International Atomic Energy Agency

The Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34014 Trieste-Miramare, Italy, tel. +39 40 2240111, fax +39 40 224163, www.ictp.trieste.it

School on Data and Multimedia Communications Using Terrestrial and Satellite Radio Links,
12 February - 2 March 2001

smr1301@ictp.trieste.it | www.ictp.trieste.it/~radionet/2001_school/Timetable.html

 

 

 

 

 

 

 

 

 

 

Rational Use of Radio

(Lecture notes)

 

 

 

 

 

Prof. Ryszard Struzak

 

 

 

 

 

 

 

 

 

 

For internal use only. Not for reproduction. All right reserved.

 

 

 

 

 


 

 

 

 

 

 

 

 

Prof. Ryszard Struzak

Rational Use of Radio

 

 

 

 

Contents

Communication, Digital Revolution and Growth of Radio. 3

Natural communication. 3

Writing. 3

Digital Revolution. 3

Measure of Information. 3

Multimedia and Virtual Reality. 3

Convergence. 3

Growth of Radio. 3

Radio Communication Channel 3

Information Theory. 3

Communication Channel 3

Spectrum Management 3

EM Interactions. 3

National Spectrum Management 3

International Spectrum Management 3

Concluding remarks. 3

References. 3

Figure 1. Graph representing a network of 7 transmitters. 3

Annex. Examples of Unintended Effects of Radio Interference. 3

 

 

Rational Use of Radio

Prof. Ryszard Struzak

<ryszard.struzak@ties.itu.int>

 

 

These are just rough notes for my lectures in the winter school on data and multimedia communications using terrestrial and satellite radio link organized at the International Centre for Theoretical Physics ICTP Trieste, February 2001 (beware of misprints!). We shall review various issues related to the use of radio waves. Since to cover any of the many topics in detail one would need much more time, we shall concentrate on basics, and only touch on a number of more advanced issues.

 

Note. These materials may be used for study, research, and education in not-for-profit applications. If you link to or cite these materials, please credit the author, Ryszard Struzak. These materials may not be published, copied to or issued from another Web server without the author's express permission. Copyright © 2001 Ryszard Struzak. All commercial rights are reserved. If you have comments or suggestions, please contact the author at ryszard.struzak@ties.itu.int.

 

Motto: Transport of the mails, transport of the human voice, transport of flickering pictures - in this century as in others, our highest accomplishments - still have the single aim of bringing men together. Antoine de Saint-Exupéry, French aviator and writer (1900 - 1944).

 

 

Communication, Digital Revolution and Growth of Radio

Communication is fundamental to any social activity. According to The American Heritage® Dictionary of the English Language, communication, is "the exchange of thoughts, messages, or information, as by speech, signals, writing, or behavior". During the process of evolution, humans have developed various communication means. They have also developed curiosity that pushes them to explore the unknown and to gather knowledge. All this helped them to survive, to benefit from natural resources available and to improve the quality of life. Exploration can be considered as a form of one-way communicating with nature, where natural signals are received, processed and interpreted. Rushing to explore natural resources, researchers invented a variety of instruments and tools that extended the abilities of our natural senses and allowed exact observations and data collection. An example: although from the very beginning humans were immersed in the natural magnetic field of the Earth, they could not sense and visualize it until the compass was invented in China in the 6th century or so. That invention has revolutionized the art of navigation and was an important step towards globalisation that has progressed until now. In the process of evolution, humans developed also predisposition to share the experience and knowledge within social groups, as we do it now at this school. Indeed, acquiring and sharing information is at the very roots of human civilisation and development. This process involves also storage, handling, transmission (communication), and presentation of data in various forms. The spiritual and material wealth of humanity emanates from that.

 

Natural communication

The natural communication involves all senses: eyesight, hearing, touches, smell, gesture, etc. Mother's direct contact with her newly born baby is a good example here. Without using any spoken language, mother "feels" what her baby needs. Such communicating is not limited to parent-children relations. Members of deaf communities have extended the non-spoken gesture-language to perfection. Our natural "body language" discloses information about the true nature of our feelings. Such an undue openness may create problems in interpersonal relations, and many social groups consider it unwanted. These groups require spontaneous feelings to be hided under the mask of the etiquette accepted by the group.

 

Language

Spoken language is the principal means of human communication, although it can be transferred to other media. A prominent characteristic of language is the arbitrary relation between a linguistic sign and its meaning: There is no reason other than convention among speakers of English that a cat should be called cat. Indeed, the meaning results from interpretation of data by people in specific circumstances. Language can be used to discuss a wide range of topics, a characteristic that distinguishes it from animal communication, which we believe has only specific uses. There are about 6000 languages spoken in the world, and different languages rarely use the same phonemes to convey the same meaning. Communication between groups using different languages involves translation from one language to another - which is a specific example of information processing. A spoken language can be translated not only into another spoken language, but also into a language of gestures, a process we often use in personal contacts when no interpreter is available. A spoken message can also be represented in other media, such as acoustic "tam-tam" drum rhythms or as graphical symbols such as icons or writing.

 

Writing

Natural voice communication with its transient and volatile nature is limited in time and in space. Voice, once uttered, as any sound once produced, does not last in time but immediately disappears forever leaving no trace (except perhaps in the memory of listeners, if any). >From the very beginning, people wanted to overcome these limitations. Only after Thomas Edison (1847-1931) invented a sound recording and reproduction instrument, it became possible to convert the sound vibrations into a permanent record to be played back later and perhaps even in a different place. However, a long time before Edison, our ancestors invented the art of writing to surmount the time and space limitations of spoken messages. A number of tribes and nations have disappeared in the past, and their spoken languages have died with them forever. However, written messages they left still exist witnessing their achievements and problems. Wedge-shaped Sumerian inscriptions, made about 30th century BC on clay tablets, seals, stone obelisks, statues, and the walls of Mesopotamian palaces with inscriptions can still be admired, in spite of the time passed, climate exposure, and war destructions.

 

Writing means representing information using a limited set of symbols. For instance, English uses only 23 letters of the Roman alphabet. Although unable to convey all the nuances of spoken language (e.g. emotional elements, intonation, accent, etc.) with precision, writing was one of the greatest achievements of humanity as a method of information transmission by means of visual marks systems. Its origins, often attributed to divine sources, go to China, Indus Valley, and Tigris-Euphrates river valley. The earliest writings used hieroglyphs representing an entire spoken word or group of words. Such an approach is still in use in traditional Chinese and Japanese. It can be found also in modern mathematics, chemistry, and in computers where "icons" are used around the word as shortcut commands. The main advantage of such an approach is that the message can be conveyed with no relation to its pronunciation. For example, "5" reads “five” in English, “cinq” in French and "pięć" in Polish, keeping the same meaning.

 

The first writings were impractical. They were costly and heavy/ voluminous as clay tablets of ancient Mesopotamia or fragile as papyrus scrolls of ancient Egypt, Greece, and Rome. About the 4th century AD, the clay and papyrus were replaced in Europe by parchment made of animal skins. The texts, however, were still handwritten as previously, by professional scribes in a very limited number of copies. Hand labor was slow and expensive, and thus books were few and costly. They were commissioned exclusively by the rich literate minority of the population, mainly temples, and rulers. Two innovations in Europe simplified book production and made it economically feasible: paper and movable metal type in the 15th century. Both were based on earlier Chinese achievements, as printing from carved wood blocks was known in China in the 6th century AD, and printing from movable type since the 11th century. Highly mechanized technology of paper manufacturing and book production offered printing material at low cost, which increased public literacy.

 

Writing was one of the greatest inventions in the history of human civilization, as it allowed overcoming the time-barrier. With writing, we can to go hundreds and thousand of years back in time and follow thoughts and ideas of our ancestors as they were still with us. Writing and arts have common roots. Indeed, calligraphy was in most countries a highly respected art form for many centuries. Art and science, in their broad meaning, is also a means of transmitting information related to aesthetic, emotional, or intellectual qualities from the people who produce it to the community that observes it in a musical, literary, performance, or educational context. It includes literature, music, dance, painting, sculpture, architecture, etc.

 

Digital Revolution

About two hundreds years ago, Ludwik Zamenhof (1859-1917), Polish philologist of Jewish origin, created an artificial language named Esperanto, in addition to the multitude of natural languages. It was designed specially for international use. Its vocabulary is based on word roots common to many languages and its grammar is simplified. It is spoken by a relatively small group of enthusiasts. The last century also witnessed the creation of another artificial language, designed specially for use in written. It has never been spoken as it has been invented especially to facilitate communications between humans and machines and between machines such as computers. Derived from mathematical considerations and technical conveniences it operates with two symbols only, known as bits or binary-encoded (digital) information format represented by "0" or "1", or by their physical equivalents. In spite of that apparent paucity, it is capable to convey all nuances of any natural language. In addition, it can convey all other products of human intellect such as music or picture that are difficult to present in spoken or written language. It is an inherent part of digital technology making communication less expensive and more efficient. Digital technology converts information - text, graphics, sound, or pictures - to strings of ones and zeros that can be physically represented as digital signals and transmitted or registered on a storage medium. Magnetic disks and ROM sticks, and optical compact disks complement paper. Writing has been transformed into "recording", where sounds, symbols, writings, numbers, graphics, pictures etc. are registered using the same universal digital technology, fast, and inexpensive. For instance, this page contains about 4000 characters ordered in 50 or so rows of about 80 characters. A single 600Mb CDRom can store over 150'000 such pages, which would make a heap of 165m high. As the disk costs below 2 dollars, the cost of such an "electronic print" is thus about 2 cents per 1000 pages. In addition to the substantial economies in costs and in volume, there is an additional benefit: electronic records can be copied with the speed of 100 pages or so per second and can be transmitted further on a distance of thousands kilometers with the speed of light. It could not be possible with any other technology known until recently.

 

Measure of Information

About eighty years ago, H. Nyquist and R. Hartley showed that it is convenient to use a logarithmic measure of information. Suppose that multi-level symbols are transmitted through the channel. If the number of levels in each symbol is n then for two symbols the number of possible combinations is n2. For r symbols the number of combinations is nr. If the symbols are transmitted at a rate of r symbols a second, and the message lasts for T seconds, the total number of combinations that can be transmitted is nrT. If we have a message that lasts twice as long, we expect to be able to transmit twice as much information. To maintain such an additive property, a logarithmic information measure has to be used

 

Information ~ rTlogn                                                            (1)

 

Here, logarithm of base 2 has been universally used, which led to the associated binary digit[RS1] . The binary digit or bit is the smallest unit of information. One bit expresses a 1 or a 0 in a binary numeral, or a true or a false logical condition. It can be represented physically by an element such as a high or low voltage at one point in a circuit or a small spot on a disk magnetized one way or the other. Now, log2n, where n is the number of levels in the symbol, is just the number of bits required to express the symbol as a binary number. E.g., if n =16 then the number of bits is four. Similarly, if a device has n possible positions it can, by definition, store log2n bits. Usually, we use groups of bits, as a single bit conveys little information a human would consider meaningful. A group of eight bits makes up a byte, which can be used to represent many types of information, such as a letter of the alphabet, a decimal digit, or other character[RS2] .

 

Multimedia and Virtual Reality

Natural communication between humans involves all sense organs. Five of them are considered as basic. These are: hearing, sight, smell, taste, and touch. Some scientists believe there are more senses, for instance sense of weight, equilibrium, body position, balance, hunger, thirst, or fatigue. Only the hearing and sight have widely been used in telecommunication, and several laboratories around the world are working on the remaining senses to involve them into multimedia communications. The involvement of other sense organs would make practical such applications as tele-education telemedicine and telepresence, a natural extension of today's teleconferencing or flight simulations. Until recently, communication technology was seeking to imitate the real environment around us and to transport it to another place in space and in time. Michio Kaku, professor of Theoretical Physics at the City College at New York, believes that future communications will integrate with virtual reality that will become an integral part of the world of 2020. Ubiquitous computers will seeks to re-create imaginary world that do not exist. Even today, virtual reality equipment generates signals in the computer memory that, via 3-dimensional TV goggles, and joysticks, produces feelings of moving through space and time. Although virtual reality is crude today, its technical flaws will be eliminated with time. The primitive joysticks will be replaced by body-suits equipped with various sensors that will sense location, touch, vibrations, temperature, humidity, electric field, etc., of every part of our body and will send the data via radio and Internet to recreate them where needed via special terminals built in the suits. Even today, virtual reality is a powerful training tool, source of entertainment, and scientific tool. It gives us the ability to simulate and visualise complex physical systems and processes like weather, or earthquakes. For centuries, science has advanced in two ways: some scientists observed the world conducted experiments and collected the data, whereas others created theories to generalize and explain the collected data. Virtual reality is creating a third possibility, based on computer simulations. In many areas it is the only practical way of making progress.

 

Convergence

Digital technology makes the recording, copying, processing, and transmitting of information less expensive and faster than any other techniques known until now. It has been applied not only to communications and to computers, but also to music, photography, printing, filmmaking, and entertainment - what we observe now as the convergence. This enabled the extraordinary growth of the volume of information exchanged and led to what is known as the information explosion or information revolution - the current period in human history, in which the possession and dissemination of information has supplanted mechanization or industrialization as a driving force in society. Although we all participate in this development, we do not fully understand its long-term impact on the society. Many consider its significance as comparable with the invention of writing thousands years ago, which may justify term "digital revolution", even if it is only a continuation of never-ending progress of science and technology that impacts the development of the society in general. Another popular term associated with digital technology is Information Superhighway, superseded by later by the concept of National Information Infrastructure, Global Information Infrastructure and Information Society. These terms refer to an image of a future society producing, processing, and consuming abundant information available, and using extended information services and a variety of advanced facilities, including computers and computer networks, at low cost. The term was coined by United States Vice President Albert Gore to emphasize the importance of such an infrastructure for the development of the society. In practice, Internet and the future interactive television best exemplifies these concepts.

 

Growth of Radio

Natural voice communication is limited in space. For instance, those sitting here in the last row can hardly hear my voice, which would dissolve into noise at distance exceeding a few dozens meters. Sounds produced by African Tam-Tams or Turkish military drums might reach greater distances; say up to a few kilometers. To convey messages at greater distances, the received message was re-transmitted further in the same way as we do it today using microwave links. The acoustic signals were complemented by smoke signals of ancient Israelis or Native Americans, sunlight reflected by mirrors of ancient Chinese soldiers, which led directly to a visual telegraph system developed by French engineer Claude Chappe. Named semaphore, the system transmitted the first messages between Paris and Lille in 1793, and then throughout the country. The dependence on weather condition of the visual telegraph was removed by electric telegraph patented by Samuel Morse, a New York University art professor, in 1832. With electric telegraph and later with telephone, the potential communication range has been extended over hundreds and thousands kilometers, but the humanity had to wait until Guilielmo Marconi (1874-1937) demonstrated practical radio communication technology able to transmit messages between any two points in space. With radio, the distance limitation has been overcome.

The phenomenal growth of radiocommunication services and other radio applications observed during the recent years leads to an increasing congestion of the available radio frequency spectrum. All radio applications have grown, but mobile telephony is the best example. [RS3] Before you finish reading this page (which may take two minutes or so), thirty new wireless telephones will be put into operation - but the amount of the spectrum allocated to mobile services does not increase. New multimedia applications such as videoconferencing, distance learning, or telemedicine, - generally, "telepresence" - are being developed intensively. Each of them requires frequency bandwidth to transmit the data, texts, sounds, and pictures, contributing further to the spectrum congestion. Broadband wireless access to Internet via stratospheric stations and LEO satellite systems, such as Teledesic, is under development. How all these radio systems will operate in congested environment? How will it impact the operation of the existing systems? How could we limit negative effects of the congestion? These notes review basic concepts and issues involved.

 

Radio Communication Channel

Information Theory

Although several engineers and scientists contributed to information theory, it is generally accepted that Claude E. Shannon, American electrical engineer, is its "father". At Bell Telephone Laboratories, he studied general principles of information transmission through telecommunication channels about fifty years ago. His work led basis for what was later named the Information Theory and Communication Theory. Information science in general is concerned with the gathering, manipulation, classification, storage, and retrieval of recorded knowledge. It emerged when digital computers were developed during the early 1950s. Automated searching of files, coordinate indexing, and controlled vocabularies were introduced in response to the urgent need to create easy access to the contents of patents and scientific and technical journals. In the 1960s, massive collections of documents were transferred to databases, enabling various searches to be done by computer. Information science had become a thoroughly interdisciplinary field, bringing together ideas from the social sciences, computer science, cybernetics, linguistics, management, neuroscience, systems theory and artificial intelligence, in the mid-1980s. Artificial intelligence refers to the machine's capacity to mimic intelligent human behavior.

 

Information theory is a mathematical discipline that deals with the characteristics and the transmission of information. Note that Information theory does not deal with the contents or with the interpretation of information. The need for a theoretical basis for communication technology arose from the increasing complexity and crowding of communication systems. Information theory was originally applied to communications engineering but proved relevant to other fields, too. It deals with the measurement of information, the representation of information (such as encoding), and the capacity of communication systems to transmit, receive, and process information. Encoding can refer to the transformation of speech, images, etc., into electromagnetic signals, or to the encoding of messages to ensure privacy.

 

Band-Limited and Time-Limited Signals

Body language, spoken language, writing, etc., can be considered as instances of signals. We differentiate two classes of signals: analogous or continuously varying waveforms, and digital signals. Practical signals have limited bandwidth and last during a limited time period. Let the signal bandwidth is B Hz (starting at zero frequency) and its lasting time is T seconds. There may be many functions of time whose spectra lie entirely within the band B, and whose time functions lie within the interval T. Strictly speaking, it is physically not possible to fulfil both of these limits exactly, it is, however, possible to fulfil them approximately, with only a small portion, say one percent, of signal power spread out of the band B and interval T.

 

Continuous Signals vs. Digital Signals

Any analogue band-limited signal function can be represented completely by and reconstructed perfectly from a set of measurements or samples of its amplitude, which are spaced 1/(2B) seconds apart. For a voice signal including frequencies from 0 to 4kHz (B = 4 kHz), we must use 8000 samples per second. For a television signal including frequencies from 0 to 4MHz (B = 4 MHz) we must use 8 million samples per second. The intuitive justification is that if the function contains no frequencies higher than B, it cannot change to a substantially new value in a time les than one-half cycle of the highest frequency, that is, 1/(2B). The original function can be reconstructed from the samples by using a pulse of the type

 

(Sin x) / x, where x = 2pBT.                                                  (2)

 

This function is unity at t = 0 and zero at t = n/(2B), i.e. at all other sampling points regularly distributed. At each sample point, a pulse of this type is placed and its amplitude is adjusted to equal that of the sample. The sum of these pulses is the required function. The result is true also if the band B does not start at zero frequency but at some higher value. Thus, we have obtained a set of pulses that represents exactly the original analogue signal[RS4] .

 

The sampling theorem enables us to represent a smoothly varying signal by a sequence of samples. The samples may have different values within a continuous range of amplitudes. Quantization is a process that converts that continuum into a set of discrete values. Sampling and quantization process translate analogue signals into equivalent discrete signals. Analog or continuously varying electrical waveforms, sampled at a fixed rate, are applied to an Analog-To-Digital Converter. Sample values are then approximated as digital numbers, using a binary numbering system of 0's and 1's with an error that can be kept as small as needed by controlling the approximation precision. The resulting digital codes can then be used in computers, digital audio and video recorders, and various types of communications systems.

Geometrical Representation

If a signal function is limited to the time interval T and the samples are spaced 1/2B seconds apart, a total of 2TB samples are needed to represent it correctly. Thus, giving 2TB numbers can specify the function. A set of three numbers, x, y, z, regardless of their interpretation, can always be thought of as co-ordinates of a point in three-dimensional space. Similarly, the 2TB samples of a signal can be thought of as co-ordinates of a point in a space of 2TB dimensions. Each particular point in this space corresponds to a signal in the band B and with duration T. The number of dimensions 2TB will be, in general, very high. A (0 to 5) kHz voice signal lasting for an hour would be represented by a point in space with 2x60x60x5x103 = 3.6x107 dimensions. A (0 to 5) MHz television signal lasting for the same an hour would be represented by a point in a space with 103 times more dimensions as 2 x 5 x 106 x 60 x 60 = 3.6 x 1010. Shannon showed that the square distance from the origin to a point is 2B times the energy (into a unit resistance) of the corresponding signal. Similarly, the distance between two points is Ö(2BT) times the rms discrepancy between the two corresponding signals. If we consider only signals whose average power is less than P, these will correspond to points within a sphere of radius

 

r = Ö(2BTP)                                                                         (3)

 

It means that noise added to the signal in transmission causes that the point corresponding to the signal has been moved a certain distance in the space proportional to the noise power. Thus, noise produces a region of uncertainty about each point in the signal space. If this uncertainty region becomes too large (according to some criterion), then the differentiation between different signals may be impossible[RS5] .

 

Communication Channel

A general class of communications systems considered by Shannon [RS6] consists of an information source, information destination or information sink, transmitter, receiver, and the channel or transmission medium.

 

Information source

The information source selects one message from a set of possible messages to be transmitted. It may be a sequence of letters or numbers as in facsimile transmission, or a continuous function of time as in telephony, or a series of pulses as in multimedia digital communication. The message can be interpreted as a point in an abstract message space. The destination is the person or apparatus for which the message is intended.

 

Transmitter

The transmitter performs mapping of the message space into the signal space. In telephony, this operation consists of merely changing sound pressure into a proportional electrical current. In other cases, it may involve more complex processes/ operations, such as sampling, quantization, compressing, coding, spreading, or modulation. According to Shannon, the input to the transmitter is a message, that is, a point in the message space, and its output is a signal, i.e. a point in the abstract signal space. Whatever form of encoding or modulation is performed, the transmitter establishes a correspondence between the points in these two spaces. Every point in the message space must correspond to a point in the signal space, and no two messages can correspond to the same signal. If they did, there would be no way to determine at the receiver which of the two messages was intended. The geometrical name for such a correspondence is a mapping.

 

Transmission Medium

Transmission medium is used to transport the signal from the transmitting to the receiving point. With electromagnetic signals, it may be a pair of wires, a coaxial cable, a fibre optic cable, etc. During transmission, the signal may be perturbed by noise or distortion, and the message recovered at the receiving terminal differs from the original message by an error. The aim of the system is to keep that error within limits, acceptable according to same criterion. Distortion is a fixed operation applied to the signal, while noise involves statistical and unpredictable perturbations. Distortion can, in principle, be corrected by applying the inverse operation, while a perturbation due to noise cannot always be removed, since the signal does not always undergo the same change during transmission. Noise is thus the fundamental limiting factor in information transmission. Two specific noise classes are usually differentiated: multiplicative noise and additive noise, although in practice they are usually mixed. Fading is an example of the multiplicative noise, and thermal noise – is an example of the additive noise. Although additive noise can be produced at any point of the communication channel, usually it is convenient to represent it by one equivalent noise source located at the input of the receiver.

 

Receiver

The receiver recovers the original message from the received signal. For that purpose it performs an inverse operation to that of the transmitter, that is it does mapping of the received signal space into the received message space. However, it may involve additional operations in order to combat noise[RS7] .

 

Noise

Noise considered by Shannon is called "white" because its spectrum contains a mixture of all frequencies, and its spectral density is constant in a very wide frequency range. One type of white noise is omnipresent. It is "thermal noise" or "Johnson noise", after John Bertrand Johnson (1887-1970), Swedish-born American physicist who studied electrical fluctuations caused by heat. H. Nyquist has found that a hot resistor, independently of its resistance (ohms), is a potential source of noise power of

 

N = kTB                                                                              (4)

 

where k is Boltzmann's constant, k = 1.37 x 10-23 joule / degree. T is the temperature of the resistor in degrees Kelvin. (It is the number of Celsius or centigrade degrees above absolute zero. Absolute zero is --273 centigrade). B is the bandwidth. The thermal noise constitutes a minimum noise which always exists and which we must accept. Its fundamental nature has led to its being used as a standard in the measurements of the performance of radio receivers. Note that the thermal noise power is proportional to the bandwidth. Hence, we would expect less noise in a radio receiver that amplifies signals having a bandwidth of several thousand Hz than in a television received which amplifies signals having a bandwidth of several million Hz[RS8] . Table 1 gives example of effective noise temperature of a few radio receivers  [Pierce'80].

 

Table 1 Effective noise temperature and noise figure [Pierce'80]

Type of Receiver

Equivalent Noise Temperature, degrees Kelvin (K)

Good FM Radio or TV Receiver

~1500

Parametric amplifier receiver

~50

Maser receiving station for space mission

~20

 

 

Transmission process

When a message is transmitted through a channel, or medium, such as a wire or the atmosphere, it becomes susceptible to interference from many sources, which distorts and degrades the signals. Two of the major concerns of information theory are the reduction of noise-induced errors in communication systems and the efficient use of total channel capacity. Efficient transmission and storage of information require the reduction of the number of bits used for encoding. This is possible, for instance, when processing English texts because letters are far from being completely random. The probability is extremely high, for example, that the letter following the sequence of letters informatio is an n. This redundancy enables a person to understand messages in which vowels are missing, for example, or to decipher unclear handwriting. In modern communications systems, redundancy is controlled. On the one hand, it is removed to increase the transmission speed. On the other hand, artificial redundancy is added to the encoding of messages in order to reduce errors in message transmission.

 

Channel Capacity

If it is possible to distinguish reliably (according to some criterion) M different signal functions of duration T on a channel, we say that the channel can transmit log2M bits in time T. The rate of transmission is then (log2M) / T. Shannon defined the channel capacity as the limit [RS9] 

 

C = limes [(log2M) / T] when T ® µ.                                     (5)

 

Assume a linear communication channel without memory, limited by white additive noise. Shannon has found that the maximum amount of information that can be transmitted with a negligible error by that channel in unit time depends on two parameters only, the channel bandwidth (B) and the power ratio (q) of the wanted signal (S) to noise (N) at the receiver input:

 

C* = B log2(1 + S / N) = B log2(1 + q)                                    (6)

 

q = S / N                                                                             (7)

 

Shannon did not indicate what encoding system must be applied to reach that theoretical limit. Numerous research works have been done since in source coding, channel coding - generally in signal processing - to reach the Shannon's limit. First telecommunication systems could hardly reach a few percent of the theoretical limit. Recent systems, however, could approach the capacity of only 0.1 dB below that limit, according to Rimoldi[RS10] .

 

The Shannon's equation indicates, for instance, that a channel of 1 MHz bandwidth that attain signal to noise ratio of 1 has theoretical capacity of 1'000'000 bits per second (bps). However, the same capacity can be attained with other combinations[RS11] . Early radiocommunication workers were intrigued with the idea of cutting down the bandwidth required by increasing the signal power. However, in many cases it is more practical to increase the bandwidth and work with weaker signals[RS12] . Although practical systems can only approach the Shannon's theoretical limit, the equation has universally been accepted as reference.

 

Broad Band Channel

At first glance, we might expect that if we make bandwidth larger and larger keeping signal power constant, the capacity of a channel would be larger and larger, without limitations. Indeed, it would be possible if we could change the bandwidth of the channel keeping the noise power constant. However, with increasing the bandwidth, the noise power increases too, and channel capacity cannot increase to infinity. It is easy to show the limiting value by substituting

 

C* = B log2[1 + S / N] = B log2[1 + S / (kTB)]                          (8)

 

Keeping signal power S constant and increasing bandwidth means decreasing the spectral power density, or signal power per Hz. With constant signal power, if we make B very small, C* will become very small. When we increase bandwidth, we arrive to a point where S / (kTB) becomes very small compared with unity. Then, loge(1+x) ~ x or log2(1+x) ~ 1.44x, and the above equation reduces to

 

C* = 1.44 S / (kT)                                                                (9)

 

It can be rewritten as

 

S = 0.693 kTC*                                                                   (10)

 

This relation implies that, even when we use a very wide bandwidth, we need at least a power of 0.693 kT Watt (Joule per second) at the receiver input to transmit one bit per second, so that on the average we must use energy of 0.693 kT Joule for each bit of information we transmit. We should remember, however, that this relation holds only for an ideal channel and ideal encoding. In telecommunication systems based on the Shannon model, it is possible only to approach this limit and practical systems require more energy per bit than the formula suggests[RS13] .

 

Radio Channel Specifics

Communication channel using radio waves differs from channels that use wires or cables. Radio waves are electromagnetic waves of frequencies arbitrarily lower than 3000 GHz propagating in space without artificial guide, according to the definition agreed within International Telecommunication Union [RR'98]. 3000 GHz, or 3 x 1012 Hz, corresponds to the wavelength of 0.1 mm. The communication range depends on the power, wavelength, and environment. On one extreme, it may reach the interplanetary distances, as in the case of the solar system exploration satellites. There, radio waves are the only way to transmit information. On another extreme, it may be of only a few meters or so, as in "Bluetooth" systems. These short-range systems are intended to replace interface cables between computers, phones, Personal Digital Assistants, household "intelligent" devices, etc. Bluetooth hardware is a single chip, of about postal stamp in size and a few grams in weight. It radiates about 1 mW of power at unlicensed ISM band of about 2.4 GHz.

 

Electromagnetic waves are produced by any element of the communication channel that carries alternating electrical current and propagate in space without limits. If they are produced unintentionally, the radiation efficiency is usually low, the radiated waves are of low intensity, and they propagate in an uncontrolled manner. For these reasons unintended waves are not suitable for regular communication purposes. However, they can be intercepted, what is called "communication intelligence". Communication intelligence by individuals is illegal in most countries, although it is exploited on behalf of the state by special military (spy) units and by police in criminal contraventions pursuing, if so sentenced by a court. Government-sponsored communication intelligence involving covert interception of foreign communications has become a large-scale industrial activity, providing intelligence on military, diplomatic, economic, and scientific developments. Communications are intercepted from satellites and from ground communications using satellites, and from undersea cables using submarines. In excess of 120 spy-satellite systems are currently in operation collecting intelligence, according to Wik [RS14] .

 

To achieve high effectiveness, radio waves are radiated and received by specially designed antennas, in contrast to unintended radiation that is produced by incidental antennas. An antenna is a structure that transforms electrical signals delivered to its input in the form of time-series into time-varying, space-depended, controlled electromagnetic fields - radio waves, or vice versa. On transmission, an antenna accepts energy from a transmission line and radiates it into space. On reception, an antenna gathers energy from an incident wave and sends it down a transmission line.

 

Unintended Radio Channels

The [RS15] main task of a transmitting station is to produce radio waves of required intensity within defined time periods and frequency band, and to radiate them into a defined direction. In practice, however, it is physically not possible to emit signal strictly within these assigned intervals without any spill over. Actual transmitting stations, in addition to intended bands, times, polarizations, and directions, radiate also unintended (unwanted, undesired, spurious) waves that lie outside their designed frequency, time, and space domains. Unintended waves are not necessary to transmit the regular signals and should be eliminated as they can disturb other applications of radio. As the elimination of unintended radiations involves extra costs and efforts, the ideal equipment completely free of any unintended radiation, does not exist in practice. Every electrical appliance generates some residual radio waves. In radiocommunications, one distinguishes between the energy emitted within the necessary bandwidth, out-of-band emissions, and spurious emissions. The necessary bandwidth is the frequency band just sufficient to ensure the transmission of information at the rate and with the quality required under specific application conditions. Spurious emissions are emissions on frequencies outside the necessary bandwidth, and the level of which may be reduced without affecting the corresponding transmission of information. Spurious emissions include harmonic emissions, parasitic emissions, intermodulation products, and frequency conversion products, but exclude out-of-band emissions. Out-of-band- emissions are unwanted emissions on frequencies immediately outside the necessary bandwidth, but excluding spurious emissions. Usually, they result from the modulation process and insufficient filtration. [RS16] [RR'98]

 

Similarly, the receiving stations main task is to receive signals that are contained within precisely defined portions of frequency band and time, and arriving from a specific direction and with specific polarization. Unfortunately, physically realizable receiving stations respond also to signals from outside these intended intervals of time, frequency, etc. Noise and unintended signals constitute the fundamental limiting factor in radiocommunications. Unintended signals at the receiver's input may even disrupt completely communications and operation of electronic systems; and some real-life examples will be given in one of the sections below.

 

Spectrum Management

EM Interactions

Electromagnetic Interactions and Electromagnetic Compatibility

In the fifties of the last century, concept of electromagnetic environment and electromagnetic compatibility was developed to deal with communication problems in real-life conditions. Later the concept was extended to embrace all electrical and electronic systems. Electromagnetic compatibility (EMC) is the ability of systems to operate in their electromagnetic environment

 

 

These two aspects define conditions for controlling the ability of equipment to produce unintended radiations and for controlling the vulnerability (or immunity) of equipment to unintended signals from environment. Note that any system can be disturbed if subjected to emissions exceeding its immunity. It happens

 

 

Usually, disturbances take place if a system is operating in an environment that is different from the intended one assumed at the system design and implementation. For the purpose of this lecture, we will deal separately with weak or soft noise-like interference and strong of hard interference, called also harmful interference.

 

Weak [RS17] Interactions

Shannon developed his theory for an isolated communication channel. We know, however, that no radio link can be isolated from other links operating in environment. Due to basic laws of physics, radio waves in open space cannot be confined to any specific volume. Consequently, when two or more radio links operate at the same time and frequency, a part of power transported by radio waves penetrates from the transmitting end of one link to the receiving end of another (victim) link. Such "environmental" power is unwanted as it does not convey useful information and adds to the link's noise power. In correctly designed and operated networks, such environmental effects are weak, like noise. The Shannon formula disregards that additional noise. When applied to a radio communication link in congested environment, it gives too "optimistic" results and need to be modified. [Struzak'99]

 

Coupling with Environment

To discuss radio frequency spectrum congestion issues in an unambiguous way, we need an objective, quantitative measure. For that purpose, we introduce "Isolation index" (a) of radio link. The indexes show the relation between the link noise component (N), environmental noise component (I) and total noise (N + I) at the receiver end of the victim radio link under consideration:

 

a = N / (N + I)                                                                      (11)

 

Its numerical value is confined between 1 and zero. When the environmental component (I) is much smaller than the link component (N), then the isolation index approaches 1. When the components equal each other, it is 1/2. [Struzak'99]

 

Modified Shannon's Limit

In radiocommunication, where we focus on efficient utilization of the frequency spectrum, we find often more practical to deal with the channel capacity per Hz (C) than with the channel capacity (C*). With this in mind, the Shannon's equation can be rewritten as

 

C0 = log2(1 + S / N) = log2(1 + q)                                          (12)

 

Here, C0 is the maximum number of binary digits (bits) per second per Hertz that can be transmitted by an isolated communication channel, and S and N are the signal and noise, accordingly. The equation says, for instance, that a channel with signal equals noise (q=1) is theoretically capable to transmit maximum of 1 bit of data per second per Hertz.

 

As mentioned, the Shannon's formula has to be modified to represent better radio communication systems in congested environment. For that purpose, it is sufficient to replace the link noise (N) by the total noise (N + I), as the link capacity is limited by the total noise:

 

C = log2[1 + S / (N + I)] = log2(1 + aq)                                  (13) 

 

Here "a" is the isolation index of the radio link under consideration and "q" is signal to noise power ratio of the link when operating in isolation (q = S / N). This modified formula indicates, for instance, that a channel with q = 1 (as in the previous example) and with a = 0.1 is theoretically capable to transmit only 0.14 bit/s/Hz. It contrasts sharply with the potential capacity of the isolated link that reached 1 bit/s/Hz, about seven times more[RS18] .

 

Capacity Loss

Equation (14) conveys a straightforward message. For a hypothetical isolated link (a=1), the link capacity C approaches the Shannon's limit C0. As the isolation index of the link in congested environment is always less than one, the link capacity is always smaller than indicated by the Shannon's formula, C < Co. The relative link capacity (C/C0) and capacity loss due to the environmental noise are

 

C/C0 = [log2(1 + aq)] / [log2(1 + q)]                                       (14)

 

Relative Loss = [(C0 - C) / C0] = 1 - C/C0                                         (15)

 

Absolute Loss = (C0 - C) = log2[(1 + q) / (1 + aq)]                  (16)

 

Note that the capacity loss depends on two parameters only: the isolation index of the link (a) and its signal to noise ratio (q) when operating in isolation[RS19] .

 

Simple Environment

To analyse the potential impact of the environment in more detail, we will consider the simplest case of a radio link under assumption that its environment consists of only one other radio link. For further simplicity, we assume that the both links operate at the same time, use the same frequency, and that their spatial deployment does not change with time. We also assume omnidirectional antennas, and a (1/dn)-type propagation model, and we disregard shadowing and other radio propagation effects. The power of wanted signal (S) at the receiver input is

 

S = PW / (DWR)n.                                                                  (17)

 

Here, "PW" is the power radiated by the wanted transmitter of the link, "DWR" is the span of the link, and "n" is the propagation index. Similarly, the environmental-noise power is

 

I = PU / (DUR)n.                                                                     (18)

 

Here, "PU" is the power radiated from the unwanted radiator, and "DUR" is the distance from the victim receiver to the unwanted radiator. Moreover, as the signal to noise ratio of the isolated link is q, thus its noise power is

 

N = S / q.                                                                            (19)

 

The isolation index of the link is therefore

 

a = 1 / (1 + I/N)  = 1 / [1 + q (PU / PW) (DWR / DUR)n],              (20)

 

Note the complementary role of power ratio (PU / PW) and distance ratio (DWR / DUR): one can compensate for the other. This relation indicates means that can be used in the design process to limit the negative effects of the spectrum congestion. We already noticed that the link capacity approaches its potential maximum (C0) when the isolation index tends to its maximum value. That in turn takes place when the following variables are kept as small as possible:

 

The distance ratio (DWR / DUR)                                               (21)

The power ratio (PU / PW)                                                     (22)

 

There are also other possibilities that go beyond the simplified assumptions made in this section.

 

Directive antennas

When the two links discussed above have directive antennas, the signal and interference are modified by the antenna gain:

 

S = PWGRWGWR / (DWR)n.                                                                    (23)

 

I = PUGRUGUR / (DUR)n.                                                          (24)

 

Here, "GRU" and "GRW" are the receiving antenna gain in direction of the wanted (W) and unwanted (U) transmitters, and "GUR" and "GWR" are the transmitting antenna gain of these transmitters in the direction of the victim receiver (R). The other variables are as previously. The isolation index of the link is therefore

 

a = 1 / (1 + I/N)  = 1 / [1 + q (PU / PW) (DWR / DUR)n Rat],        (25)

 

where "Rat" is the ratio of the directive antenna gains:

 

Rat = (GUR / GWR) (GRU / GRW).                                             (26)

 

Note the complementary role of the transmitting and receiving antennas: one can compensate for the other. We see now additional means that can be used in the design process to limit the congestion effects. The link capacity approaches its potential maximum (C0) when the following variables, related to the antennas' directivities, are kept as small as possible:

 

The radiating antennas gain ratio (GUR / GWR)                        (27)

The receiving antenna gain ratio (GRU / GRW).                         (28)

 

In addition, there are other possibilities (e.g. frequency division, time division, or code division among the radio links).

 

Complex Environment

In practice, there may be a number of mutually interacting radio links (not necessarily belonging to any common network). Each link may contribute to the degradation of performances of its neighbours by increasing environmental noise components. If Iij, denotes the noise component of i-th radio link due to radiation from j-th radio link, and K denotes the total number of links, then the result of individual components adding together at i-th link is

 

Ii = SjIij, where i, j = 1, 2, …, K, and i ą j.                                          (29)

 

The isolation coefficient and capacity of i-th link are

 

ai =1 / [1 + Ii / Ni]                                                                 (30)

 

Cj = log2(1 + ajqj)                                                                 (31)

 

Identical contributions

Let assume a hypothetical case in which each of K noise contributions to i-th line is identical, equal I. Then the resultant environmental noise is Ii = I (K-1), and the isolation index and capacity of link i amounts

 

ai = 1/ [1 + (K-1)I / Ni]                                                          (32)

 

Ci = log2{1 + qi / [1 + (K-1)I / Ni] } [RS20]                                         (33)

 

General Case

When each individual environmental noise component has a different value, the total capacity of all interacting links together amounts

 

CS = C1 + C2 + … + CK = log2{Pi(1 + aiqi)}, i = 1, 2, …, K       (34)

 

Without mutual interactions among the links, the total capacity is

 

C0S = log2{Pi(1 + qi)}, i = 1, 2, …, K.                                     (35)

 

The mutual interactions among the links decrease the transmission capacity of all links:

 

Relative Loss = 1 - C0S / CS                                                  (36)

 

Note that the capacity loss depends on a number of variables such as the operating frequency, spatial deployment, antenna directive pattern, signal processing gain, radio wave propagation effects, etc. In addition, out-of-band radiations, spurious emissions, out-of-band and spurious receiver responses are to be taken into account, and statistical approach may be required.

 

Strong Interactions

Strong Unintended Interference

As mentioned earlier, strong signals are capable to disturb the operation of communication or other systems. Their effects might be disastrous. For instance, an aircraft automatic landing system irradiated by radio waves may lead to crash landing. A number of real-life examples of unintended disturbances are given in the Annex.

Strong Intended Interference, EM War and EM Terrorism

Vulnerability of communication systems, computers, etc. might be exploited during war for military purposes and by terrorists and criminal organizations. It is known as information warfare, and included attacks on the computer operations on which an enemy country's economic life or safety depends. Possible examples of information warfare include crashing air traffic control systems, massively corrupting stock exchange records, or blocking the public telecommunication and power-control systems. High-altitude nuclear explosions are capable to destroy ionosphere to disable short-wave ionospheric communications, and destroy electrical wires to destroy wired communications and electrical power systems, for instance. Noted were also cases when criminals used high-power microwave waves to block security and alarm systems in banks.

 

National Spectrum Management

On a national scale, it is the government responsibility to prevent criminal acts and conflicts, including electromagnetic ones, and to solve them when they arise, minimising the social costs involved. The government obligations include ensuring an uninterrupted communication among legal users of radio, a smooth interworking with the public telecommunication networks, and promoting the development of radio services vital for the society and those improving the standard of life. To coordinate the national use of radio waves, almost every country has a system, known as national spectrum management system. Three objectives shape any spectrum management system:

 

The system involves the law, regulations, rules, practices, and standards. The law and regulations defines the basic criteria, obligations and responsibilities, in accordance with the country's legal practices and with international treaties. Although each sovereign state has the right to regulate independently the use of radio over its territory, national radiocommunication laws and regulations may not violate any binding international agreements. Commercial market considerations and globalisation trends are also to be taken into consideration here. A specialised governmental agency, called often the National Spectrum Manager, exists in most countries to implement and to enforce national radio regulations. Usually it is responsible not only for national coordination of the use of radio frequencies but also representing the Government at international level.[1] For instance, the acceptable interference level (probability) to a radiocommunication service is fixed there, after considering the arguments of all parties interested and taking into account international treaties and recommendations.

Use of Radio Frequencies

The use of technological developments depends not only on technical factors, but involves also economic, social, political and other considerations. It applies also to radio. In all countries radiocommunications have been placed under strong government control. There are special reasons for government involvement:

 

 

Radio interference must be prevented before the radio device at hand is put into operation - for example before the airplane's crash - as much as practicable. To verify that requirement, electromagnetic compatibility (EMC) examinations are needed. Their aim is to determine if the proposed radio system could cause unacceptable interference to, or could suffer such interference from, other systems that already exist or that have already been planned. If any potential incompatibility is discovered, the system technical or operational conditions must be changed to eliminate the incompatibility.

 

The radio frequency spectrum is finite, and the total number of radio devices that can operate without interfering each other at the same time within a given frequency band and given geographical region is limited. There is a continuing conflict between parties wanting access to the frequency spectrum (and satellite orbits) scarce resources. Some regions and frequency bands are already crowded and there is no space for newcomers. It is thus natural to require these resources to be exploited in a rational manner, for the maximum benefit of all those interested. However, radio waves do not stop at frontiers, and thus the use of radio inherently involves regional and international aspects. The local and the global aspects of the use of radio are inseparable.

 

Specific frequency bands are allocated to specific services/ applications in National Frequency Allocation Tables, National Frequency Plans, and/or National Frequency Assignment Rules. These been produced to facilitate frequency management in the country, and to avoid the necessity of detailed EMC examinations each time a new system/ application is proposed. The national frequency allocation tables and plans may not violate the international ones and other relevant international treaties (discussed below). Usually, they are sub-sets of international allocations and plans. Any radio station must observe these national restrictions.

 

No frequency may be used without prior authorization made by the National Spectrum Manager. Before an authorization (license) can be issued, the National Spectrum Manager is obliged to verify if the proposed use complies with national regulations and international treaties, including the electromagnetic compatibility (EMC) checking. These examinations are made following the binding laws and regulations. If national regulations are absent, or are incomplete, the relevant international regulations and agreements in force apply. If any potential incompatibility is discovered, the system technical and/or operational parameters must be changed to eliminate all incompatibilities. This may also require additional coordination with the parties involved and with the spectrum manager. If the potential interference involves system(s) from outside the country’s territory, bilateral or multilateral international coordination takes place.

 

An authorization assigns a specific frequency band for specific use and defines the technical and operational conditions to be observed. A frequency band may be assigned to an individual station, or to a network of several stations, usually for a limited period of time. The authorization may be limited to a specific place, or defined area, or may be valid for the whole territory of the country. A monitoring and enforcement unit of national spectrum manager verifies whether or not all the conditions imposed in the license are actually observed. If not, the licence may be withdrawn and a fine may be applied.

 

Usually the demand for frequencies cannot be fully satisfied, and the “first come, first served” rule may be applied by the Spectrum Manager to choose between competing applications. In other cases the Manager makes assignment on the basis of “beauty contest”. That contest, or selection on merit, involves an evaluation and ranking of qualities of each applicant. Some qualification standards are required for that purpose. In France, for example, TV licenses were attributed according to a criterion of the “culturally highest bidder”. In the USA, they were given to the “best user” after a “comparative hearing” procedure, according to a criterion of “social benefit”. Another approach used in the USA for the attribution of cellular radio licenses was based on a lottery procedure.

 

A fee may be collected for an authorization to use a frequency band. For many years the licences were offered free, as a rule. It was because, at the beginning, radio was used mainly by the military units, and the whole radio communication sector was under the state monopoly. Only later the development of civilian applications showed that radio means also a lucrative business, attracting private sector. Regulatory delays in licensing mobile services caused economic losses estimated at $50bn for the US alone, according to a study reported by Gruber [1]. With the involvement of private sector, authorisations fees have been collected, and there were long discussions on how to determine fairly their amount. An auction mechanism was proposed to avoid the fairness problem, and was applied successfully in few countries. The valuation of frequency assignment licences reached unprecedented levels. Up-front licence fees can account for up to 50% of the initially estimated total investment costs. For instance, the UK government managed recently (1999) to raise a total of US$ 35bn in the spectrum auction for the next-generation mobile system known as UMTS, even if a number of technical questions remain still open and the market is not sure. In fact, the mobile phone industry has experienced cases of bankruptcies in the US because too high license fees were paid during the Personal Communication System auctions in 1995 [Gruber'01].

 

Frequency Coordination and Taboos

As radiowaves propagate in free space without limits, to avoid mutual interference, many factors of physical, technical, economic, social and even political nature must be given due consideration when planning a radio link. Frequencies, powers, antenna radiation patterns, and locations of the transmitting stations and receiving stations are among major factors. Sometimes spectrum managers prevent interference between radio stations by using frequency-distance separation (F-D) or taboo rules during frequency assignment. The term "taboo" here refers to specific technical constraints that should not be violated. They specify the minimum geographic distance between two radio stations in function of their frequency separation, and are derived from the two essential requirements:

 

·     The wanted signal power should be greater that a given threshold (to overcome background noise);

·     The protection ratio (wanted-signal to interference ratio) should be greater that a given threshold (to overcome interfering signals).

 

In a channelized services, these rules have the form: co-channel transmitters must be separated by at least D(0) km; adjacent-channel transmitters must be separated by at least D(1) km; transmitters separated in frequency by two channels must be separated in distance by at least D(2) km, and so forth. That is, D(k) is the minimal geographic distance two stations must be separated if their assigned frequencies are separated by k channels [Berry 83]. Each transmitter is to be considered twice: first as the wanted transmitter and then as interfering one. In both cases the distance between them must be not less than the sum of the coverage range of the wanted transmitter and the interference range of the other transmitter treated as interfering. Otherwise the coverage area of one transmitter coincides with the interfering area of another. The minimum distance between them is the greater of the two values determined in the two cases.

 

The derivation of FD rules involves consideration of the transmitter power, height and gain patterns of the antennas, transmission loss for the paths of interest, and the protection ratio of the service desired that depends in turn on the transmitter emission and the receiver selectivity/ immunity characteristics. The protection ratio is the signal-to-interference ratio required for normal operation of the service. A designer has some control over these characteristics. When ideal isotropic equipment operates in an ideal isotropic environment, the symmetry considerations imply that transmitters are to be located at the nodes of a regular triangular grid to cover a plane. Table 2 is an example of taboos for UHF-TV, which is a channelized service [Hale'81], and Table 3 is an example of taboos for an inter-service case, where the channel concept does not apply.

 

The cochannel separation distance (Table 2) results from the type and quality of service desired and uncontrollable factors like propagation loss and variability. The rest of the taboos are description of actual receivers' response to signals coming outside the channel the receiver is tuned to. Each taboo is intended to prevent some kind of interference mechanism to occur and reflects some economic tradeoffs. Note that for an ideal receiver, only the cochannel distance are needed to be observed, as other interference mechanisms do not exist in such a theoretical receiver. Although we could approximate such an ideal very closely, its cost would be prohibitive.

 

Table 2. UHF-TV Taboos

Channel Separation

Required Distance Separation, km

0 (cochannel)

250

1

89

2

32

3

32

4

32

5

32

7

97

8

32

14

97

15

121

 

The minimum distance separation between a radionavigation station and a sound broadcasting transmitting station (Table 3) is such a distance beyond which the aeronautical service is unlikely to be significantly affected. The distances are shown in function of two parameters: (1) the frequency of the broadcasting station and (2) its power.

 

Table 3. Minimum distance separation between a radionavigation station and a sound broadcasting transmitting station [Handb'94]

 

Frequency of broadcasting station (MHz)

Effective Radiated Power
of Broadcasting Station (kW)
Ż

100

102

104

105

106

107

107.9

Minimum distance separation (km)

300

125

210

400

500

500

500

500

100

75

120

230

340

500

500

500

30

40

65

125

190

310

500

500

10

25

40

70

105

180

380

500

3

20

20

40

60

95

210

500

1

20

20

25

35

55

120

370

0.30

20

20

20

20

30

65

200

0.10

20

20

20

20

20

40

115

0.03

20

20

20

20

20

20

65

 

The radionavigation station was tacitly assumed to operate on the lowest frequency of the band allocated to the radionavigation service. The probability of interference to the radionavigation system decreases with the distance between the two stations and with the power of the FM station. When the distance is greater than indicated in the table, the probability of interference is very low. Similar rules and tables can be created for other services and/or service combinations.

 

FD rules were used for the first time in the USA in 1950s as a simple but efficient rule to coordinate the use of available frequency channels and are still in use. For example, the US Federal Communication Commission uses such rules in the FM Broadcasting service, and the Federal Aviation Administration uses them to prevent interference to aeronautical navigation systems. The FD concept is still used in some countries, especially in the North and South Americas. It also served as a basis for planning during the Regional Administrative MF Broadcasting Conference (Region 2), Rio de Janeiro, 1981. The FD rules assume ideal propagation conditions. These rules do not work in urban and mountainous environment where local terrain-dependent propagation effects introduce strong asymmetry. At some frequencies (e.g. VHF/UHF), in some applications, terrain shadowing is essential and more realistic propagation models (e.g. involving the propagation path profile) must be applied. As the actual terrain relief is irregular, required distance separations are direction-dependent and cannot be represented by a simple matrix as in Table 3.

 

Regular Lattices

A systematic application of FD rules to a radiocommunication network over a large and uniform territory lead to a frequency planning method known as “Regular Lattice Method”, introduced in Europe some 40 years ago (Eden, 1986; EBU, 1988). That approach is based on the following assumptions:

 

·     all transmitters are identical, having the same power, omnidirectional antennas at the same height and use the same polarization,

·     all transmitters are situated on an infinitely extended plane area exactly at nodes of a boundless regular lattice, with all nodes occupied and with no other locations allowed (natural and administrative boundaries ignored),

·     radio wave propagation is isotropic, uniform throughout the whole area and does not depend on terrain or ionospheric reflections

·     one set of frequency channels is regularly reused throughout the whole lattice following a linear distribution scheme

·     other frequency assignments are ignored.

 

Such frequency planning based on geometrically regular lattices is in use, especially in Europe and Africa. The lattice planning method was applied, among others, at the European VHF/UHF Broadcasting Conference, Stockholm 1961 (for UHF television) and the African VHF/UHF Broadcasting Conference, Geneva, 1963 (for television and FM sound).

 

Simplifications inherent to the regular lattice method make it difficult to take into account terrain shadowing, actual station locations and antenna radiation patterns, non-continuous coverage area, effects of other services sharing the same or adjacent frequency bands, etc. To deal with such difficulties, more elaborated methods and models, e.g. digital terrain model, are required. Many of these methods are based on a graph-theoretical approach that facilitates identification of “critical” elements. Figure 1 gives an example. It represents seven transmitters. For simplicity, the environment is disregarded and only co-channel interference mechanism is assumed. The small circles represent the transmitters. The lines represent potential incompatibilities (interference) between the transmitters. The number of lines associated with a transmitter (the degree of vertex) is the number of adjacent transmitters with which frequency coordination is required. As no adjacent transmitters may use the same frequency, it is the number of frequencies denied to them. The degree of vertex can, therefore, be used as a simple indicator of the difficulties in frequency assignment process. The “critical” transmitter is associated with the greatest number of lines. In Figure 1, transmitter #6 is “critical”. It requires coordination with 6 neighbours. Then go transmitters #3 with 5 neighbours, #1 with 4, #7 and #2 with 3, #4 with 2 neighbours, and #5 with a single adjacent transmitter. Several planning algorithms are based on that approach.

 

Use of Radio Devices

Not only the use of radio frequencies, but also the use of radio devices is regulated in most countries. In these countries, no radio equipment may be used without prior authorization by national spectrum manager. The reason is that each radio device can operate correctly only in specified environmental conditions, outside which it may create, or it may suffer, unacceptable interference. These requirements are defined in the radio regulations and standards, aimed at ensuring the required safety and quality of services, protecting the environment, etc. Often, however, their true purpose is to protect the local market. The authorisation process may involve everything from a simple identification of the equipment-type, to submission manufacturer's declarations or test results, to full-scale tests and measurements made by an independent laboratory. Usually, a fee is collected separately for testing and for issuing the authorisation certificate. The certificate may be issued for a single apparatus for use under specific conditions, or for a set of equipment to be operating over the whole territory of the country, etc.

 

International Spectrum Management

ITU

Almost from the beginning of its applications, a mechanism has been needed to ensure that radio would be used for the maximum benefit of all nations. The mechanism has been the International Telecommunication Union, created over a century ago (in 1865), just after the development of the electromagnetic telegraph. The Union has been heavily involved in all kinds of telecommunications, including radio. This involvement has consisted primarily of establishing binding rules and regulations necessary for the functioning of the international telecommunication system, setting standards to permit the integration of new technologies into the system, disseminating information necessary for the smooth functioning of the system, and providing assistance to the developing countries to create workable domestic telecommunications. All these functions meet a genuine need of the international community.

 

The decision-making functions of ITU are performed by Member States during conferences, assemblies, meetings of study groups, and sessions of the Council. The machinery is fairly transparent and simple: delegates from member countries meet to formulate plans of common activities, and to set new rules, regulations, standards and recommendations, or to revise the existing ones as the need has arisen. Members of voluntary study groups composed of experts from various countries carry out the necessary studies between the meetings. The secretariat provides the necessary infrastructure. During its existence, ITU has undergone many reorganisations, in response to the changing needs of its members. Now, ITU is a Specialized Agency of the United Nations within which governments and the private sector coordinate global telecommunication networks and services.

 

Plenipotentiary Conference

The supreme authority of the Union is the Plenipotentiary Conference, a meeting of official delegations from the Union's Member States. It is held every four years to adopt the underlying policies of the organization and to agree upon its structure and activities. Plenipotentiary conferences determine the direction of the Union and its activities, and make decisions relating to the structure of the organization via a treaty called the Constitution and Convention of the International Telecommunication Union. The conference, among others, elects the Member States, which serve on the Council, elects the Secretary General, Deputy-Secretary General, Directors of the three Bureaux of the ITU Sectors, and the twelve members of the part-time Radio Regulation Board that replaced the previous International Frequency Registration Board. These seventeen persons are the only elected individuals in ITU. The recent (15th) Plenipotentiary Conference was held in Minneapolis, United States, from 12 October to 6 November 1998. The next conference will be held in Marrakesh, Morocco, in 2002.

 

ITU-R

The Radiocommunication Sector (ITU-R) is that part of ITU, which deals with radio. One of its principal tasks is facilitating the complex inter-governmental negotiations needed to develop legally binding agreements between sovereign states. These agreements are embodied in the Radio Regulations and in regional plans adopted for broadcasting and mobile services. The Radiocommunication Sector embraces Radiocommunication Conferences, Radiocommunication Assemblies and Study Groups, Advisory Group, Radio Regulations Board, and Radiocommunication Bureau.

 

Radiocommunication Conference

World radiocommunication conferences (WRC) are international treaty-making conferences held under the auspices of ITU's Radiocommunication Sector. They are held every two to three years to review, and, if necessary, revise the Radio Regulations, the international treaty governing the use of the radiofrequency spectrum and the geostationary-satellite and non-geostationary-satellite orbits. Revisions are made on the basis of an agenda determined by the ITU Council, which takes into account recommendations made by previous world radiocommunication conferences. The general scope of the agenda of world radiocommunication conferences is established four to six years in advance, with the final agenda set by the ITU Council two years before the conference, with the concurrence of a majority of Member States. A WRC agenda may be altered at the express request of at least one quarter of the Union's Member States, subject to the approval of the Council. It may also be changed at the request of the Council itself. In all cases, any change to an agenda must be accepted by a majority of Member States.

 

A world radiocommunication conference can:

·       revise the Radio Regulations and any associated frequency assignment and allotment plans

·       address any radiocommunication matter of worldwide character

·       give instructions to the Radio Regulations Board and the Radiocommunication Bureau, and review their activities

·       determine Questions for study by the Radiocommunication Assembly and the Sector's study groups in preparation for future radiocommunication conferences

·       consider any question deemed necessary by a plenipotentiary conference.

 

In addition to world radiocommunication conferences, an ITU region or a group of countries may hold a regional radiocommunication conference, with a mandate to develop agreements concerning a particular radiocommunication service or frequency band. However, such conferences cannot modify the Radio Regulations, unless approved by a WRC, and the Final Acts of the conference are only binding on those countries that are party to the agreement. The recent World Radiocommunication Conference was held in Istanbul (Turkey) in 2000; the next one is planned for

 

Radiocommunication Assembly and Study Groups

Radiocommunication assemblies (RA) are responsible for approving the programme of work for the ITU-R study groups, organising preparatory studies for radiocommunication conferences, and for the approval of recommendations covering various technical and operational aspects of radiocommunications. The assemblies are normally held in association, in terms of timing and location, with the radiocommunication conferences.

 

More than 1500 specialists from telecommunication organizations and administrations around the world participate in the work of the Radiocommunication Sector's eight study groups.  These groups:

 

·       develop ITU-R Recommendations on the technical characteristics of, and operational procedures for, radiocommunication services and systems

·       draft the technical bases for radiocommunication conferences

·       compile handbooks on spectrum management and emerging radiocommunication services and systems.

 

Although the ITU-R Recommendations are legally non-binding (unless referred to in the Radio Regulations), they are used as de-facto standards on which national spectrum managers rely.

 

Regulatory and procedural matters are dealt with by a special committee, and studies of mutual interest to the Radiocommunication and Telecommunication Standardization Sectors are coordinated by inter-sector coordination groups (ICG). Conference Preparatory Meetings (CPM) prepare a consolidated report on the technical, operational and regulatory/procedural bases for a WRC. The CPM consolidates the output from the study groups and the special committee, together with any new material submitted to it by the ITU members.

 

Radio Regulations

To manage the radio frequency spectrum, the Sector has produced the Radio Regulations, Rules of Procedure, and Recommendations, which are reviewed from time to time. ITU-R prepares the technical groundwork, which enables radiocommunication conferences to make sound decisions, developing regulatory procedures and examining technical issues, planning parameters and sharing criteria with other services in order to limit the risk of harmful interference to a value acceptable by all those interested. The portion of the radiofrequency spectrum suitable for communications is divided into 'blocks', or 'frequency bands' the size of which varies according to individual services. These blocks are allocated to services on an exclusive or shared basis. The full list of services and frequency bands allocated in different regions forms the Table of Frequency Allocations, which is a substantial part of the Radio Regulations. It is illegal to violate the international Table of Frequency Allocations. Changes to the Table, and also to other parts of the Radio Regulations, can only be made by a world radiocommunication conference. Alterations are made on the basis of negotiations between national delegations, which work to reconcile demands for greater capacity with the need to protect existing services. If a country or group of countries wishes a frequency band to be used for a purpose other than the one listed in the Table of Frequency Allocations, changes may be made provided a consensus is obtained from other Member States. In such a case, the change may be indicated by a footnote, or authorized by the application of a Radio Regulations procedure under which the parties concerned must formally seek the agreement of any other nations affected by the change before any new use of the band can begin.

 

In addition to managing the Table of Frequency Allocations, world radiocommunication conferences may also adopt assignment plans or allotment plans for services where transmission and reception are not necessarily restricted to a particular country or territory In the case of assignment plans, frequencies are allocated on the basis of requirements expressed by each country for each station within a given service, while in the case of allotment plans, each country is allotted frequencies to be used by a given service, which the national authorities then assign to the relevant stations within that service.

 

Concluding remarks

We have reviewed some basic issues relevant to use of radio for communication purposes. Radio is a remarkable achievement in the never-ending process of human development, extending our natural communication capabilities, and digital technology is an important step in that process. Spectrum congestion deteriorates the transmission capacity of radiocommunication systems, and an improper use of radio might cause irreparable harm. For instance scientific observations of natural phenomena that cannot be repeated (such as radio astronomical observations), where substantial information contained in the natural signal received can irrevocably be lost. The examples of how strong interference might seriously disturb the operation of various systems exemplify the need for rational approach to use of radio waves. We reviewed possible abuses of radio, too. We described spectrum management system and discussed some ways and means to prevent radio interference. Above all these technicalities we should, however, always keep in mind that the principal idea behind the use of radio is bringing people closer together.

 

References

[Berry'83] Berry Leslie: The Spectrum Cost of Frequency-Distance Separation Rules; IEEE'83 EMC Symposium Record p. 75-78.

[Codding'84] Codding George International Constraints on the Use of Telecommunications: The Role of the International telecommunication Union; in Lewin L. Telecommunications

[Delogne'99] Delogne P, Baan W: Spectrum Congestion; in Modern Radio Science 1999, ed. by M A Stuchly, Oxford University Press 1999, p. 309-327

[Handb'94] ITU Handbook on National Spectrum Management, p. 63

[Hale'81] Hale William: New Spectrum Management Tools; IEEE'81 EMC Symposium Record, p. 47-53

[ITU'98] World Telecommunication Development Report, ITU 1998

[Johnson'61] Antenna Engineering Handbook; McGraw Hill 1961

[Kaku '98] Michio Kaku: Visions; Oxford University Press 1998

[Lewin'84] Lewin L: Telecommunications: An Interdisciplinary Text; Artech House, 1984, ISBN 0-89006-140-8

[NASA '95] Electronic Systems Failures and Anomalies Attributed to Electromagnetic Interference; NASA Reference Publication 1374, July 1995

[Pierce'80] Pierce J R: An Introduction to Information Theory; Dover Publ. 1998, ISBN 0-486-24061-4

[Rappaport'96] Rappaport T S: Wireless Communications; Prentice Hall, 1996, ISBN 0-13-375536-3

[Rimoldi'99] Rimoldi B: The Mobile Radio Interface; Presentation at Journee de la Recherche EPFL 1999

[Rotkiewicz'82] Rotkiewicz W: Electromagnetic Compatibility in Radio Engineering, Elsevier Scientific Publishing Co. 1982, ISBN 0-444-99722-9

[RR'98] Radio Regulations; International Telecommunication Union, 1998

[Shannon'49] Shannon C: Communication in the Presence of Noise; Proceedings of the IRE, January 1949, p. 10-21

[Struzak'99] Struzak R: Spectrum Congestion - a Voice in Discussion; The Radio Science Bulletin No. 291, December 1999, p. 6-7

[Wik'00] Wik M: Revolution in Information Affairs; Global Communications 2000

www.radio.gov.uk: Mapping the Future…; Convergence and Spectrum Management

-------------------------------------------

 

 

 

 

Figure 1. Graph representing a network of 7 transmitters

 

 


Annex. Examples of Unintended Effects of Radio Interference

B-52 Stability Case

When electronic flight-control systems were first added to the B-52 bomber autopilot system, use of the HF radio on the board resulted in the uncommanded activation of all rear flight control surfaces.  The cause was found to be spurious HF radio signals induced in the wiring system. (Source: NASA Reference Publication 1374, July 1995)

 

Unintended Missile Launch

During a B-52 missile interface unit test, an uncommanded missile launch signal took place. One of the contributing factors was crosstalk in the systems wiring.  The outcome was a yearlong redesign and test effort. (Source: NASA Reference Publication 1374, July 1995)

 

Pioneer Crash

Pioneer is the name of a remotely piloted vehicle (RPV) using a portable remote control box. During its flight tests performed by the US Navy in January 1987 aboard the U.S.S. Iowa, the pilot experienced a series of uncommanded manoeuvres that caused loss of control and a crash landing. Subsequent investigation found that the remote control boxes received false signals from HF communication transmitting antennas located aboard the Iowa due to inadequate shielding and cable termination. (Source: NASA Reference Publication 1374, July 1995)

 

Sheffield Catastrophe

The British Ship Sheffield had the most sophisticated antimissile defence system available. Despite that fact, during the Falklands War (1982), it was hit by an Exocet missile and sank with heavy casualties. It was possible because the antimissile system created electromagnetic interference to radio communication system of Sheffield's Harrier jet contingent assigned to the ship.  While the Harriers took off and landed, the missile defence was disengaged to allow communications with the jets. This provided a window of opportunity for the Exocet missile. (Source: NASA Reference Publication 1374 July 1995)

 

Blackhawk Crashes

An Army Sikorsky UH-60 Blackhawk helicopter, while flying past a radio broadcast tower in West Germany in 1987, experienced an uncommanded stabilator movement. Spurious warning light indications and false cockpit warnings were also reported. Subsequent investigation and testing showed that the stabilator system was affected by high intensity radiated fields (HIRF). The Blackhawk has a conventional mechanically linked flight control system with hydraulic assist.  The stabilator system, however, uses transmitted digital signals (fly-by-wire) to automatically adjust its position relative to control and flight parameters.  These digital signals are highly susceptible to HIRF. When the Blackhawk was initially designed, the Army did not routinely fly near large RF emitters. The Navy version of the Blackhawk, the SB-60 Seahawk, however, has not experienced similar EMI problems because it is hardened against the severe EMI aboard modem ships. Despite the Army identifying several hundred worldwide emitters that could cause problems and instructing its pilots to observe proper clearance distances, between 1981 and 1987 five Blackhawk helicopters crashed and killed or injured all on board.  In each crash, the helicopter flew too near radio transmitters. The long-term solution was to increase shielding of sensitive electronics and provide as a backup some automatic control resets. (Source: NASA Reference Publication 1374 July 1995)

 

Severmorsk Disaster

In mid-May 1984, a Soviet ammunition depot exploded.  The cause of the accident, according to the Soviets, was an over-the-horizon radar that had illuminated the depot. (Source: NASA Reference Publication 1374 July 1995)

 

F-16 Flight Controls

An F- 16 fighter jet crashed in the vicinity of a Voice of America (VOA) radio transmitter because its fly-by-wire flight control system was susceptible to the radio waves transmitted. Since the :F-16 is inherently unstable, the pilot must rely on the flight computer to fly the aircraft.  Subsequently, many of the F-16's were modified to prevent this type EMI, caused by inadequate military specifications on that particular electronics system. This F-16 case history was one of the drivers for institution by the Federal Aviation Administration (FAA) of the certification programs.  (Source: NASA Reference Publication 1374 July 1995)

 

Tornado Fighter Case

Another case occurred in 1984 near Munich, Germany.  A West German Tornado fighter crashed after flying too close to a powerful Voice of America transmitter. (Source: NASA Reference Publication 1374 July 1995)

 

ABS Failure

Early Antilock Braking Systems (ABS) systems on both aircraft and automobiles were susceptible to electromagnetic interference. Accidents occurred when the brakes functioned improperly because EMI disrupted the ABS control system. During the early years of ABS's, Mercedes-Benz automobiles equipped with ABS had severe braking problems along a certain stretch of the German autobahn. The brakes where affected by a near-by radio transmitter as drivers applied them on the curved section of highway. The near term solution was to erect a mesh screen along the roadway to attenuate the EMI. This enabled the brakes to function properly when drivers applied them. (Source: NASA Reference Publication 1374 July 1995)

 

Aircraft Passenger Carry-On Devices Cases

Passenger carry-on devices provide another group of case histories. They show the increased susceptibility to external EMI sources that modem automated electronic systems aboard aircraft experience from seemingly innocuous electronic devices, which include portable computers, AM-FM "walkman" cassette players, dictaphones, radios, heart monitors, and cellular phones. NASA maintains a database, known as the FAA Aviation Safety Reporting System (ASRS), which is a compilation of voluntary reports detailing safety problems submitted by pilots or crewmembers flying commercial and private aircraft. These reports are, for the most part, anonymous with non-specific aircraft models and unidentified operating companies. During the period 1986 - 1995, the database registered in average 5200 reports a year, among them those related to interfering radio waves.  (Source: NASA Reference Publication 1374 July 1995)

 

"Black hole" in railway system

An automatic electronic registration of rail wagons passing a number of checkpoints was introduced in the USA. After some time of successful exploitation it was noticed that wagons enter in some region, but never go out of it; it was as a black hole were there. Investigation discovered that a new radar station was constructed near the rails and its strong radio waves burned out electronic components installed on the wagons. (Source: private communication from a NTIA source)

 

Crosstalk between mobile and fixed phones

Antennas of a base station of mobile system were installed at the top of high building in the city centre. After putting the mobile system into operation, all conversations of the mobile users could hear at every telephone connected to the old fixed telephone installation. Investigations showed that the reason was the coupling between the antennas and the wired telephone installation.

Medical Equipment Cases

Modern medical equipment has experienced EMI problems.  From 1979 to 1993, the FDA received over 90 reports concerning EMI problems in the field. It was pointed out that users experiencing medical equipment performance degradation might not suspect EMI as a possible cause. Thus, EMI problems are more likely to be under-ported to the FDA. (Source: NASA Reference Publication 1374 July 1995)

 

Ambulance Heart Monitor/Defibrillator

Susceptibility of medical equipment to conducted or radiated emission is a concern. A case was reported when a 93-year-old heart attack victim was being taken to the hospital and the medical technician had attached a monitor/defibrillator to the patient. Because the machine shut down every time the technicians turned on the radio transmitter to request medical advice, the patient died. An investigation showed that the monitor/defibrillator was exposed to exceptionally high radiated emissions because the ambulance roof had been changed from metal to fibreglass and fitted with a long-range radio antenna. Reduced shielding combined with the strong radiated radio signal resulted in EMI to the vital machine. (Source: NASA Reference Publication 1374 July 1995)

Runaway Wheelchairs

Wheelchairs came under the scrutiny of the FDA because of reported erratic, unintentional powered-wheelchair movements. These movements included sudden starts that caused wheelchairs to drive off curbs or piers when police, fire, or CB transmitters were activated near the chairs. Although no fatal injuries have been reported, FDA has ordered manufacturers of motorized wheelchairs to shield them from EMI and to educate users on the potential EMI hazards. (Source: NASA Reference Publication 1374 July 1995)

 

Electronic Implant Activation

An English-language journal read by over 1 million readers reported the following story. A pastor in San Francisco wanting to cure his impotence was trying an experimental electronic penis implant, which produced erection at the touch of a button. However, in addition to the intended erections, the man experienced also random ones. Investigations revealed the cause. It was insufficient immunity to signals from a radio-steered garage-door opener. Every time his next-door neighbour used the opener, the pastor had an erection. He commented: "It's bad enough when I'm pottering around the back yard, but it gets really mortifying when I'm delivering my Sunday sermon." (Source: Europa Times No.22, March 1995)

 

Hearing Aids

Millions of pensioners using a hearing-aid device were forced to endure 24-hour rap and rock music from the 41 powerful radio transmitters of Ailing Radio 1 network in the UK. It was caused by a flaw in the device that acted as unintended radio receiver, in addition to its intended function. (Source: Europa Times No.22, March 1995)

 

Burning Crane

A building worker was burned when working with a crane installed for building operations. Investigations discovered that the crane and its metallic chain created a conducting loop when the worker touched the load. The burning was due to the electrical current induced by a nearby broadcasting transmitter.

Accident at Foundry Plant

A foundry worker died when the radio-controlled overhead transporter thought out a few tons of liquid metal on him. Investigations discovered that it was an accidental signal from a radio transmitter that was interpreted by the automatic system as the legitimate order to get rid of the load.



[1] National spectrum management is a serious responsibility. In Germany, for instance, about 2.500 civilian personnel are involved in such an activity (1992). In the USA, the Federal Communication Commission employs over 1.000 staff and spends $100 million yearly (1990).


 [RS1] Note that we disregard here the meaning of transmitted information.

 [RS2]Volume of information 1 page text picture

 [RS3]The worldwide population of mobile phones reached roughly 285 million in 1998. The number of new mobile telephone subscribers increased from 4 million in 1990 to 75 million in 1998, according to the data published by the International Telecommunication Union, and continues to increase further [RS3].

 [RS4][Shannon'49]

 [RS5][Shannon'49]

 [RS6] is shown schematically in Figure 2. It

 [RS7][Shannon'49]

 [RS8][Pierce'80]

 [RS9][Shannon'49]

 [RS10][Rimoldi'99]

 [RS11]  as shown in Figure 1

 [RS12][Pierce'80]

 [RS13][Pierce'80]

 [RS14][Wik'00]

 [RS15]Background

 [RS16]Unwanted  [RS16]Reflections

 [RS17]Unwanted Receptions

 [RS18] Figure 2 illustrates the case. [Struzak'99]

 [RS19]Figure 3 illustrates the relation.

 [RS20]Figure 4 illustrates this relation.