Technical interference in CCTV networks


Classification of interference in communication lines

Interference

extraneous electromagnetic disturbances
n(t)
, superimposing on the transmitted signals
S(t)
and interfering with the reception of signals.

Interference in its form

are divided into:

1) sinusoidal

– interference from an industrial network with a frequency of 50 Hz, from medical installations and various devices;

2) pulse

– in the form of individual pulses or groups of pulses, for example, interference from ignition systems of internal combustion engines;

3) chaotic

– for example, thermal noise (Brownian motion).

By the nature of the interfering influence

interference is divided into:

1) additive

– if in the communication channel the interference
n(t)
adds up to the useful signal
S(t)
, i.e.
Z(t)
=
S(t)+ n(t)
;

2) multiplicative

– if the impact of interference
n(t)
is equivalent to a change in the transmission coefficient of the communication channel
Z(t)
=
S(t)∙n(t)
.

Additive interference

have the following main types:

1) interference from neighboring radio channels

, arising, for example, due to the overlap of the spectra of adjacent channels at the carrier frequency.

Figure 5.8 – Overlapping of adjacent communication channels with carriers

frequencies f01 and f02

Countermeasures include spreading the carrier frequencies of adjacent channels by at least two half-widths of the signal spectra;

2) industrial interference

. These include electromagnetic radiation caused by the occurrence of damped oscillations during sparking in various electrical devices. They manifest themselves in random crackling and clicking sounds from phones.

Control measures - preventing or reducing sparking, using filters to short-circuit HF oscillations in devices, shielding radio equipment;

3) atmospherics

. The cause of occurrence is electromagnetic radiation during lightning discharges, which manifests itself at long and medium waves in the form of strong irregular crackling sounds in phones.

Control measures - removal to the ultrashort wave range, free from this type of interference;

fluctuation noise

– this interference is internal noise, which means random fluctuations of currents and voltages in the elements of radio equipment. Such interference is a sequence of short pulses that have a random moment of occurrence.

Interference in communication lines

Interference in the form of electrical noise

can be defined as the unwanted energy that accompanies a signal in an electronic system.
In any system, in addition to the signal, there is always noise. An example of noise is crosstalk
, when two different telephone lines are switched during a telephone call, as a result of which you can hear what other people are saying on the receiver.
Another example is co-channel interference
, which sometimes occurs in television systems when exposed to atmospheric conditions. In this case, the television signal begins to spread over distances exceeding normal ones. This leads to mutual interference with local radio stations broadcasting on the same frequencies.

There are two types of interference in communication systems : industrial or artificial

and
natural
.
Artificial ones
arise as a result of exposure of the system to various sources of electromagnetic radiation, for example, industrial equipment, some types of incandescent lamps, etc.
Natural
interference occurs as a result of natural phenomena, for example, the characteristic crackling in a radio receiver caused by lightning discharges in the atmosphere.
This is an example of atmospheric
interference.
Another source of noise is cosmic radiation, called cosmic
clutter.
It is caused by the radiation of stars as a result of energy conversion processes occurring in them. In addition, electronic components
. In this case we talk about noise. These include:

1) thermal noise

- noise that occurs during thermal excitation of the atoms of a conductor or resistor, resulting in the creation of free electrons. These electrons move chaotically in different directions at different speeds. Their movement leads to the appearance of a random potential difference at the ends of the conductor or resistor. Thus, a thermal process occurs.;

2) shot noise

occurs wherever direct or alternating current flows through any active device and random fluctuations in the magnitude of this current occur, which are superimposed on the signal and distort it. The name shot noise comes from the specific crackling sound that can be heard in headphones if the signal is amplified using a low-frequency amplifier;

3) flicker noise

occurs in semiconductor, vacuum and other devices due to defects in the crystal structure of the material, which lead to fluctuations in conductivity. The origin of the noise is still not fully understood. Flicker noise cannot be modeled because it varies from device to device. In most practical applications, at frequencies above 10 kHz, flicker noise can be neglected. It is conventionally considered that flicker noise occupies a band of 0.1…103 Hz.

To assess the quality of the system, the signal-to-noise ratio

is the ratio of the maximum value of the signal voltage to the effective value of the noise voltage in accordance with (3.24):

(5.46)

The signal-to-noise ratio is often measured in decibels:

, dB. (5.47)

Sometimes the signal-to-noise ratio is taken as the signal power ratio Ps

and average interference power
Pп,
also expressed in decibels:

, dB. (5.48)

Typical values ​​for an acceptable signal-to-noise ratio are about 50-60dB for high-quality radio broadcasting of music programs, 16dB for low-quality speech transmission and up to 30dB for commercial telephone systems, 60dB for good-quality television broadcasting. The overall signal-to-noise ratio of all circuits in the system is determined by the product, and when expressed in decibels, by the summation.

Problems for section 5

Example 1.

A discrete, noise-free channel uses an alphabet with four different symbols to transmit messages. The duration of all symbols is the same and equal to 1 ms. Determine the capacity of the information transmission channel.

Solution.

Let us write the expression for the capacity of a discrete channel without interference:

,

n

– the total number of messages from the alphabet with four characters.

n=44=64; ms. dv.unit/s.

Example 2.

The source produces symbols with probabilities
p
1=0.2;
p
2=0.7;
p
3=0.1. Information is transmitted in binary code, the duration of all symbols of which is 1 ms. Determine the speed of information transmission over a channel without interference when using a uniform code.

Solution.

Let's write down the expression for the information transmission speed:

,

dv.unit/s.

Example 3.

A source producing four symbols with a priori probabilities
p
1=0.4;
p
2=0.3;
p
3=0.2;
p
4=0.1, connected to an information transmission channel with a capacity of C=1000 bits/sec. Information is transmitted using a uniform binary code. Determine the speed of information transfer.

Solution.

The expression for the information transfer rate has the form:

,

Let us express UY

from the expression for communication channel capacity:

; bps

bit.

bps

Example 4.

On average, how many letters of Russian text can be transmitted in 1 second via an information transmission channel with a capacity of C = 1000 bit/s, provided that the average entropy of the Russian language per letter is 2 bits. Determine the amount of information.

Solution.

;

Let us express UY

from the expression for communication channel capacity:

, since n

=32 (number of letters in the Russian alphabet), then

bit/s, N

(
X
)=2 bits, bps.

Example 5.

Determine the capacity of a binary symmetric channel with noise at the probabilities of elementary symbol distortion q=0.001 and q=0.01.

Solution.

1) q=0.001, p=1-q=1-0.001=0.999;

bps

2) q=0.01, p=1-q=1-0.01=0.99;

bps

Example 6.

Determine the entropy of the system, which is described by a discrete random variable
x
with the following distribution series:
p
(
x
1) =
p
(
x
2) =
p
(
x
3) =
p
(
x
4) = 0.01,
p
(
x
5) = 0 .96.

Solution.

Let us write the expression for the entropy of a discrete random variable:

bit.

Example 7.

Determine the entropy of the system, which is described by a discrete random variable
xi
with the following distribution series
p
(
x
1) =
p
(
x
2) =
p
(
x
3) =
p
(
x
4) =
p
(
x
5) = 0.2.

Solution.

bit.

Example 8.

Determine the S/N ratio at the output of the system shown in Figure 5.9 with gains
G
1,
G
2 and
G
3, write it down numerically and express it in decibels. The input signal power is 2 mW, the noise level is 5 μW. It is assumed that the circuit links do not introduce their own noise.

Figure 5.9

Solution.

Let's determine the S/N at the system input: .

Let us determine the S/N at the system output: .

Let's express the S/N ratio in decibels at the system input: .

Let us express the gain factors of each circuit link in decibels:

Hence the total gain of the system will be equal to:

G

1 +
G
2 +
G
3 = 16.02 + 10 + 9.03 = 35.05 dB.

The S/N ratio in decibels at the system output will be equal to:

.

5.9 Tasks for independent solution

Problem 5.1.

Determine the capacity of a binary symmetric channel with noise at the probabilities of elementary symbol distortion q=0.1 and q=0.0001.

Problem 5.2.

Determine the entropy of the system, which is described by a discrete random variable
x
with the following distribution series:
p
(
x
1) = 0.1
p
(
x
2) = 0.2
p
(
x
3) = 0.5
p
(
x
4) = 0, 1,
p
(
x
5) = 0.1.

Problem 5.3.

Determine the entropy of the system, which is described by a discrete random variable
xi
with the following distribution series
p
(
x
1) =
p
(
x
2) = 0.1
p
(
x
3) =
p
(
x
4) =
p
(
x
5) = 0.3 .

Problem 5.4.

On average, how many letters of Russian text can be transmitted in 1 second via an information transmission channel with a capacity of C = 500 bit/s, provided that the average entropy of the Russian language per letter is 2 bits. Determine the amount of information.

Problem 5.5.

A source producing four symbols with a priori probabilities
p
1=0.01;
p
2=0.2;
p
3=0.5;
p
4 = 0.29, connected to an information transmission channel with a capacity of C = 1000 bits/sec. Information is transmitted using a uniform binary code. Determine the speed of information transfer.

Problem 5.6.

Determine the signal-to-noise ratio (S/N) at the output of the system shown in Figure 5.10 with gains
G
1,
G
2 and
G
3, write down in numerical form and express in decibels. The input signal power is 2 mW, the noise level is 5 μW. It is assumed that the circuit links do not introduce their own noise.

Figure 5.10

Problem 5.7.

The source produces symbols with probabilities
p
1=0.25;
p
2=0.7;
p
3=0.01,
p
4=0.01
p
5=0.01
p
6=0.01
p
7=0.01. Information is transmitted in binary code, the duration of all symbols of which is 2 ms. Determine the speed of information transmission over a channel without interference when using a uniform code.

Problem 5.8.

A discrete, interference-free channel uses an alphabet with six different symbols to transmit messages. The duration of all symbols is the same and equal to 2.5 ms. Determine the capacity of the information transmission channel.

Problem 5.9.

Determine the signal-to-noise ratio (S/N) at the output of the system shown in Figure 5.11 with gains
G
1,
G
2 and
G
3, write down in numerical form and express in decibels. The input signal power is 2 mW, the noise level is 5 μW. It is assumed that the circuit links do not introduce their own noise.

Figure 5.11

Problem 5.10.

On average, how many letters of Russian text can be transmitted in 2 seconds via an information transmission channel with a capacity of C = 5000 bit/s, provided that the average entropy of the Russian language per letter is 3 bits. Determine the amount of information.

Meaning of the word interference

interference
in wired communications, external electromagnetic influences on wired (overhead, cable) lines, as well as electrical processes in them, causing distortion of transmitted information. Depending on the type of information, errors appear: in the form of errors during the transmission of telegrams and data transmission; in the form of rustling, crackling sounds, poor intelligibility of subscribers’ speech and audibility of conversations taking place on adjacent channels during telephone communications; insufficient clarity of strokes and the appearance of unnecessary strokes when transmitting photo telegrams and newspaper strips; in the distortion of commands in telemechanics and telesignaling systems, etc. P.'s action depends on many reasons and, as a rule, is random. Therefore, the problem of noise immunity in wired communications is solved using methods of probability theory and mathematical statistics. P. can be divided into 2 groups: additive and non-additive. Additive signals include signals that add linearly to the signal. They contain 3 components that are different in their statistical properties: fluctuation, harmonic, and impulse signals. The signal distortions introduced by each component are determined by many factors, for example, the ratio of the powers or amplitudes of the signal and the signal, the method of transmission and reception, and the composition of the frequency spectra of the signal and the signal. The most characteristic are fluctuation fluctuations, which are caused by thermal noise (see Electrical fluctuations) of electron tubes and semiconductor devices, the influence of adjacent communication channels (in multichannel equipment), etc. Harmonic voltages in systems using communication cables are relatively rare; their appearance indicates damage to the cable. In communication channels using overhead lines, they appear quite often - this is mainly the radiation of long-wave radio broadcasting stations. Pulse signals do not lead to a significant reduction in the quality of telephone communications, but are the main cause of errors in the transmission of digital and other types of discrete information. Sources of pulsed signals include poor-quality electrical contacts, switching in wired communication equipment, lightning discharges, nearby radio stations, electrified railways, power lines, etc. Non-additive ones include P., which cause parasitic modulation of the signal. They arise due to the nonlinear dependence of the characteristics of the communication channel on the signal parameters and on time and significantly affect the transmission of signals mainly in long-distance wired communication channels. Lit.: Fundamentals of data transmission over wired communication channels, M., 1964; Data transmission channels, ed. V. O. Shvartsman, M., 1970; Long-distance communication, ed. A. M. Zingerenko, M.,

1970. A. I. Koblenz.

Great Soviet Encyclopedia, TSB

Rating
( 1 rating, average 4 out of 5 )
Did you like the article? Share with friends:
For any suggestions regarding the site: [email protected]
Для любых предложений по сайту: [email protected]