LTE Physical Layer
– Channel Structure
– Single Carrier Frequency Division Multiple Access (SC-FDMA)
– LTE Data Rates
– LTE TDD
New Bands for LTE
LTE Network Architecture
Description of LTE
As described, UMTS has been further developed to higher bit rates via HSDPA. But it was clear that limits would be reached. You couldn’t fit more users into a CDMA band. But you couldn’t just increase the bandwidth either. Even higher bandwidth would have been accompanied by even shorter chip rates and intersymbol interference would no longer have been manageable. It was required to think about a new system and it soon became clear that it would be OFDM based. OFDM had the great advantage that the bandwidth could be changed very flexibly and practically had the highest possible efficiency. WLAN had already proven this and now IEEE 802.16 was also introducing OFDM for large networks.
However, mobile operators did not want to experience another transition like from 2G to 3G when they had to install an entirely new network. So a strict requirement from the mobile communications industry was that a new system should work as optimally as possible with the old (3G) system. It should be an evolution, not a revolution.
One manufacturer of network components was the Canadian manufacturer Nortel. A team there had been working on a cellular system based on OFDM since 1998. Between 2002 and 2004 they developed a design called HSOPA (High Speed OFDMA Packet Access). They proposed this to 3GPP. This triggered a discussion between the various members. In a meeting in Toronto in November 2004, it was officially agreed to begin a study for an OFDM based system. However, this project was not called HSOPA but rather the “Long Term Evolution”, abbreviated LTE. The requirements for this project can be summarized as follows:
- Reduced transmission costs per bit
- More services at lower costs
- Flexible use of existing and new frequency bands
- Simple network interfaces
- Low power consumption for end devices
It was clear that they couldn’t take much time. The GSM specification and the UMTS specification had taken almost 10 years. WiMAX was already specified in 2004. That’s why LTE couldn’t wait until 2014. They also didn’t want to wait until the ITU had completed the successor system to IMT 2000 for 4G. Rather, a 3.9 G system was created as an intermediate step.
Just one year later, it was decided that the new system should be OFDM based. In June 2006 the studies were completed and detailed specification began. The exact boundary conditions were:
- Data rates 100 Mbit/s in the downlink (comparable to WiMAX)
- 2-4 x efficiency of HSDPA
- IP based and optimized
- Bandwidths 1.25 MHz, 5 MHz, 10 MHz, 20 MHz.
- TDD and FDD based
- Compatible with 3G network
The first LTE demonstrators were shown as early as 2006. In 2007 the first 144 Mbit/s transmission from Ericsson.
On the semiconductor manufacturer side, Infineon was the first provider of an RF chip for LTE. In February 2008, a number of semiconductor manufacturers presented a series of demonstrators at the Mobile World Conference in Barcelona. Ericsson Mobile Platform, Freescale and NXP have already demonstrated LTE transmissions.
In December 2008, LTE was released in 3GPP Release 8. In December 2009, the first LTE network went into operation at Telia Sonera in Sweden.
LTE Physical Layer
LTE is an OFDM based system. It should support different bandwidths:
1.25MHz, 2.5MHz, 5MHz, 10MHz, 15MHz and 20MHz.
The subcarrier spacing for LTE is set to 15 kHz. The different bandwidths therefore result from different lengths of the FFT, from 128 to 2048. Not all subcarriers are used and there is a relatively high number of guard subcarriers that do not carry any information. The null carrier is also not used for the same reason as discussed under WLAN.
Channel Bandwidth (MHz) | 1.25 | 2.5 | 5 | 10 | 15 | 20 |
Frame Length (ms) | 10 | |||||
Subframe Length (ms) | 1 | |||||
Sub Carrier distance (kHz) | 15 | |||||
Sampling rate (MHz) | 1.92 | 3.84 | 7.68 | 15.36 | 23.04 | 30.72 |
FFT Size | 126 | 256 | 512 | 1024 | 1536 | 2048 |
Used Sub-Carrier | 76 | 151 | 301 | 601 | 901 | 1201 |
Guard Sub-Carrier | 52 | 105 | 211 | 423 | 635 | 847 |
Number Resource Blocks | 6 | 12 | 25 | 50 | 75 | 100 |
Occupied Bandwidth (MHz) | 1.14 | 2.265 | 4.515 | 9.015 | 13.515 | 18.015 |
Bandwidth Efficiency (%) | 77.1 | 90 | 90 | 90 | 90 | 90 |
OFDM Symbols/Subframe | 7 oder 8 (short or long) | |||||
CP Length (short) ms | 5.2 for first symbol followed by 4.69 | |||||
CP Length (long) ms | 16.67 |
Unlike WLAN, a base station (which in LTE is called eNodeB) continuously sends frames of 10 ms in length. The frames are divided into 10 subframes of 1 ms length and these again into two slots of 0.5 ms length. These slots consist of a sequence of 7 LTE symbols. However, before the actual symbol is sent, there is a so-called Cyclic Prefix (CP) which separates the individual symbol bits from each other. As with WiMAX, the aim here is to avoid intersymbol interference. The CP is 160 samples long at the beginning of the slot and then 144 samples long. After the CP come the 2048 sample values of the symbol, which can then be used directly for an FFT. The sampling rate is 30.72 MHz with a 20 MHz bandwidth. A symbol therefore has a length of 71.9 ms.
In addition to the CP defined here, there is also a long CP. This one is three times as long. For this reason, a long CP slot only has 6 symbols instead of 7.
A specific (temporal) symbol in a specific sub-channel is called a resource element. Theoretically, with the intended OFDMA process, every single resource element could be assigned to a user. However, this effort would be far too high. That’s why 12 neighboring resource elements (via the sub-channels) and all 7 symbols of a slot are combined and referred to as resource blocks (RB). An RB therefore has 12 x 7 = 84 resource elements. RB can now be assigned to different users.
As with UMTS, LTE user equipment (UE) must also synchronize precisely with the base station (eNodeB). Two synchronization channels are defined for this, the Primary Synchronization Channel (PSS) and the Secondary Synchronization Channel (SSS). The PSS is located on the last symbols of a slot in sub-frame 0 and then again in sub-frame 5. 62 symbols of a so-called Zadoff-Chu sequence are used. These are on the inner 62 sub-carriers and are therefore always the same regardless of whether the bandwidth is 1.25 or 20 MHz. The synchronization is therefore always the same.
Like all synchronization sequences, the Zadoff-Chu sequence also has the property that when the incoming test sequence and the known sequence meet exactly, they have maximum autocorrelation. This allows the exact time of the last symbol or the beginning of the following subframe to be determined exactly. Not one but a total of 3 different Zadoff-Chu sequences are used. Which one is used depends on the eNodeB identity.
Now the UE looks for the SSS, which is one symbol before the PSS. 168 possible block codes are transmitted in the SSS. The Zadoff-Chu index and the corresponding block number of the SSS then result in the cell ID of the eNodeB with which it can be identified, similar to what happened with UMTS via the spreading code. Encoding the SSS also determines whether a short or long CP is used.
To estimate the transmission channel, LTE reference symbols are embedded in the resource blocks. As shown in the following figure, 4 reference symbols (red) are distributed over a resource block. They can be used to measure the reception strength of the channel. The strength of the free channels can be estimated by interpolation.

Furthermore, the reference symbols are also used for channel estimation in MIMO operation. Then the reference symbols shown in red are emitted by antenna 1 and the symbols shown in black by antenna 2. From this, the corresponding transmission can then be determined and MIMO operation can be initiated.
Channel Structure
LTE is based on pure packet-oriented transmission. Consequently, there are no longer any dedicated channels that are assigned to a user to guarantee a certain capacity. Instead, there are only shared channels, the Physical Downlink Shared Channels (PDSCH), for transmitting the data. This is assigned to the users per sub frame every millisecond. This assignment is managed directly by the eNodeB and is a complex process because it must be ensured that when there is a lot of traffic, all participants are processed, regardless of the quality of the respective transmission channel.
The shared channel is assigned via the Physical Downlink Control Channel (PDCCH). This channel occupies all symbols from slot 0 into a sub frame. The number of symbols used for the PDCCH is flexible and can be between 1 and 4 (2 in the figure shown below). So that the UE (User Equipment) knows how many symbols are used, there are some symbols in the first slot which are called Physical Control Format Indicator Channel (PCFICH). A UE first reads the PCFICH and then the PDCCH to find out which RBs are assigned to it in this frame.
In parallel to the PCFICH, the so-called Physical HARQ Indicator Channel (PHICH) is also transmitted, which contains a fast feedback channel from the eNodeB to the last data packet transmitted.
The following figure shows the various channels discussed for 6RB and 5 sub-frames.

SINGLE CARRIER FREQUENCY DIVISION MULTIPLE ACCESS (SC-FDMA)
So far, the main advantages of OFDM modulation have been discussed. However, there is also a disadvantage that primarily affects cellular operations. This disadvantage lies in the fact that a large number of subchannels are transmitted at the same time. All of these channels are added together and then transmitted. In extreme cases, 1200 subchannels are transmitted with a bandwidth of 20 MHz. If each channel transmits with an amplitude of 1, it is theoretically possible (although extremely rare) for all the amplitudes to add up and briefly produce an amplitude of 1200. In any case, the spectrum of possible amplitudes is very high. This is characterized by a quantity known as the Peak to Average Power Ratio (PAPR). This is the ratio of the highest occurring amplitude to the average amplitude.
Why is high PAPR a problem? The sampling of the signals must first be precise enough to be able to display all channels again. But this is a minor problem. The bigger problem is that there should not be any distortion when sending the signal.
Distortion is generated by the transmitter’s power amplifier. An amplifier is linear over a wide range (i.e. there is no distortion here), which means that the gain is always proportional to the input signal. But at high gains, saturation occurs, which then leads to nonlinearity and distortion. The technical problem is that an amplifier is very inefficient when operated in the linear range. For one watt of output power you have to put in 10 W.
This is more manageable for eNodeBs because they have enough energy. However, this is very critical for mobile devices because high power consumption has an impact on battery. All previous mobile communications standards have therefore had a modulation that generates a constant amplitude, such as GMSK in GSM. This means there is no distortion and the amplifier can be operated very efficiently. This problem is not so big with WLAN because it only transmits at low power.
In order to reduce this problem with LTE, a different modulation method was chosen for the uplink, Single Carrier Frequency Division Multiple Access (SC-FDMA). This procedure is very similar to the OFDMA procedure in the downlink. The difference is that the UE’s uplink data is first transformed with a digital Fourier transformation and then converted into a time signal using an iFFT, as before. This does not create many individual sub-carriers, but rather just a single wide carrier. This has a significantly smaller PAPR than a classic OFDMA signal.

LTE Data Rates
The data rate for LTE is calculated quite simply: If you assume the most optimal conditions: 64 QAM (6 bit) and no channel coding and a bandwidth of 20 MHz (1200 sub carrier) this is calculated with a symbol length of 71.4 ms (short CP)
6 x 1200 / 71.4 ms = 100.8 Mbps
If you also have ideal MIMO conditions, the data rate doubles. However, there is no transmission without channel coding. The lowest coding rate is 11/12. This means that the highest transfer rate is around 90 Mbit/s without MIMO.
Additionally, this does not take into account the overhead of the reference and control channels. Maximum data rates are therefore closer to 75 Mbit/s. The data rates achieved in practice are significantly lower and depend on the general load (how many users are served) and the transmission conditions.
On average, uplink data rates are always only half as large as the downlink data rates, especially because the uplink is modulated with a maximum of 16 QAM.
The advantage of LTE over HSDPA becomes clear when comparing the data rates at 5 MHz bandwidth (5 MHz is the W-CDMA bandwidth). For HSDPA this is a maximum of 14.4 Mbit/s while for LTE it is 25.2 Mbit/s. The spectral efficiency is therefore almost twice as high.
LTE TDD
Like GSM and UMTS, LTE has an uplink and a downlink band. This is called FDD mode. A TDD mode is also defined for LTE. Here the same frequency band is shared for uplink and downlink. A disadvantage here is that you have to leave a gap between the uplink and downlink, the so-called guard period.
LTE TDD uses the same physical layer and frame structure with frame lengths of 1 ms. It follows 7 possible arrangements of uplink and downlink sub frames which repeat every 10 ms after 10 sub frames. This is shown in the following table.
Configuration | Peak Data Rate Down/UP Mbit/s | Sub Frame Number | |||||||||
0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | ||
0 | 41/32 | D | S | U | U | U | D | S | U | U | U |
1 | 62/22 | D | S | U | U | D | D | S | U | U | D |
2 | 82/11 | D | S | U | D | D | D | S | U | D | D |
3 | 64/15 | D | S | U | U | U | D | D | D | D | D |
4 | 82/11 | D | S | U | U | D | D | D | D | D | D |
5 | 94/5 | D | S | U | D | D | D | D | D | D | D |
6 | 57/27 | D | S | U | U | U | D | U | U | U | D |
New Bands for LTE
For the new LTE standard, new frequency bands were necessary or GSM bands had to be reused for LTE.
This was the case for the DCS band around 1800 MHz, band 3
- Lower Band: 1710MHz – 1785MHz
- Upper Band: 1805MHz – 1880MHz
For America, the PCS band was used for LTE, band 2
- Lower Band: 1850MHz – 1910MHz
- Upper Band: 1930MHz – 1990MHz
A whole new band emerged from the fact that radio broadcast television was gradually switched off and replaced by DVB-T. Through digital transmission, the necessary bandwidth per television channel can be reduced, freeing up frequency bands. This was called digital dividend. This resulted in volume 20
- Lower band 791 MHz – 821 MHz
- Upper band 832 MHz – 862 MHz
The low frequencies of band 20 should be used primarily by mobile phone operators to reach rural regions. Low frequencies have very good propagation properties.
The new Band 7 is intended more for urban regions.
- Lower Band: 2500MHz – 2570MHz
- Upper Band: 2620MHz – 2690MHz
In North America, the so-called AWS band (Advanced Wireless System) was made available for LTE, band 4.
- Lower Band: 1710MHz – 1755MHz
- Upper Band: 2110MHz – 2155MHz
This band is characterized by its extremely large band gap of 400 MHz.
For TDD, the Band 7 band gap of 50 MHz is used. 2570MHz – 2620MHz. This is referred to as volume 38.
The frequency bands in Germany were auctioned in 2010. There were blocks of 5 MHz each for the 800 MHz band. A maximum of 10 MHz in the 800 MHz band could be used per mobile operator. In the 2500 MHz range, however, 20 MHz bandwidths were possible.
LTE Network Architecture
LTE is a pure IP based network. Unlike GSM and UMTS, there is no access to the telephone network. There is only access to the Internet. So it’s pretty simple.

There is no network control unit in this architecture like the RNC in UMTS. The eNodeB controls the radio interface largely autonomously. This is also due to the fact that the eNodeB has to react very quickly and cannot wait for an external unit. This makes the eNodeB the most complex unit in the entire network.
An eNodeB is often divided into two units, the digital part and the analog radio part. This is located in a remote radio head close to the antenna and is connected to the digital part with a fiber optic connection (CPRI interface). The most important task of the eNodeB is complex and time-critical scheduling, i.e. the allocation of resources to the many users. It must also prioritize when special services are offered (see QoS and VoLte, which are described later). The eNodeB is also responsible for a handover. For this purpose, the eNodeB are connected to each other with their own X2 interface.
The central unit of the LTE network is the Mobility Management Entity (MME). All network signaling runs via the MME. The MME is connected to the Home Subscriber Server. It corresponds to the Home Location Register of the GSM/UMTS networks.
As with GPRS and UMTS, LTE has two units that manage access to the Internet. The Serving Gateway (S-GW) is a router which, on the one hand, enables a connection to the eNodeBs and, on the other hand, to the Packed Data Gateway (P-GW), which handles access to the Internet. The S-GW switches between the eNodeB when cells change. If necessary, an S-GW can also switch to a UMTS or GSM connection if an LTE connection is not possible. The P-GW creates the connection to the Internet. When a connection is established, a temporary IP address is assigned to the end device.