Evolving needs in T&M space

Technology watch , September 16, 2010

Testing is a complex and demanding process requiring overall systems expertise and effective integration of test results. A successful test solution could be defined as an updated and extremely flexible (scalable/compact) solution that is adept at addressing changing market requirements such as rapidly evolving technologies and intense competition. Moreover, it should address the emerging and legacy wireless data standards with the same hardware and at the same time, be a low-cost, reconfigurable test system that meets the operator’s wireless testing needs.

Test and measurement (T&M) requirements differ for handset hardware and software as well as network hardware and software. Compatibility and consistent hardware and software performance depends on functional testing and interoperability testing. To ensure interoperability, equipment must be tested on all the network interfaces with various other network elements in a live network environment.

In general, software is harder to test than hardware. Hardware testing is relatively deterministic and either works or does not under a predetermined set of operational conditions, namely, temperature and humidity. Individual hardware components – passive and active components, memory and microcontrollers – can be tested against precise, pre-agreed performance specifications. On the other hand, software is less deterministic and it is difficult to simulate or predict software behaviour under all operational conditions. The way in which a code has been written and complied with by individual vendors may determine the behaviour of the code. For instance, in the case of 2G handsets, the majority of product recalls and in-service problems tend to be related to software implementation.

The success of different communications standards that have varied testing requirements is heavily dependent upon the available T&M equipment. T&M vendors need to constantly innovate and develop flexible test solutions that incorporate integrated functionality to address a variety of 2G and 3G wireless data standards.

In the case of 3G, since higher frequencies are used and more complex designs become the norm, the challenge to test equipment has increased. Signal generators, spectrum analysers, protocol analysers, field-strength meters and other specialised pieces of complex equipment in varying degrees of complexity are required to address multiple frequencies, modes and protocols. Protocol analysers are particularly important to test high speed packet data functions like GPRS and EDGE. Test equipment leaders such as Tektronix and Agilent Technologies now have full lines of generic radio frequeny (RF) test equipment and many new 3G test products.

With 3G rollouts on the anvil in India, operators will have to deal with the T&M implications of transitioning from 2G networks to 3G networks. For a seamless transition to higher speed 3G networks, it is necessary to ensure that handset hardware works with the corresponding handset software and network hardware works with the corresponding network software. In addition, it is important that handset and network hardware and software are compatible. This is likely to be hard to achieve consistently in 2G and 2.5G networks, and will be harder to achieve in 3G networks due to the increase in the network software component count and complexity. Since the 2G to 3G transition involves an order of magnitude increase in handset code footprint (from 100,000 to 1 million lines of code), an order of magnitude increase in software-related compatibility and performance issues is likely.

Measurement requirements 

While most companies have, for the large part, been able to incrementally evolve their test architectures to absorb new RF measurement requirements resulting from the wireless revolution, the emergence of multiple-input, multiple-output (MIMO) wireless technology (offering significant increases in data throughput and link range without additional bandwidth or transmit power) in the case of Wi-Max and long term evolution (LTE), and the convergence of multiple wireless radios such as GPS and WLAN into a single system on a chip (SOC), has created new measurement requirements.

While the addition of each new wireless standard delivers benefits to consumers, it also creates a new set of challenges for test vendors. Added complexity results in longer test times and cost overruns, forcing test engineers to evaluate alternative approaches.

This trend has created a new demand in the marketplace for multichannel RF testing configuration. A multichannel RF test architecture enables parallel testing – that is, testing multiple wireless-enabled devices in parallel, and/or testing multiple communication standards like Bluetooth and 3G on the same device, in parallel.

MIMO uses multiple antennas at both the transmitter and receiver. A multichannel RF instrument architecture is required when fully characterising a MIMO device during validation/verification or when implementing MIMO technology for non-production applications such as RADAR and beam forming. With technologies evolving rapidly, scalability has become a key requirement for next-generation RF test systems.

With advances in multi-radio SOCs, additional wireless technologies such as MIMO can be fitted into already multifaceted devices such as next-generation smartphones, leading to one more radio that requires testing.

To implement a parallel test architecture for multi-radio devices, RF instrumentation that can economically scale but is flexible enough to test multiple frequencies is required; this has led to the need for a new class of application-specific RF instrumentation with a parallel hardware and software architecture that includes advanced synchronisation capabilities.

In addition, the new RF instruments must be MIMO ready, offering a new level of synchronisation that goes beyond sharing signals such as the reference clock (usually 10 MHz) and the occasional start trigger. The software component of the architecture is even more important because processing a multi-standard configuration is computationally intensive. Modern software architecture enables parallel data streams where one or more processing units are dedicated to each RF channel. Common parallel processing architectures found in the marketplace today include multiprocessor, hyper-threading, multicore and FPGA.


Besides RF instruments, as it is still early days for LTE, significant work needs to be done on developing, optimising and verifying user equipment (UE) and network equipment ahead of the launch of commercial services. As the deployment of these networks accelerates, the T&M industry will have its work cut out with network and handset testing. From data rates of up to 326 Mbps, mobility expectations of 162 km per hour, scalable channel widths, 2x to 5x spectral efficiency, all-IP support, MIMO capability and low latency, it is imperative to understand the LTE physical layer for accurate and reliable UE testing. A total LTE testing solution must include the verification of core network elements and interworking with more established network technologies in addition to several other key performance measurements. While some of these, such as maximum output power, power control and receiver sensitivity, are similar to those used in the existing standards, given the transmission schemes used (OFDMA in the downlink, SC-FDMA in the uplink), new measurement equipment will be needed to support these tests.

Other measurements are specific to LTE. With its OFDMA transmission scheme, for example, the error vector magnitude (EVM) per sub-carrier becomes an essential test of modulator performance. With the availability of the 700 MHz analog TV spectrum, LTE will be deployed at lower frequencies than GSM or WCDMA, resulting in much broader bandwidths vis-à-vis WCDMA. This can pose a challenge with some modulator architectures as it results in a higher EVM at the band edges and, therefore, requires special attention at the design stage.

Due to the dynamic nature of some of the tests, such as power control, the measurement conditions need to be established using the signalling protocol. This makes it essential for the test equipment to include the protocol stack, simulating the evolved Node B base station.

Protocol testing 

Protocol testing can often involve expending as much effort on generating test cases as on creating the protocol stack; so access to comprehensive and efficient test facilities is vital. In order to break down the testing, it is important to be able to test each sub-layer in both the user plane and control plane. Protocol test diagnostic features are essential when tracing faults. Typically, this would include time-stamped message logging and decoding. But it is important that this is available for each sub-layer, providing the ability to trace through signalling message flows in detail, from MAC PDUs up to RRC messages, thus ensuring that timing requirements are met. The ability to create test scenarios for each layer requires a detailed control of the test equipment, but this should be kept as easy to use as possible to avoid a painful learning curve.


Meanwhile, fixed and mobile Wi-Max also have very complex requirements in terms of characterisation. In fact, even performing basic RF transmitter measurements on Wi-Max requires high performance test equipment with advanced measurement functionality that was not commercially available until recently. Fortunately, with the availability of sophisticated signal generation and analysis equipment, nominally developed for evaluating digitally modulated wireless networks, tools already exist for performing the complex measurements needed to characterise a Wi-Max RF transmitter.

Basic RF communication measurements of any kind require an instrument to substitute for the transmitter, such as a signal generator, and an instrument to serve as a receiver, such as a spectrum analyser or signal analyser. To evaluate the operation of a Wi-Max transmitter in a BS or handheld device, the analyser must at least meet the frequency and channel (modulation bandwidth) requirements of the Wi-Max standard for that particular equipment. Current Wi-Max frequency allocations extend to about 5.8 GHz with channel bandwidths ranging from 1.25 MHz through 28.0 MHz, depending on whether it is fixed or mobile Wi-Max equipment and whether OFDM or OFDMA is being used. In addition, because of the number of modulation schemes used in Wi-Max, a signal analyser must have flexible demodulation analysis capabilities.

For evaluating Wi-Max receivers and components, the known test signals from a signal generator take the place of signals from a Wi-Max transmitter. Since the Wi-Max signal has a burst nature, with differences in amplitude level from the start (preamble) of the burst through the burst data, a signal generator for Wi-Max receiver testing should also provide programmable power control to mimic the dynamic power characteristics of Wi-Max signals.

Once Wi-Max test signals are generated, they must be analysed. Testing a BS unit or portable device for mobile Wi-Max essentially requires a signal analyser that can emulate the operation of an IEEE 802.16e base station since the analyser must be able to detect and record the full range of frequencies and modulation formats used by a Wi-Max system.

Due to the complexity associated with Wi-Max and LTE testing combined with the application-specific customer demands, test equipment vendors have been unable to cope with the steady requirement for test solutions. Nonetheless, as the success of these standards is dependent on new technologies like MIMO, the design and verification of which are seen as significant growth opportunities for test equipment vendors, the telecom sector clearly represents a lucrative business opportunity for this industry.


Copyright © 2010, All Rights Reserved