Modeling a fixed, wired network is a tricky endeavor. There are so many interactions at the network protocol level and the application data plane that make it far more stochastic than deterministic. Now, make the modeled network wireless and you've elevated the degree of difficulty. Modeling free space propagation loss, signal, noise and interference as well as antenna gain patterns and directions make modeling wireless networks a far more challenging task. Finally, make the modeled network airborne. Routers and switches are no longer bolted into fixed racks with neatly dressed cables; they are literally flying around in the air hoping to build some type of connectivity on which a network topology can be created. Welcome to my work.
For several years now, I've been researching and developing applications for airborne networks. Testing software in this environment means super expensive and rare live flight exercises, or more reasonably priced and available modeling and simulation (M&S) packages. But when it comes to testing custom developed software, not all M&S packages are the same.
Simulation vs. Emulation
I like to describe simulation and emulation with a simple analogy. If I want to test car racing, I have a few options:
- Go to the local track with lots of cars and have race (live flight exercise)
- Use a race car video game on my computer / gaming console (simulation)
- Use a bunch of remote control (RC) cars on a scaled down version of the track (emulation)
It's not a perfect analogy, but conveys the point - that is - emulation is our bridge between the more theoretical simulation and the more expensive and rare real system test. With emulation, we can scale efficiently (a benefit of simulation, a challenge with real system tests), yet we can also interface with real systems and software (a benefit of real systems testing, a challenge with simulation). We essentially get the best of both real system tests and simulation without the drawbacks. But emulation itself is not without its own drawbacks.
Model Accuracy
Since an emulated system is not the system itself, it is a model of the real system, there is a question of accuracy. How accurate is the model? How much of the real system has to be modeled? Can some parts of the real system be abstracted away in the model with little impact? Using our RC car example above, how accurate is our acceleration curve in the RC car versus the real car it is emulating? Is there a concern if the real car uses gas and the RC car uses a battery if the run time between fill up / charging is the same? In my airborne networking research, having an accurate behavioral model of the radio and its waveform characteristics is key to building a reliable scenario to test the large network effects we are researching. The model should not create links beyond the distance capabilities and environmental conditions of a real radio. We should expect the same type of packet loss and interference when propagation issues are injected during testing.
Framework and Orchestration
The framework is an important factor in picking an emulation package. How is the emulation done - is specific hardware required or can it be done in software? How modular is the software - can radio and waveform characteristics be changed without changing the physics constants for free-space signal propagation? How comprehensive is the framework - is it just signal-in-space or can other factors like location (e.g., GPS) and communication effects (e.g., dropped, delayed and mangled packets) be introduced? Is the emulation package open-source and use a standard way to interface, or is closed with a proprietary interface? The openness of the package can also be tied to support - you may not get the "round-the-clock" customer support of a paid vendor; however, with a talented group of on-staff engineers, open-source offers more opportunities to customize and fit the model for your specific needs.
Another important factor to consider in an emulation package is "orchestration". Like a conductor leading an orchestra, we need a convenient, easy-to-use, central way to start the emulation. This is especially important when you consider we are testing custom developed software over the emulation. In a simulation package, the simulator starts and controls everything about the test, like the race car video game. With emulation - the RC car example - we need to control all the components; ready the track, start the RC cars and steer them all around the track during the emulated race event. In my airborne networking research, we need to start the emulated radios and associated network and compute components. We need to launch our custom software on each node in the emulation. We need to start any analyzers that will collect data during the emulation. With a scenario of five emulated aircraft, there could be up to 15 to 20, or more nodes (i.e., radios and computers that would be on the real aircraft) that we have to start with 10 to 20 software packages running on each. This is not something you can do manually or by pointing and clicking your way through a nice interface. Lots of code and configuration needs to be written, and to the extent any of it can be automated for orchestration, the easier and more repeatable the test cases become.
Data Collection and Analysis
I alluded to data collectors in the previous section, but they definitely deserve their own. Data collection is the entire reason we do emulation - or any kind of modeling and simulation activity. We want to gain an understanding of how a real system will operate in potential scenarios that we do not want to (or cannot) test the real system against. Again, with simulation, the simulation software package will collect all the data you need, but with our distributed emulation framework and orchestration, we need to instrument all the data collectors on any of the nodes we deploy where we want data collected. Orchestration can help with that by starting data collection software packages on nodes while it starts our software to be tested. In some cases however, custom data collection software needs to be written and since it is not part of a larger all-in-one simulation package, the output can be somewhat "difficult".
"Difficult" is a nice way to say non-standardized, free-text, unformatted, and a host of other inconsistencies between the collected data. This is where correlation and analysis become important after the emulated scenario is finished. Coordinating log files with different timestamps and formatting them to present a clear picture of events is crucial to understanding the performance of the thing you are testing. In my airborne networking research, we've standardized the logging format for our software, but other data collectors provide all sorts of different formats that we need to convert and combine to get a detailed analysis. Good practices here are critical to answering the questions we are testing - does our software improve the performance of the airborne network, and if so, by how much?
Summary
So considering all those challenges, why did we choose emulation? Simply put, simulation could not deliver the real-world integration for our custom developed software that we were seeking to evaluate. We needed an emulation package that provided an interface to real computer systems and software so we could write and test code in the same fashion we would for a real airborne network system. We looked at various options and prioritized:
- Integration of real software / computer systems into the emulation framework,
- Availability of models for radios and waveforms of interest,
- Industry and peer / partner adoption (i.e., who else is using the emulation package), and
- Availability of open-source tooling for orchestration and analysis.
The initial adoption was difficult and we had a lot of false starts and reworks. We spent the better part of a year making the model more accurate. But now, over 5 years in, we have a robust, software-based emulation testbed that is modular, portable and adopted by other programs and partners looking to do similar wireless network modeling.
No comments :
Post a Comment