Abstract
Because of their rarity, the estimation of the statistics of buffer overflows in queuing systems via direct simulation is often very expensive in computer time. Past work on fast simulation using importance sampling has concentrated on systems with Poisson arrival processes and exponentially distributed service times. The authors demonstrate how, using large deviations theory and deterministic optimal control, an asymptotically optimal simulation system (in the sense of variance) can be generated for queues with a variety of arrival and service processes. In particular, it is shown how to generate an optimal simulation system for a number of queues with deterministic service times. Such systems are of great practical interest because of their application to the modeling of ATM (asynchronous transfer mode) switches.
| Original language | English |
|---|---|
| Pages (from-to) | 881-882 |
| Number of pages | 2 |
| Journal | Proceedings of the IEEE Conference on Decision and Control |
| Volume | 2 |
| Publication status | Published - 1990 |
| Externally published | Yes |
| Event | Proceedings of the 29th IEEE Conference on Decision and Control Part 6 (of 6) - Honolulu, HI, USA Duration: 5 Dec 1990 → 7 Dec 1990 |