Wednesday 16 April 2025
The quest for efficient queueing systems has been a longstanding challenge in operations research, with practical applications in industries such as healthcare and manufacturing. Researchers have traditionally relied on approximations and simulations to analyze these complex systems, but recent advances in machine learning have opened up new possibilities.
A team of scientists has developed a novel approach that leverages neural networks to predict the stationary distribution of customers in multi-server queues. The model, which combines the first four moments of inter-arrival and service time distributions, achieves state-of-the-art performance when compared to existing methods.
The researchers began by analyzing two distinct queueing systems: the GI/GI/c system, where customers arrive according to a general distribution and are served by c homogeneous servers; and the GI/GIi/2 system, which features heterogeneous servers. By training neural networks on these systems, they were able to accurately predict the steady-state probabilities of customer numbers in each queue.
The model’s accuracy was tested against a range of scenarios, from low-to-high service rates and varying numbers of servers. In most cases, the errors were found to be less than 5%, demonstrating the method’s reliability. Furthermore, the neural network-based approach enabled rapid evaluation of thousands of instances in parallel, making it an attractive solution for real-world applications.
One of the key benefits of this approach is its ability to handle complex systems with non-renewal arrivals and blocking, which are common in many industries. Traditional methods often struggle to accurately model these scenarios, but the neural network-based approach can adapt to these complexities.
The researchers also explored the limitations of their method, noting that extrapolation beyond the training domain can lead to increased error. However, they suggest that this limitation can be mitigated by retraining the model on a broader range of scenarios.
As the quest for efficient queueing systems continues, this novel approach offers a promising solution for operations researchers and practitioners alike. By leveraging machine learning techniques, we may finally have the tools to tackle the complexities of multi-server queues and unlock new insights into the behavior of these critical systems.
Cite this article: “Machine Learning Revolutionizes Queueing Theory: Accurate Predictions of Complex Systems”, The Science Archive, 2025.
Operations Research, Queueing Systems, Machine Learning, Neural Networks, Multi-Server Queues, Inter-Arrival Time, Service Time, Stationary Distribution, Customer Numbers, Queuing Theory