Deep Q-Learning based resource allocation and load balancing in a mobile edge system serving different types of user requests
Önem Yildiz – Radosveta Ivanova Sokullu
With the expansion of the communicative and perceptual capabilities of mobile devices in recent years, the number of complex and high computational applications has also increased rendering traditional methods of traffic management and resource allocation quite insufficient. Recently, mobile edge computing (MEC) has emerged as a new viable solution to these problems. It can provide additional computing features at the edge of the network and allow alleviation of the resource limit of mobile devices while increasing the performance for critical applications especially in terms of latency. In this work, we addressed the issue of reducing the service delay by choosing the optimal path in the MEC network, which consists of multiple MEC servers that has different capabilities, applying network load balancing where multiple requests need to be handled simultaneously and routing selection based on a deep-Q network (DQN) algorithm. A novel traffic control and resource allocation method is proposed based on deep Q-learning (DQL) which allows reducing the end-to-end delay in cellular networks and in the mobile edge network. Real life traffic scenarios with various types of user requests are considered and a novel DQL resource allocation scheme which adaptively assigns computing and network resources is proposed. The algorithm optimizes traffic distribution between servers reducing the total service time and balancing the use of available resources under varying environmental conditions.
Keywords: cellular network, deep Q-learning, mobile edge network, resource allocation
|