Service Migration Algorithm Based on Markov Decision Process with Multiple Service Types and Multiple System Factors
-
Abstract
This paper proposes a Markov decision process based service migration algorithm to satisfy quality of service (QoS) requirements when the terminals leave the original server. Services were divided into real-time services and non-real-time services, each type of them has different requirements on transmission bandwidth and latency, which were considered in the revenue function. Different values were assigned to the weight coefficients of QoS parameters for different service types in the revenue and cost functions so as to distinguish the differences between the two service types. The overall revenue was used for migration decisions, rather than fixed threshold or instant revenue. The Markov decision process was used to maximize the overall revenue of the system. Simulation results show that the proposed algorithm obtained more revenue compared with the existing works.
-
-