توصيفگر ها :
اينترنت اشياء , محاسبات مبتني بر مه , تخصيص منابع , يادگيري تقويتي , الگوريتم ژنتيك
چكيده انگليسي :
Abstract
In recent years, the Internet of Things (IoT) has been one of the most popular technologies facilitating interactions between objects and humans to enhance the quality of life. The continuous increase in IoT applications, which require large-scale computing and long-term storage, has led to an over-reliance on cloud computing. The resulting congestion in the cloud, along with the distance of cloud data centers from objects, has caused end-to-end response delays and unreliability. Fog computing offers a distributed structure where cloud computing services, including data processing and storage, are extended to the network edge, as close as possible to end devices. Therefore, it is introduced as a complement to cloud computing, providing low-latency services by bringing processing and storage resources to the network edge. On the other hand, resource limitations, resource heterogeneity, and the dynamic and unpredictable nature of the fog environment necessitate that resource management be considered as one of the challenging problems in fog computing, requiring online and intelligent decision-making. In this research, the goal is to achieve a centralized method for prioritizing, scheduling, and placing processing tasks on heterogeneous fog servers that have a multi-queue structure. First, the problem is formulated considering the criteria of minimizing delay and the number of missed task deadlines. Then, given that the search space size for this problem is exponential, two methods with lower time complexity are proposed: a Genetic Algorithm (GA), which falls under the category of evolutionary algorithms, and a Reinforcement Learning (RL) algorithm, which is one of the machine learning approaches. Additionally, for comparison, a random algorithm and two algorithms, Shortest Job First (SJF) and Earliest Deadline First (EDF), are used. The results show that the Genetic Algorithm performs better in both the average delay and the deadline miss rate criteria. Specifically, in the average delay criterion, it performs %10 and %13 better than the SJF and EDF methods, respectively, and %82 better than the RA method. Moreover, in this criterion, it performs %23 better than the RL method. In terms of the average number of tasks that miss their deadlines, the Genetic Algorithm shows improvements of %38, %36, and %77 over the SJF, EDF, and RA methods, respectively, and %48 better performance compared to the RL algorithm. The simulation results also indicate that the multi-queue structure of fog servers plays a role in improving system performance when the number of fog servers is low. However, as the number of fog servers increases, the performance of the single-queue structure surpasses that of the multi-queue structure.
KeyWords: Internet of Things, Fog computing, Resource allocation, Reinforcement learning, Genetic algorithm