IEEE Internet of Things Journal | 2019

Optimal Cloud Resource Allocation With Cost Performance Tradeoff Based on Internet of Things

 
 
 

Abstract


Internet of Things (IoT) still faces many challenges, one of which is how to efficiently and effectively allocate the cloud resources in order to obtain the desired quality of service (QoS) for the IoT users and communication domains. To face the above challenges, this paper first identifies that there is an issue of cost performance tradeoff, which is stemmed from various limited resources in the IoT systems and the competing services requirements. We second propose a nonlinear optimization model that attacks the resource allocation problem. Within the constraints of resources and service demands, this model seeks to maximize the suggested IoT cost performance ratio. We then find that the problem can be treated as a quasiconcave maximization problem. We are able to reduce the original problem to a pseudoconcave maximization problem and thereby identify a local solution, which is proved to be exactly the desired global one. We therefore propose a feasible direction method and design its corresponding algorithm to yield the desired solution. Finally, we provide a numerical example to demonstrate the theoretical findings of resource allocation and the proposed algorithm for the cloud computing based on IoT systems. The mathematical results and method proposed in this paper can act as designing guidelines for resource allocation and managements, computing scheduling, and networking protocol designing in the cloud infrastructure of IoT.

Volume 6
Pages 6876-6886
DOI 10.1109/JIOT.2019.2911978
Language English
Journal IEEE Internet of Things Journal

Full Text