Show simple item record

dc.contributor.authorChen, Z
dc.contributor.authorHu, J
dc.contributor.authorMin, G
dc.contributor.authorZomaya, AY
dc.contributor.authorEl-Ghazawi, T
dc.date.accessioned2019-11-20T15:11:21Z
dc.date.issued2019-11-15
dc.description.abstractResource provisioning for cloud computing necessitates the adaptive and accurate prediction of cloud workloads. However, the existing methods cannot effectively predict the high-dimensional and highly-variable cloud workloads. This results in resource wasting and inability to satisfy service level agreements (SLAs). Since recurrent neural network (RNN) is naturally suitable for sequential data analysis, it has been recently used to tackle the problem of workload prediction. However, RNN often performs poorly on learning longterm memory dependencies, and thus cannot make the accurate prediction of workloads. To address these important challenges, we propose a deep Learning based Prediction Algorithm for cloud Workloads (L-PAW). First, a top-sparse auto-encoder (TSA) is designed to effectively extract the essential representations of workloads from the original high-dimensional workload data. Next, we integrate TSA and gated recurrent unit (GRU) block into RNN to achieve the adaptive and accurate prediction for highly-variable workloads. Using realworld workload traces from Google and Alibaba cloud data centers and the DUX-based cluster, extensive experiments are conducted to demonstrate the effectiveness and adaptability of the L-PAW for different types of workloads with various prediction lengths. Moreover, the performance results show that the L-PAW achieves superior prediction accuracy compared to the classic RNN-based and other workload prediction methods for high-dimensional and highly-variable real-world cloud workloads.en_GB
dc.identifier.citationPublished online 15 November 2019en_GB
dc.identifier.doi10.1109/TPDS.2019.2953745
dc.identifier.urihttp://hdl.handle.net/10871/39622
dc.language.isoenen_GB
dc.publisherInstitute of Electrical and Electronics Engineersen_GB
dc.rights© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_GB
dc.subjectCloud computingen_GB
dc.subjectworkload predictionen_GB
dc.subjectresource provisioningen_GB
dc.subjectsequential data analysisen_GB
dc.subjectdeep learningen_GB
dc.titleTowards accurate prediction for high-dimensional and highly-variable cloud workloads with deep learningen_GB
dc.typeArticleen_GB
dc.date.available2019-11-20T15:11:21Z
dc.identifier.issn1045-9219
dc.descriptionThis is the author accepted manuscript. The final version is available from IEEE via the DOI in this recorden_GB
dc.identifier.journalIEEE Transactions on Parallel and Distributed Systemsen_GB
dc.rights.urihttp://www.rioxx.net/licenses/all-rights-reserveden_GB
dcterms.dateAccepted2019-11-09
rioxxterms.versionAMen_GB
rioxxterms.licenseref.startdate2019-11-09
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2019-11-12T22:49:53Z
refterms.versionFCDAM
refterms.dateFOA2019-11-29T12:02:23Z
refterms.panelBen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record