




版權(quán)說(shuō)明:本文檔由用戶(hù)提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
ConvolutionalNeuralNetworkforShort-termWind
PowerForecasting
MargaridaSolas
Powergrid
Portugal
margarida.solas@powergrid.pt
NunoCepeda
Powergrid
Portugal
nuno.cepeda@powergrid.pt
JoaquimL.Viegas
IDMEC,InstitutoSuperiorTe′cnico
UniversidadedeLisboa
Lisboa,Portugal
joaquim.viegas@tecnico.ulisboa.pt
Abstract—Windpowergenerationisbecomingincreasingly
relevanttothepowersupplysystemasitiscleanandrenewable.
Thispaperproposesanovelmethodologyforshort-termwind
powerforecasting,basedonaconvolutionalneuralnetwork
(CNN).Inthiswork,weevaluatetheCNNabilityofpredicting
thewindpowergenerationbycomparingittotwobenchmarking
methods–ARIMAandgradientboostingmachine(GBM).We
provethatCNNiswellsuitedforthispurpose,outperformingthe
othertestedtechniques,speciallywhenthepredictionhorizonis
greaterthan1-hour.Besides,thispapershowsthatadditionalfea-
tureslikemeteorologicalforecastsprovidefruitfulinformation,
poweringtheCNNperformance.
IndexTerms—windpowerforecasting,convolutionalneural
network,benchmarkingmethods
I.INTRODUCTION
Nowadays,renewableenergysourcessuchaswindpower
andsolarpowerarewidelyusedastheyarenotrelianton
exhaustibleandpollutingrawresources[1].However,their
uncertaintyandinstabilitycharacterizeahugechallengeto
thepowersupplysystem,demandinganaccurateforecasting
model[2]–[4].
Inlastdecades,thescienti?ccommunitywasstimulatedto
addressthisissue,generatingavastcollectionofapproaches
thatincludebothstatisticalanddata-drivenmethods[5]–[7].
Morerecently,deeplearningalgorithmshavealsobeenim-
plementedinthiscontext.Deeplearningisamachinelearning
sub-?eld,concernedwithcomplexarchitecturesthatmimic
thestructureandfunctionofthehumanbrain,nameddeep
arti?cialneuralnetworks(ANN).DeepANNarchitectures
modelthenonlinearitiesintheinputandextractcomplexfea-
turesfromdatabyperformingoperationsacrossthenetwork.
Deeplearningtechniqueshavebeengainingmomentumdue
totheirsuccessfulapplicationinseveralsoftware?elds,such
ascomputationalvision,speech,audioandnaturallanguage
processing[8].Consequently,itbecameatrendfortimeseries
forecasting,aswell[9].
Inthiswork,weevaluatetheabilityofaconvolutional
neuralnetwork(CNN)ofpredictingthewindpowerseries
atleadtimesfrom1to24hoursaheadbycomparingitto
twobenchmarkingmethods–ARIMAandgradientboosting
machine(GBM).WeaimtoevaluatetheCNNcapacityof
978-1-5386-8218-0/19/$31.00?2019IEEE
automaticallylearntemporaldependenciesandstructuresthat
aretypicaloftimeseriessuchastrendsandseasonalityagainst
thetwomentionedmethods.
ARIMAisoneofthemostcommontimeseriesmodels
becauseitpresentsquite?exibilityanditonlyassumesthat
databecomestationaryafterdifferencing[10].
GBMgathersseveralstrengthswhichjustifyitschoice.
Itautomaticallydetectsnon-linearfeatureinteractionsandit
mustpresentastrongpredictivecapacityasitisfairlyrobust
toover?tting[11].
CNNswereselectedovertheremainingdeeparchitectures
becausetheyareparticularlysuitablefortuningdatathathas
agrid-basedstructure[8].Asatimeseriesrepresentsa1-
dimensionalgridofsamplesequallyspacedintime,CNNs
shouldbeabletomodelthiskindofdataaswell.Nevertheless,
only[12]applyCNNsforwindpowerforecastinginthe
literature.
Inthiswork,weproposeanovelCNN-basedmethodology
wheremeteorologicalforecastsareincludedaspredictorsfor
the?rsttime.Ourmaingoalsareasfollow:(1)reviewing
thelatestpublishedworkregardingdeeparchitecturesapplied
totimeseriesforecastingingeneral,andparticularlywind
powerforecasting;(2)unravellingthepotentialofaCNNfor
thispurpose;(3)comparingtheperformanceofaCNNtotwo
benchmarkingmethods–ARIMAandGBM;(4)unveiling
howadditionalfeatureslikemeteorologicalforecastsimpact
theCNNperformance.
II.RELATEDWORK
Inthelastyears,severalstudiesabouttimeseriesforecasting
werecarriedout,unravellingthepotentialofdeeplearning
techniquesforsuchpurpose.Someprominentalgorithms,
suchasdeepbeliefnetwork(DBN),deepBoltzmannmachine
(DBM)andstackedauto-encoders(SAE)wereappliedtothis
context.
Theapproachsuggestedin[13]wasoneofthe?rstoffering
thepromiseofdeeplearningmethodsfortimeseriesforecast-
ing.Thisworkfocusedontimeseriesingeneral,butasimilar
approachwasappliedtowindpowerforecastingin[2].
In[14],aDBMwasimplementedandin[15]aSAEwith
aregressionlayerwasapplied,bothtosolvethewindspeed
forecastingtask.
In[16],the?rstdeepensemblemethodfortimeseriesfore-
castingwasproposed.Thiscomplexarchitectureisgivenbya
setofconcurrentDBNsplacedinparallel,whoseoutputfeeds
asupportvectorregression(SVR)thatworksasoutputlayer.
Soonafter,differentcombinationsofdeeplearningtechniques
appliedtotimeseriesforecastingwerepublished.Theworkin
[17]joinedanautoencoder(AE)toalongshort-termmemory
(LSTM)topredictsolarpowerandmoreover,itcomparedthe
ensemblemethodperformancetotheperformanceofaDBN
andthementionedmethodsworkingseparately.
In[12],acooperativeCNN-basedensemblewaspresented
forthe?rsttime.Inthatwork,windrawdataisdecomposed
intodifferentfrequencycomponentsusingtheWaveletTrans-
form(WT)andeachcomponentisprovidedtoadistinctCNN.
Theoutputisobtainedbyaggregatingthepredictionofall
CNNs.Itwastheonlywork,toourknowledge,thatapplied
CNNtopredictthewindpowerseries.
In[18],anidenticalapproachwaspresentedtopredictthe
loaddemandseriesbutaDBN-basedensembleisusedinstead.
Besides,theloaddemandseriesisdecomposedintoseveral
intrinsicmodefunctions(IMFs)byapplyingtheEmpirical
ModeDecomposition(EMD)algorithminsteadoftheWT.
Wehavenoticedthatthedeeparchitecturethatappearinthe
literaturemoreoftenfortimeseriesforecastingistheDBN
[14],[16],[18].Today,however,DBNshavemostfallenout
offavorandhavebeenreplacedbyrecurrentneuralnetworks
(RNNs)andCNNs[8].AsCNNseemedtousmoresuitable
thanRNNfortimeseriesforecastingduetoreasonsstated
intheprevioussection,wedecidedtocarryoutthisstudyto
evaluateitspotential.
III.FORECASTINGMETHODS
Inthiswork,twoCNNarchitecturesaretestedagainst
ARIMAandGBM.Below,webrie?ydescribethebenchmark-
ingmethods,followedbytheCNN.
A.AutoregressiveIntegratedMovingAverage
ARIMAisgivenbythecombinationofthreeclassesof
models–autoregressive(AR),integrated(I)andmoving
average(MA)–whichwerealldesignedtodealwithtime
series.TheARpartofARIMAstandsforautoregressiveand
includespredictorsthatarelaggedversionsoftheseries;the
MApartstandsformovingaverageandtherespectiveterm
isalinearcombinationoflaggedforecastingerrorsandtheI
componentofARIMAstandsforintegratedasitmakesthe
timeseriesstationarybyperformingadifferencingprocess
thatmaybecarriedoutmorethanonce[19].
ARIMAiscommonlydenotedARIMA(p,d,q)wherepa-
rameterprepresentstheorderoftheautoregressivepart,
parameterdisthedifferencingdegreeandparameterqisthe
sizeofthemovingaveragewindow.ARIMA(p,d,q)model
hastheequation:
i!dXt=i!
XX
pq
1?αiB(1?B)1+θiL
εt(1)
i=1i=1
where,Xtisthetimeseriesobservationatinstancet,αirepre-
senttheparametersoftheautoregressivepart,θirepresentthe
parametersofthemovingaveragepart,εtaretheerrorterms,
LiisthelagoperatorandBiisthebackwardshiftoperator,
thathastheeffectofshiftingbackwardsanobservationbyi
periods.
ThechoiceofARIMAparameters(p,d,q)requiressome
expertiseandcanbequiteexhaustingifthesearchisdone
manually.Wetriedseveralcombinationsofparametersto
achievethesetthat?ttedbetterthewindseriesso,wegot
ARIMA(2,1,0).
B.GradientBoostingMachine
GBMisanensembleoftreesthatworkasweakpredictors.
Thisensembleisbuiltinastage-wisefashionso,eachtreeis
createdtocorrecterrorsofthepreexistentones[20].
ThesetofGBMparameterscanbedividedinto2categories:
(i)tree-speci?cparameterswhichareusedtode?neeach
individualtreeand(ii)boostingparameterswhichareused
tocreatethetreeensemble.SinceGBMhasaconsiderable
setoffreeparameters,weusedanexhaustivesearchstrategy.
TableIdepictsthebestsetofparametersacquiredthroughthe
grid-searchprocedure.
TABLEI
GBMPARAMETERS
ParametersValues
Tree-speci?cmaximumdepth3
numberofsamplestomakeasplit100
minimumnumberofsamplesataleafnode50
splitcriterionMSE
Boostingnumberoftrees500
optimizationfunctionHuberloss
fractionofsamplestotraineachtree50%
InTableI,the?rstthreeparametersareknobstoprevent
over?ttingbyavoidingtreesofhavingunpopulatedpathsand
leafnodesofrepresentingfewsamples.Thecriterionusedto
evaluatethesplitqualityisthemeansquarederror(MSE)but
theHuberlossisusedasoptimizationfunction[21]instead.
Huberlossislesssensitivetooutliersthansquarederror
becauseitappliesasofterpenaltytogreatresiduals.Astime
seriesoftencontainnoise,theapplicationofHuberlossas
costfunctionisparticularlysuitable.TheHuberlossisgiven
by
L(x,x?)=
ii
(x??|≤
x?x?δ
iii
2
12
,
(2)
2otherwise,
δ|?|?
xx?δ
ii
2
1
whererepresentstheactualvalueofthei-thsample,is
xx?
ii
itsestimatedvalueandreferstothetransitionpoint,i.e.,δ
valuethatde?neswhichresidualsareoutliers.
Eachtreeistrainedwitharandomsubsamplechosenat
randomwithoutreplacementfromthetrainingset.Byusing
suchsubsetsofdatafor?ttingeachweakpredictor,oneim-
provestheoverallmodelrobustnessratherthantheprediction
capacityofeachtree[22].
C.ConvolutionalNeuralNetwork
CNNiscomprisedoftwocomponentswhichareresponsible
forextractingfeaturesandreturningeithertheclassi?cation
ortheregressionoutput,respectively.CNNowesitsnameto
themathematicaloperationperformedonthe?rstcomponent
whichismadeofconvolutionandpoolinglayers.Convolution
layersapplyasetof?lterstotheinput,generatingasetoffea-
turemapsandpoolinglayersareinsertedbetweensuccessive
convolutionlayerstodownsampletheinputvolume,helping
toavoidover?tting.
UnlikeinconventionalANN,theselayersarenotfully-
connected,i.e.,theirneuronsconnectonlytoregionsof
neuronsoftheprecedinglayers.ThispropertyallowsCNNs
tohavefewerfreeparametersandconsequently,tobecom-
putationallylessdemanding[23].
CNNisvery?exibleandhasagoodpredictivecapacity
whetherparametersarecarefullytuned.CNNparameterslike
weightsandbiasareobtainedinadata-drivenfashionthrough
thebackpropagationalgorithmbutCNNhyperparametersare
setbeforetraining.Weusedrandomsearchinthehyperparam-
etertuningwhichismoretimeef?cientthangrid-search[24].
TableIIsumsupthesetofhyperparametersusedtode?nethe
networkstructureandhowitistrained.
TABLEII
CNNPARAMETERS
ParametersValues
wherenirepresentsthenumberofinputunitsinthatlayer.
BesidestheparametersshowninTableII,CNNhasan
additionalsetofparameterswhichincludethekernelsize,
strideandpaddinginbothconvolutionandpoolinglayers,the
optimizerparametersandthenumberofunitsineachfully-
connectedlayer.
Inthiswork,wedesigntwoCNNarchitecturestoapply
either1-Dconvolutionor2-Dconvolution,furtherreferredto
asCNN-1DandCNN-2D,respectively.EachCNNis?ttedto
aspeci?cdataset.Bothsetsincluderollingpartitionsofthe
windpowerseriesbutthedatasetusedtocreatetheCNN-2D
alsoincludesrollingpartitionsofaserieswithwindspeed
forecasts.
IV.METHODOLOGY
Inthiswork,weusedataprovidedonkagglefortheGlobal
EnergyForecastingCompetition(GEFCom2012)1.Thisan-
nualcontestusedtorewardwhowasabletoforecastmore
accuratelythehourlywindpowerupto48hoursaheadat7
locations.
Datagathernormalizedhourlywindpowermeasurements
forsevenwindfarmsandwindforecastsateachlocationfor
18months,coveringtheperiodfromJul.2009toJun.2012.
Thewindforecastsincludespeed,directionandmeridional
andzonalwindcomponents.
Nevertheless,weonlyevaluatethewindpowerforecasting
performanceofARIMA,GBMandCNNatleadtimesfrom
1to24hoursahead.
Networkconvolutionlayers(c)3
poolinglayers(p)3
fullyconnectedlayers(f)2
layerssequencec.p.c.p.c.p.f.f
techniqueofsubsamplingmaxpooling
dropout50%
activationfunctionReLu
TrainingweightsinitializationHe-et-al
lossfunctionMAE
optimizationalgorithmAdam
batchsize20
epochs500
Meanabsoluteerror(MAE)isadoptedaslossfunctionand
Adamisusedasoptimizationalgorithmbecauseitshowshuge
performancegainsintermsoftrainingspeed[25].
Thankstotheuseofactivationfunctions,ANNsareableto
detectnonlinearitiesondata.Inthiswork,weusetheRecti?er
ActivationFunction(ReLU)asitisthemostwidely-used[26].
ReLUisgivenby
A.Data
CNNandGBMaretrainedwithadatasetcreatedina
iterativefashionwhileARIMA?tsdirectlyaportionofthe
windpowerseries,asfollows:
?BothCNNandGBMaretrainedwithadatasetconsisting
oftimeseriespartitions,i.e.,thesuccessivepositionsof
arectangularwindowthatslidesoverthewindpower
serieswithunitstrideand72unitsoflength.
?Onotherhand,ARIMAis?ttedtoaportionofwind
powerseriescontaining13monthsofwindpowergener-
ationtopredictthefollowinginstance.Ateachiteration,
ARIMAisrecreated.
Forforecastersatleadtimesgreaterthan1hour,arolling
forecastingprocedureisemployed.Basically,eachpredicted
hourlypowervalueisusedaspredictorofthenextpointinthe
timeseriestillthepredictionhorizonhasbeenfullycovered.
B.ModelEvaluation
ReLU(y)=max(0,y)=y+,(3)
whereyistheweightedsumoftheneuron’sinputsplusthe
biasterm.AsweuseReLUactivationfunction,theweight
initializationisdoneusingthemethodderivedbyHe-et-al
in[27].Regardingthismethod,samplesaredrawnfroma
truncatednormaldistributionwithzeromeanandstandard
deviationgivenby
σ=r
,(4)
2
ni
Theperformanceisgenericallyevaluatedonanunseen
portionofdata,namedtestset.BothCNNandGBMare?tted
tothetrainingsetdrawnatrandomfromthewholedataset
(0.75%),andtestedagainsttheremainingdata.AsARIMAis
rebuiltateachiteration,weuseallpredictionstomeasurethe
overallmodelperformance.
Inthiswork,weusethreemetricstoevaluatethepredic-
tivecapacityofthemethodsunderanalysis:meanabsolute
1/c/GEF2012-wind-forecasting
error(MAE),rootmeansquarederror(RMSE)andexplained
varianceregressionscore(EVS).Mathematicalformulasare
presentedinTableIII.
TABLEIII
PERFORMANCEMETRICS
EvaluationmetricExpression
NP
MAE1
N
i=1|xi?x?i|
RMSEq
NP
1N
i=1(xi?x?i)2
EVS1?
Var(x?x?)
Var(x)
V.RESULTSANDDISCUSSION
Thethreemetricsstatedbeforewereusedtoassessthe
performanceoftheconvolutionalarchitectureagainstthe
benchmarkingmethods.TableIVdepictsnotonlythe1-
hourleadforecastingperformanceofthefourmethodsunder
evaluationbutalsotheresultsreturnedbyapersistencemodel,
i.e.,amodelthatpresumesthatthewindpoweratt+1
isequaltothewindpowerattheprecedingmomentt,
regardlesstheatmosphericfactors.TableIVshowsthatall
methods(ARIMA,GBM,CNN-1D,CNN-2D)outperformthe
persistencemodelanditprovesthattheproposedCNN-2D
performedthebestoverall.Besides,wecanseethatARIMA,
GBMandCNN-1Dplaysimilarly.
TABLEIV
MODELPERFORMANCEFOR1-HOURLEADFORECASTING
TableVpresentsthehourlyday-aheadpredictionperfor-
manceofallmethods.BycomparingTableIVandTableV,
onecanseethattheoverallperformancedecaysregardless
themethodbecausetherollingforecastingprocedureused
topredictthewindpowerinstancesupto24-hoursahead
propagateserrorsalongthepredictionhorizon.Forthatreason,
thewindpowerforecastsaregenericallylessreliableasthe
predictionhorizonincreases.
Althoughneithermethodiscompletelyrobustorimmuneto
thepropagationoferrors,CNN-2Dstillhasthebestpredictive
capacity,followedbyARIMA.ARIMAisthesecondbest
methodbecausethereareportionsofthetimeseriesthatare
nearlyconstant.Whenthewindpowergraphpresentsgreater
variability,bothARIMAandGBMseemtofail(Figure2).
TABLEV
MODELPERFORMANCEATLEADTIMESOF24-HOURS
MAERMSEEVSTime[s]
PERSIST.0.26650.40890.0987-
ARIMA0.20890.26140.20247503.3096
GBM0.21590.27310.14970.9300
CNN-1D0.25730.32880.12913.2320
CNN-2D0.19480.20520.48823.4783
Figure2displaysadayinwhichthewindpowerseries
exhibithighvariability.ARIMAandGBMactasapersistence
model,sotheydonotseemtocapturetimedependenciesand
temporalstructures.Onceagain,itisclearthatCNN-2Disthe
methodwhichapproximatethetimeseriesmoreclosely.
MAERMSEEVSTime[s]
PERSIST.0.08030.09680.8454-
ARIMA0.06400.08860.8843425.8349
GBM0.06310.08690.88990.0020
CNN-1D0.06670.08890.88980.0803
CNN-2D0.04890.07770.93820.0115
Figure1comparesthe1-hourleadforecastsprovidedbyall
methodsfor5daysandthedesiredgraph(target).Wecansee
thatallmethodsyieldaquiteaccurateapproximationofthe
windpowerseries.
Fig.2.24-hoursforecastingofhourlywindpower
Furthermore,thedifferencebetweentheshort-termforecast-
ingperformanceofCNN-1DandCNN-2DinbothTablesIV
andVshowsusthatmeteorologicalforecastslikewindspeed
provideusefulinformation,improvingthemodelforecasting
abilityingeneral,butspeciallyforapredictionhorizongreater
than1-hour.
VI.CONCLUSIONS
Fig.1.1-hourforecastingofhourlywindpower(5days)
ThispaperstudiesthepotentialofaCNN-basedmethod-
ologyforwindpowerforecastingupto24-hoursaheadby
comparingittoARIMAandGBM.
Withthiswork,wedrawthreemainconclusions.Firstly,we
showthatbothCNNarchitecturesareabletopredictthehourly
windpowergenerationaswellasthebenchmarkingmethods.
Secondly,weprovethatCNN-2Doutperformstheremaining
methodsingeneral,butspeciallywhenthepredictionhorizon
isgreaterthan1-hour.Finally,weshowthatARIMAandGBM
failtopredictthewindpowergenerationwhenthedailypower
curveexhibitshighvariability.
Asthepresentworkisencouraging,wewanttofurtherstudy
howadditionalmeteorologicalforecastslikewinddirection
impactthemodelperformancebyincludingsuchfeaturesin
thedataset.Inthefuture,wealsowanttostudythepotential
ofaCNN-basedensemblemadeoftwoCNNswhereoneis
?ttedtothewindpowerseriesandtheotheris?ttedtothe
meteorologicalforecastsforthewholepredictionhorizon.
ACKNOWLEDGMENT
ThisworkwassupportedbyFCT,throughIDMEC,under
LAETA,projectUID/EMS/50022/2019.TheworkofMar-
garidaSolasandNunoCepedawassupportedbyPowergrid
Lda,throughProgramaOperacionalRegionaldoCentro,
Projeto11229.
REFERENCES
[1]M.R.Patel,Windandsolarpowersystems:design,analysis,and
operation.CRCpress,2005.
[2]Y.Tao,H.Chen,andC.Qiu,“Windpowerpredictionandpattern
featurebasedondeeplearningmethod,”P(pán)owerandEnergyEngineering
Conference(APPEEC),2014IEEEPESAsia-Paci?c,pp.1–4,2014.
[3]N.Augustine,S.Suresh,P.Moghe,andK.Sheikh,“Economicdispatch
foramicrogridconsideringrenewableenergycostfunctions,”inInno-
vativeSmartGridTechnologies(ISGT),2012IEEEPES.IEEE,2012,
pp.1–7.
[4]J.C.Smith,M.R.Milligan,E.A.DeMeo,andB.Parsons,“Utilitywind
integrationandoperatingimpactstateoftheart,”IEEEtransactionson
powersystems,vol.22,no.3,pp.900–908,2007.
[5]G.Giebel,R.Brownsword,G.Kariniotakis,M.Denhard,andC.Draxl,
“Thestate-of-the-artinshort-termpredictionofwindpower:Aliterature
overview,”ANEMOS.plus,2011.
[6]C.Monteiro,R.Bessa,V.Miranda,A.Botterud,J.Wang,G.Conzel-
mannetal.,“Windpowerforecasting:State-of-the-art2009.”Argonne
NationalLab.(ANL),Argonne,IL(UnitedStates),Tech.Rep.,2009.
[7]S.Pelland,J.Remund,J.Kleissl,T.Oozeki,andK.DeBrabandere,
“Photovoltaicandsolarforecasting:stateoftheart,”IEAPVPS,Task,
vol.14,pp.1–36,2013.
[8]I.Goodfellow,Y.Bengio,A.Courville,andY.Bengio,Deeplearning.
MITpressCambridge,2016,vol.1.
[9]M.La¨ngkvist,L.Karlsson,andA.Lout?,“Areviewofunsupervised
featurelearninganddeeplearningfortime-seriesmodeling,”P(pán)attern
RecognitionLetters,vol.42,pp.11–24,2014.
[10]P.J.Brockwell,R.A.Davis,andM.V.Calder,Introductiontotime
seriesandforecasting.Springer,2002,vol.2.
[11]J.Friedman,T.Hastie,andR.Tibshirani,Theelementsofstatistical
learning.SpringerseriesinstatisticsNewYork,NY,USA:,2001,
vol.1,no.10.
[12]H.z.Wang,G.q.Li,G.b.Wang,J.c.Peng,H.Jiang,andY.t.Liu,
“Deeplearningbasedensembleapproachforprobabilisticwindpower
forecasting,”AppliedEnergy,vol.188,pp.56–70,2017.[Online].
Available:/10.1016/j.apenergy.2016.11.111
[13]T.Kuremoto,S.Kimura,K.Kobayashi,andM.Obayashi,“Timeseries
forecastingusingadeepbeliefnetworkwithrestrictedBoltzmann
machines,”Neurocomputing,vol.137,no.July,pp.47–56,2014.
[Online].Available:/10.1016/j.neucom.2013.03.047
[14]C.Y.Zhang,C.L.Chen,M.Gan,andL.Chen,“PredictiveDeep
BoltzmannMachineforMultiperiodWindSpeedForecasting,”IEEE
TransactionsonSustainableEnergy,
溫馨提示
- 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶(hù)所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶(hù)上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶(hù)上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶(hù)因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 冶金企業(yè)安全防控與案例分析馬園老師主講
- 長(zhǎng)沙尚智和顏-華盛彩虹城巡展活動(dòng)案
- 2025授權(quán)借款合同模板示例范文
- 2025年度煤炭供應(yīng)與銷(xiāo)售合同范本
- 新聞媒介素養(yǎng)培養(yǎng)練習(xí)卷
- 跨境電商運(yùn)營(yíng)策略及案例分享
- 電商行業(yè)移動(dòng)支付與安全保障解決方案
- 航空航天技術(shù)發(fā)展趨勢(shì)習(xí)題
- 在線(xiàn)旅行服務(wù)平臺(tái)創(chuàng)新與營(yíng)銷(xiāo)策略研究
- 工程項(xiàng)目招標(biāo)代理服務(wù)合同
- 2024年工商銀行智能研發(fā)技術(shù)及應(yīng)用白皮書(shū)
- 食品生產(chǎn)車(chē)間6S管理
- 糖尿病飲食的健康宣教
- GB/T 44569.1-2024土工合成材料內(nèi)部節(jié)點(diǎn)強(qiáng)度的測(cè)定第1部分:土工格室
- 《智能網(wǎng)聯(lián)汽車(chē)智能傳感器測(cè)試與裝調(diào)》電子教案
- 2024年資格考試-對(duì)外漢語(yǔ)教師資格證考試近5年真題附答案
- 超聲引導(dǎo)下動(dòng)靜脈內(nèi)瘺穿刺
- 柯坦鎮(zhèn)中心小學(xué)開(kāi)展研學(xué)旅行活動(dòng)實(shí)施方案
- 人教版音樂(lè)一年級(jí)上冊(cè)教案
- DB34-T 4877-2024 智慧檢驗(yàn)檢測(cè)實(shí)驗(yàn)室建設(shè)指南
- 云南省昆明市云南民族大學(xué)附屬中學(xué)2025屆高三下第一次測(cè)試物理試題含解析
評(píng)論
0/150
提交評(píng)論