版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
基于神经网络的马尾松毛虫精细化预报Matlab建模试验
张国庆
(安徽省潜山县林业局)
1.数据来源
马尾松毛虫发生量、发生期数据来源于潜山县监测数据,气象数据来源于国家气候中心。
2.数据预处理
为了体现马尾松毛虫发生发展时间上的完整性,在数据处理时,将越冬代数据与上一年第二代数据合并,这样,就在时间上保持了一个马尾松毛虫世代的完整性,更便于建模和预测。
(1)气象数据处理
根据《松毛虫综合管理》、《中国松毛虫》等学术资料以及近年来有关马尾松毛虫监测预报学术论文,初步选择与松毛虫发生量、发生期有一定相关性气象因子,包括卵期极低气温,卵期平均气温,卵期积温(日度),卵期降雨量,第1、2龄极低气温,第1、2龄平均气温,第1、2龄积温(日度),第12龄降雨量,幼虫期极低气温,幼虫期平均气温,幼虫期积温(日度),幼虫期降雨量,世代极低气温,世代平均气温,世代积温(日度),世代降雨量共16个变量。将来自于国家气候中心的气象原始数据,按年度分世代转换成上述16个变量数据系列。
(2)发生量数据处理
为了在建模时分析发生强度,在对潜山县1983~2014年原始监测数据预处理时,按照“轻”、“中”、“重”3个强度等级,分类按世代逐年汇总。
(3)发生期数据处理
首先对潜山县1983~2014年原始发生期监测数据按世代逐年汇总,然后日期数据转换成日历天,使之数量化,以便于建模分析。
3.因子变量选择
通过相关性分析和建模试验比较,第一代发生量因子变量选择第1、2龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积;第二代发生量因子变量选择第1、2龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积,第1、2龄降雨量,卵期降雨量;第一代幼虫高峰期因子变量选择第1、2龄平均气温,第1、2龄积温(日度),第1、2龄极低气温,卵期极低气温;第二代幼虫高峰期因子变量选择成虫始见期,卵期平均气温,卵期积温(日度),第1、2龄极低气温。
将第一代发生量变量命名为s1y,因变量命名为s1x;第二代发生量变量命名为s2y,因变量命名为s2x;第一代幼虫高峰期变量命名为t1y,因变量命名为t1x;第二代幼虫高峰期变量命名为t2y,因变量命名为t2x。
4.第一代发生量建模试验
4.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2819:28:48CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s1x-inputdata.
%s1y-targetdata.
x=s1x';
t=s1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2819:29:03CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s1x-inputdata.
%s1y-targetdata.
x=s1x';
t=s1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:helpnnprocess
cessFcns={'removeconstantrows','mapminmax'};
cessFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
4.2网络训练过程
网络训练为:
图1第一代发生量网络训练过程
4.3训练结果
训练结果为:
图2第一代发生量网络训练结果
训练样本、验证样本、测试样本的R值分别为0.875337、-1和1。
误差直方图为:
图3第一代发生量网络训练结果误差直方图
训练样本、验证样本、测试样本、所有数据回归图为:
图4第一代发生量网络训练结果回归图
验证样本和测试样本R值均为1。
5.第二代发生量建模试验
5.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:04:18CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s2x-inputdata.
%s2y-targetdata.
x=s2x';
t=s2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:04:31CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s2x-inputdata.
%s2y-targetdata.
x=s2x';
t=s2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:helpnnprocess
cessFcns={'removeconstantrows','mapminmax'};
cessFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
5.2网络训练过程
网络训练为:
图5第二代发生量网络训练过程
5.3训练结果
训练结果为:
图6第二代发生量网络训练结果
训练样本、验证样本、测试样本的R值分别为0.942388、0.999999和1。
误差直方图为:
图7第二代发生量网络训练结果误差直方图
训练样本、验证样本、测试样本、所有数据回归图为:
图8第二代发生量网络训练结果回归图
验证样本和测试样本R值均为1,训练样本R=0.94239,所有数据R=0.89479。
6.第一代幼虫高峰期建模试验
6.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:16:32CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%t1x-inputdata.
%t1y-targetdata.
x=t1x';
t=t1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:17:08CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%t1x-inputdata.
%t1y-targetdata.
x=t1x';
t=t1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:helpnnprocess
cessFcns={'removeconstantrows','mapminmax'};
cessFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
6.2网络训练过程
网络训练为:
图9第一代幼虫高峰期网络训练过程
6.3训练结果
训练结果为:
图10第一代幼虫高峰期网络训练结果
训练样本、验证样本、测试样本的R值分别为0.875337、-1和1。
误差直方图为:
图11第一代幼虫高峰期网络训练结果误差直方图
训练样本、验证样本、测试样本、所有数据回归图为:
图12第一代幼虫高峰期网络训练结果回归图
验证样本和测试样本R值均为1。
7.第二代幼虫高峰期建模试验
7.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:22:04CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%t2x-inputdata.
%t2y-targetdata.
x=t2x';
t=t2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:22:29CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%t2x-inputdata.
%t2y-targetdata.
x=t2x';
t=t2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:helpnnprocess
cessFcns={'removeconstantrows','mapminmax'};
cessFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
7.2网络训练过程
网络训练为:
图13第二代幼虫高峰期网络训练过程
7.3训练结果
训练结果为:
图14第二代幼虫高峰期网络训练结果
训练样本、验证样本、测试样本的R值分别为0.402150、1和1。
误差直方图为:
图15第二代幼虫高峰期网络训练结果误差直方图
训练样本、验证样本、测试样本、所有数据回归图为:
图16第二代幼虫高峰期网络训练结果回归图
验证样本和测试样本R值均为1。
8.原始数据
第一代发生量变量名为s1y,因变量名为s1x;第二代发生量变量名为s2y,因变量名为s2x;第一代幼虫高峰期变量名为t1y,因变量名为t1x;第二代幼虫高峰期变量名为t2y,因变量名为t2x。原始数据为:
s1x=
1.0e+04*
0.00180.001800
0.00200.00200.00010.1054
0.00180.00180.00010.1651
0.00200.00200.00010.0667
0.00130.00130.00010.2980
0.00190.00210.00011.0873
0.00180.001900
0.00210.00210.00010.6967
0.00200.00190.00010.3207
0.00190.001900
0.00190.00170.00010.0100
0.00190.00200.00010.0337
0.00190.00190.00010.6313
0.00190.00170.00010.0380
0.00170.00170.00013.3381
0.00190.001900
0.00190.001900
0.00180.001800
0.00200.001900
0.00240.002400
0.00190.00190.00010.5500
0.00180.00180.00011.7041
0.00210.00210.00010.1785
0.00190.00190.00010.0607
0.00200.00200.00010.5295
0.00190.00200.00010.5240
0.00190.001900.0228
0.00180.001800.0333
0.00200.00200.00001.2040
0.00220.002000
0.00160.001600
0.00200.001700
s1y=
1.0e+04*
0.4639
0.8096
1.2294
0.7667
1.4733
0.2387
0.2042
0.1312
0.0100
0.0031
0.0030
0.2434
0.0764
0.5497
0.0362
0
0.0327
0.1533
0.0017
0.2313
0.4684
0.1733
0.1282
0.0183
0.0283
0.0367
0.0319
0.0720
0.3400
0.0687
0.2200
0.2200
s2x=
1.0e+03*
0.02120.36670.01970.00070.07660.0798
0.01791.31870.02010.00080.16950.1504
0.02191.81670.02190.00080.01450.0112
0.02162.04000.02180.00080.01800.0096
0.01998.26670.02380.00080.06080.0623
0.02090.08670.01940.00090.21560.1275
0.01861.22800.02090.00080.11990.1103
0.02091.11290.02090.00070.01340.0044
0.019200.022700.07170.0007
0.01730.03130.02180.00090.03970.0137
0.017600.017600.01780.0178
0.02100.80000.02380.00080.08670.0442
0.01990.30670.02200.00090.00110.0518
0.020900.020900.04260.0587
0.017900.020500.04780.0475
0.021300.022200.01180.0039
0.021000.020500.15340.3282
0.017800.022200.05070.0461
0.020900.020900.00330.0299
0.02132.31330.02510.00080.01540.0133
0.01972.88670.01970.00080.01160.0049
0.01951.73330.02170.00080.05780.0757
0.02001.28200.02110.00090.13160.1654
0.017600.024500.01540.0564
0.01900.56670.01900.00080.10270.1208
0.02001.76670.021000.05280.0476
0.01780.40000.017800.02430.0243
0.02100.20000.021000.13430.1161
0.020200.019100.02160.0806
0.020300.020700.10320.0089
0.017500.020300.09940.0799
0.018900.021300.02820.0344
s2y=
1.0e+04*
0.9162
0.8164
1.0393
0.6327
1.7020
0.0989
0.6046
0.3207
0.0617
0.0102
0.0400
0.6903
0.2784
3.3381
0.0200
0
0.0327
0.1333
0.2517
1.0533
1.8514
0.1785
0.0607
0.0367
0.0387
0.0215
0.0333
3.4874
0.2610
0.1933
0.2333
0.2000
t1x=
23.9000358.600018.300017.9000
25.3000378.800019.700019.7000
25.0000375.300017.500017.5000
24.4000366.700019.800019.8000
22.5000336.800013.200013.2000
26.3000395.200018.700020.9000
25.3000379.500018.100019.4000
26.6000399.500021.000020.6000
25.0000374.400019.900018.5000
25.5000381.800018.600018.6000
26.9000404.000019.300016.7000
23.9000357.800019.100019.7000
23.9000359.000019.000019.0000
26.5000398.100018.700016.9000
25.1000375.900017.300017.3000
24.7000370.300018.900018.9000
24.5000368.000018.700018.7000
25.4000381.400018.300018.3000
25.5000383.000019.900018.5000
28.6000429.500023.700023.7000
26.2000393.500019.500019.5000
24.4000365.400018.300018.3000
27.2000407.300021.100021.1000
27.8000416.500018.500018.5000
25.7000386.100019.700019.7000
24.6000368.300019.200019.6000
27.1000406.900018.800018.8000
24.5000367.400018.100018.1000
24.3000364.100020.100020.1000
27.0000405.300022.000020.1000
25.5000381.900016.300016.3000
26.4000396.000020.400017.0000
t1y=
162
163
162
162
162
163
160
160
160
160
159
158
158
159
158
158
158
157
156
156
156
157
156
156
155
156
155
155
155
156
155
155
t2x=
137.000026.9000349.200021.2000
137.000027.3000355.100017.9000
136.000028.1000364.900021.9000
133.000026.4000343.300021.6000
134.000028.7000373.700019.9000
133.000024.0000312.300020.9000
131.000026.0000338.200018.6000
130.000028.2000366.500020.9000
130.000028.9000375.200019.2000
130.000026.5000344.900017.3000
128.000025.3000329.100017.6000
128.000029.0000376.400021.0000
126.000029.3000380.400019.9000
126.000026.1000339.000020.9000
123.000027.4000356.500017.9000
122.000027.8000361.900021.3000
120.000024.4000316.900021.0000
120.000027.5000357.500017.8000
118.000025.6000332.900020.9000
117.000029.6000385.200021.3000
117.000029.3000380.900019.7000
118.000026.4000343.200019.5000
117.000024.8000322.500020.0000
117.000030.3000393.400017.6000
117.000027.6000359.100019.0000
118.000026.4000342.600020.0000
117.000027.5000357.400017.8000
117.000026.6000346.100021.0000
117.000025.2000327.900020.2000
118.000026.6000345.700020.3000
117.000027.3000354.500017.5000
117.000025.0000324.800018.9000
t2y=
253
254
253
253
253
254
252
252
252
253
250
249
249
250
249
249
249
249
248
248
248
249
248
248
248
249
248
248
248
249
248
248
附录资料:不需要的可以自行删除
Pascal/C/C++语句对比(补充版)
一、Helloworld
先看三种语言的样例:
Pascal
begin
writeln(‘Helloworld’);
end.
C
#include<stdio.h>
intmain()
{
printf("Helloworld!\n");
return0;
}
C++
#include<iostream>
usingnamespacestd;
intmain()
{
cout<<"Helloworld!"<<endl;
return0;
}
从这三个程序可以看到一些最基本的东西。在Pascal中的begin和end,在C/C++里就是{};Pascal主程序没有返回值,而C/C++返回0(好像在C中可以为NULL)。在C/C++中,main函数以前的是头文件,样例中C为stdio.h,C++除了iostream还有第二行的usingnamespacestd,这个是打开命名空间的,NOIP不会考这个,可以不管,只要知道就行了。
此外说明注释单行用//,段落的话Pascal为{},C/C++为/**/。
**常用头文件(模板)
#include<iostream>
#include<cstdio>
#include<cstdlib>
#include<cmath>
#include<ctime>
#include<string>
usingnamespacestd;
intmain()
{
……
system(“pause”);
return0;
}
二、数据类型及定义
这里只列出常用的类型。
1、整型
Pascal
C/C++
范围
shortint
-
-128…127
integer
short
-32768…32767
longint
Int
-2147483648…2147483647
int64
longlong
-9223372036854775808…9223372036854775807
byte
-
0…255
word
unsignedshort
0…65535
longword
unsignedint
0…4294967295
qword
unsign
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- GB/T 44589-2024机器人自适应能力技术要求
- GB/T 44453-2024信息技术服务数字化转型跨灾种监测预警系统技术要求
- GB/T 29244-2024网络安全技术办公设备安全规范
- 同学30周年聚会活动方案(10篇)
- 初中高知识专项测试题附答案
- 2024安全管理技术竞赛(多选、判断)复习试题及答案
- 国开计算机文化基础第1章形考客观题题库2及答案
- 专题九软文营销(课件)职教高考电子商务专业《网络营销实务》
- 贵阳某教育公司中考九年级英语第一轮复习 语法专题精讲-动词 含答案
- 工程硕士研究生英语基础教程课后题语法翻译
- 机器视觉技术基础PPT完整全套教学课件
- 镜眼距对矫正视力的影响 省赛获奖
- 中建项目经济活动分析作业指导书
- 读后续写:爱与善良-面冷心热的老妇人 讲义-高考英语作文备考
- 新视野大学英语4第三版第四册第四单元教案演示教学
- 110KV变电站电气试验项目
- (鲁科版)五年级英语上册知识要点总复习
- 电除颤仪使用理论知识考核试题及答案
- 频率分布直方图专题训练
- 高一抽象函数专题
- 内部审计统计调查制度
评论
0/150
提交评论