乡下人产国偷v产偷v自拍,国产午夜片在线观看,婷婷成人亚洲综合国产麻豆,久久综合给合久久狠狠狠9

  • <output id="e9wm2"></output>
    <s id="e9wm2"><nobr id="e9wm2"><ins id="e9wm2"></ins></nobr></s>

    • 分享

      用MATLAB的神經(jīng)網(wǎng)絡(luò)工具箱實(shí)現(xiàn)三層BP網(wǎng)絡(luò)

       Ethan的博客 2011-08-04

      用MATLAB的神經(jīng)網(wǎng)絡(luò)工具箱實(shí)現(xiàn)三層BP網(wǎng)絡(luò)


      % 讀入訓(xùn)練數(shù)據(jù)和測(cè)試數(shù)據(jù)
      Input = [];
      Output = [];
      str = {'Test','Check'};
      Data = textread([str{1},'.txt']);
      % 讀訓(xùn)練數(shù)據(jù)
      Input = Data(:,1:end-1);
      % 取數(shù)據(jù)表的前五列(主從成分)
      Output = Data(:,end);
      % 取數(shù)據(jù)表的最后一列(輸出值)
      Data = textread([str{2},'.txt']);
      % 讀測(cè)試數(shù)據(jù)
      CheckIn = Data(:,1:end-1);
      % 取數(shù)據(jù)表的前五列(主從成分)
      CheckOut = Data(:,end);
      % 取數(shù)據(jù)表的最后一列(輸出值)
      Input = Input';
      Output = Output';
      CheckIn = CheckIn';
      CheckOut = CheckOut';
      % 矩陣賺置
      [Input,minp,maxp,Output,mint,maxt] = premnmx(Input,Output);
      % 標(biāo)準(zhǔn)化數(shù)據(jù)

      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      % 神經(jīng)網(wǎng)絡(luò)參數(shù)設(shè)置
      %====可以修正處
      Para.Goal = 0.0001;
      % 網(wǎng)絡(luò)訓(xùn)練目標(biāo)誤差
      Para.Epochs = 800;
      % 網(wǎng)絡(luò)訓(xùn)練代數(shù)
      Para.LearnRate = 0.1;
      % 網(wǎng)絡(luò)學(xué)習(xí)速率
      %====
      Para.Show = 5;
      % 網(wǎng)絡(luò)訓(xùn)練顯示間隔
      Para.InRange = repmat([-1 1],size(Input,1),1);
      % 網(wǎng)絡(luò)的輸入變量區(qū)間
      Para.Neurons = [size(Input,1)*2+1 1];
      % 網(wǎng)絡(luò)后兩層神經(jīng)元配置
      Para.TransferFcn= {'logsig' 'purelin'};
      % 各層的閾值函數(shù)
      Para.TrainFcn = 'trainlm';
      % 網(wǎng)絡(luò)訓(xùn)練函數(shù)賦值
      % traingd : 梯度下降后向傳播法
      % traingda : 自適應(yīng)學(xué)習(xí)速率的梯度下降法
      % traingdm : 帶動(dòng)量的梯度下降法
      % traingdx :
      % 帶動(dòng)量,自適應(yīng)學(xué)習(xí)速率的梯度下降法
      Para.LearnFcn = 'learngdm';
      % 網(wǎng)絡(luò)學(xué)習(xí)函數(shù)
      Para.PerformFcn = 'sse';
      % 網(wǎng)絡(luò)的誤差函數(shù)
      Para.InNum = size(Input,1);
      % 輸入量維數(shù)
      Para.IWNum = Para.InNum*Para.Neurons(1);
      % 輸入權(quán)重個(gè)數(shù)
      Para.LWNum = prod(Para.Neurons);
      % 層權(quán)重個(gè)數(shù)
      Para.BiasNum = sum(Para.Neurons);
      % 偏置個(gè)數(shù)
      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      Net = newff(Para.InRange,Para.Neurons,Para.TransferFcn,...
      Para.TrainFcn,Para.LearnFcn,Para.PerformFcn);
      % 建立網(wǎng)絡(luò)
      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      Net.trainParam.show = Para.Show;
      % 訓(xùn)練顯示間隔賦值
      Net.trainParam.goal = Para.Goal;
      % 訓(xùn)練目標(biāo)誤差賦值
      Net.trainParam.lr = Para.LearnRate;
      % 網(wǎng)絡(luò)學(xué)習(xí)速率賦值
      Net.trainParam.epochs = Para.Epochs;
      % 訓(xùn)練代數(shù)賦值
      Net.trainParam.lr = Para.LearnRate;

      Net.performFcn = Para.PerformFcn;
      % 誤差函數(shù)賦值
      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      % 調(diào)試
      Out1 =sim(Net,Input);
      % 仿真剛建立的網(wǎng)絡(luò)
      Sse1 =sse(Output-Out1);
      % 剛建立的網(wǎng)絡(luò)誤差

      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      [Net TR] = train(Net,Input,Output);
      % 訓(xùn)練網(wǎng)絡(luò)并返回
      %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
      Out3 =sim(Net,Input);
      % 對(duì)學(xué)習(xí)訓(xùn)練后的網(wǎng)絡(luò)仿真

        本站是提供個(gè)人知識(shí)管理的網(wǎng)絡(luò)存儲(chǔ)空間,所有內(nèi)容均由用戶發(fā)布,不代表本站觀點(diǎn)。請(qǐng)注意甄別內(nèi)容中的聯(lián)系方式、誘導(dǎo)購買等信息,謹(jǐn)防詐騙。如發(fā)現(xiàn)有害或侵權(quán)內(nèi)容,請(qǐng)點(diǎn)擊一鍵舉報(bào)。
        轉(zhuǎn)藏 分享 獻(xiàn)花(0

        0條評(píng)論

        發(fā)表

        請(qǐng)遵守用戶 評(píng)論公約

        類似文章 更多