分類預測 | MATLAB實現(xiàn)WOA-CNN-BiGRU鯨魚算法優(yōu)化卷積雙向門控循環(huán)單元數(shù)據(jù)分類預測
分類效果
基本描述
1.Matlab實現(xiàn)WOA-CNN-BiGRU多特征分類預測,多特征輸入模型,運行環(huán)境Matlab2020b及以上;
2.基于鯨魚算法(WOA)優(yōu)化卷積神經(jīng)網(wǎng)絡-雙向門控循環(huán)單元(CNN-BiGRU)分類預測,優(yōu)化參數(shù)為,學習率,隱含層節(jié)點,正則化參數(shù);
3.多特征輸入單輸出的二分類及多分類模型。程序內(nèi)注釋詳細,直接替換數(shù)據(jù)就可以用;
程序語言為matlab,程序可出分類效果圖,迭代優(yōu)化圖,混淆矩陣圖;
4.data為數(shù)據(jù)集,輸入12個特征,分四類;運行主程序即可,其余為函數(shù)文件,無需運行,可在下載區(qū)獲取數(shù)據(jù)和程序內(nèi)容。
模型描述
CNN 是一種前饋型神經(jīng)網(wǎng)絡,廣泛應用于深度學習領域,主要由卷積層、池化層和全連接層組成,輸入特征向量可以為多維向量組,采用局部感知和權值共享的方式。卷積層對原始數(shù)據(jù)提取特征量,深度挖掘數(shù)據(jù)的內(nèi)在聯(lián)系,池化層能夠降低網(wǎng)絡復雜度、減少訓練參數(shù),全連接層將處理后的數(shù)據(jù)進行合并,計算分類和回歸結(jié)果。
BiGRU是LSTM的一種改進模型,將遺忘門和輸入門集成為單一的更新門,同時混合了神經(jīng)元狀態(tài)和隱藏狀態(tài),可有效地緩解循環(huán)神經(jīng)網(wǎng)絡中“梯度消失”的問題,并能夠在保持訓練效果的同時減少訓練參數(shù)。文章來源:http://www.zghlxwxcb.cn/news/detail-732847.html
程序設計
- 完整程序和數(shù)據(jù)獲取方式私信博主回復MATLAB實現(xiàn)WOA-CNN-BiGRU鯨魚算法優(yōu)化卷積雙向門控循環(huán)單元數(shù)據(jù)分類預測。
% The Whale Optimization Algorithm
function [Best_Cost,Best_pos,curve]=WOA(pop,Max_iter,lb,ub,dim,fobj)
% initialize position vector and score for the leader
Best_pos=zeros(1,dim);
Best_Cost=inf; %change this to -inf for maximization problems
%Initialize the positions of search agents
Positions=initialization(pop,dim,ub,lb);
curve=zeros(1,Max_iter);
t=0;% Loop counter
% Main loop
while t<Max_iter
for i=1:size(Positions,1)
% Return back the search agents that go beyond the boundaries of the search space
Flag4ub=Positions(i,:)>ub;
Flag4lb=Positions(i,:)<lb;
Positions(i,:)=(Positions(i,:).*(~(Flag4ub+Flag4lb)))+ub.*Flag4ub+lb.*Flag4lb;
% Calculate objective function for each search agent
fitness=fobj(Positions(i,:));
% Update the leader
if fitness<Best_Cost % Change this to > for maximization problem
Best_Cost=fitness; % Update alpha
Best_pos=Positions(i,:);
end
end
a=2-t*((2)/Max_iter); % a decreases linearly fron 2 to 0 in Eq. (2.3)
% a2 linearly dicreases from -1 to -2 to calculate t in Eq. (3.12)
a2=-1+t*((-1)/Max_iter);
% Update the Position of search agents
for i=1:size(Positions,1)
r1=rand(); % r1 is a random number in [0,1]
r2=rand(); % r2 is a random number in [0,1]
A=2*a*r1-a; % Eq. (2.3) in the paper
C=2*r2; % Eq. (2.4) in the paper
b=1; % parameters in Eq. (2.5)
l=(a2-1)*rand+1; % parameters in Eq. (2.5)
p = rand(); % p in Eq. (2.6)
for j=1:size(Positions,2)
if p<0.5
if abs(A)>=1
rand_leader_index = floor(pop*rand()+1);
X_rand = Positions(rand_leader_index, :);
D_X_rand=abs(C*X_rand(j)-Positions(i,j)); % Eq. (2.7)
Positions(i,j)=X_rand(j)-A*D_X_rand; % Eq. (2.8)
elseif abs(A)<1
D_Leader=abs(C*Best_pos(j)-Positions(i,j)); % Eq. (2.1)
Positions(i,j)=Best_pos(j)-A*D_Leader; % Eq. (2.2)
end
elseif p>=0.5
distance2Leader=abs(Best_pos(j)-Positions(i,j));
% Eq. (2.5)
Positions(i,j)=distance2Leader*exp(b.*l).*cos(l.*2*pi)+Best_pos(j);
end
end
end
t=t+1;
curve(t)=Best_Cost;
[t Best_Cost]
end
參考資料
[1] https://blog.csdn.net/kjm13182345320/article/details/129036772?spm=1001.2014.3001.5502
[2] https://blog.csdn.net/kjm13182345320/article/details/128690229文章來源地址http://www.zghlxwxcb.cn/news/detail-732847.html
到了這里,關于分類預測 | MATLAB實現(xiàn)WOA-CNN-BiGRU鯨魚算法優(yōu)化卷積雙向門控循環(huán)單元數(shù)據(jù)分類預測的文章就介紹完了。如果您還想了解更多內(nèi)容,請在右上角搜索TOY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關文章,希望大家以后多多支持TOY模板網(wǎng)!