程序员最近都爱上了这个网站  程序员们快来瞅瞅吧!  it98k网:it98k.com

本站消息

站长简介/公众号

  出租广告位,需要合作请联系站长

+关注
已关注

分类  

暂无分类

标签  

暂无标签

日期归档  

暂无数据

循环神经网络(RNN、LSTM、GRU)的结构及代码实现

发布于2021-10-05 21:24     阅读(320)     评论(0)     点赞(29)     收藏(0)


RNN

rnn

 代码实现:

  1. # 一般循环神经网络RNN
  2. class ConvRNN(nn.Module):
  3. def __init__(self, inp_dim, oup_dim, kernel, dilation):
  4. super().__init__()
  5. pad_x = int(dilation * (kernel - 1) / 2)
  6. self.conv_x = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  7. pad_h = int((kernel - 1) / 2)
  8. self.conv_h = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  9. self.relu = nn.LeakyReLU(0.2)
  10. def forward(self, x, h=None):
  11. if h is None:
  12. h = F.tanh(self.conv_x(x))
  13. else:
  14. h = F.tanh(self.conv_x(x) + self.conv_h(h))
  15. h = self.relu(h)
  16. return h, h

 

LSTM  

参考:图解LSTM 结构逻辑_BruceJust的博客-CSDN博客_lstm结构图

 

前向传播公式:

 

代码实现:

 

  1. # 参考LSTM结构图
  2. # https://blog.csdn.net/weixin_42175217/article/details/106183682
  3. class ConvLSTM(nn.Module):
  4. def __init__(self, inp_dim, oup_dim, kernel, dilation):
  5. super().__init__()
  6. pad_x = int(dilation * (kernel - 1) / 2)
  7. self.conv_xf = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  8. self.conv_xi = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  9. self.conv_xo = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  10. self.conv_xj = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  11. pad_h = int((kernel - 1) / 2)
  12. self.conv_hf = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  13. self.conv_hi = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  14. self.conv_ho = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  15. self.conv_hj = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  16. self.relu = nn.LeakyReLU(0.2)
  17. def forward(self, x, pair=None):
  18. if pair is None:
  19. i = F.sigmoid(self.conv_xi(x))
  20. o = F.sigmoid(self.conv_xo(x))
  21. j = F.tanh(self.conv_xj(x))
  22. c = i * j
  23. h = o * c
  24. else:
  25. h, c = pair
  26. f = F.sigmoid(self.conv_xf(x) + self.conv_hf(h))
  27. i = F.sigmoid(self.conv_xi(x) + self.conv_hi(h))
  28. o = F.sigmoid(self.conv_xo(x) + self.conv_ho(h))
  29. j = F.tanh(self.conv_xj(x) + self.conv_hj(h))
  30. c = f * c + i * j
  31. h = o * F.tanh(c)
  32. h = self.relu(h)
  33. return h, [h, c]

 GRU

 

前向传播公式

 代码实现:

  1. class ConvGRU(nn.Module):
  2. def __init__(self, inp_dim, oup_dim, kernel, dilation):
  3. super().__init__()
  4. pad_x = int(dilation * (kernel - 1) / 2)
  5. self.conv_xz = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  6. self.conv_xr = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  7. self.conv_xn = nn.Conv2d(inp_dim, oup_dim, kernel, padding=pad_x, dilation=dilation)
  8. pad_h = int((kernel - 1) / 2)
  9. self.conv_hz = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  10. self.conv_hr = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  11. self.conv_hn = nn.Conv2d(oup_dim, oup_dim, kernel, padding=pad_h)
  12. self.relu = nn.LeakyReLU(0.2)
  13. def forward(self, x, h=None):
  14. if h is None:
  15. z = F.sigmoid(self.conv_xz(x))
  16. f = F.tanh(self.conv_xn(x))
  17. h = z * f
  18. else:
  19. z = F.sigmoid(self.conv_xz(x) + self.conv_hz(h))
  20. r = F.sigmoid(self.conv_xr(x) + self.conv_hr(h))
  21. n = F.tanh(self.conv_xn(x) + self.conv_hn(r * h))
  22. h = (1 - z) * h + z * n
  23. h = self.relu(h)
  24. return h, h

 

 代码参考:DCSFN/model.py at master · Ohraincu/DCSFN · GitHub



所属网站分类: 技术文章 > 博客

作者:你是好人

链接:http://www.phpheidong.com/blog/article/167761/17f0e71446b473a51b71/

来源:php黑洞网

任何形式的转载都请注明出处,如有侵权 一经发现 必将追究其法律责任

29 0
收藏该文
已收藏

评论内容:(最多支持255个字符)