You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+28-17
Original file line number
Diff line number
Diff line change
@@ -7,8 +7,7 @@ The code is built on [RCAN(pytorch)](https://github.com/yulunzhang/RCAN) and tes
7
7
### 1. Introduction
8
8
-**Abstract:**
9
9
Recently, deep convolutional neural networks (CNNs) have been widely explored in single image super-resolution (SISR) and obtained remarkable performance. However, most of the existing CNN-based SISR methods mainly focus on wider or deeper architecture design, neglecting to explore the feature correlations of intermediate layers, hence hindering the representational power of CNNs. To address this issue, in this paper, we propose a second-order attention network (SAN) for more powerful feature expression and feature correlation learning. Specifically, a novel train- able second-order channel attention (SOCA) module is developed to adaptively rescale the channel-wise features by using second-order feature statistics for more discriminative representations. Furthermore, we present a non-locally enhanced residual group (NLRG) structure, which not only incorporates non-local operations to capture long-distance spatial contextual information, but also contains repeated local-source residual attention groups (LSRAG) to learn increasingly abstract feature representations. Experimental results demonstrate the superiority of our SAN network over state-of-the-art SISR methods in terms of both quantitative metrics and visual quality.
10
-
- Main framework
11
-

10
+
12
11
13
12
### 2. Train code
14
13
#### Prepare training datasets
@@ -17,58 +16,70 @@ Recently, deep convolutional neural networks (CNNs) have been widely explored in
17
16
18
17
#### Train the model
19
18
- You can retrain the model:
20
-
- 1. CD to 'TrainCode/code';
21
-
- 2. Run the following scripts to train the models:
22
-
19
+
-1. CD to 'TrainCode/code';
20
+
-2. Run the following scripts to train the models:
0 commit comments