Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the dilated residual layer #33

Open
JustinYuu opened this issue May 17, 2021 · 0 comments
Open

Question about the dilated residual layer #33

JustinYuu opened this issue May 17, 2021 · 0 comments

Comments

@JustinYuu
Copy link

JustinYuu commented May 17, 2021

Hello,
The dilated residual layer in Fig. 2 includes a 1*1 convolution after the ReLU activation. However, I cannot find the explanations of the role of this 1*1 convolution. Is this 1*1 Conv used to introduce more parameters to improve the expressiveness of the TCN?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant