You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@nicholas-leonard Hi here. Thanks for the nice work. The problem mentioned in #134 still remained unsolved when I'm trying to implement deep recurrent attention model. When I pass a table of rewards to the new Criterion and try to call the reinfoce method. Yeah, if I pass a module that's not decorated by the nn.AbstractSequencer, passing a table of reward seems like working. Something like this.
But when I pass the RecurrentAttention module which is decorated by the nn.AbstractSequencer, I got some error encountered by @pefi9 and @ssampang in issue 134.
/home/vyouman/torch/install/bin/luajit: ...an/torch/install/share/lua/5.1/rnn/AbstractSequencer.lua:4: DEPRECATED 27 Oct 2015. Wrap your internal modules into a Recursor instead
stack traceback:
[C]: in function 'error'
...an/torch/install/share/lua/5.1/rnn/AbstractSequencer.lua:4: in function 'getStepModule'
...an/torch/install/share/lua/5.1/rnn/AbstractRecurrent.lua:177: in function 'reinforce'
/home/vyouman/torch/install/share/lua/5.1/dpnn/Module.lua:586: in function 'reinforce'
...-linux.gtk.x86_64/workspace/DRAM/src/VRCaptionReward.lua:53: in function 'backward'
...product-linux.gtk.x86_64/workspace/DRAM/src/testRAEx.lua:171: in main chunk
[C]: at 0x00406670
Yeah, I've modified the RecurrentAttention module a little bit, but I think the problem is caused by the deprecated getStepModule of the AbstractSequencer module. I know the following code of the AbstractRecurrent deals with table of rewards.
But the getStepModule of the AbstractSequencer is deprecated but it's used in the reinforce method. I think that's the cause of the bug.
How should I fix it? It doesn't work if I change the parent module of the RecurrentAttention module to nn.Container or nn.AbstractRecurrent or nn.Recursor.
@vyouman I just fixed it in 4a0643e . It was bad code that should have been removed when the deprecated message was added. Thanks for pushing me on this.
Also, when your code is ready, please let me know (either via issue or PR) so it can be added to external resources section of the README.
@nicholas-leonard Hi here. Thanks for the nice work. The problem mentioned in #134 still remained unsolved when I'm trying to implement deep recurrent attention model. When I pass a table of rewards to the new Criterion and try to call the reinfoce method. Yeah, if I pass a module that's not decorated by the nn.AbstractSequencer, passing a table of reward seems like working. Something like this.
https://github.com/vyouman/DRAM/blob/master/testVRCaptionReward.lua#L28-L33
But when I pass the RecurrentAttention module which is decorated by the nn.AbstractSequencer, I got some error encountered by @pefi9 and @ssampang in issue 134.
Yeah, I've modified the RecurrentAttention module a little bit, but I think the problem is caused by the deprecated getStepModule of the AbstractSequencer module. I know the following code of the AbstractRecurrent deals with table of rewards.
https://github.com/Element-Research/rnn/blob/master/AbstractRecurrent.lua#L179-L184
And the reinforce method of the AbstractSequencer is
But the getStepModule of the AbstractSequencer is deprecated but it's used in the reinforce method. I think that's the cause of the bug.
How should I fix it? It doesn't work if I change the parent module of the RecurrentAttention module to nn.Container or nn.AbstractRecurrent or nn.Recursor.
I've made my code opensource.
This is my modified RA module: https://github.com/vyouman/DRAM/blob/master/RecurrentAttentionEx.lua
and the unit test script of the modified RA module: https://github.com/vyouman/DRAM/blob/master/testRAEx.lua
And need the LanguageModelCriterion.lua and VRCaptionReward.lua of the depository to run the unit test script.
Now it's ok to forward, the error occurred in the backward procedure.
The text was updated successfully, but these errors were encountered: