-
Notifications
You must be signed in to change notification settings - Fork 311
Open
Description
I am trying to implement a different version of this simple recurrent network which accepts three image of 200x200 and predicts a fourth. I modified it as:
require 'rnn'
require 'image'
-- hyper-parameters
batchSize = 1
rho = 3 -- sequence length
hiddenSize = 7
nIndex = 10
lr = 0.1
-- build simple recurrent neural network
local r = nn.Recurrent(
hiddenSize, nn.LookupTable(nIndex, hiddenSize),
nn.Linear(hiddenSize, hiddenSize), nn.Sigmoid(),
rho
)
local rnn = nn.Sequential()
:add(r)
:add(nn.Linear(hiddenSize, nIndex))
:add(nn.LogSoftMax())
-- wrap the non-recurrent module (Sequential) in Recursor.
-- This makes it a recurrent module
-- i.e. Recursor is an AbstractRecurrent instance
rnn = nn.Recursor(rnn, rho)
print(rnn)
-- build criterion
criterion = nn.ClassNLLCriterion()
offsets = {}
for i=1,batchSize do
table.insert(offsets, math.ceil(math.random()*sequence:size(1)))
end
--print(offsets)
offsets = torch.LongTensor(offsets)
--print(offsets)
-- training
local iteration = 1
count=1
datasetname = '/home/srilatha/Desktop/Research_intern/Data_sets/Cropped_data_set'
j=1
dir_name=datasetname.."/"..j.."/"
image_size=200
while count>0 do
-- 1. create a sequence of rho time-steps
local inputs, targets = {}, {}
for step=1,rho do
filename=dir_name..step..".JPG"
filename1=dir_name..(step+1)..".JPG"
img1=image.load(filename)
img1=image.scale(img1,image_size,image_size)
img1=torch.reshape(img1,1,image_size*image_size)
inputs[step]=img1
img2=image.load(filename1)
img2=image.scale(img2,image_size,image_size)
img2=torch.reshape(img2,1,image_size*image_size)
targets[step] = img2
end
print(inputs[2][1][1])
print(targets[1][1][1])
-- 2. forward sequence through rnn
rnn:zeroGradParameters()
rnn:forget() -- forget all past time-steps
--print(rnn:forward(inputs[1][1]))
local outputs, err = {}, 0
for step=1,rho do
outputs[step] = rnn:forward(inputs[step])
print(outputs[step])
err = err + criterion:forward(outputs[step], targets[step])
end
print(string.format("Iteration %d ; NLL err = %f ", iteration, err))
-- 3. backward sequence through rnn (i.e. backprop through time)
local gradOutputs, gradInputs = {}, {}
for step=rho,1,-1 do -- reverse order of forward calls
gradOutputs[step] = criterion:backward(outputs[step], targets[step])
gradInputs[step] = rnn:backward(inputs[step], gradOutputs[step])
end
-- 4. update
rnn:updateParameters(lr)
iteration = iteration + 1
count=count-1
end
But I'm getting an index out of range error:
/home/srilatha/torch/install/share/lua/5.1/nn/Container.lua:67:
In 1 module of nn.Sequential:
In 1 module of nn.Sequential:
.../srilatha/torch/install/share/lua/5.1/nn/LookupTable.lua:73: index out of range at /tmp/luarocks_torch-scm-1-3482/torch7/lib/TH/generic/THTensorMath.c:156
stack traceback:
[C]: in function 'index'
.../srilatha/torch/install/share/lua/5.1/nn/LookupTable.lua:73: in function <.../srilatha/torch/install/share/lua/5.1/nn/LookupTable.lua:68>
[C]: in function 'xpcall'
/home/srilatha/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
...e/srilatha/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function 'updateOutput'
...e/srilatha/torch/install/share/lua/5.1/rnn/Recurrent.lua:68: in function <...e/srilatha/torch/install/share/lua/5.1/rnn/Recurrent.lua:64>
[C]: in function 'xpcall'
/home/srilatha/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
...e/srilatha/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function 'updateOutput'
/home/srilatha/torch/install/share/lua/5.1/rnn/Recursor.lua:25: in function 'forward'
/home/srilatha/Desktop/Research_intern/rnn_simple.lua:75: in main chunk
[C]: in function 'dofile'
[string "_RESULT={dofile('/home/srilatha/Desktop/Resea..."]:1: in main chunk
[C]: in function 'xpcall'
/home/srilatha/torch/install/share/lua/5.1/trepl/init.lua:651: in function </home/srilatha/torch/install/share/lua/5.1/trepl/init.lua:560>
[string "require('trepl')()"]:1: in main chunk
What is going wrong and how can I fix this?
Metadata
Metadata
Assignees
Labels
No labels