TensorFlow LSTM Iterator for test queue is incremented during training2019 Community Moderator ElectionInput to LSTM network tensorflowRegularization for LSTM in tensorflowTensorFlow: LSTM State Saving/Updating within GraphState resetting in LSTMs during training and testingTensorflow: jointly training CNN + LSTMDeactivate LSTM memory in Tensorflow3darray training/testing TensorFlow RNN LSTMHow to save and restore a lstm trained model in Tensorflow using Saver?Tensorflow LSTM training by batchShape ValueError in LSTM network using Tensorflow
Are all UTXOs locked by an address spent in a transaction?
Called into a meeting and told we are being made redundant (laid off) and "not to share outside". Can I tell my partner?
PTIJ: Why can't I sing about soda on certain days?
Why are special aircraft used for the carriers in the United States Navy?
Can the Shape Water Cantrip be used to manipulate blood?
How can neutral atoms have exactly zero electric field when there is a difference in the positions of the charges?
Can I solder 12/2 Romex to extend wire 5 ft?
is 'sed' thread safe
If nine coins are tossed, what is the probability that the number of heads is even?
Why won't the strings command stop?
Why is my Contribution Detail Report (native CiviCRM Core report) not accurate?
Why did the Cray-1 have 8 parity bits per word?
Specific Chinese carabiner QA?
Can an earth elemental drown/bury its opponent underground using earth glide?
1970s scifi/horror novel where protagonist is used by a crablike creature to feed its larvae, goes mad, and is defeated by retraumatising him
Correct physics behind the colors on CD (compact disc)?
School performs periodic password audits. Is my password compromised?
Where is the fallacy here?
Is divide-by-zero a security vulnerability?
It doesn't matter the side you see it
The need of reserving one's ability in job interviews
How to mitigate "bandwagon attacking" from players?
I can't die. Who am I?
Formatting a table to look nice
TensorFlow LSTM Iterator for test queue is incremented during training
2019 Community Moderator ElectionInput to LSTM network tensorflowRegularization for LSTM in tensorflowTensorFlow: LSTM State Saving/Updating within GraphState resetting in LSTMs during training and testingTensorflow: jointly training CNN + LSTMDeactivate LSTM memory in Tensorflow3darray training/testing TensorFlow RNN LSTMHow to save and restore a lstm trained model in Tensorflow using Saver?Tensorflow LSTM training by batchShape ValueError in LSTM network using Tensorflow
I have a LSTM in Tensorflow, which uses a queue to distinguish between training and test data.
The structure is as follows:
# Queue for Trainingdata
iter_train = tf.data.Dataset.range(epochNum_train).repeat().make_one_shot_iterator().get_next()
input_train_queue = input_train[:, iter_train * num_steps : (iter_train + 1) * num_steps, :]
input_train_queue.set_shape([batch_size, num_steps, input_size])
output_train_queue = output_train[:, iter_train * num_steps: (iter_train + 1) * num_steps, :]
output_train_queue.set_shape([batch_size, num_steps, input_size])
# Queue for Testdata
iter_test = tf.data.Dataset.range(epochNum_test).repeat().make_one_shot_iterator().get_next()
input_test_queue = input_test[:, iter_test * num_steps : (iter_test + 1) * num_steps, :]
input_test_queue.set_shape([batch_size, num_steps, input_size])
output_test_queue = output_test[:, iter_test * num_steps: (iter_test + 1) * num_steps, :]
output_test_queue.set_shape([batch_size, num_steps, input_size])
# tf.cond for the selection of data
rnn_outputs, _ = tf.nn.dynamic_rnn(cell, tf.cond(useTestData, lambda: input_test_queue, lambda: input_train_queue),
dtype=tf.float32, initial_state=init_state)
error = tf.reduce_mean(tf.squared_difference(rnn_outputs, tf.cond(useTestData, lambda: output_test_queue, lambda: output_train_queue)))
train_fn = tf.train.AdamOptimizer(learning_rate=0.01).minimize(error)
My problem is that iter_test
is also incremented when training data is given to the LSTM:
t1 = sess.run(iter_test) # t1 has the value 0
sess.run(train_fn, useTestData: False)
t2 = sess.run(iter_test) # t2 has the value 2
t3 = sess.run(iter_test) # t3 has the value 3
Why is iter_test
incremented during training? And is there a solution to the problem so that iter_test
is not changed during training?
python tensorflow queue lstm recurrent-neural-network
add a comment |
I have a LSTM in Tensorflow, which uses a queue to distinguish between training and test data.
The structure is as follows:
# Queue for Trainingdata
iter_train = tf.data.Dataset.range(epochNum_train).repeat().make_one_shot_iterator().get_next()
input_train_queue = input_train[:, iter_train * num_steps : (iter_train + 1) * num_steps, :]
input_train_queue.set_shape([batch_size, num_steps, input_size])
output_train_queue = output_train[:, iter_train * num_steps: (iter_train + 1) * num_steps, :]
output_train_queue.set_shape([batch_size, num_steps, input_size])
# Queue for Testdata
iter_test = tf.data.Dataset.range(epochNum_test).repeat().make_one_shot_iterator().get_next()
input_test_queue = input_test[:, iter_test * num_steps : (iter_test + 1) * num_steps, :]
input_test_queue.set_shape([batch_size, num_steps, input_size])
output_test_queue = output_test[:, iter_test * num_steps: (iter_test + 1) * num_steps, :]
output_test_queue.set_shape([batch_size, num_steps, input_size])
# tf.cond for the selection of data
rnn_outputs, _ = tf.nn.dynamic_rnn(cell, tf.cond(useTestData, lambda: input_test_queue, lambda: input_train_queue),
dtype=tf.float32, initial_state=init_state)
error = tf.reduce_mean(tf.squared_difference(rnn_outputs, tf.cond(useTestData, lambda: output_test_queue, lambda: output_train_queue)))
train_fn = tf.train.AdamOptimizer(learning_rate=0.01).minimize(error)
My problem is that iter_test
is also incremented when training data is given to the LSTM:
t1 = sess.run(iter_test) # t1 has the value 0
sess.run(train_fn, useTestData: False)
t2 = sess.run(iter_test) # t2 has the value 2
t3 = sess.run(iter_test) # t3 has the value 3
Why is iter_test
incremented during training? And is there a solution to the problem so that iter_test
is not changed during training?
python tensorflow queue lstm recurrent-neural-network
add a comment |
I have a LSTM in Tensorflow, which uses a queue to distinguish between training and test data.
The structure is as follows:
# Queue for Trainingdata
iter_train = tf.data.Dataset.range(epochNum_train).repeat().make_one_shot_iterator().get_next()
input_train_queue = input_train[:, iter_train * num_steps : (iter_train + 1) * num_steps, :]
input_train_queue.set_shape([batch_size, num_steps, input_size])
output_train_queue = output_train[:, iter_train * num_steps: (iter_train + 1) * num_steps, :]
output_train_queue.set_shape([batch_size, num_steps, input_size])
# Queue for Testdata
iter_test = tf.data.Dataset.range(epochNum_test).repeat().make_one_shot_iterator().get_next()
input_test_queue = input_test[:, iter_test * num_steps : (iter_test + 1) * num_steps, :]
input_test_queue.set_shape([batch_size, num_steps, input_size])
output_test_queue = output_test[:, iter_test * num_steps: (iter_test + 1) * num_steps, :]
output_test_queue.set_shape([batch_size, num_steps, input_size])
# tf.cond for the selection of data
rnn_outputs, _ = tf.nn.dynamic_rnn(cell, tf.cond(useTestData, lambda: input_test_queue, lambda: input_train_queue),
dtype=tf.float32, initial_state=init_state)
error = tf.reduce_mean(tf.squared_difference(rnn_outputs, tf.cond(useTestData, lambda: output_test_queue, lambda: output_train_queue)))
train_fn = tf.train.AdamOptimizer(learning_rate=0.01).minimize(error)
My problem is that iter_test
is also incremented when training data is given to the LSTM:
t1 = sess.run(iter_test) # t1 has the value 0
sess.run(train_fn, useTestData: False)
t2 = sess.run(iter_test) # t2 has the value 2
t3 = sess.run(iter_test) # t3 has the value 3
Why is iter_test
incremented during training? And is there a solution to the problem so that iter_test
is not changed during training?
python tensorflow queue lstm recurrent-neural-network
I have a LSTM in Tensorflow, which uses a queue to distinguish between training and test data.
The structure is as follows:
# Queue for Trainingdata
iter_train = tf.data.Dataset.range(epochNum_train).repeat().make_one_shot_iterator().get_next()
input_train_queue = input_train[:, iter_train * num_steps : (iter_train + 1) * num_steps, :]
input_train_queue.set_shape([batch_size, num_steps, input_size])
output_train_queue = output_train[:, iter_train * num_steps: (iter_train + 1) * num_steps, :]
output_train_queue.set_shape([batch_size, num_steps, input_size])
# Queue for Testdata
iter_test = tf.data.Dataset.range(epochNum_test).repeat().make_one_shot_iterator().get_next()
input_test_queue = input_test[:, iter_test * num_steps : (iter_test + 1) * num_steps, :]
input_test_queue.set_shape([batch_size, num_steps, input_size])
output_test_queue = output_test[:, iter_test * num_steps: (iter_test + 1) * num_steps, :]
output_test_queue.set_shape([batch_size, num_steps, input_size])
# tf.cond for the selection of data
rnn_outputs, _ = tf.nn.dynamic_rnn(cell, tf.cond(useTestData, lambda: input_test_queue, lambda: input_train_queue),
dtype=tf.float32, initial_state=init_state)
error = tf.reduce_mean(tf.squared_difference(rnn_outputs, tf.cond(useTestData, lambda: output_test_queue, lambda: output_train_queue)))
train_fn = tf.train.AdamOptimizer(learning_rate=0.01).minimize(error)
My problem is that iter_test
is also incremented when training data is given to the LSTM:
t1 = sess.run(iter_test) # t1 has the value 0
sess.run(train_fn, useTestData: False)
t2 = sess.run(iter_test) # t2 has the value 2
t3 = sess.run(iter_test) # t3 has the value 3
Why is iter_test
incremented during training? And is there a solution to the problem so that iter_test
is not changed during training?
python tensorflow queue lstm recurrent-neural-network
python tensorflow queue lstm recurrent-neural-network
edited 16 hours ago
Anne Bierhoff
asked 16 hours ago
Anne BierhoffAnne Bierhoff
8516
8516
add a comment |
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55021235%2ftensorflow-lstm-iterator-for-test-queue-is-incremented-during-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55021235%2ftensorflow-lstm-iterator-for-test-queue-is-incremented-during-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown