Can't Initialise Two Different Tokenizers with Keras
Can't Initialise Two Different Tokenizers with Keras Question: For spelling correction task, I build a seq2seq model including LSTM and attention mechanism. I do char-level tokenisation with Keras. I initialised two different tokenizers, one for typo sentence and the other for corrected sentence. After testing, I see that model produced empty string and I believe …