Is it possible to freeze only certain embedding weights in the embedding layer in pytorch?
Is it possible to freeze only certain embedding weights in the embedding layer in pytorch? Question: When using GloVe embedding in NLP tasks, some words from the dataset might not exist in GloVe. Therefore, we instantiate random weights for these unknown words. Would it be possible to freeze weights gotten from GloVe, and train only …
Total answers: 1