tokenizer.save_pretrained TypeError: Object of type property is not JSON serializable
tokenizer.save_pretrained TypeError: Object of type property is not JSON serializable Question: I am trying to save the GPT2 tokenizer as follows: from transformers import GPT2Tokenizer, GPT2LMHeadModel tokenizer = GPT2Tokenizer.from_pretrained("gpt2") tokenizer.pad_token = GPT2Tokenizer.eos_token dataset_file = "x.csv" df = pd.read_csv(dataset_file, sep=",") input_ids = tokenizer.batch_encode_plus(list(df["x"]), max_length=1024,padding=’max_length’,truncation=True)["input_ids"] # saving the tokenizer tokenizer.save_pretrained("tokenfile") I am getting the following error: TypeError: …