Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input'

Question:

from langchain.llms import LlamaCpp
from langchain import PromptTemplate, LLMChain
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

template = """Question: {question}

Answer: Let's work this out in a step by step way to be sure we have the right answer."""

prompt = PromptTemplate(template=template, input_variables=["question"])

callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])

llm = LlamaCpp(
                model_path="./Models/llama-7b.ggmlv3.q2_K.bin",
                input={"temperature": 0.75,
                       "max_length": 2000,
                       "top_p": 1},
                callback_manager=callback_manager,
                verbose=True,
                )

llm_chain = LLMChain(prompt=prompt, llm=llm)

current folder structure

(llm) C:llm>python app1.py
C:llmlibsite-packageslangchainutilsutils.py:155: UserWarning: WARNING! input is not default parameter.
                input was transferred to model_kwargs.
                Please confirm that input is what you intended.
  warnings.warn(
Exception ignored in: <function Llama.__del__ at 0x000001923B3AE680>
Traceback (most recent call last):
  File "C:llmlibsite-packagesllama_cppllama.py", line 1507, in __del__
    if self.model is not None:
AttributeError: 'Llama' object has no attribute 'model'
Traceback (most recent call last):
  File "C:llmapp1.py", line 14, in <module>
    llm = LlamaCpp(
  File "C:llmlibsite-packageslangchainloadserializable.py", line 74, in __init__
    super().__init__(**kwargs)
  File "pydanticmain.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for LlamaCpp
__root__
  Could not load Llama model from path: ./Models/llama-7b.ggmlv3.q2_K.bin. Received error Llama.__init__() got an unexpected keyword argument 'input' (type=value_error)
Asked By: rahularyansharma

||

Answers:

You could try installing llama-cpp-python older version:

pip install llama-cpp-python==0.1.65 --force-reinstall --upgrade --no-cache-dir

This worked for me.

Answered By: Abinaya Shankar

New updated llama.cpp uses gguf file Bindings(formats).
Try;
1.building your latest llama-cpp-python library with –force-reinstall –upgrade and use some reformatted gguf models for example on huggingface by the user "The bloke"

2.building an older version of the llama.cpp. to my knowledge <= 0.1.48

Answered By: Eren Kalinsazlioglu
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.