How to use adapter transformers with a Huggingface Pipeline

Question:

I tried to run the model "AdapterHub/bert-base-uncased-pf-conll2003" (Model description here) for token classification in NLP.

First I tried to install the adapter transformers

pip install -U adapter-transformers 

The output of the above command was

Collecting adapter-transformers

[... see edit history for skipped lines ...]

Installing collected packages: tokenizers, huggingface-hub, adapter-transformers
  Attempting uninstall: tokenizers
    Found existing installation: tokenizers 0.15.0
    Uninstalling tokenizers-0.15.0:
      Successfully uninstalled tokenizers-0.15.0
  Attempting uninstall: huggingface-hub
    Found existing installation: huggingface-hub 0.19.4
    Uninstalling huggingface-hub-0.19.4:
      Successfully uninstalled huggingface-hub-0.19.4
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
transformers 4.35.2 requires huggingface-hub<1.0,>=0.16.4, but you have huggingface-hub 0.13.4 which is incompatible.
transformers 4.35.2 requires tokenizers<0.19,>=0.14, but you have tokenizers 0.13.3 which is incompatible.
Successfully installed adapter-transformers-3.2.1.post0 huggingface-hub-0.13.4 tokenizers-0.13.3

I tried to load the model like this into the pipeline:

from transformers import AutoModelWithHeads
from transformers import pipeline
token_classification = pipeline("token-classification", model = "AdapterHub/bert-base-uncased-pf-conll2003")
res = token_classification("Take out the trash bag from the bin and replace it.")
print(res)

I received the errors

EntryNotFoundError: 404 Client Error. (Request ID: Root=1-657e793c-0ce0c1936aff5e5741676650)

Entry Not Found for url: https://huggingface.co/AdapterHub/bert-base-uncased-pf-conll2003/resolve/main/config.json.

During handling of the above exception, another exception occurred:


OSError                                   Traceback (most recent call last)
<ipython-input-3-030dfe0e128d> in <cell line: 3>()
      1 from transformers import AutoModelWithHeads
      2 from transformers import pipeline
----> 3 token_classification = pipeline("token-classification", model = "AdapterHub/bert-base-uncased-pf-conll2003")
      4 res = token_classification("Take out the trash bag from the bin and replace it.")
      5 print(res)

/usr/local/lib/python3.10/dist-packages/transformers/pipelines/__init__.py in pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, device, device_map, torch_dtype, trust_remote_code, model_kwargs, pipeline_class, **kwargs)
    673         hub_kwargs["_commit_hash"] = config._commit_hash
    674     elif config is None and isinstance(model, str):
--> 675         config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
    676         hub_kwargs["_commit_hash"] = config._commit_hash
    677 

 [... see edit history for skipped lines ...]

/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py in _get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    624             try:
    625                 # Load from local folder or from cache or download from model Hub and cache
--> 626                 resolved_config_file = cached_file(
    627                     pretrained_model_name_or_path,
    628                     configuration_file,

/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
    452         if revision is None:
    453             revision = "main"
--> 454         raise EnvironmentError(
    455             f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
    456             f"'https://huggingface.co/{path_or_repo_id}/{revision}' for available files."

OSError: AdapterHub/bert-base-uncased-pf-conll2003 does not appear to have a file named config.json.
Checkout 'https://huggingface.co/AdapterHub/bert-base-uncased-pf-conll2003/main' for available files.

How do I correctly load this adapter model?

Asked By: Encipher

||

Answers:

# be sure you have the dependencies (NEW)
$ pip install adapters 

The old & legacy package is pip install -U adapter-transformers


Create the model outside of the pipeline

from transformers import AutoModelWithHeads
from transformers import pipeline
from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelWithHeads.from_pretrained("bert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/bert-base-uncased-pf-conll2003", source="hf")
model.active_adapters = adapter_name

token_classification = pipeline("token-classification", model=model, tokenizer=tokenizer)
res = token_classification("Take out the trash bag from the bin and replace it.")
print(res)
Answered By: Daraan