Safetensors

Can't install

#1
by JLouisBiz - opened
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 507, in _build_extensions_serial
      self.build_extension(ext)
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/setuptools/command/build_ext.py", line 264, in build_extension
      _build_ext.build_extension(self, ext)
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/setuptools/_distutils/command/build_ext.py", line 562, in build_extension
      objects = self.compiler.compile(
                ^^^^^^^^^^^^^^^^^^^^^^
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 713, in unix_wrap_ninja_compile
      _write_ninja_file_and_compile_objects(
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1869, in _write_ninja_file_and_compile_objects
      _run_ninja_build(
    File "/home/data1/protected/tmp/pip-build-env-xl1_ycyx/overlay/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2225, in _run_ninja_build
      raise RuntimeError(message) from e
  RuntimeError: Error compiling objects for extension
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building editable for medit-one
Failed to build medit-one
ERROR: Could not build wheels for medit-one, which is required to install pyproject.toml-based projects

MedIT Solutions org

Hi! I've already addressed your question on GitHub. Could you please pull the latest updates?

Thanks. I could install requirements from Github. Then I am trying to run this:

from transformers import AutoTokenizer, AutoModelForCausalLM

# Load model and tokenizer
tokenizer = AutoTokenizer.from_pretrained("meditsolutions/one-small")
model = AutoModelForCausalLM.from_pretrained("meditsolutions/one-small")

# Generate text
input_text = "The single-token architecture provides"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids, max_length=50, temperature=0.7)
print(tokenizer.decode(outputs[0]))

and there is no one-small, and if I put directory where is the model it doesn't work. What to do?

    raise EnvironmentError(
OSError: meditsolutions/one-small is not a local folder and is not a valid model identifier listed on 'https://huggingface.co./models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`

Sign up or log in to comment