Hello

Thanks for your interest in the DemoMachine-LLMs packages. These are free for your use. These packages are built with llamafile and run on macOS, Windows, and Linux. Smaller models will even run on a Rasberry Pi! You should be familiar with running scripts on your computer. If you're not, please get help from someone who is.

If you would like paid assistance or need help with AI projects, please email me or start a discussion in the community. I do not accept pull requests.

-Brad
Brad Hutchings
[email protected]


A note about the EU AI Act

I'm not here on this Earth, and my creations and adaptations are not here to comply with the EU AI Act. We settled this initially in 1776 and again in 1812. For good measure, we came and helped y'all's out in WWI, WWII, the Cold War, and now Ukraine. So we are going to sort out our own way of "regulating" AI or not, as we see fit. We're not your colony and not subject to your laws. Apologies for any lingering misunderstanding.

If this bothers you, there are other models and projects on Hugging Face. If this bothers Hugging Face, I'm sure there will be other repo spaces that can host large downloads.

-Brad


Windows Instructions

Video: https://youtu.be/HRqaBoNajCM

  1. If you have not used PowerShell on your computer previously, you must enable scripting.

    1. Run PowerShell as Administrator.
    2. Paste this command and press the enter key:
      Set-ExecutionPolicy Unrestricted
      
      This will allow you to run scripts with permission each time. When you feel comfortable running scripts and, most importantly, not running scripts you shouldn't run that might compromise your computer, run this command:
      Set-ExecutionPolicy Bypass
      
    3. Close the PowerShell window.
  2. Go to the Files and versions tab of this web page.

  3. Download the file DemoMachine LLMs (Windows).zip.

  4. Once downloaded, move the file to your Desktop and extract there.

  5. Open that new folder. right-click the Start-Apple-OpenELM-1.1B-Instruct.ps1. Choose Run with PowerShell*.

    • The script will download a .gguf model for Apple OpenELM 1.1B Instruct that I have prepared to the models sub-folder. This may take a few minutes.
    • The script will launch my DemoMachine-llamafile.exe executable to run the model.
    • A web browser window will open. Follow instuctions there.
  6. When you're finished with the model, close the web browser tab and close the PowerShell or Command window.


macOS, Linux, Raspberry Pi. etc. Instructions

  1. Go to the Files and versions tab of this web page.
  2. Download the file DemoMachine LLMs (macOS-Linux).zip.
  3. Once downloaded, move the file to your Desktop and extract there.
  4. Open a Terminal. Type commands:
    cd ~/Desktop/DemoMachine\ LLMs\ \(macOS-Linux\)
    chmod a+x *
    ls -al
    
    (Picture of files with x bits set goes here.)
  5. Open that new folder. right-click the Start-Apple-OpenELM-1.1B-Instruct.sh. There should be a menu item to "run" the script. Choose that.
    • The script will download a .gguf model for Apple OpenELM 1.1B Instruct that I have prepared to the models sub-folder. This may take a few minutes.
    • The script will launch my DemoMachine-llamafile.exe executable to run the model.
    • A web browser window will open. Follow instuctions there.
  6. When you're finished with the model, close the web browser tab and close the PowerShell or Command window.

macOS Users

  • Note: When you right-click the .sh script file, choose Open from the context menu that appears. You may get a warning about the script. You can allow the script in the privacy settings:
    https://support.apple.com/guide/mac-help/open-a-mac-app-from-an-unknown-developer-mh40616/mac

  • Note: You may be asked to enable Developer Tools or some such when you launch the Terminal app for the first time. If this isn't a familiar place for you, please get some basic help from someone who knows their way around the Terminal.

Downloads last month
136
GGUF
Model size
1.08B params
Architecture
openelm

8-bit

16-bit

Inference API
Unable to determine this model's library. Check the docs .