LLM Inference
A Fast Chat LLM Inference Module for Lilypad
Overview
This LLM Inference Module is a community-contributed module developed at AugmentHack.xyz The repo for this module can be found here.
See the original AugmentHack entry below:
[CLI] Usage
Usage:
Inputs:
Where "paramsStr" is a question in CID form for the LLM. For example https://ipfs.io/ipfs/QmcPjQwVcJiFge3yNjVL2NoZsTQ3GBpXAZe21S2Ncg16Gt is a bare file CID which contains
To use it you would run:
Outputs:
The output will be an IPFS CID, for example running the above input would result in the following link:
https://ipfs.io/ipfs/QmVNXCAfJgER6U7Z5XT8QaAVFPdwmtSFE6c9sUaAx7ttZs
Under link/output/result.json you will see
LLM Module Code
Last updated