(core) Document configuring AI assistance

Summary:
Added OPENAI_API_KEY and GRIST_FORMULA_ASSISTANT to the README. GRIST_FORMULA_ASSISTANT may be removed in the long term, but for now the goal is just to get something into the grist-core README quickly for self-hosters.

Removed `documentation/llm.md` because it's outdated and not really providing value.

Test Plan: none

Reviewers: paulfitz

Reviewed By: paulfitz

Differential Revision: https://phab.getgrist.com/D3963
This commit is contained in:
Alex Hall 2023-07-20 19:43:17 +02:00
parent 788a6d01ce
commit 0469a98c08
2 changed files with 1 additions and 35 deletions

View File

@ -296,6 +296,7 @@ PORT | port number to listen on for Grist server
REDIS_URL | optional redis server for browser sessions and db query caching REDIS_URL | optional redis server for browser sessions and db query caching
GRIST_SNAPSHOT_TIME_CAP | optional. Define the caps for tracking buckets. Usage: {"hour": 25, "day": 32, "isoWeek": 12, "month": 96, "year": 1000} GRIST_SNAPSHOT_TIME_CAP | optional. Define the caps for tracking buckets. Usage: {"hour": 25, "day": 32, "isoWeek": 12, "month": 96, "year": 1000}
GRIST_SNAPSHOT_KEEP | optional. Number of recent snapshots to retain unconditionally for a document, regardless of when they were made GRIST_SNAPSHOT_KEEP | optional. Number of recent snapshots to retain unconditionally for a document, regardless of when they were made
OPENAI_API_KEY | optional. Used for the AI formula assistant. Sign up for an account on OpenAI and then generate a secret key [here](https://platform.openai.com/account/api-keys). You also need to set `GRIST_FORMULA_ASSISTANT=1`.
Sandbox related variables: Sandbox related variables:

View File

@ -1,35 +0,0 @@
# Using Large Language Models with Grist
In this experimental Grist feature, originally developed by Alex Hall,
you can hook up OpenAI's ChatGPT to write formulas for
you. Here's how.
First, you need an API key. Visit https://openai.com/api/ and prepare a key, then
store it in an environment variable `OPENAI_API_KEY`.
That's all the configuration needed!
Currently it is only a backend feature, we are still working on the UI for it.
## Hugging Face and other OpenAI models (deactivated)
_Not currently available, needs some work to revive. These notes are only preserved as a reminder to ourselves of how this worked._
~~To use a different OpenAI model such as `code-davinci-002` or `text-davinci-003`,
set the environment variable `COMPLETION_MODEL` to the name of the model.~~
~~Alternatively, there are many non-proprietary models hosted on Hugging Face.
At the time of writing, none can compare with OpenAI for use with Grist.
Things can change quickly in the world of AI though. So instead of OpenAI,
you can visit https://huggingface.co/ and prepare a key, then
store it in an environment variable `HUGGINGFACE_API_KEY`.~~
~~The model used will default to `NovelAI/genji-python-6B` for
Hugging Face. There's no particularly great model for this application,
but you can try other models by setting an environment variable
`COMPLETION_MODEL` to `codeparrot/codeparrot` or
`NinedayWang/PolyCoder-2.7B` or similar.~~
~~If you are hosting a model yourself, host it as Hugging Face does,
and use `COMPLETION_URL` rather than `COMPLETION_MODEL` to
point to the model on your own server rather than Hugging Face.~~