how to config AI for selfhost #6996
Replies: 11 comments 15 replies
-
|
I have the same question |
Beta Was this translation helpful? Give feedback.
-
|
Turn on the copilot plugin at |
Beta Was this translation helpful? Give feedback.
-
|
For everybody interested: this works with the self hosted version. You have to use (a paid) Open AI account with a User API keys (Legacy) - not the new project API keys! In the affine.js config set: |
Beta Was this translation helpful? Give feedback.
-
|
Looks like openai don't allow new accounts to generate legacy API keys anymore. Anyway to get this to work with the project api keys? |
Beta Was this translation helpful? Give feedback.
-
|
I made it work with the latest 0.20 version: #10549 . My openai API key starts with |
Beta Was this translation helpful? Give feedback.
-
|
i found another workaround, i use LiteLLM as a llm proxy, converts over 200 LLMs in openai compatible api calls. In the admin UI you have the option to give a fake name to the LLM, as gpt-4o, so it will be recognize by AFFine |
Beta Was this translation helpful? Give feedback.
-
|
Does that approach still works with 0.21? I have set: and set the API key in the Any idea? |
Beta Was this translation helpful? Give feedback.
-
|
This should be a feature. |
Beta Was this translation helpful? Give feedback.
-
|
Anyone giving it a shot on 0.23? |
Beta Was this translation helpful? Give feedback.
-
|
any update on this? |
Beta Was this translation helpful? Give feedback.
-
|
What is FAL? |
Beta Was this translation helpful? Give feedback.


Uh oh!
There was an error while loading. Please reload this page.
-
Can we use AI in the self-host instance?
Beta Was this translation helpful? Give feedback.
All reactions