-
Notifications
You must be signed in to change notification settings - Fork 43
[Example] ggml: add llava-base64-stream example #107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
dm4
commented
Feb 29, 2024
- Merge after [WASI-NN] ggml: add inline base64 prompt support for llava WasmEdge/WasmEdge#3248
Signed-off-by: dm4 <dm4@secondstate.io>
|
Hello, I am a code review bot on flows.network. Here are my reviews of code commits in this PR. Potential Issues and Errors:
Most Important Findings:
Overall Summary:The Pull Request introduces a new example to the DetailsCommit 258acdcca66e78c6323e4f53376a4cb6481fd32fKey Changes:
Potential Problems:
Commit 1d15aa3fd951984e7922e27417b08ae6da648524Key Changes:
Potential Problems:
Overall, the changes seem to aim at streamlining the existing workflows and consolidating job configurations, but some aspects need further refinement for completeness and clarity. |
5d02c66 to
814c426
Compare
Signed-off-by: dm4 <dm4@secondstate.io>
814c426 to
1d15aa3
Compare
* [Example] ggml: add llava-base64-stream example Signed-off-by: dm4 <dm4@secondstate.io> * [CI] llama: merge m1 job into matrix, build llava-base64-stream Signed-off-by: dm4 <dm4@secondstate.io> --------- Signed-off-by: dm4 <dm4@secondstate.io>