-
-
Notifications
You must be signed in to change notification settings - Fork 159
Task #916
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
bjornbytes
wants to merge
17
commits into
dev
Choose a base branch
from
task
base: dev
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Task #916
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Contributor
|
for my use cases i tend to want a parallel for (often 2d/3d, too), fwiw |
Owner
Author
|
There will be a (async) function to run Lua code on a worker thread, and likely a parallel-for helper on top of this. I haven't figured out the specific API for the parallel-for function yet though. |
Was seeing situations where only 1 worker thread would wake up.
- Pool RunContexts - Cache bytecode + functions
- Jobs are fire and forget now. You call job_start, which just returns whether it was successfully queued, there are no job handles. - If you want to wait on a job, you should issue some side effect (decrement atomic counter) to track completion. - Instead of job_wait, there's job_spin which runs a random job, which is helpful if your job isn't ready yet and you need to do something productive. - Fixes a major issue where job system could fall back to single threaded for a long time if you aren't waiting on jobs fast enough. - Job system is much simpler internally and can use a fixed-size queue instead of a linked list.
Avoids starvation in more cases...
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is an experiment to add a coroutine-based task system to LÖVR, with the main goal of making it easier for projects to scale to multiple CPU cores and run expensive work in the background.
Design
The basic idea is to create lots of
Taskobjects, which are thin wrappers around Lua coroutines. Each Task represents an independent function doing work, and lots of tasks can all run more or less at the same time, cooperatively.Normally, Lua coroutines are still single threaded, and don't allow for parallelism. However, Tasks in LÖVR can achieve parallelism by running asynchronous functions.
When a Task calls an asynchronous function, it will start work on a thread pool, yield itself, and wake up automatically once the work is complete. While the Task is yielded, other Tasks get a chance to run.
In this way, many Tasks (coroutines) can be running "at the same time", with all of them doing work on a thread pool, making use of all of the CPU cores.
API
Everything is in a new module,
lovr.task:Notes:
Task = lovr.task.newTask(function)creates a newTask. You give it a function you want the task to run. Internally, it gets turned into a coroutine, similar tocoroutine.create.Channelto communicate with the task.for task in lovr.task.poll() do end. Iterates over any tasks that are ready to run. The defaultlovr.runcalls this for you, and just passes the result to a newlovr.taskreadycallback.success, ...results = lovr.task.wait(...tasks). Blocks until all the tasks are complete, returning their results or the first error that was encountered.lovr.task.wait(a, b, c)returns1, 1, 1, 2, 3.lovr.taskreadyanymore, because it's already finished. You can choose to "fire and forget" a task and letlovr.taskreadytake care of running it on a future frame, or explicitly wait on the task if you need its results sooner.success, ...results = task:resume(...args). Resumes a task, if possible, otherwise returns false and an error.coroutine.yield, thenargswill get passed to the task. If the task yielded due to an asynchronous function,argsare ignored and the results from the async operation are given to the task instead.trueand you'll get any yield/return values.lovr.taskready, since you already ran it.ready = task:isReady()returns true if the task can be resumed, or false if it's finished or still waiting on the result of an async function.complete = task:isComplete()returns true if the task is finished (its function returned or errored)...results = task:getResults()returns the values the task returned with, or nil if the task hasn't returned yet.error = task:getError()returns the task's errorAsynchronous Functions
Asynchronous functions are where the magic happens. Asynchronous functions behave differently, depending on whether they are called inside of a Task or not:
lovr.taskreadyon the next frame (unless it gets run sooner than that).Any LÖVR function can become asynchronous. Let's take
lovr.data.newImage, which I've been using as a test async function on this branch:Even though this is an async function, it behaves exactly the same as it does today, and you don't need to change any code. But if you run it on a task, it yields the task:
By itself this isn't interesting. But there are interesting things you can do with asynchronicity and parallelism.
Here's an example that loads an image in the background:
Here's an example that loads 100 images on multiple threads:
Discussion
taskreadyjust a regular event?Readbackobject? What ifTexture:getPixelswas async?TODO
lovr.task.sleep(async, wakes up task later, could cause problems if you want the wait time to be based ondt?) Maybe justlovr.timer.sleepis asyncdebuglibrary with the task to inspect its state.