Replies: 3 comments
-
|
🕒 Discussion Activity Reminder 🕒 This Discussion has been labeled as dormant by an automated system for having no activity in the last 60 days. Please consider one the following actions: 1️⃣ Close as Out of Date: If the topic is no longer relevant, close the Discussion as 2️⃣ Provide More Information: Share additional details or context — or let the community know if you've found a solution on your own. 3️⃣ Mark a Reply as Answer: If your question has been answered by a reply, mark the most helpful reply as the solution. Note: This dormant notification will only apply to Discussions with the Thank you for helping bring this Discussion to a resolution! 💬 |
Beta Was this translation helpful? Give feedback.
-
|
This is also slowing down some of my workflows! I'm very interested in ways to accelerate the process / circumvent it |
Beta Was this translation helpful? Give feedback.
-
|
Same here |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Question
Body
I have created a workflow which runs python script that checks if an official goverment data was updated and if it was my script downloads new data, converts it and update my database, but the downloading file has 3GB and my Github Actions has limit of 1.28MB/s. It takes 30 minutes to download it.
Is there way to speed it up or maybe I should use some specific tool? I really have no idea how to speed the process up..
Beta Was this translation helpful? Give feedback.
All reactions