Josh Cowles
07/21/2022, 8:17 PMHa Pham
07/22/2022, 3:08 AMconfig.toml
file, or put them in an .env file. In 2.0 looks like this is replaced by the Block
concept. So:
• Looking at Block's doc, I think currently the only way for me to register a certain secret is to write it as a Block, turn it into a script, then run it. Is this correct?
• It is said that the Block's content will be saved in the db (default to SQLite). does this mean if I want to see the content outside of prefect's development workflow, I have to query the db directly?
• How do I modify & delete saved secrets?
• What's the best way to manage environment variables?Rohit
07/23/2022, 1:42 PM@task
def deploy_best_model():
python deploy.py
Surawut Jirasaktavee
07/23/2022, 1:46 PMprefect storage ls
or prefect storage create
. How can I solve this?Owais Farooqui
07/24/2022, 3:43 PMsqlalchemy.exc.OperationalError: (sqlite3.OperationalError) table flow already exists
when I try to run prefect storage ls
OR prefect storage create
i am using prefect==2.0b6Kun Yin
07/25/2022, 6:24 AMKun Yin
07/26/2022, 9:13 AMprefect==2.0b12
Problem: when I run the code a, b= task.submit(c, d)
the error occurs:`cannot unpack non-iterable PrefectFuture object`
I can use task(nout=2)
in Prefect 1.0+ when my task has multiple return values. What shall I do in Prefect 2.0+ in the same situation? Maybe a, b= task.submit(c, d).result()
can work but I think it is not a good way.Sana Shaikh
07/26/2022, 8:53 PMSana Shaikh
07/26/2022, 8:58 PMSana Shaikh
07/26/2022, 8:58 PMJohn Kang
07/26/2022, 11:32 PMDarren
07/27/2022, 6:08 PMAnna Geller
Matt from DataHouse
07/29/2022, 2:36 PMFile "c:\ds\tools\python3.8\latest\lib\site-packages\httpx\_transports\default.py", line 77, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectTimeout
Here's a link to a Github issue with the full stack trace.
I tried various Prefect versions, most recently 2.0.0 but I get the same error. Is is an issue with asyncio? I think John Kang's message from yesterday suggests a similar problem. In this case, though, I can't even start the server.Viet Nguyen
08/01/2022, 2:57 AMFailed to create the Prefect home directory at /home/sbx_user1051/.prefect
) The full error below:
I didn't execute any Prefect flow tho, just tried to import prefect package with the dummy Lambda handler function.
We aim to do things with server-less infrastructure using AWS services. Our actual pipeline may involve processing very large amount of NetCDF files, but many occasions, will be just a few newly uploaded file.
So my questions, how to overcome the above error? And is the second option doable? We would use Fargate cluster for Dask client, but the environment where Dask client is created needs to have sufficient memory ~10GB etc.
Many thanks.eddy davies
08/01/2022, 2:24 PMJohn Kang
08/01/2022, 7:14 PMJohn Kang
08/01/2022, 9:40 PMMaximilian Schnieder
08/02/2022, 5:55 AMJohn Kang
08/03/2022, 8:38 PMPipat (Benz) Methavanitpong
08/04/2022, 3:58 AMDeployment
class. It can provide the deployment configuration above the DO NOT EDIT
line in a generated deployment file.
https://orion-docs.prefect.io/concepts/deployments/Jonas Dahlbæk
08/04/2022, 10:59 AMDarren
08/04/2022, 3:45 PMKarl Bühler
08/06/2022, 10:30 PMАндрей Насонов
08/08/2022, 12:46 PMfrom prefect.filesystems import RemoteFileSystem
gitlab_block = RemoteFileSystem(
basepath='<git://path/to/repo>',
settings={
'key': "GITLAB_USER",
'secret': "GITLAB_TOKEN"})
gitlab_block.save('flows_repo')
This leaves me with
prefect.exceptions.PrefectHTTPStatusError: Client error '422 Unprocessable Entity' for url '<http://ephemeral-orion/api/block_documents/>'
I might be misusing RemoteFileSystem something fierce, could you please guide me in the right direction? Big thanks!Chris L.
08/08/2022, 3:07 PMscheduler
flow that takes a subflow_key
. This subflow_key
is passed into a curried function that dynamically creates a new Prefect flow. There are about 30 (and growing) different subflows that can be created dynamically.
My problem: I would like to have a single-flow many-deployments (each deployment is associated with different types of subflows) setup
Constraints: Because subflows are generated dynamically using the curried function, I can't separate subflows into parent flows and run prefect deployment build
for each flow.
What I've tried: 30 different deployment yaml files for 1 scheduler
flow with 30 different combinations of subflow_keys
and schedules.
My question: Is there a DRYer way to achieve the same setup. Each deployment file is identical except for 2 lines (parameters and schedule)? What is Prefect engineering's current take on this single-flow many-deployments paradigm? Will this be achievable via a single Prefect CLI command in the future (maybe with arrays of parameters / schedule flags passed into prefect deployment build
)?Андрей Насонов
08/08/2022, 3:57 PMNeil Natarajan
08/08/2022, 5:26 PMZheyuan
08/09/2022, 1:50 PMZheyuan
08/11/2022, 5:33 AM