Riccardo Tesselli
07/20/2022, 3:05 PMdevelopment
and production
, so to have a dedicated workspace for each environment. Then, after exploring Prefect 2.0 features, I’ve started to question this because one could use one workspace and setup everything in order to distinguish between development and production pipelines. So I wonder, what do you suggest for managing enviroments? Go with different workspaces or have one workspace? Then, what should be the ideal use case for a workspace?Christian Vogel
07/20/2022, 3:51 PMFile "/opt/conda/lib/python3.8/site-packages/distributed/worker.py", line 2742, in loads_function result = pickle.loads(bytes_object) File "/opt/conda/lib/python3.8/site-packages/distributed/protocol/pickle.py", line 73, in loads return pickle.loads(x) ModuleNotFoundError: No module named 'prefect'
When entering the container, prefect seems to be available though. Do you have any idea what could be the reason?Jason
07/20/2022, 4:24 PMSam Maradwi
07/20/2022, 4:29 PMbotocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the GetSecretValue operation: User: arn:aws:sts::xxx:assumed-role/code_deployments-role/iddoc- is not authorized to perform: secretsmanager:GetSecretValue on resource: 4i-adl-config because no identity-based policy allows the secretsmanager:GetSecretValue action
Matt Delacour
07/20/2022, 6:15 PMChris Reuter
07/20/2022, 7:00 PMTim Enders
07/20/2022, 8:30 PMmap
operator coming? That is a big stumbling block for us adopting 2.0. Thanks!kiran
07/20/2022, 9:46 PMAxiosError: Request failed with status code 401
. So I thought maybe I could sign in with my 1.0 credentials and got Error: Invalid username or password.
Am I unable to sign up for 2.0 if I also have 1.0?Mansour Zayer
07/20/2022, 10:48 PMsubprocess
to run my dbt project locally (Prefect 1.2.2, Windows). I create my command (dbt run --vars '{data_processing_start_date: 2022-07-20, data_processing_end_date: 2022-07-20}' --profiles-dir ./
) like this:
command = (
f"dbt run --vars '{{"
f"data_processing_start_date: {data_processing_start_date}, "
f"data_processing_end_date: {data_processing_end_date}}}' --profiles-dir ./ "
)
The command is created correctly, but dbt gives me this error dbt: error: unrecognized arguments: 2022-07-20, data_processing_end_date: 2022-07-20}'
Seems like dbt interprets 2022-07-20
as an argument instead of the value for data_processing_start_date
variable.
Keep in mind that when I run the same command in my CLI, dbt works fine. But when it's provided to dbt through subprocess
this occurs.
This is my subprocess:
subprocess.run(
command,
check=True,
stderr=True,
stdout=True,
shell=True,
cwd="dbt",
)
Any idea what might cause this, and how to solve this? Thank youR
07/20/2022, 11:16 PMPriyank
07/21/2022, 7:19 AMStefan
07/21/2022, 8:39 AMDenys Volokh
07/21/2022, 9:35 AMVadym Dytyniak
07/21/2022, 10:07 AM{
'apiVersion': 'batch/v1',
'kind': 'Job',
'spec': {
'template': {
'spec': {
'nodeSelector': {
"<http://topology.kubernetes.io/zone|topology.kubernetes.io/zone>": "us-east-1a",
"<http://dask.corp.com/subnet-type|dask.corp.com/subnet-type>": "private",
"<http://dask.corp.com/storage|dask.corp.com/storage>": 'true',
},
'containers': [
{'name': 'flow'},
]
}
}
}
}
Rajvir Jhawar
07/21/2022, 10:48 AMMahesh
07/21/2022, 12:24 PMPrass
07/21/2022, 12:27 PMtask1
. But do task2
only if task1
is completed, and task3
only if task1
and task2
are complete?
2. Does prefect 2.0 parallelize flows over async for
s?
3. I write (append) to a file in one of my tasks. Is that prefect
safe?Justin Trautmann
07/21/2022, 12:34 PMadmin/version
api route doesn't work for me. thanks a lot.Chu
07/21/2022, 12:59 PMToby Rahloff
07/21/2022, 2:19 PMKeyError: "No class found for dispatch key 'S3 Storage:sha256:68ed [...]' in registry for type 'Block'."
. The full error trace and code can be found in the first comment to this post.rectalogic
07/21/2022, 3:03 PM__file__
is not defined when prefect cloud runs the flow. Any idea what the issue is?Ellie Redding
07/21/2022, 3:19 PMsnowflake_query
and snowflake_multiquery
are their own tasks, so I can’t use them as part of a different task? Which means that my flow looks like this:
do_some_stuff()
for table_name in tables:
queries = build_queries(table)
snowflake_multiquery(queries)
These tasks are all running sequentially, but there are a lot of tables so I’d like the snowflake_multiquery
tasks for each table to run concurrently. How can I make that happen?Ellie Redding
07/21/2022, 3:23 PMsnowflake_multiquery
was throwing Object of type SecretStr is not JSON serializable
errors from this line of code in the connector, where it dumps the request body, including connection auth info, into the request. I got around this for now by changing the password here from type SecretStr
to str
, but that’s obviously not a great solution 😅 Has anyone else run into this issue?Kha Nguyen
07/21/2022, 3:25 PMJai P
07/21/2022, 3:48 PM2.0b10
) was released yesterday, but don't see associated release notes. was this just a minor bug fix?Yupei Chen
07/21/2022, 4:01 PMTim Enders
07/21/2022, 5:32 PMJai P
07/21/2022, 6:08 PMBilly McMonagle
07/21/2022, 6:15 PMChristian Nuss
07/21/2022, 6:22 PM@task
that across all concurrent flows is a Singleton (e.g only one of that task can run at a time)?