Fabrice Toussaint
05/25/2021, 12:19 PMMarc Lipoff
05/25/2021, 2:41 PMprefect build/register
commands in CircleCI, specifically done in parallel? I have dozens for flows to register via CircleCI, and I'd like to do then in parallel to minimize the time.Pedro Henrique
05/25/2021, 3:19 PMPedro Henrique
05/25/2021, 3:20 PMChris L.
05/25/2021, 5:10 PMbig_future = client.scatter(big_dask_dataframe)
and passing future = client.submit(func, big_future)
as an output from one task to be used as an input in another task? I found this UserWarning at the bottom of the "prefect-etl" article in the dask docs (https://examples.dask.org/applications/prefect-etl.html) as well.
Was wondering if anybody has encountered this issue as well? And whether there's a solution to this. Thank you in advance!Joseph Hughes
05/25/2021, 5:41 PMChris DeNardo
05/25/2021, 6:03 PMHemanand Ramasamy
05/25/2021, 6:09 PMTim Enders
05/25/2021, 7:34 PMSaksham Dixit
05/25/2021, 7:35 PMFailed to load and execute Flow's environment: BadCredentialsException(401
Any advice how to proceed to solve this error?Sam Cox
05/25/2021, 8:40 PMFailed to load and execute Flow's environment
followed by a GitLab MaxRetryError, probably because we are reading from gitlab too many times in a given time period.
Using gitlab as storage is important for some of our current projects, but is there a way to retry a flow when the Flow's environment fails to load?Pedro Henrique
05/25/2021, 8:59 PMAlex Furrier
05/25/2021, 9:20 PMfrom prefect.tasks.snowflake.snowflake import SnowflakeQuery
sf_task = SnowflakeQuery(query=SNOWFLAKE_QUERY,
user=SNOWFLAKE_USER,
password=SNOWFLAKE_PASSWORD,
account=SNOWFLAKE_ACCOUNT,
warehouse=SNOWFLAKE_ACCOUNT,
database=SNOWFLAKE_DATABASE,
role=SNOWFLAKE_ROLE,
schema=SNOWFLAKE_SCHEMA)
ret = sf_task.run()
Going off the source code I would expect that to return the query data in an iterable. This based off the code within SnowflakeQuery
that creates a cursor, connects, executes the query and then returns the .fetchall()
method of an executed Snowflake cursor (basing my understanding of that from this)
What is actually returned by sf_task.run()
is the Snowflake cursor which appears to have already executed the query (it's of type snowflake.connector.cursor.SnowflakeCursor
with a closed connection state).
However, there is metadata in ret.description
which is a list of tuples. I've tried this with a few different queries that should return data and it's the same result.
Any idea what's going on? I may be doing something obviously wrong but not seeing it.
I'm able to get the data using the Snowflake connector and executing myself:
import snowflake.connector
import pandas as pd
#create connection
conn=snowflake.connector.connect(
user=SNOWFLAKE_USER,
password=SNOWFLAKE_PASSWORD,
account=SNOWFLAKE_ACCOUNT,
warehouse=SNOWFLAKE_ACCOUNT,
database=SNOWFLAKE_DATABASE,
role=SNOWFLAKE_ROLE,
schema=SNOWFLAKE_SCHEMA)
#create cursor
curs=conn.cursor()
#execute SQL statement
cur = curs.execute(SNOWFLAKE_QUERY)
#load it to df
df = pd.DataFrame.from_records(iter(cur), columns=[x[0] for x in cur.description])
df
kevin
05/25/2021, 10:18 PMFelipe Saldana
05/26/2021, 1:09 AMNoah Holm
05/26/2021, 7:55 AMAlex Souvannakhot
05/26/2021, 8:33 AMBruno Murino
05/26/2021, 11:59 AMDotan Asselmann
05/26/2021, 12:20 PMAttributeError: 'Flow' object has no attribute 'terminal_state_handler'
any idea what could be the reason?
i’m running on prefect self hosted serverBruno Murino
05/26/2021, 1:48 PMBruno Murino
05/26/2021, 2:26 PMMark McDonald
05/26/2021, 2:27 PMKevin Kho
Robert Bastian
05/26/2021, 2:41 PMLukáš Polák
05/26/2021, 2:54 PMBruno Murino
05/26/2021, 4:02 PMKevin
05/26/2021, 4:07 PMPedro Henrique
05/26/2021, 7:43 PMClaire Herdeman
05/26/2021, 9:02 PMBruno Murino
05/26/2021, 10:48 PM