Wieger Opmeer
11/22/2021, 10:47 PMkwmiebach
11/22/2021, 11:22 PMprefect orion start
-
some error messages appear, including sqlite no such table 'flow_run'
- is there some db initialisation procedure that I missed?kwmiebach
11/22/2021, 11:48 PMConstantino Schillebeeckx
11/23/2021, 12:30 AMPREFECT__LOGGING__LEVEL="DEBUG"
within the run_config (ECSRunner) and I've confirmed that this env is set as such in the running container. However, within my running flow, when I check os.environ
its set to INFO
2. I setup a logger in my shared code that I later import like a library: logger = logging.getLogger("dwh.utils.main")
. I've set PREFECT__CLOUD__SEND_FLOW_RUN_LOGS='true'
and PREFECT__LOGGING__EXTRA_LOGGERS="['dwh']"
. when I execute a flow that uses that shared code, I can't see it emit logs in CloudWatch or in Prefect cloud, however when I run the flow locally I do see the logging statement
what am I missing?Chris L.
11/23/2021, 4:10 AMMatt Clapp
11/23/2021, 4:45 AMsupervisord
monitoring their health."
What I'd like and what my question is: Is there documentation on just setting up a Docker image and having that be self-contained container running a flow on AWS, without needing to kick it off from a local computer? I'd be happy to use Prefect Cloud for UI. Is this a valid use case? Is there some reason there's not much documentation for it? (Or do I just not understand something basic?)
thanks so much for any help.Gaylord Cherencey
11/23/2021, 6:48 AMauthenticator
to the connect method (which doesn't seams to be possible at the moment in the task). Is my assumption correct or is there a magic env variable I can use? If not is it a change I can request or implement my self in the repository?dammy arinde
11/23/2021, 2:14 PMJason Motley
11/23/2021, 4:32 PMMaurits de Ruiter
11/23/2021, 4:45 PMif status in ['SUCCEEDED', 'FAILED']:
fails with the following error:
TypeError: 'sequence' not supported between instances of 'str' and 'tuple'
If we log the type of string and array, it returns their types correctly.Côme Arvis
11/23/2021, 5:50 PMB
which depends on a undefined size list of other tasks [A1, A2, A3, ...]
(Prefect therefore creates an implicit List
task under the hood).
The thing is, some tasks in the list [A1, A2, A3, ...]
can be skip at runtime, but I still want B
to be executed.
I currently can’t achieve this, even if skip_on_upstream_skip=False
is specified for B
, since the implicit List
task is skip without being able to do anything (I receive None
, and not a list of optional elements).
Is there a way to do it? Thanks!Wesam Manassra
11/23/2021, 6:29 PMdockerfile
to the prefect.environments.storage.Docker
class, I get an error that looks like this:
shutil.Error "[Errno 63] File name too long: ['<Endless recursive path>']
Marwan Sarieddine
11/23/2021, 6:29 PMKevin
11/23/2021, 8:28 PMDotan Asselmann
11/24/2021, 10:29 AMhaf
11/24/2021, 10:49 AM@task(
nout=2,
max_retries=10,
retry_delay=timedelta(seconds=1),
)
def fetch_model_settings(
dsn_params: DSNParams,
app_id: UUID,
default_model_group_id: UUID,
) -> ModelSettings:
logger = prefect.context.get("logger")
<http://logger.info|logger.info>(
f"Fetching model settings for app_id={app_id}, default_model_group_id={default_model_group_id}"
)
raise ValueError("WTF where are the logs")
# before flow:
prefect.config.logging.level = "DEBUG"
In the sample above I never get to see the WTF in the output of running the flow in the console/locally (PREFECT__LOGGING__LEVEL=DEBUG python flows/flow.py
)Manuel Gomes
11/24/2021, 11:49 AMupload_file()
call, because big&binary). Since it's synchronous, I can trust the file will be where desired when task succeeds.
Next task in this flow is to transcode this video. I do so by creating a mediaconvert
boto3
client, and sending a mess of json to its create_job(**args)
method. This returns me... the job.
Now from what I've read... I should be able to use a prefect built-in prefect.tasks.aws.client_waiter.AWSClientWait
to wait for said job to finish (which is fine, at this point the workflow is serial/synchronous). Problem is... even when the job reports success (in the console, even!), it takes a while (minutes?!) for the transcoded movie to be present in the target bucket.
I would then... need to enter another wait task until I could find the file in the bucket's list of objects, possibly through prefect.tasks.aws.s3.S3List? until I could proceed to do further things to this transcoded video?
This conjunction sounds all too common not to have an integrated solution, unless I'm being dense (hah! no news there!) and not spotting an obvious solution. Any guidance?Marko Herkaliuk
11/24/2021, 3:00 PMEmma Rizzi
11/24/2021, 3:32 PMcreate_flow_run
to launch part 2 synchronized with steps 1 and 3.
Considering I can configure some script to deploy the Agent on VM start, is it possible ? I'm concerned about the main flow run starting before the VM's agent exists
If you have suggestions on better ways to implement this with Prefect I'm interested ! So far i only used basic ECS agent 🙂chicago-joe
11/24/2021, 3:41 PMMichael Warnock
11/24/2021, 4:23 PMAndré Petersen
11/24/2021, 4:28 PMRyan Brennan
11/24/2021, 4:38 PMdbt compile
before executing dbt run
?André Petersen
11/24/2021, 4:51 PMSlackbot
11/24/2021, 5:22 PMPrashob Nair
11/24/2021, 6:23 PMcreate_flow_run
to start a new flow 2 mins after the previous task.I have set the below parameter as follows but the flow run starts immediately instead.Please let me know where I'm going wrong.Thanks!
scheduled_start_time=pendulum.now().add(minutes=2),
Pedro Machado
11/24/2021, 8:33 PMrequests
session to make the API requests. This class also implements rate limiting.
I'd like to confirm that if I use the `LocalDaskExecutor`with threads I can pass a single instance of the class to a mapped task and it effectively rate limit across all mapped tasks. Also, is there a benefit to using a resource manager task to instantiate the class that queries the api?haf
11/24/2021, 11:34 PMaaron
11/25/2021, 1:15 AMAaron Ash
11/25/2021, 1:30 AMproject
name of the currently executing flow from the context or somewhere else?