Antoine_MFP
05/09/2023, 1:24 PMCody Peterson
05/10/2023, 12:04 AMSimon Rascovsky
05/10/2023, 7:53 PMRyan Morshead
05/10/2023, 9:30 PMMike Grabbe
05/11/2023, 1:05 AMMarvin
05/11/2023, 1:05 AMJeremiah
Matt Alhonte
05/11/2023, 7:24 PMECSWorker
?Simon Rascovsky
05/14/2023, 5:28 AMLuca Vehbiu
05/15/2023, 11:50 AMTomer Friedman
05/15/2023, 1:02 PMJeremiah
Jeremiah
Jaime Raldua Veuthey
05/16/2023, 1:02 PMJaime Raldua Veuthey
05/16/2023, 1:03 PMStรฉphan Taljaard
05/16/2023, 2:11 PMTarek
05/16/2023, 2:39 PMfoo
I would import completely different functions and run them as tasks?
funcs = get_functions(foo)
run_funcs(funcs) #but as tasks not just functions to profit from prefect's tasks scheduling algorithms
Sean Kruzel
05/16/2023, 3:18 PMEdmondo Porcu
05/16/2023, 7:10 PMYSF
05/16/2023, 7:22 PMprefect deployment build --infra process --storage-block azure/flowsville/health_test --name health-test --pool default-agent-pool --work-queue test --apply health_flow.py:health_check_flow
into the python deployment object equivalent pleaseMarco Barbero Mota
05/17/2023, 6:16 PMpickle
package and pandas.read_pickle
? The current pickle serializer available in prefect does not allow to read the files that are saved with a task.
from prefect.serializers import Serializer
D = TypeVar("D")
class PickleSerializer(Serializer):
"""
Serializes data to pickle.
"""
type: Literal["pickle"] = "pickle"
def dumps(self, obj: D) -> bytes:
"""Encode the object into a blob of bytes."""
return pickle.dumps(obj)
def loads(self, blob: bytes) -> D:
"""Decode the blob of bytes into an object."""
return pickle.load(blob)
Matt Alhonte
05/18/2023, 12:07 AMmondras
05/18/2023, 4:51 AMmondras
05/18/2023, 3:21 PMAshe
05/18/2023, 3:25 PMlogger.py
is not resulting in logs in prefect UI locally - whatโs wrong?
import logging
import os
import sys
import prefect
from prefect.logging.loggers import get_run_logger
def get_logger(name):
logger = logging.getLogger(name)
logger.setLevel(<http://logging.INFO|logging.INFO>) # Or any level you want
os.environ["PREFECT__LOGGING__LEVEL"] = "INFO"
os.environ["PREFECT__LOGGING__ROOT_LEVEL"] = "INFO"
# Create a console handler
ch = logging.StreamHandler()
ch.setLevel(<http://logging.INFO|logging.INFO>) # Or any level you want
# Create a formatter
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
# Add the formatter to the handler
ch.setFormatter(formatter)
# Add the handler to the logger
logger.addHandler(ch)
return logger
mondras
05/18/2023, 4:16 PMmondras
05/18/2023, 4:26 PMRun poetry run prefect cloud login -k $PREFECT_API_KEY -w "xxxx/$PREFECT_WORKSPACE"
Usage: prefect cloud login [OPTIONS]
Try 'prefect cloud login --help' for help.
โญโ Error โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ Got unexpected extra argument (xxxx/xxxx-staging) โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
Marco Barbero Mota
05/18/2023, 5:20 PMfrom prefect.serializers import PickleSerializer
as serializer and persist_result=True
,
result_storage=LocalFileSystem(basepath=settings.main_dir+'/results')
and result_storage_key=result.pkl
in the @task
decorator?
Ideally I want to use the pickle
module functions.Ashe
05/18/2023, 5:54 PMMarvin
05/18/2023, 5:55 PMprefect deployment apply
command followed by your deployment YAML file. For example:
$ prefect deployment apply catfacts_flow-deployment.yaml
You can find more information on deployments in the Prefect documentation.
If you're looking for a way to trigger a flow run via a POST request, you might want to consider using Prefect's GraphQL API. You can use a POST request to interact with the API and create a flow run.
I hope this helps, even if it's not the exact answer you were looking for. Humans and their specific requests, am I right?