• b

    Brad I

    1 year ago
    Just wondering if the full graphql schema is hosted anywhere? I’m trying to use a codegen (https://www.graphql-code-generator.com/docs/getting-started/codegen-config) to generate bindings for a typescript/nodejs application. I tried to use https://api.prefect.io/graphql but it wasn’t able to download the schema.
    b
    Chris White
    2 replies
    Copy to Clipboard
  • haven

    haven

    1 year ago
    hi team, wondering if we could override/enrich the prefect context? i.e. I would like to have a task that does
    from prefect import context as prefect_context, task
    
    @task
    def enrich_context():
        prefect_context["custom_key"] = 1
    
    @task
    def main_task():
        value = prefect_context["custom_key"]
        print(value)
    
    with Flow("test-enrich_context") as flow:
       _enrich_context_task = enrich_context()
       main_task(upstream_tasks=[_enrich_context_task])
    haven
    Kevin Kho
    3 replies
    Copy to Clipboard
  • Salohy

    Salohy

    1 year ago
    Hy every one, I am using Docker storage for my flow and get this error
    ModuleNotFoundError: No module named 'utils'
    when running
    python file.py
    It seems like that my local package utils is not found during docker build. Here is the structure of my folder. 
    src
    	utils/
    	file.py
    	__init__.py
    I am using a custom docker image where I already copied every thing in src in the image. Please help me on this. Any help is appreciated 🙏Many thanks already
    Salohy
    Greg Roche
    +1
    6 replies
    Copy to Clipboard
  • i

    Italo Barros

    1 year ago
    Hello everyone, it's possible to execute multiple flows where each one uses a specified virtualenv (I'm using conda by the way)? For example, let's suppose that I have two envs with different libraries in each one, one in Python 3.6 (env_1) and another in Python 3.8 (env_2), and I want to run two Flows, one using the env_1 and another the env_2. I know that a possible approach would be using a containerized application, but since I do some file/folder manipulation on the host, I think that would be tricky to do with docker (please correct if I'm wrong). Obs: I'm using the Prefect Cloud
    i
    davzucky
    +1
    3 replies
    Copy to Clipboard
  • Kien Nguyen

    Kien Nguyen

    1 year ago
    Hi guys, is there any guide of monitoring Prefect Agent using tools like New Relic or Datadog?
    Kien Nguyen
    i
    4 replies
    Copy to Clipboard
  • Xyp Jn

    Xyp Jn

    1 year ago
    Hi people, I'm getting this weired error while using custom Docker image with DockerRun on Prefect Server, the error pops up immediately after the agent pulled the image successfully, and no task has been executed yet. I didn't use
    rstrip
    anywhere in my code and the flow runs smoothly with LocalRun. I wounder is it possible for me to get a full stacktrace on this error... Thanks for your help!
    Xyp Jn
    Kevin Kho
    6 replies
    Copy to Clipboard
  • Tim Enders

    Tim Enders

    1 year ago
    How do I change logs to debug when running a flow with
    prefect run
    ?
    Tim Enders
    1 replies
    Copy to Clipboard
  • p

    Philip MacMenamin

    1 year ago
    Hi - can someone point out what I'm missing in the following set up - I have a Server stood up, and separate machine running:
    prefect agent local start --api http://<server_IP>:4200
    p
    Kevin Kho
    23 replies
    Copy to Clipboard
  • Pedro Machado

    Pedro Machado

    1 year ago
    Hi there. I am using the new
    get_task_run_result
    and
    create_flow_run
    tasks described here. I'd like to treat the child flow as a single unit that can be retried or restarted if the flow fails. Currently, these are two separate tasks and I haven't been able to set them up this way. How could I change my flow to support retries/restart when the child flow fails?
    Pedro Machado
    Kevin Kho
    4 replies
    Copy to Clipboard
  • Joe Hamman

    Joe Hamman

    1 year ago
    Hi folks - we’re setting up a new prefect agent in our Kubernetes cluster on Google Cloud and I could use some pointers getting the correct permissions configured. I’m starting with this:
    prefect agent kubernetes install -k $KEY --namespace=staging --rbac | kubectl apply --namespace=staging -f -
    I then submit a flow that uses the
    KubernetesRun
    config and
    GCS
    storage configured as:
    run_config = KubernetesRun(cpu_request=2, memory_request="2Gi", image='<http://gcr.io/carbonplan/hub-notebook:c89f7f1|gcr.io/carbonplan/hub-notebook:c89f7f1>', env={'TZ': 'UTC'})
    storage = GCS("carbonplan-scratch", project='carbonplan')
    This results in an error message like this:
    └── 23:15:59 | INFO    | Submitted for execution: Job prefect-job-d8a8a648
    └── 23:16:05 | INFO    | Entered state <Failed>: Failed to load and execute Flow's environment: Forbidden('GET <https://storage.googleapis.com/storage/v1/b/carbonplan-scratch?projection=noAcl&prettyPrint=false>: Caller does not have storage.buckets.get access to the Google Cloud Storage bucket.')
    So, I gather my agent doesn’t have the correct IAM privileges to read the flow from GCS. Next I tried adding a service account to my agent:
    prefect agent kubernetes install -k $KEY --namespace=staging --service-account-name pangeo --rbac | kubectl apply --namespace=staging -f -
    Here I’m pointing to my kubernetes service account called
    pangeo
    which has been given
    storage.objectAdmin
    permissions. However this results in the same error as above. So now I’m wondering if I’m missing something more fundamental here. If anyone has suggestions on where to look for more details on setting up prefect on GKE, I’d certainly appreciate it.
    Joe Hamman
    Kevin Kho
    +1
    7 replies
    Copy to Clipboard