• y

    Yusuf Khan

    8 months ago
    Looking at this:https://github.com/PrefectHQ/prefect/issues/3375 I'm trying to get ShellTask to run powershell. I have the powershell executable on the path in the machine, as "pwsh".
    run_in_pwsh = ShellTask(name="Powershell run", shell="pwsh")
    this is what I was trying to run. Then within the flow I had:
    run_in_pwsh(command='ls')
    The documentation for the shell argument says 'shell to run the command with; defaults to "bash"'. I assumed it would accept whatever as long as that kicked off in the terminal correctly? What I'm actually trying to do is run an azure command line utility called 'azcopy' (which is not part of the generic az cli). I need to do it both on a windows machine and a linux machine. Having separate scripts is fine. Any thoughts for how I could/should do this on windows?
    y
    Kevin Kho
    +1
    34 replies
    Copy to Clipboard
  • Jason Motley

    Jason Motley

    8 months ago
    Has anyone ever run into an issue where they get the error 
    Nonetype object has no attribute 'to_sql
     but only in production? I've spot checked and my load statement is constructed identically, registration is fine, etc. Local runs go fine as well.
    Jason Motley
    Kevin Kho
    3 replies
    Copy to Clipboard
  • a

    Andrey Tatarinov

    8 months ago
    Hi! I'm trying to setup third party (sqlalchemy) logs to output in flow logs. I'm using Docker storage and KubernetesAgent. I do something similar to https://docs.prefect.io/core/concepts/logging.html#extra-loggers in my register_flow.py script and I see some logs when Prefect is capturing flow object for serialization, hence I know it's working. But during the run, no logs from sqlalchemy are written. It seems that logging configuration is not serialized during flow serialization. What is the preferred way to setup third-party logging in flows?
    a
    Kevin Kho
    8 replies
    Copy to Clipboard
  • p

    Philip MacMenamin

    8 months ago
    I've stood up a prefect server instance within a network, I'm able to see the ip:8080, and I'm able to see ip:4200, but if I look at 8080 it's telling me that it cannot connect to:
    Connecting to Prefect Server at ip/graphql:4200
    p
    2 replies
    Copy to Clipboard
  • Jon Ruhnke

    Jon Ruhnke

    8 months ago
    I'm having a problem creating a prefect flow to process a large number of xml files (40k) without running out of ram. Is this the right place to ask for help?
    Jon Ruhnke
    Kevin Kho
    18 replies
    Copy to Clipboard
  • g

    Greg Adams

    8 months ago
    Hi again! Is there a storage/deployment pattern that allows me to include custom modules for my flow at registration time, rather than rebuilding the docker image whenever I want to update them? I thought the Git storage might include the extra python files but it’s not liking it (maybe I’m doing it all wrong?)
    g
    Kevin Kho
    4 replies
    Copy to Clipboard
  • Josh

    Josh

    8 months ago
    I’m running into a mypy issue with Prefect Tasks. Mypy will product an error
    error: <nothing> not callable
    for the task run when I try to test it out
    class MyTask(Task):
        def run(self): 
            # do something
            return True
    
    if __name__ == "__main__":
        my_task = MyTask()
        with Flow("My Flow") as flow:
            my_task()
        flow.run()
    Josh
    3 replies
    Copy to Clipboard
  • Amber Papillon

    Amber Papillon

    8 months ago
    Hey guys, quick question. Has this been implemented yet? https://github.com/PrefectHQ/prefect/issues/2254
    Amber Papillon
    Kevin Kho
    4 replies
    Copy to Clipboard
  • Philipp Eisen

    Philipp Eisen

    8 months ago
    EDIT: This was because my dask runners were not accessing the same orion database. Hey! I was testing to run orion with a DaskCluster that is deployed in kubernetes; I’m starting the flow locally and point to the daskcluster on localhost - (using port-forwarding for the scheduler) When running a local DaskCluster it works fine I’m always getting this error:
    distributed.worker - WARNING - Compute Failed
    Function:  orchestrate_task_run
    args:      ()
    kwargs:    {'task': <prefect.tasks.Task object at 0x7f02f0f56430>, 'task_run': TaskRun(id=UUID('2acf899f-67f5-4717-9665-c91f730f3719'), created=datetime.datetime(2022, 1, 16, 11, 45, 38, 995585, tzinfo=datetime.timezone.utc), updated=datetime.datetime(2022, 1, 16, 11, 45, 39, 19000, tzinfo=datetime.timezone.utc), name='get-product-b7ee3036-0', flow_run_id=UUID('a3e75090-2e7d-42f6-8dda-b00600f70b12'), task_key='b7ee3036fbe1354fe2fbf30215a316c4', dynamic_key='0', cache_key=None, cache_expiration=None, task_version=None, empirical_policy=TaskRunPolicy(max_retries=10, retry_delay_seconds=0.0), tags=[], state_id=UUID('36aa692f-175d-4bff-81ed-e57f2228cdfa'), task_inputs={}, state_type=StateType.PENDING, run_count=0, expected_start_time=datetime.datetime(2022, 1, 16, 11, 45, 38, 988955, tzinfo=datetime.timezone.utc), next_scheduled_start_time=None, start_time=None, end_time=None, total_run_time=datetime.timedelta(0), estimated_run_time=datetime.timedelta(0), estimated_start_time_delta=datetime.timedelta
    Exception: "ValueError('Invalid task run: 2acf899f-67f5-4717-9665-c91f730f3719')"
    Is there something obvious I’m missing?
    Philipp Eisen
    Anna Geller
    3 replies
    Copy to Clipboard
  • t

    Tao Bian

    8 months ago
    Hi, I am having a flow scheduled daily run, and I tried to get the timestamp inside the flow, why I got the exact same timestamp written into database every day?
    @task
    def write_timestamp_into_database():
        ...
    
    with Flow("sample-flow", daily_schedule) as flow:
        timestamp = str(datetime.datetime.now())
        write_timestamp_into_database(timestamp)
    t
    Anna Geller
    2 replies
    Copy to Clipboard