https://prefect.io logo
Docs
Join the conversationJoin Slack
Channels
announcements
ask-marvin
best-practices-coordination-plane
data-ecosystem
data-tricks-and-tips
events
find-a-prefect-job
geo-australia
geo-bay-area
geo-berlin
geo-boston
geo-chicago
geo-colorado
geo-dc
geo-israel
geo-japan
geo-london
geo-nyc
geo-seattle
geo-texas
gratitude
introductions
marvin-in-the-wild
prefect-ai
prefect-aws
prefect-azure
prefect-cloud
prefect-community
prefect-contributors
prefect-dbt
prefect-docker
prefect-gcp
prefect-getting-started
prefect-integrations
prefect-kubernetes
prefect-recipes
prefect-server
prefect-ui
random
show-us-what-you-got
Powered by Linen
prefect-community
  • s

    Slackbot

    10/28/2022, 7:34 PM
    This message was deleted.
    i
    m
    3 replies · 3 participants
  • l

    Luca Schneider

    10/28/2022, 8:06 PM
    Hi all, I’m not sure how i am supposed to leverage secret string from a customised block. I’m trying to run a shell where i pass a password.
    m
    5 replies · 2 participants
  • k

    Kirill Popov

    10/28/2022, 8:22 PM
    hey folks, looking for some help here with setting up prefect 2 on K8S When running an agent in kubernetes, the agent does not receive logs from workers... among other things, this means I can not see if flow run succeeded/failed from agent logs Here is my setup: • all flows are stored in S3 • all flows run on the same infra block which is a KubernetesJob - new pod is spun up for each flow • using Prefect Cloud to orchestrate Everything was great when i was running prefect agent locally -- agent produces logs (output in CLI) similar to:
    19:41:45.583 | INFO    | prefect.agent - Submitting flow run 'd03564b9-ad4e-4310-8662-184616d6406f'
    19:41:47.416 | INFO    | prefect.agent - Completed submission of flow run 'd03564b9-ad4e-4310-8662-184616d6406f'
    19:41:48.471 | INFO    | prefect.infrastructure.kubernetes-job - Job 'expert-lyrebird-xjq42': Pod has status 'Pending'.
    19:41:49.155 | INFO    | prefect.infrastructure.kubernetes-job - Job 'expert-lyrebird-xjq42': Pod has status 'Running'.
    19:41:52.098 | INFO    | Flow run 'expert-lyrebird' - Created task run 'My Example Task-c06c9343-0' for task 'My Example Task'
    19:41:52.098 | INFO    | Flow run 'expert-lyrebird' - Executing 'My Example Task-c06c9343-0' immediately...
    19:41:52.345 | INFO    | Task run 'My Example Task-c06c9343-0' - Finished in state Completed()
    19:41:52.443 | INFO    | Flow run 'expert-lyrebird' - Finished in state Completed()
    When running the agent locally but in docker container, the logs look similar. Now I want the agent to run in a kubernetes deployment to automatically manage its uptime. Only 1 replica of agent pod is used. However I only receive part of the log in the agent cli output -- it looks like no info is coming back from the worker pods back to the agent pod...
    19:59:58.687 | INFO    | prefect.agent - Submitting flow run 'a510b13d-b0b5-4067-8610-518e6a2b45a8'
    19:59:59.268 | INFO    | prefect.agent - Completed submission of flow run 'a510b13d-b0b5-4067-8610-518e6a2b45a8'
    19:59:59.305 | INFO    | prefect.infrastructure.kubernetes-job - Job 'secret-iguana-zlm4f': Pod has status 'Pending'.
    20:00:01.349 | INFO    | prefect.infrastructure.kubernetes-job - Job 'secret-iguana-zlm4f': Pod has status 'Running'.
    What is going on ? how can I make the agent running on K8S produce a complete log ? let me know if you have any ideas -- I am fairly new to k8s so might be something basic, but looks unexpected to me
    👀 1
    ⁉️ 1
    c
    19 replies · 2 participants
  • y

    YD

    10/28/2022, 11:31 PM
    Hi When using prefect 2, I assume we still need the agent to run in the background all the time. do we still need to run something like
    nohup prefect agent start -q …  > ~/tmp/prefect_agent.log &
    or is there a different way to keep the agent running? is there best practice around this in the docs? I did not find it so far Thanks
    ✅ 1
    r
    m
    +1
    5 replies · 4 participants
  • a

    Ahmed Ezzat

    10/29/2022, 12:23 AM
    I saw a new feature to restart flows from failed/crashed state getting released a while ago but I can't find it from UI or it's still not ready?
    ✅ 1
    a
    j
    5 replies · 3 participants
  • s

    Sander

    10/29/2022, 8:22 PM
    Hi, I’m trying to create a deployment with a minio object store as storage layer. However it seems that the flow code is not registered. I’m trying to debug this but I can’t seem to adjust the log level of Orion. I’ve tried setting PREFECT_LOGGING_SERVER_LEVEL to DEBUG but I still only see INFO logging from my server. I’m running my own server so no cloud here.
    ✅ 1
    a
    15 replies · 2 participants
  • t

    Tyson Chavarie

    10/29/2022, 10:08 PM
    I just started getting API errors on all my scheduled flows on Prefect 1
    ✅ 1
    a
    a
    15 replies · 3 participants
  • t

    Tyson Chavarie

    10/29/2022, 10:23 PM
    All my flows are failing..
    ✅ 1
  • t

    Tyson Chavarie

    10/29/2022, 10:27 PM
    Anyone monitoring chat tonight? my error is
    Failed to load and execute flow run: ClientError([{'path': ['secret_value'], 'message': 'Unable to complete operation', 'extensions': {'code': 'API_ERROR'}}])
    we are hard down...
    ✅ 1
  • p

    pk13055

    10/30/2022, 10:29 AM
    Hi, I am trying to replicate a local deployment of prefect (incl.
    orion
    +
    agent
    ) on a remote server. However, I get an error message in the orion logs after changing
    PREFECT_ORION_API_HOST
    from
    0.0.0.0
    to
    SERVER_IP
    . The particular error is:
    ERROR:    [Errno 99] error while attempting to bind on address ('SERVER_IP', 4200): cannot assign requested address
    This does not make sense, since I can specifically access the same
    SERVER_IP
    through the browser and the particular port has been allowed through the firewall as well. (PS - orion appears to connect for a brief second, and then immediately throws this error and attempts to restart). Additionally, attached below is the relevant section of my `docker-compose.yml`:
    version: "3.9"
    services:
      db:
        image: timescale/timescaledb:latest-pg14
        volumes:
          - $PWD/data/db:/var/lib/postgresql/data
          - $PWD/config/db:/docker-entrypoint-initdb.d/
        healthcheck:
          test: [ "CMD-SHELL", "pg_isready" ]
          interval: 10s
          timeout: 5s
          retries: 5
        environment:
          - POSTGRES_DB=$POSTGRES_DB
          - POSTGRES_USER=$POSTGRES_USER
          - POSTGRES_PASSWORD=$POSTGRES_PASSWORD
        networks:
          - db_network
    
      orion:
        image: prefecthq/prefect:2.6.3-python3.10
        restart: always
        volumes:
          - $PWD/prefect:/root/.prefect
        entrypoint: [ "prefect", "orion", "start" ]
        environment:
          - PREFECT_ORION_API_HOST=$SERVER_IP
          - PREFECT_ORION_DATABASE_CONNECTION_URL=$PREFECT_DB_URL
        ports:
          - 4200:4200
        depends_on:
          - db
        networks:
          - prefect_network
          - db_network
    
      agent:
        build: ./prefect
        restart: always
        entrypoint: [ "prefect", "agent", "start", "-q", "main_queue" ]
        environment:
          - PREFECT_API_URL=<http://orion:4200/api>
        networks:
          - prefect_network
          - db_network
    
      cli:
        build: ./prefect
        entrypoint: "bash"
        working_dir: "/root/flows"
        volumes:
          - "$PWD/prefect/flows:/root/flows"
        environment:
          - PREFECT_API_URL=<http://orion:4200/api>
        networks:
          - prefect_network
          - db_network
    
    networks:
      db_network:
      prefect_network:
    ✅ 1
    r
    2 replies · 2 participants
  • r

    Rabea Yousof

    10/30/2022, 10:44 AM
    i'm a new user i want to save the flows to the database PostgreSQL, I added the connection and the prefect tables was added to my database, I have a record in table configuration , But the flows and the logs not inserted to my PostgreSQL database, i don’t know what i’m missing, can someone help me please?
    ✅ 1
    a
    2 replies · 2 participants
  • d

    Dzmitry Aliashkevich

    10/30/2022, 2:32 PM
    Hi, folks, I’m struggling to find a solution for one particular use case in Prefect 1.0 regarding upstream/downstream dependencies, details are in thread. Would appreciate any help, thanks
    a
    8 replies · 2 participants
  • s

    Stephen Herron

    10/30/2022, 4:51 PM
    Hey - trying to figure out the ci/cd for prefect 2 Everything I’ve read so far is about creating deployments from flows - that’s fine but what about customised deployments? I figured the approach now is one flow many deployments per flow, so custom deploys would need to be written (i.e. passing in the parameters, the name of the deployment etc.) Options considered: • devs create the .yml and github actions or the like will apply those on deployment (e.g. refresh_some_dbt_subdag, model_selector=something) • create deployments from python files Is there any place I can look for how custom deployments can be managed?
    ✅ 1
    a
    2 replies · 2 participants
  • a

    Adam

    10/30/2022, 9:55 PM
    Hello friends, I’m trying to use the Prefect Cloud REST API but I can’t find any information about authentication in the docs. I assume there must be some header I can put my API Keys in. If someone can point me in the right direction that would be great.
    ✅ 1
    m
    3 replies · 2 participants
  • How can I leverage tags for better organization (and search from the UI) of metadata related to my flow runs?
    m

    merlin

    10/30/2022, 9:59 PM
    Hello, I'm struggling with the entrypoint in deployments, I think because of the way I've called the flow when running as a script. I'm calling the flow and assigning a dyanmic tag at runtime, after instantiating an object to pass as a parameter to the flow:
    # i've left out all the imports and task definitions
    
    # trino_flows.py
    @flow(name="extract write")
    def extract_write(config):
        logger = get_run_logger()
        <http://logger.info|logger.info>(f"extract file: {config.filepath}")
    
        sql = load_sqlfile(config.filepath)
        trino_cnxn = trino_connect()
        data = send_query(trino_cnxn, sql)
        write_output(data, config.outfile_path)
    
    
    # file: extract_write.py
    filepath = Path(sys.argv[1])
    extract_config = ExtractConfig(filepath=filepath)
    
    with tags(extract_config.dataset_name, "extract"):
        extract_write(extract_config)
    In development I'm calling the script with:
    python src/extract_write.py src/extracts/weekly_date.sql
    So the ExtractConfig object creates a dataset_name, rundate, and filepath field used by the flow code. How do I build/apply a deployment when I'm passing an object to the flow function in my script?
    ✅ 1
    a
    17 replies · 2 participants
  • d

    Deepanshu Aggarwal

    10/31/2022, 6:05 AM
    hi ! i had one small doubt regarding the request body for running deployments using openapi what all fields are necessary to provide while using api/deployments/{id}/create_flow_run
    {
      "state": {
        "type": "SCHEDULED",
        "name": "string",
        "message": "Run started",
        "data": "string",
        "state_details": {
          "flow_run_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
          "task_run_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
          "child_flow_run_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
          "scheduled_time": "2022-10-31T06:00:35.234Z",
          "cache_key": "string",
          "cache_expiration": "2022-10-31T06:00:35.234Z",
          "untrackable_result": false
        },
        "timestamp": "2022-10-31T06:00:35.234Z",
        "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
      },
      "name": "my-flow-run",
      "parameters": {},
      "context": {
        "my_var": "my_val"
      },
      "infrastructure_document_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
      "empirical_policy": {
        "retries": 0,
        "retry_delay": 0
      },
      "tags": [
        "tag-1",
        "tag-2"
      ],
      "idempotency_key": "string",
      "parent_task_run_id": "3fa85f64-5717-4562-b3fc-2c963f66afa6"
    }
    ✅ 1
    j
    1 reply · 2 participants
  • i

    iKeepo w

    10/31/2022, 6:24 AM
    hi, could it possible to orchestrating flows together that some of them need Windows enviroment and others need WSL2 enviroment ?
    ✅ 1
    j
    1 reply · 2 participants
  • s

    Stephen Lloyd

    10/31/2022, 8:27 AM
    Trying to get setup in Prefect 2.0. The Create Organization function is not working for me. We’re on a paid plan
    ✅ 1
    a
    3 replies · 2 participants
  • a

    Alejandro

    10/31/2022, 10:52 AM
    Hello everyone! I have the following example code:
    import random
    from collections import namedtuple
    from datetime import date, datetime, time
    
    import pandas as pd
    from prefect import flow, task
    
    WeatherConditions = namedtuple(
        "WeatherConditions", ["wind_speed", "temperature", "rel_humidity"]
    )
    
    
    @task
    def register_current_weather() -> WeatherConditions:
        return WeatherConditions(
            wind_speed=random.weibullvariate(3, 1.5),
            temperature=random.uniform(-5, 25),
            rel_humidity=random.uniform(0, 100),
        )
    
    
    @task
    def upload_to_database(station_data: pd.DataFrame) -> None:
        print("Updating weather database with the following data:")
        print(station_data)
        print("Observations were successfully recorded")
    
    
    @flow
    def surface_station_daily_weather(station: str, freq: str = "H") -> pd.DataFrame:
        print(f"Daily weather observations for station {station.title()!r}")
        timestamps = pd.date_range(
            start=date.today(), end=datetime.combine(datetime.now(), time.max), freq=freq
        )
        observations = [register_current_weather() for _ in range(len(timestamps))]
        return pd.DataFrame(data=observations, index=timestamps)
    
    
    @flow
    def weather_app(station_names: list[str]) -> None:
        print("Welcome to the world's fastest weather data collection application!")
        for station in station_names:
            station_weather = surface_station_daily_weather(station=station, freq="3H")
            upload_to_database(station_data=station_weather)
        print(
            "Daily observations have been updated for all operational stations. See you soon!"
        )
    
    
    if __name__ == "__main__":
        STATIONS = [
            "bilbao_station",
            "oviedo_station",
            "salamanca_station",
            "badajoz_station",
        ]
        weather_app(station_names=STATIONS)
    I was wondering what is the recommended way to run subflows in parallel (not concurrently). In this case, the subflow
    surface_station_daily_weather
    is executed sequentially (as far as I know there is no way to use the
    submit
    mechanism with a flow). Is it advisable to use the multiprocessing library for this purpose? Or is there any built-in functionality for it?
    r
    k
    +3
    15 replies · 6 participants
  • s

    Stephen Herron

    10/31/2022, 12:33 PM
    hi do I need to be wary of this? Is there a way to update blocks that were created when the prefect version was updated?
    ✅ 1
    j
    k
    3 replies · 3 participants
  • a

    Adam

    10/31/2022, 12:50 PM
    Hi friends, what is the Prefect 2.0 equivalent of state handlers? I’m looking to build in some integration in our flows to report metrics per flow and task to datadog and I need some way to hook in to the flow and task lifecycle
    k
    2 replies · 2 participants
  • v

    Vadym Dytyniak

    10/31/2022, 12:51 PM
    Hi. I see unexpected behaviour for my use case. If I deploy the flow with S3 storage for example and then deploy flow again with no storage - deployment still has old storage. Is it expected?
    ✅ 1
    1 reply · 1 participant
  • s

    Stéphan Taljaard

    10/31/2022, 1:20 PM
    Hi. I'm porting my Prefect 1 tasks to 2. Some of my original tasks used multiple values from the
    prefect.context
    , notably the
    flow_name
    ,
    flow_run_name
    , and
    flow_run_id
    . Is there a way to pass the entire flow run context to a task? Or, is there a way to access the flow run context from within a task? If I uncomment the middle line in my flow function (see in the thread), the flow run seems to become unresponsive (I guess it's because it's waiting for "this flow run"'s state, but the flow is still running...)
    ✅ 1
    n
    7 replies · 2 participants
  • c

    Carlo

    10/31/2022, 2:22 PM
    We are using
    run_deployment
    w/
    SequentialTaskRunner
    . However when the first in a chain failed, it didn't block the remaining
    run_deployments
    . In fact, they ran and the parent completed. How do I ensure the dependencies are honored? Flow definition in thread
    ✅ 1
    n
    5 replies · 2 participants
  • o

    Oscar Björhn

    10/31/2022, 3:11 PM
    I'm trying out the (relatively) new azure container instance blocks and I'm getting the following error when attempting to run a flow on my VM: KeyError: "No class found for dispatch key 'azure-container-instance-job' in registry for type 'Block'." Any ideas as to what the cause might be? I have prefect-azure installed on the VM. It seems to run fine on my local machine and I can't figure out what the difference is.
    r
    14 replies · 2 participants
  • r

    Rahul Kadam

    10/31/2022, 3:19 PM
    Hi Team, How do i identify the images corresponding to Prefect V2 under Dockerhub prefecthq/server . The tags doesnt seem to be helpful in identifying it. Is there any documentation around using Dockerhub imags correctly ?
    ✅ 1
    r
    o
    17 replies · 3 participants
  • x

    Xavier Babu

    10/31/2022, 3:37 PM
    Prefect Community, I need your help to get better understanding of how agent works in Prefect Orion. I am planning to run two agents from two different Linux servers. But I would like to start Prefect Orion in only one server by deploying a Prefect Orion in a Conda VM where the 1st agent is running. Will that work? Or, Every server where I run the agent, I need to deploy Prefect Orion in a Conda VM and use that VM to run the jobs queued in the Agent. Please shed some light. My understanding so far is, I have to install Prefect Orion Python libs in a Conda VM and start the agent in every Linux Server (one to one mapping). In the same server I can run multiple agents, but if I need to run a agent in another Linux server, I need to deploy and activate the Python VM. Thanks, Xavier Babu
    n
    6 replies · 2 participants
  • t

    Tim Enders

    10/31/2022, 3:45 PM
    Getting a 404 on my Prefect 2.0 workspace? Did I miss something changing with workspaces?
    ✅ 1
    m
    2 replies · 2 participants
  • t

    Tim Enders

    10/31/2022, 3:59 PM
    Another load/crash issue. I am getting these exceptions when running a local agent against a delpoyed flow in Prefect 2.0
    Traceback (most recent call last):
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1334, in report_task_run_crashes
        yield
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1070, in begin_task_run
        connect_error = await client.api_healthcheck()
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/orion.py", line 204, in api_healthcheck
        await self._client.get("/health")
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1751, in get
        return await self.request(
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1527, in request
        return await self.send(request, auth=auth, follow_redirects=follow_redirects)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/base.py", line 159, in send
        await super().send(*args, **kwargs)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1614, in send
        response = await self._send_handling_auth(
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1642, in _send_handling_auth
        response = await self._send_handling_redirects(
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1679, in _send_handling_redirects
        response = await self._send_single_request(request)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1716, in _send_single_request
        response = await transport.handle_async_request(request)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_transports/default.py", line 353, in handle_async_request
        resp = await self._pool.handle_async_request(req)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 252, in handle_async_request
        await self.response_closed(status)
    asyncio.exceptions.CancelledError
    10:56:53.688 | ERROR   | Task run 'Get-Items-d8ed86f1-2473' - Crash detected! Execution was cancelled by the runtime environment.
    10:56:53.688 | DEBUG   | Task run 'Get-Items-d8ed86f1-2473' - Crash details:
    Traceback (most recent call last):
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 314, in acquire
        self.acquire_nowait()
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 342, in acquire_nowait
        raise WouldBlock
    anyio.WouldBlock
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 237, in handle_async_request
        response = await connection.handle_async_request(request)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection.py", line 90, in handle_async_request
        return await self._connection.handle_async_request(request)
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/http2.py", line 96, in handle_async_request
        await self._max_streams_semaphore.acquire()
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_synchronization.py", line 46, in acquire
        await self._semaphore.acquire()
      File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 319, in acquire
        await event.wait()
      File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
        await fut
    asyncio.exceptions.CancelledError
    x
    p
    4 replies · 3 participants
  • n

    Nic

    10/31/2022, 4:28 PM
    when using prefect deployment build in the cli
    --params='{"question": "ultimate", "answer": 42}'
    Returns
    +- Error ---------------------------------------------------------------------+
    | Got unexpected extra arguments (ultimate, answer: 42}')                     |
    +-----------------------------------------------------------------------------+
    The --param is working, but I've tried many different combinations of --params without success. Is it possible to provide a working example or see if anything has changed since it's not working?
    ✅ 1
    j
    9 replies · 2 participants
Powered by Linen
Title
n

Nic

10/31/2022, 4:28 PM
when using prefect deployment build in the cli
--params='{"question": "ultimate", "answer": 42}'
Returns
+- Error ---------------------------------------------------------------------+
| Got unexpected extra arguments (ultimate, answer: 42}')                     |
+-----------------------------------------------------------------------------+
The --param is working, but I've tried many different combinations of --params without success. Is it possible to provide a working example or see if anything has changed since it's not working?
✅ 1
j

Jeff Hale

10/31/2022, 4:57 PM
That looks correct. Was it working earlier, Nic? What does
prefect version
show?
n

Nic

10/31/2022, 5:02 PM
I haven't tried it before, but currently, it's not working
Version:             2.6.4
API version:         0.8.2
Python version:      3.10.6
Git commit:          51e92dda
Built:               Thu, Oct 20, 2022 3:11 PM
OS/Arch:             win32/AMD64
Profile:             default
Server type:         cloud
:thank-you: 1
j

Jeff Hale

10/31/2022, 5:28 PM
I just tested with 2.6.5 and it works okay for me. Maybe it’s an issue with how the windows command line program deals with quotes: https://superuser.com/questions/324278/how-to-make-windows-command-prompt-treat-single-quote-as-though-it-is-a-double-q
n

Nic

10/31/2022, 5:51 PM
I've tried a few different quoting 'situations' and am left with errors always, I'll continue testing tomorrow and dot down everything that doesn't work and report back
Can't get it working with the standard windows CMD or powershell. Are you running windows too, or somebody in the team running windows and has it working?
j

Jeff Hale

10/31/2022, 8:47 PM
I don’t - but I think it’s a matter of escaping the quotes - here’s a guide for PowerShell. You mentioned earlier that the
param
flag works. Does that work for you to input your params? Alternatively, the Python deployment method might be an option.
n

Nic

10/31/2022, 9:01 PM
I've set up a CI/CD pipeline that's integrated with a fair bit of systems at this point so changing to the python deploy would be an option, but one I'd like to avoid. We could use the param flag, but was hoping to have params working so inputting multiple parameters would be easier for our data science team that is going to use the pipeline. It might be the option - that or storing the deployment in .py files and running those instead of the cli script throught azure devops
j

Jeff Hale

10/31/2022, 9:17 PM
I found this section of the power-shell docs which goes more in depth with single and double quotes: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_quoting_rules?view=powershell-7
👍 1
View count: 3