https://prefect.io
Join Slack
Hello, in Orion - how to handle logging from python files which are not part of flow code? I usuall...
d

Darshan

over 3 years ago
Hello, in Orion - how to handle logging from python files which are not part of flow code? I usually use loguru for my logging but not able to make it work with Prefect logging seamlessly. I would like to have both prefect logging and app logging to sink to same target (file/console) and with consistent format. Are there any good examples available on this ?
d
k
a
  • 3
  • 2
  • 158
<@ULVA73B9P> my self hosted server is running out of space, how can i remove everything that is olde...
n

Nikhil Joseph

over 1 year ago
@Marvin my self hosted server is running out of space, how can i remove everything that is older than a week
n
m
  • 2
  • 3
  • 157
Hi! When I run the example code after successfully logged in my prefect cloud (prefect version == 2...
k

Kun Yin

almost 3 years ago
Hi! When I run the example code after successfully logged in my prefect cloud (prefect version == 2.0b9), I got the error : prefect.exceptions.PrefectHTTPStatusError: Client error '422 Unprocessable Entity' for url 'https://api-beta.prefect.io/api/accounts/f68f7059-5eec-4e71-acc8-d7f351373e43/workspaces/ada6556d-d542-4378-9b90-b5647a4dd404/flow_runs/'. How can I solve this?
🙌 1
✅ 1
k
a
j
  • 3
  • 3
  • 157
<@ULVA73B9P> Can I run Docker with GPUs on Prefect?
p

Petko

over 1 year ago
@Marvin Can I run Docker with GPUs on Prefect?
p
m
  • 2
  • 5
  • 156
<@ULVA73B9P> I used a terraform module to start my ecs worker and I’m getting these worker logs. ``...
g

Gemma

about 2 years ago
@Marvin I used a terraform module to start my ecs worker and I’m getting these worker logs.
6/22/2023, 2:21:26 PM GMT+1	raise PrefectHTTPStatusError.from_httpx_error(exc) from exc.__cause__	dd7a2b4d0c864a29a45c7fb3be8e12e0	prefect-worker-dev
6/22/2023, 2:21:26 PM GMT+1	prefect.exceptions.PrefectHTTPStatusError: Client error '401 Unauthorized' for url '<https://api.prefect.cloud/api/accounts/df07584b-fc84-43cb-a012-4a733392bcb7/workspaces/aebc2199-ddcd-430f-b7f1-de9c172d3892/work_pools/my-ecs-pool>'	dd7a2b4d0c864a29a45c7fb3be8e12e0	prefect-worker-dev
6/22/2023, 2:21:26 PM GMT+1	Response: {'detail': 'Invalid authentication credentials'}	dd7a2b4d0c864a29a45c7fb3be8e12e0	prefect-worker-dev
6/22/2023, 2:21:26 PM GMT+1	For more information check: <https://httpstatuses.com/401>	dd7a2b4d0c864a29a45c7fb3be8e12e0	prefect-worker-dev
6/22/2023, 2:21:26 PM GMT+1	Worker 'ECSWorker b98f1928-6596-4a06-83f1-84188da6d966' started!	dd7a2b4d0c864a29a45c7fb3be8e12e0	prefect-worker-dev
g
m
t
  • 3
  • 5
  • 156
is it possible to set a context variable on the flow level? for context, i want to set a flow owner ...
l

Lana Dann

over 3 years ago
is it possible to set a context variable on the flow level? for context, i want to set a flow owner to the flow and then use that value in my slack state handler. but even when i set
with prefect.context(dict(flow_owner="test")):
before defining the flow, i still get an error
Traceback (most recent call last):
  File "/Users/lanadann/.pyenv/versions/data-prefect-3.9.7/lib/python3.9/site-packages/prefect/engine/runner.py", line 161, in handle_state_change
    new_state = self.call_runner_target_handlers(old_state, new_state)
  File "/Users/lanadann/.pyenv/versions/data-prefect-3.9.7/lib/python3.9/site-packages/prefect/engine/task_runner.py", line 113, in call_runner_target_handlers
    new_state = handler(self.task, old_state, new_state) or new_state
  File "/Users/lanadann/prefect/data_prefect/lib/notifiers.py", line 13, in post_to_slack_on_failure
    f"@{prefect.context.flow_owner} "
AttributeError: 'Context' object has no attribute 'flow_owner'
discourse 1
l
a
k
  • 3
  • 30
  • 156
<@ULVA73B9P> whats the difference between a task and a flow — can I just use flows?
g

Gabriel Lespérance

11 months ago
@Marvin whats the difference between a task and a flow — can I just use flows?
g
m
n
  • 3
  • 62
  • 155
Hello all. I am running into various `sqlalchemy` errors when running a test flow with many tasks (&...
y

Young Ho Shin

almost 3 years ago
Hello all. I am running into various
sqlalchemy
errors when running a test flow with many tasks (>10000) locally. Here's the code I'm running: https://gist.github.com/yhshin11/1832bc945446a62c5c6152abb9c1a0a5 It seems like the problem has to do with the fact that there are too many tasks that are trying to write to the Orion database at the same time. I tried switching to a Postgres database as described in the [docs](https://docs.prefect.io/concepts/database/), and also adding concurrency limit of 10. Neither seems to fix the issues. Any ideas about how to fix this? Here's an example of the kind of errors I'm getting:
sqlalchemy.exc.TimeoutError: QueuePool limit of size 5 overflow 10 reached, connection timed out, timeout 30.00 (Background on this error at: <https://sqlalche.me/e/14/3o7r>)
✅ 1
y
h
a
  • 3
  • 2
  • 155
How do I create and run flows on a schedule in Prefect Orion? Let’s say I have this: ```schedule = ...
c

Christoph Deil

over 3 years ago
How do I create and run flows on a schedule in Prefect Orion? Let’s say I have this:
schedule = IntervalSchedule(interval=datetime.timedelta(seconds=10))
deployment_spec = DeploymentSpec(name="hola", flow=greetings_flow, schedule=schedule)
Do I now use OrionClient and some methods to deploy? We currently use Prefect core in a pod and simply do flow.run() with a schedule attached, and I’m looking for a working example to do the equivalent in Orion (even if I gather behind the scenes it will do something else via a server and DB). Basically I’m looking for this: https://orion-docs.prefect.io/concepts/deployments/#running-deployments-with-the-api 🙂
c
z
  • 2
  • 9
  • 155
<@ULVA73B9P> How to Clean the prefect salute DB data ?
p

Ponraj

11 months ago
@Marvin How to Clean the prefect salute DB data ?
p
m
  • 2
  • 1
  • 154
Previous272829Next

Prefect Community

Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.

Powered by