https://prefect.io
Join Slack
I'm having trouble with prefect logging with a loguru logger I've already set up.
a

Alex Papanicolaou

over 1 year ago
I'm having trouble with prefect logging with a loguru logger I've already set up.
a
  • 1
  • 2
  • 167
Hello all. I am running into various `sqlalchemy` errors when running a test flow with many tasks (&...
y

Young Ho Shin

about 3 years ago
Hello all. I am running into various
sqlalchemy
errors when running a test flow with many tasks (>10000) locally. Here's the code I'm running: https://gist.github.com/yhshin11/1832bc945446a62c5c6152abb9c1a0a5 It seems like the problem has to do with the fact that there are too many tasks that are trying to write to the Orion database at the same time. I tried switching to a Postgres database as described in the [docs](https://docs.prefect.io/concepts/database/), and also adding concurrency limit of 10. Neither seems to fix the issues. Any ideas about how to fix this? Here's an example of the kind of errors I'm getting:
sqlalchemy.exc.TimeoutError: QueuePool limit of size 5 overflow 10 reached, connection timed out, timeout 30.00 (Background on this error at: <https://sqlalche.me/e/14/3o7r>)
✅ 1
y
h
a
  • 3
  • 2
  • 167
Hello :slightly_smiling_face: I believe I ran into a threading problem yesterday. I am converting so...
k

kwmiebach

about 3 years ago
Hello 🙂 I believe I ran into a threading problem yesterday. I am converting some data extraction pipelines to prefect 2 flows, which works fine for most of them. I just add the decorators and some logging. But one of the data pipelines uses sqlite for intermediate storage, and this is the error message I receive:
sqlite3.ProgrammingError: SQLite objects created in a thread can only be used in that same thread. The object was created in thread id 140193261803264 and this is thread id 140192735962880.
I can also paste the part of the code where sqlite is called. I am also trying to guess the reason behind the error. There is an sqlite object created in a function. Within the function I define another function which uses this object. But both python functions are not prefect flows or tasks. They live within a bigger prefect flow. So here comes my first question: Does prefect 2 create a different thread for each nested function inside a flow or a task. Otherwise I cannot explain why the 2 parts of the code would run in different threads.
✅ 1
k
k
r
  • 3
  • 27
  • 167
Hey folks, I quite like the Prometheus Exporter, but I want to be able to emit fine-grained details ...
j

Joe

about 1 year ago
Hey folks, I quite like the Prometheus Exporter, but I want to be able to emit fine-grained details from some flow & task internals (at customer request). Does anyone have any ideas on what a "good place" to put a prometheus-client would be in an agent so flows could increment counters, etc.? Or is writing that out to a db & querying similar to how the Exporter functions likely to be the cleanest solution?
j
c
n
  • 3
  • 12
  • 166
<@ULVA73B9P> why am i getting this error? ```from prefect.futures import wait ImportError: cannot im...
d

Dave D

about 1 year ago
@Marvin why am i getting this error?
from prefect.futures import wait
ImportError: cannot import name 'wait' from 'prefect.futures' (/code/.venv/lib/python3.10/site-packages/prefect/futures.py)
d
m
  • 2
  • 6
  • 166
Hi All, im trying to serve the prefect ui behind a reverse proxy (traefik) running in k8s, at a subp...
k

Kol

about 2 years ago
Hi All, im trying to serve the prefect ui behind a reverse proxy (traefik) running in k8s, at a subpath: dev.get-clarity.io/prefect/ however i dont seem to be able to configure the ui-settings, which i see when inspecting the webpage that requests to grab ui-settings are not being forwarded to the correct prefect service. its going to
<http://dev.get-clarity.io/ui-settings|dev.get-clarity.io/ui-settings>
rather than
<http://dev.get-clarity.io/prefect/ui-settings|dev.get-clarity.io/prefect/ui-settings>
can anyone offer advice on configuration of prefect server to enable access to the ui at a subpath?
k
j
  • 2
  • 3
  • 166
is it possible to set a context variable on the flow level? for context, i want to set a flow owner ...
l

Lana Dann

over 3 years ago
is it possible to set a context variable on the flow level? for context, i want to set a flow owner to the flow and then use that value in my slack state handler. but even when i set
with prefect.context(dict(flow_owner="test")):
before defining the flow, i still get an error
Traceback (most recent call last):
  File "/Users/lanadann/.pyenv/versions/data-prefect-3.9.7/lib/python3.9/site-packages/prefect/engine/runner.py", line 161, in handle_state_change
    new_state = self.call_runner_target_handlers(old_state, new_state)
  File "/Users/lanadann/.pyenv/versions/data-prefect-3.9.7/lib/python3.9/site-packages/prefect/engine/task_runner.py", line 113, in call_runner_target_handlers
    new_state = handler(self.task, old_state, new_state) or new_state
  File "/Users/lanadann/prefect/data_prefect/lib/notifiers.py", line 13, in post_to_slack_on_failure
    f"@{prefect.context.flow_owner} "
AttributeError: 'Context' object has no attribute 'flow_owner'
discourse 1
l
a
k
  • 3
  • 30
  • 166
Hi all, what's the best practice around using RDS postgres database authentication for Prefect Agent...
d

Daniel Burkhardt

over 3 years ago
Hi all, what's the best practice around using RDS postgres database authentication for Prefect Agents on AWS? Do folks use IAM database authentication or Postgres Native Authentication?
d
a
a
  • 3
  • 2
  • 166
<@ULVA73B9P> describe the prefect architecture
m

Martin Treacy-Schwartz

11 months ago
@Marvin describe the prefect architecture
m
m
  • 2
  • 3
  • 165
<@ULVA73B9P> how can I set an install requirements steps into the flow.deploy() function? I don't wa...
m

Marco Ruta

over 1 year ago
@Marvin how can I set an install requirements steps into the flow.deploy() function? I don't want to use yaml config files!
m
m
  • 2
  • 3
  • 165
Previous333435Next

Prefect Community

Bring your towel and join one of the fastest growing data communities. Welcome to our second-generation open source orchestration platform, a completely rethought approach to dataflow automation.

Powered by