Kind of new to Prefect, but from my experience working with orchestration tools (like Prefect, Airflow), they should be used more to orchestrate and less to transform data (specially when data starts to get big). You could leverage Prefect + Spark for example as an alternative.
For your use-case specifically, what you can also try to do it to persist the data in an intermediate storage layer (like s3, gcs) using parquet, and instead of passing the whole dataframe between tasks you can pass the file path to the new task