I’m evaluating Prefect along with Metaflow and Kubeflow Pipelines for 1) general ETL / analytics workflows and 2) machine learning training + deployment workflows. So far, I’m most excited about Prefect. Prefect’s native support for multiple environments (local, dask-local, dask-distributed, dask-kubernetes…) makes it accessible to anyone doing analytics at any scale in my company. It’s checkpointing feature makes debugging easier. The thing I like about Metaflow is that you can specify dependencies and resources at the task-level (ie run this task on AWS Batch with 2 GPUs), and seamlessly inspect run artifacts in a jupyter notebook. The thing I like about Kubeflow Pipelines (which we currently run in production) is how tasks can be authored, shared, and composed into new pipelines, and its UI supports arbitrary, rich output from each task. Does Prefect Core / Cloud envision supporting any of these features?