Hi, I am getting the following error on prefect cl...
# ask-community
m
Hi, I am getting the following error on prefect cloud, No heartbeat detected from the remote task; marking the run as failed. I tried adding [cloud] heartbeat_mode = "thread" in my backend.toml file and this line, flow.run_config = UniversalRun(env={"PREFECT__CLOUD__HEARTBEAT_MODE": "thread"}) in my code but I'm still getting the same error. Any help to what I am missing please?
a
You would need to provide more information. This error happens with long running jobs and when you run out of memory. 1. Did you check the logs to see whether one of those reasons may be true? 2. what is your agent? 3. do you run it on Cloud or Server? 4. what is your task doing - is this a Kubernetes job?
m
I am running it on cloud and I'm using a local agent. By checking the logs I can see this error Process PID 406553 returned non-zero exit code 2! maybe it's not related to the run out of memory as I thought.
a
I see - this could be related to this non-zero exit code. It looks like your flow run couldn’t finish successfully for some reason and since the process died, the flow’s heartbeat was lost. Can you investigate why the process died? This way you can tackle the root cause of the problem
m
The process is failing while run this following task
Copy code
@task(nout=3)
def load_data_paths():
    parser = argparse.ArgumentParser()
    parser.add_argument(
        "-d", "--data_dir=", type=str, dest="data", help="row data directory"
    )
    parser.add_argument(
        "-l", "--labels_dir=", type=str, dest="labels", help="labels directory"
    )
    parser.add_argument(
        "-att",
        "--attributes_dir=",
        dest="attributes",
        type=str,
        help="attribute directory",
    )

    args = parser.parse_args()
    print(args)
    data_dir = None

    if data_dir == None:
        data_dir = s3_download(
            s3_uri="s3://.../",
            cache_path="../data/images/",
            aws_profile="...",
        )
        labels_dir = s3_download(
            s3_uri="s3://.../",
            cache_path="../data/labels/",
            aws_profile="...",
        )
        attributes_dir = s3_download(
            s3_uri="s3://.../",
            cache_path="../data/attributes/",
            aws_profile="...",
        )

        return data_dir, attributes_dir, labels_dir
So I think it's because of the command line arguments in the function, it's not able to recognize it I guess. How do I proceed with this in prefect?
I had the same problem/error while running the script in my local jupyter notebooks but it worked after saving the script as .py file and using the command: !python file.py in my notebook and the flow was able to run
a
Did someone else from your team write it? You can get rid of the ArgumentParser since you don’t need it when running your task. If you’re not familiar with boto3, you can use e.g. this task I know for sure that other user had argparser in their flow and this was the issue for them
m
Thanks, once I remove the ArgumentParser the flow run perfectly. I was just wondering, is there was an option to keep the ArgumentParser and still avoid the error of the process failing with prefect cloud? If I may ask, why did the ArgumentParser not work while running my task? Was there any type of conflict?
a
it’s just that ArgumentParser is for CLI programs i.e. when you want to run your script from a terminal - I think it’s not intended to be used within functions like that
👍 1