Cab Maddux
04/14/2020, 7:08 PMScott Zelenka
04/14/2020, 9:15 PMclient.create_flow_run
but that want's the UUID. I'm looking for a display name to UUID map, similar to how the UI does it.Manuel Aristarán
04/14/2020, 9:55 PMlocalhost
hardcoded in the UI) when deploying Server to an external server, this stopgap might help you get going:
socat TCP-LISTEN:4200,fork,reuseaddr TCP:your.prefect.api.host:4200
(TCP proxy from localhost:4200
to the external Prefect API server)Matias Godoy
04/15/2020, 10:07 AMMatias Godoy
04/15/2020, 10:09 AMMatias Godoy
04/15/2020, 10:10 AMNate Atkins
04/15/2020, 2:51 PMINFO - prefect.Task: write_data | {'col_1': [6, 4, 2, 0], 'col_2': ['a', 'b', 'c', 'd']}
Nate Atkins
04/15/2020, 2:56 PMINFO - prefect.Task: write_data | {'col_1': [3, 2, 1, 0], 'col_2': ['a', 'b', 'c', 'd']}
If I look in the log I see that the xform_data stage isn't run. Not sure why that happens, but also not sure how the data passed to write_data is the data that was passed to xform_data. It seems like the two sets of cached data are the same and xform_data sees the cached data from load_data and gets skipped.
April 15th 2020 at 8:45:25am | prefect.CloudTaskRunner
DEBUG
Task 'xform_data': 1 candidate cached states were found
April 15th 2020 at 8:45:25am | prefect.LocalResultHandler
DEBUG
Starting to read result from /tmp/inter/prefect-result-2020-04-15t14-45-25-356078-00-00...
April 15th 2020 at 8:45:25am | prefect.LocalResultHandler
DEBUG
Finished reading result from /tmp/inter/prefect-result-2020-04-15t14-45-25-356078-00-00...
April 15th 2020 at 8:45:25am | prefect.CloudTaskRunner
DEBUG
Task 'xform_data': Handling state change from Pending to Cached
April 15th 2020 at 8:45:25am | prefect.CloudTaskRunner
DEBUG
Task 'xform_data': can't set state to Running because it isn't Pending; ending run.
April 15th 2020 at 8:45:25am | prefect.CloudTaskRunner
INFO
Task 'xform_data': finished task run for task with final state: 'Cached'
April 15th 2020 at 8:45:25am | prefect.CloudTaskRunner
INFO
Task 'write_data': Starting task run...
This seems to be the line that I don't expect.
Task 'xform_data': can't set state to Running because it isn't Pending; ending run.
Preston Marshall
04/15/2020, 7:52 PMEdidiong Etuk
04/15/2020, 8:21 PMAn Hoang
04/15/2020, 9:09 PMprefect server start
? This is a fresh environment. I thought it might be incompatibility with python 3.8 but I saw the 3.8 compatibility was merged.Jeremiah
Jacob Blanco
04/15/2020, 11:23 PMHugh Cameron
04/16/2020, 12:36 AM--postgres-port
feature I was able to deploy on my NAS at home. I’m not treating the NAS like a development environment, so I’m testing out flows on my laptop before scheduling on the NAS. My question is - how do I deploy a flow I’ve written locally to another server? Should I run something like prefect backend server
with an option for the server address 192.168.103.210:8080
?Arsenii
04/16/2020, 4:32 AMA
, and then maps 5 tasks over that list. Pretty straightforward, until the part where objects A
include information about what other objects A'
(A dash) they depend on... And those A'
objects have to be processed and mapped over the same tasks as A
, before A
. This can go several layers deep, with A->A'->A''
dependencies that need to be taken care of dynamically.
The most naive solution is just to insert the dependencies at the beginning of the original list, hence making some kind of priority queue, and mapping over the tasks. However, this would not work with a DaskExecutor -- since everything is in parallel.
What I guess I need here is "sub-flows" that can be mapped over a list of lists. It seems there's some discussion on it going on https://github.com/PrefectHQ/prefect/issues/1745 , but since it's far from release yet, do y'all think a similar thing can be hacked together, now?
Thanks!!Niclas Roos
04/16/2020, 8:56 AMScott Zelenka
04/16/2020, 3:14 PMManuel Mourato
04/16/2020, 4:40 PM$ prefect create project "Test-Project"
....
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "python3.6/site-packages/urllib3/connectionpool.py", line 416, in _make_request
httplib_response = conn.getresponse()
File "python3.6/http/client.py", line 1354, in getresponse
response.begin()
File "python3.6/http/client.py", line 307, in begin
version, status, reason = self._read_status()
File "python3.6/http/client.py", line 268, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "python3.6/socket.py", line 586, in readinto
return self._sock.recv_into(b)
urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
As anyone experienced this before?Brad
04/16/2020, 10:49 PMslack_notifier
state handler (via the task kwarg), but I’d like to do some local testing without trigger the handler. Is there a context
or set_temp_config
I could use to mute it ?Viv Ian
04/17/2020, 12:17 AMprefecthq/prefect:latest
as the base image, but encountered pyscopg2 installation issues, so I decided to just pip install prefect within the image. The example below is super basic (just want to be able to run a prefect
command). However, when I run the image, I get the following error: /bin/sh: 1: prefect: not found
I’ve also attempted the following variations in place of the CMD prefect version
command:
• RUN ["/bin/bash", "-c", "prefect version"]
• CMD ["prefect", "version"]
Any ideas? 😄Luan Carvalho
04/17/2020, 1:49 AM/home/luan/anaconda3/lib/python3.7/site-packages/prefect/cli/server.py:236: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read <https://msg.pyyaml.org/load> for full details.
y = yaml.load(file)
`ERROR: Couldn't connect to Docker daemon - you might need to run docker-machine start default
.`
Exception caught; killing services (press ctrl-C to force)
`ERROR: Couldn't connect to Docker daemon - you might need to run docker-machine start default
.`
Traceback (most recent call last):
File "/home/luan/anaconda3/lib/python3.7/site-packages/prefect/cli/server.py", line 291, in start
["docker-compose", "pull"], cwd=compose_dir_path, env=env
File "/home/luan/anaconda3/lib/python3.7/subprocess.py", line 347, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['docker-compose', 'pull']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/luan/anaconda3/bin/prefect", line 10, in <module>
sys.exit(cli())
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/luan/anaconda3/lib/python3.7/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/home/luan/anaconda3/lib/python3.7/site-packages/prefect/cli/server.py", line 307, in start
["docker-compose", "down"], cwd=compose_dir_path, env=env
File "/home/luan/anaconda3/lib/python3.7/subprocess.py", line 395, in check_output
**kwargs).stdout
File "/home/luan/anaconda3/lib/python3.7/subprocess.py", line 487, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker-compose', 'down']' returned non-zero exit status 1.
Matthew Maldonado
04/17/2020, 3:14 AMSanjay Patel
04/17/2020, 3:25 AMFlow
needs a storage
keyword and that storage needs to create a Docker
instance with a dockerfile
keyword. I have tried this on a very basic example passing a dockerfile and I keep getting the same error 500 Server Error: Internal Server Error ("Cannot locate specified Dockerfile: .\tmp64wpj0sv\Dockerfile")
after I execute flow.register()
Simple example was taken from tutorial and my dockerfile is located in the same location (note i'm pretty certain it can find the file as I get a different error message if the file can't actually be located)
import prefect
from prefect import task, Flow
from prefect.environments.storage import Docker
@task
def hello_task():
logger = prefect.context.get("logger")
<http://logger.info|logger.info>("Hello, Cloud!")
flow = Flow("hello-flow", tasks=[hello_task])
flow.storage = Docker(dockerfile = 'Dockerfile')
flow.register(project_name="hello-flow")
Dockerfile content:
FROM ubuntu:18.04
Any assistance on how I should actually be specifying my dockerfile to add the required modules to my actual workflow is appreciated
Thanksalvin goh
04/17/2020, 5:00 AMMatthew Maldonado
04/17/2020, 5:40 AMRalph Willgoss
04/17/2020, 9:34 AMMatthew Maldonado
04/17/2020, 12:35 PMJacques
04/17/2020, 2:32 PMJacques
04/17/2020, 2:33 PMJacques
04/17/2020, 2:34 PM