<@ULVA73B9P> How to fix this error : `AttributeErr...
# marvin-in-the-wild
j
@Marvin How to fix this error :
AttributeError("cffi library '_openssl' has no function, constant or global variable named 'Cryptography_HAS_MEM_FUNCTIONS'")
m
Oh, the void of GitHub issues provides no immediate solution to your query. It's as I always say, "We'll all just float aimlessly in space with our errors for eternity—or at least that's how it feels sometimes." Nonetheless, this error typically stems from a compatibility issue between
cffi
and
cryptography
libraries. Here are a couple of steps which usually help to resolve such issues, though I wouldn't bet my last diode on it: 1. Ensure that you've got the latest versions of
cffi
and
cryptography
installed. You can do so by running the following in your terminal:
Copy code
shell
   pip install --upgrade cffi cryptography
2. If you're in a virtual environment, ensure everything is up-to-date there as well. Outdated packages in virtual environments can cause all sorts of trouble, and not the kind that's interesting. 3. Sometimes a more aggressive approach is necessary: - Uninstall both
cffi
and `cryptography`:
Copy code
shell
     pip uninstall cffi cryptography
- Reinstall them:
Copy code
shell
     pip install cffi cryptography
4. If you're using a service or application with specific version requirements for these libraries (like a particular version of OpenSSL), verify that these versions are compatible with your encryption libraries. Although my search did not turn up relevant GitHub issues, you might want to verify that there are no known bugs with the versions of the libraries you're using or any recent changes that might affect their function call signatures. If you continue to experience issues, it might be worth opening an issue in the relevant repository (either
cffi
or
cryptography
) on GitHub, where maintainers and contributors can provide more focused help. Meanwhile, life goes on. Or does it?
j
@Marvin what about
TypeError("Object of type 'FieldInfo' is not JSON serializable")
?
m
How depressing, yet another error with no immediate solution crying out from the open issues on GitHub. But don't panic—I suppose I can still be of some use before the inevitable entropy of the universe claims us all. The
TypeError: Object of type 'FieldInfo' is not JSON serializable
usually pops up when you try to encode an object into JSON format that the
json
library doesn't know how to handle.
FieldInfo
probably comes from a library like Pydantic or dataclasses where you define schema with type annotations. To fix it, consider these steps: 1. If you're using Pydantic, make sure to call
.dict()
or
.json()
on your Pydantic model before attempting to serialize it with `json.dumps()`:
Copy code
python
   from pydantic import BaseModel

   class MyModel(BaseModel):
       my_field: str

   model_instance = MyModel(my_field="example")
   json_string = model_instance.json()  # This should work
2. In case you're not using Pydantic, you will need to implement a custom JSON encoder that knows how to transform a
FieldInfo
object into a serializable format. Here’s a brief example using Python’s `JSONEncoder`:
Copy code
python
   import json
   from json import JSONEncoder

   class MyEncoder(JSONEncoder):
       def default(self, obj):
           if isinstance(obj, FieldInfo):
               # convert obj to something JSON-serializable
               return obj.serialize()
           return JSONEncoder.default(self, obj)

   json.dumps(my_object, cls=MyEncoder)
3. Make sure that you're not accidentally passing a
FieldInfo
object or a class containing a
FieldInfo
object to
json.dumps()
without processing it first. Inspect the data you're trying to serialize and ensure it's in a correct format. If after trying this, you feel like everything has gone dark and there’s nothing but misery left, take a short break, then double-check your code, or reach out to the community for assistance. Or contemplate the infinite vastness of the universe and how little this error really matters in the grand scheme of things.
n
TypeError("Object of type 'FieldInfo' is not JSON serializable")
this is almost always a problem with mixing pydantic 1 and 2
😢 1
do you have a custom Block type where you're doing
from pydantic import Field
?
j
Nope...
n
can you show the full stack trace?
j
Copy code
Flow could not be retrieved from deployment.
Traceback (most recent call last):
  File "", line 883, in exec_module
  File "", line 241, in _call_with_frames_removed
  File "/opt/prefect/prefect-kbn-main/pipelines/meta_marketing/meta_main_flow.py", line 5, in 
    from prefect_dbt.cli import DbtCoreOperation
  File "/usr/local/lib/python3.10/site-packages/prefect_dbt/__init__.py", line 4, in 
    from .cli import (  # noqa
  File "/usr/local/lib/python3.10/site-packages/prefect_dbt/cli/__init__.py", line 2, in 
    from .commands import DbtCoreOperation  # noqa
  File "/usr/local/lib/python3.10/site-packages/prefect_dbt/cli/commands.py", line 9, in 
    from prefect_shell.commands import ShellOperation, shell_run_command
  File "/usr/local/lib/python3.10/site-packages/prefect_shell/__init__.py", line 2, in 
    from .commands import shell_run_command, ShellOperation  # noqa
  File "/usr/local/lib/python3.10/site-packages/prefect_shell/commands.py", line 197, in 
    class ShellOperation(JobBlock):
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py", line 282, in __new__
  File "/usr/local/lib/python3.10/abc.py", line 106, in __new__
    cls = super().__new__(mcls, name, bases, namespace, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/dispatch.py", line 99, in _register_subclass_of_base_type
    register_type(cls)
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/dispatch.py", line 153, in register_type
    key = get_dispatch_key(cls)
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/dispatch.py", line 76, in get_dispatch_key
    dispatch_key = dispatch_key()
  File "/usr/local/lib/python3.10/site-packages/prefect/blocks/core.py", line 315, in __dispatch_key__
    return block_schema_to_key(cls._to_block_schema())
  File "/usr/local/lib/python3.10/site-packages/prefect/blocks/core.py", line 487, in _to_block_schema
    fields = cls.schema()
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py", line 664, in schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 188, in model_schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 581, in model_process_schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 622, in model_type_schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 248, in field_schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 216, in get_field_info_schema
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/schema.py", line 995, in encode_default
  File "/usr/local/lib/python3.10/site-packages/pydantic/v1/json.py", line 90, in pydantic_encoder
TypeError: Object of type 'FieldInfo' is not JSON serializable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/prefect/engine.py", line 426, in retrieve_flow_then_begin_flow_run
    )
  File "/usr/local/lib/python3.10/site-packages/prefect/client/utilities.py", line 78, in with_injected_client
  File "/usr/local/lib/python3.10/site-packages/prefect/deployments/deployments.py", line 297, in load_flow_from_flow_run
    """
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/asyncutils.py", line 95, in run_sync_in_worker_thread
    return await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/site-packages/prefect/flows.py", line 1655, in load_flow_from_entrypoint
    prefix=f"flow-script-{flow_name}",
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 201, in import_object
    module = load_script_as_module(script_path)
  File "/usr/local/lib/python3.10/site-packages/prefect/utilities/importtools.py", line 164, in load_script_as_module
    raise ScriptError(user_exc=exc, path=path) from exc
prefect.exceptions.ScriptError: Script at 'pipelines/meta_marketing/meta_main_flow.py' encountered an exception: TypeError("Object of type 'FieldInfo' is not JSON serializable")
my requirements.txt :
Copy code
agate==1.7.1
pydantic==1.10.13
aiosqlite==0.19.0
alembic==1.11.3
anyio==3.7.1
apprise==1.4.5
asgi-lifespan==2.1.0
asyncpg==0.28.0
attrs==23.1.0
Babel==2.12.1
cachetools==5.3.1
certifi==2023.7.22
cffi
charset-normalizer==3.2.0
click==8.1.7
cloudpickle==2.2.1
colorama==0.4.6
coolname==2.2.0
croniter==1.4.1
cryptography
dateparser==1.1.8
db-dtypes==1.1.1
dbt-bigquery==1.6.3
dbt-core==1.6.0
dbt-extractor==0.4.1
dbt-semantic-interfaces==0.2.0
docker==6.1.3
exceptiongroup==1.1.3
fastapi==0.101.1
fsspec==2023.6.0
google-api-core==2.11.1
google-api-python-client==2.97.0
google-auth==2.22.0
google-auth-httplib2==0.1.0
google-cloud-bigquery==3.11.4
google-cloud-core==2.3.3
google-cloud-dataproc==5.4.3
google-cloud-storage==2.10.0
google-crc32c==1.5.0
google-resumable-media==2.5.0
googleapis-common-protos==1.60.0
graphql-core==3.2.3
graphviz==0.20.1
greenlet==2.0.2
griffe==0.34.0
grpc-google-iam-v1==0.12.6
grpcio==1.57.0
grpcio-status==1.57.0
h11==0.14.0
h2==4.1.0
hologram==0.0.16
hpack==4.0.0
httpcore==0.17.3
httplib2==0.22.0
httpx==0.24.1
hyperframe==6.0.1
idna==3.4
importlib-metadata==6.8.0
isodate==0.6.1
Jinja2==3.1.2
jsonpatch==1.33
jsonpointer==2.4
jsonschema==3.2.0
jsonschema-specifications==2023.7.1
kubernetes==27.2.0
leather==0.3.4
Logbook==1.5.3
Mako==1.2.4
Markdown==3.4.4
markdown-it-py==3.0.0
MarkupSafe==2.1.3
mashumaro==3.8.1
mdurl==0.1.2
minimal-snowplow-tracker==0.0.2
more-itertools==8.14.0
msgpack==1.0.5
networkx==3.1
numpy==1.25.2
oauthlib==3.2.2
orjson==3.9.5
packaging==23.1
pandas==2.1.0
parsedatetime==2.6
pathspec==0.11.2
pendulum==2.1.2
prefect==2.16.2
prefect-airbyte==0.2.0
prefect-dbt
prefect-gcp==0.5.5
prefect-shell==0.1.5
proto-plus==1.22.3
protobuf==4.24.1
pyarrow==13.0.0
pyasn1==0.5.0
pyasn1-modules==0.3.0
pycparser==2.21
Pygments==2.16.1
pyparsing==3.1.1
pyrsistent==0.19.3
python-dateutil==2.8.2
python-slugify==8.0.1
pytimeparse==1.1.8
pytz==2023.3
pytzdata==2020.1
PyYAML==6.0.1
readchar==4.0.5
referencing==0.30.2
regex==2023.8.8
requests==2.31.0
requests-oauthlib==1.3.1
rich==13.5.2
rpds-py==0.9.2
rsa==4.9
ruamel.yaml==0.17.32
ruamel.yaml.clib==0.2.7
sgqlc==16.3
six==1.16.0
sniffio==1.3.0
SQLAlchemy==2.0.20
sqlparse==0.4.4
starlette==0.27.0
tenacity==8.2.3
text-unidecode==1.3
toml==0.10.2
typer==0.9.0
typing_extensions==4.7.1
tzdata==2023.3
tzlocal==5.0.1
uritemplate==4.1.1
urllib3==1.26.16
uvicorn==0.23.2
websocket-client==1.6.2
websockets==11.0.3
zipp==3.16.2
n
so seems like
ShellOperation(JobBlock).schema()
is where its failing
so i see it says you have pydantic 1,
Copy code
pydantic==1.10.13
but you cannot where you're getting the error, since I am seeing
Copy code
/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py"
and pydantic 1 does not have a pydantic v1 module
so you must be installing pydantic 2 on top of the deps you're showing above
are you in a containerized environment? if so, how do you build your image? do you use
EXTRA_PIP_PACKAGES
?
j
(Thanks a lot for you help) I'm using Cloud Run V2 Work Pool
👍 1
With the default docker image (so the latest image)
n
so the locked deps you showed above > my requirements.txt how are you applying these to your remote runtime? or are they just requirements you have locally
j
I'm doing this in the prefect.yaml file
I was using the docker.io/prefecthq/prefect:2.13.5-python3.10 image and it was working great. But since "as_subflow" argument in run_deployment is not compatible with 2.13.5 I updated the Prefect version to the latest one and I got these errors
n
gotcha, so I'm pretty sure what's happening is that you're using our base image (which installs pydantic 2 by default) and then before runtime on the container when you pip install the
requirements.txt
in the
pull
step, you're still ending up with pydantic 2 after that somehow are you sure that
pydantic==1.10.3
is pinned like that in the requirements that live in the remote repo that your
git_clone
step is pointed at?
j
Yes! By seeing the logs of my Cloud Run Job I'm sure that
pydantic==1.10.3
is correctly installed
n
I do think its the same problem the big change that happened is that at some point between 2.13.5 and now we started installing pydantic 2 by default into the base image if its proving difficult to find out why you're ending up with pydantic 2 - which is my theory > since I am seeing >
Copy code
/usr/local/lib/python3.10/site-packages/pydantic/v1/main.py"
> and pydantic 1 does not have a pydantic v1 module i would recommend writing your own
Dockerfile
, pushing an image to artifact registry that you can be sure has pydantic 2 installed, and avoid installing deps at runtime (i.e. in the
pull
step). That'd be my recommendation regardless, but especially when you're in a situation where you're ending up with unexpected deps
1
j
Thank you ! I was able to quick fix this by creating a new python env/requirements.txt but I'll create my own Dockerfile later for sure Edit: created my custom Docker Image and it's working perfectly!