core.custom.custom_json ======================= .. py:module:: core.custom.custom_json .. autoapi-nested-parse:: Sigh. Here we go again, *another* json implementation with support for: - date - datetime - time Because nobody else does all of these. And if they do (like standardjson), they don't support decoding... Attributes ---------- .. autoapisummary:: core.custom.custom_json.AnySerializer core.custom.custom_json._T core.custom.custom_json._ST core.custom.custom_json.default_serializers core.custom.custom_json.DEFAULT_DUMPS_OPTIONS Classes ------- .. autoapisummary:: core.custom.custom_json.Serializer core.custom.custom_json.PrefixSerializer core.custom.custom_json.DictionarySerializer core.custom.custom_json.Serializers core.custom.custom_json.Serializable Functions --------- .. autoapisummary:: core.custom.custom_json.dumps core.custom.custom_json.dumps_bytes core.custom.custom_json.loads core.custom.custom_json.dump core.custom.custom_json.dump_bytes core.custom.custom_json.load Module Contents --------------- .. py:type:: AnySerializer :canonical: 'PrefixSerializer[_T] | DictionarySerializer[_T]' .. py:data:: _T .. py:data:: _ST .. py:class:: Serializer(target: type[_T]) Bases: :py:obj:`Generic`\ [\ :py:obj:`_T`\ , :py:obj:`_ST`\ ] Provides a way to encode all objects of a given class or its subclasses to and from json. .. py:attribute:: target .. py:method:: encode(obj: _T) -> _ST :abstractmethod: .. py:method:: decode(value: _ST) -> _T :abstractmethod: .. py:class:: PrefixSerializer(target: type[_T], prefix: str, encode: collections.abc.Callable[[_T], str], decode: collections.abc.Callable[[str], _T]) Bases: :py:obj:`Serializer`\ [\ :py:obj:`_T`\ , :py:obj:`str`\ ] Serializes objects to a string with a prefix. Resulting json values take the form of __prefix__@, where is the encoded value and __prefix__@ is the prefix that is used to differentiate between normal strings and encoded strings. Note that the part after the prefix is user-supplied and possibly unsafe. So something like an 'eval' should be out of the question! .. py:attribute:: prefix_format :value: '__{}__@{}' .. py:attribute:: prefix_expression .. py:attribute:: prefix_characters .. py:attribute:: prefix .. py:attribute:: prefix_length :value: 0 .. py:attribute:: _encode .. py:attribute:: _decode .. py:method:: encode(obj: _T) -> str .. py:method:: decode(string: str) -> _T .. py:class:: DictionarySerializer(target: type[_T], keys: collections.abc.Iterable[str]) Bases: :py:obj:`Serializer`\ [\ :py:obj:`_T`\ , :py:obj:`onegov.core.types.JSONObject_ro`\ ] Serialises objects that can be built with keyword arguments. For example:: class Point: def __init__(self, x, y): self.x = x self.y = y Can be serialised using:: DictionarySerializer(Point, ('x', 'y')) Which results in something like this in JSON:: {'x': 1, 'y': 2} As the internal __dict__ represenation is of no concern, __slots__ may be used: class Point: __slots__ = ('x', 'y') def __init__(self, x, y): self.x = x self.y = y .. py:attribute:: keys .. py:method:: encode(obj: _T) -> onegov.core.types.JSONObject_ro .. py:method:: decode(dictionary: onegov.core.types.JSONObject_ro) -> _T .. py:class:: Serializers Organises the different serializer implementations under a unifiying interface. This allows the actual encoder/decoder to call a single class without having to worry how the various serializers need to be looked up and called. .. py:attribute:: by_prefix :type: dict[str, PrefixSerializer[Any]] .. py:attribute:: by_keys :type: dict[frozenset[str], DictionarySerializer[Any]] .. py:attribute:: known_key_lengths :type: set[int] .. py:property:: registered :type: collections.abc.Iterator[AnySerializer[Any]] .. py:method:: register(serializer: PrefixSerializer[Any] | DictionarySerializer[Any]) -> None .. py:method:: serializer_for(value: object) -> AnySerializer[Any] | None .. py:method:: serializer_for_string(string: str) -> PrefixSerializer[Any] | None .. py:method:: serializer_for_dict(dictionary: dict[str, Any]) -> DictionarySerializer[Any] | None .. py:method:: serializer_for_class(cls: type[_T]) -> AnySerializer[_T] | None .. py:method:: encode(value: object) -> onegov.core.types.JSON_ro .. py:method:: decode(value: Any) -> Any .. py:data:: default_serializers .. py:class:: Serializable Classes inheriting from this base are serialised using the :class:`DictionarySerializer` class. The keys that should be used need to be specified as follows:: class Point(Serializable, keys=('x', 'y')): def __init__(self, x, y): self.x = x self.y = y .. py:attribute:: serialized_keys :type: ClassVar[collections.abc.Collection[str]] .. py:method:: serializers() -> Serializers :classmethod: .. py:method:: __init_subclass__(keys: collections.abc.Collection[str], **kwargs: Any) :classmethod: .. py:data:: DEFAULT_DUMPS_OPTIONS .. py:function:: dumps(obj: None, *, ensure_ascii: bool = False, sort_keys: bool = False, indent: Literal[2] | None = None) -> None dumps(obj: Any, *, ensure_ascii: bool = False, sort_keys: bool = False, indent: Literal[2] | None = None) -> str .. py:function:: dumps_bytes(obj: Any, *, sort_keys: bool = False, indent: Literal[2] | None = None) -> bytes .. py:function:: loads(txt: str | bytes | bytearray | memoryview | None) -> Any .. py:function:: dump(data: Any, fp: _typeshed.SupportsWrite[str], *, sort_keys: bool = False, indent: Literal[2] | None = None) -> None .. py:function:: dump_bytes(data: Any, fp: _typeshed.SupportsWrite[bytes], *, sort_keys: bool = False, indent: Literal[2] | None = None) -> None .. py:function:: load(fp: _typeshed.SupportsRead[str | bytes]) -> Any