You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 24, 2025. It is now read-only.
I would like to ask to make wrappers for functions from the first class, into the functions (from the 2nd link), so that they could be used without too much hassle in pyspark.
Hey,
I have problems binding to functions schema_of_xml, schema_of_xml_array, from_xml_string in https://github.com/databricks/spark-xml/blob/ef3af6aa5b29763dbfe72cb23c7755d2bfe4d5a7/src/main/scala/com/databricks/spark/xml/package.scala from pyspark (yes, python). The py4j wrapper library complains about the package wrapping the functions directly.
I do not have such problem with function from_xml inside of the object https://github.com/databricks/spark-xml/blob/ef3af6aa5b29763dbfe72cb23c7755d2bfe4d5a7/src/main/scala/com/databricks/spark/xml/functions.scala. It is understood by by py4j without a problem, and can be used with just providing correct py4j wrapper code around it.
I would like to ask to make wrappers for functions from the first class, into the functions (from the 2nd link), so that they could be used without too much hassle in pyspark.
Thx
G