Big Data
8 min read

Bridging Complexities: Migrating Hive UDFs to BigQuery Remote Functions - Part 1

Published on
February 20, 2024
Author
Csaba Kassai
Csaba Kassai
Head of Data
Subscribe to our newsletter
Subscribe
Bridging Complexities: Migrating Hive UDFs to BigQuery Remote Functions - Part 1

In the realm of data warehousing and analytics, the transition from traditional platforms like Apache Hive to modern cloud solutions such as Google BigQuery often presents unique challenges. In my previous posts, I showed you through practical examples how Hive UDFs, UDAFs, and UDTFs can be migrated into BigQuery using SQL. In some cases, SQL falls short and we need to look for alternatives. One such case is migrating complex User-Defined Functions (UDFs) that cannot be directly translated into BigQuery SQL. This post delves into the nuances of migrating a Hive UDF to a BigQuery Remote Function, providing a solution when native BigQuery native options or SQL UDFs cannot deliver what is required.

BigQuery Remote Functions

BigQuery Remote Functions represent an innovative leap in data processing, enabling the integration of sophisticated external computations into SQL queries. This feature is especially pivotal for scenarios where the complexity of data operations surpasses the intrinsic functions available in BigQuery. By facilitating HTTP calls to Cloud Functions or Cloud Run within SQL queries, Remote Functions offers a versatile solution for embedding intricate logic, such as custom analytics, machine learning model predictions, or dynamic data transformations, directly into data analysis workflows.

The utility of Remote Functions extends significantly to the realm of migrating Hive UDFs to BigQuery, particularly for those UDFs encapsulating complex logic that cannot be directly translated into BigQuery's SQL dialect. This bridge between BigQuery and external computational logic ensures that even the most complex Hive UDFs can find a new home within BigQuery's ecosystem, preserving the integrity and functionality of existing data pipelines during migration.

However, the advantages come with trade-offs. Utilizing external services can introduce latency, affecting query performance. Additionally, the overhead of managing these external endpoints and ensuring their security can complicate infrastructure management. Also, the costs associated with external service calls and potential increases in query processing times necessitate careful planning and optimization. Despite these considerations, when deployed thoughtfully, BigQuery Remote Functions can significantly smooth the transition from Hive to BigQuery, offering a robust pathway for migrating complex UDFs without sacrificing their functionality.

Example: XML Processing

For demonstration let’s use a Hive UDF designed to parse XML documents, extract specific pieces of information, and transform this data into a structured format. Handling XML documents is not part of BigQuery native functionality and it is quite cumbersome and sometimes impossible to implement XML processing in SQL.

Here is a simple Hive UDF example that is capable to extract some information from an XML document:

This UDF, XmlProductInfoExtractorUDF, uses Java's XML parsing libraries to navigate and extract data from a nested XML structure based on XPath expressions.
You can find the full Java class here. ( Disclaimer: I did not test this Hive function, it is just for demonstration purposes) 

Moving to the BigQuery side

Directly migrating this type of complex XML processing logic to BigQuery's native SQL is not feasible due to the lack of XML parsing functions and the procedural nature of the processing involved. BigQuery does not inherently support XPath queries or similar XML navigation and extraction mechanisms within its SQL dialect.

The main advantage of using BigQuery Remote Functions that we can reuse most of the Java code. Of course we need to get rid of the Hive specific boilerplate code and replace it with the Cloud Functions framework, but core processing logic can be lift and shifted. This reduces the effort required and the risk of the migration. 

Here is a possible implementation of the Cloud Function that provides the same functionality as the Hive UDF

You can find the full source code here

The class has two methods: the ‘service’ method contains the boilerplate needed by Cloud Functions and the BigQuery Remote functions framework. Important to mention that BigQuery will invoke this HTTP endpoint with multiple set of parameters, BigQuery batches the execution to increase efficiency. The ‘service’ method also takes care of this aspect, by invoking the ‘process’ method for every set of parameters.
You will notice that the process method is very similar to the ‘evaluate’ method of the Hive UDF, this is where the original processing code was moved over. The only difference is that the type of the inputs and outputs have been changed from the Hive specific types to generic json datatypes. 

Next Up: Part 2 - Testing and deployment of the migrated BQ Remote Function

Concluding our exploration into the transformative journey of migrating Hive UDFs to BigQuery Remote Functions, we've delved into the intricate process of rehousing complex XML parsing logic within the cloud-native architecture of BigQuery. Through the example of the XmlProductInfoExtractorUDF, we've illustrated how the essence of sophisticated Hive UDFs can be preserved and adapted to BigQuery's ecosystem, leveraging Google Cloud Functions to encapsulate the original Java code with minimal adjustments. This strategic approach not only mitigates migration risks but also significantly streamlines the transition process, ensuring that the advanced functionalities of Hive UDFs continue to thrive in BigQuery's scalable and efficient environment.

As we look ahead, the next post in this series will guide you through the practical aspects of testing and deploying your newly created BigQuery Remote Function. We'll cover the essentials of setting up a robust testing framework to validate the function's performance and accuracy, followed by step-by-step instructions on deploying the function to ensure seamless integration within your BigQuery workflows. Stay tuned for these invaluable insights that will equip you with the knowledge to confidently elevate your data processing capabilities to new heights in BigQuery.

Author
Csaba Kassai
Head of Data
Subscribe to our newsletter
Subscribe