-
Notifications
You must be signed in to change notification settings - Fork 130
Description
When trying to use a custom endpoint (regional endpoint or private service connect endpoint) for BigQuery, it fails for the BigQuery.writer method because this specific endpoint is hardcoded.
The example is similar to this: https://github.com/googleapis/java-bigquery/blob/main/samples/snippets/src/main/java/com/example/bigquery/CreateDatasetWithRegionalEndpoint.java
But instead of creating a dataset, is trying to use the writer method
Environment details
- API: BigQuery API, jobs: https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert
- OS type and version: Linux
- Java version: openjdk 11.0.15 2023-04-21
- version(s): 2.25.0
Steps to reproduce
- Use the following sample to load a local file: https://github.com/googleapis/java-bigquery/blob/main/samples/snippets/src/main/java/com/example/bigquery/LoadLocalFile.java
- Add setHost() with a custom endpoint to the BigQueryOption builder
- The custom host will not be used, instead it will default to "https://www.googleapis.com/upload/bigquery/v2/projects/"
Code example
public static void loadLocalFile(
String datasetName, String tableName, Path csvPath, FormatOptions formatOptions)
throws IOException, InterruptedException {
try {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
BigQuery bigquery =
BigQueryOptions.newBuilder()
.setHost("https://us-east1-bigquery.googleapis.com/")
.setProjectId("my-project")
.build()
.getService();
TableId tableId = TableId.of(datasetName, tableName);
....
// Imports a local file into a table.
try (TableDataWriteChannel writer = bigquery.writer(jobId, writeChannelConfiguration);
OutputStream stream = Channels.newOutputStream(writer)) {
Files.copy(csvPath, stream);
}
// Get the Job created by the TableDataWriteChannel and wait for it to complete.
Job job = bigquery.getJob(jobId);
Job completedJob = job.waitFor();
if (completedJob == null) {
System.out.println("Job not executed since it no longer exists.");
return;
} else if (completedJob.getStatus().getError() != null) {
System.out.println(
"BigQuery was unable to load local file to the table due to an error: \n"
+ job.getStatus().getError());
return;
}
....
```
#### Stack traceIt won' t necessary fail, it depends on the custom environment, but it's not using the specified host. For example, this is the message when not authenticating properly:
Local file not loaded.
com.google.cloud.bigquery.BigQueryException: 401 Unauthorized
POST https://googleapis.com/upload/bigquery/v2/projects/my-project/jobs?uploadType=resumable
#### External references such as API reference guides
- This is the method used to insert load jobs with metadata and streamed data as explained here: https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert
#### Any additional information below
The resumable URL is hardcoded as seen here:
https://github.com/googleapis/java-bigquery/blob/main/google-cloud-bigquery/src/main/java/com/google/cloud/bigquery/spi/v2/HttpBigQueryRpc.java#L728