Skip to content

Commit 2ca615c

Browse files
authored
Update Google Cloud branding (#10642)
1 parent 934115b commit 2ca615c

File tree

92 files changed

+515
-523
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

92 files changed

+515
-523
lines changed

BREEZE.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -433,15 +433,15 @@ Those are currently installed CLIs (they are available as aliases to the docker
433433
+-----------------------+----------+-------------------------------------------------+-------------------+
434434
| Microsoft Azure | az | mcr.microsoft.com/azure-cli:latest | .azure |
435435
+-----------------------+----------+-------------------------------------------------+-------------------+
436-
| Google Cloud Platform | bq | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
436+
| Google Cloud | bq | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
437437
| +----------+-------------------------------------------------+-------------------+
438438
| | gcloud | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
439439
| +----------+-------------------------------------------------+-------------------+
440440
| | gsutil | gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud |
441441
+-----------------------+----------+-------------------------------------------------+-------------------+
442442

443443
For each of the CLIs we have also an accompanying ``*-update`` alias (for example ``aws-update``) which
444-
will pull the latest image for the tool. Note that all Google Cloud Platform tools are served by one
444+
will pull the latest image for the tool. Note that all Google Cloud tools are served by one
445445
image and they are updated together.
446446

447447
Also - in case you run several different Breeze containers in parallel (from different directories,

CONTRIBUTING.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -817,7 +817,7 @@ We support the following types of tests:
817817
additional services running, such as Postgres, Mysql, Kerberos, etc.
818818

819819
* **System tests** are automatic tests that use external systems like
820-
Google Cloud Platform. These tests are intended for an end-to-end DAG execution.
820+
Google Cloud. These tests are intended for an end-to-end DAG execution.
821821

822822
For details on running different types of Airflow tests, see `TESTING.rst <TESTING.rst>`_.
823823

TESTING.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Airflow Test Infrastructure
3030
marked as integration tests but soon they will be separated by ``pytest`` annotations.
3131

3232
* **System tests** are automatic tests that use external systems like
33-
Google Cloud Platform. These tests are intended for an end-to-end DAG execution.
33+
Google Cloud. These tests are intended for an end-to-end DAG execution.
3434
The tests can be executed on both the current version of Apache Airflow and any of the older
3535
versions from 1.10.* series.
3636

@@ -612,7 +612,7 @@ visible to anything that you have installed inside the Docker container.
612612
Currently forwarded credentials are:
613613
* credentials stored in ``${HOME}/.aws`` for the aws Amazon Web Services client
614614
* credentials stored in ``${HOME}/.azure`` for the az Microsoft Azure client
615-
* credentials stored in ``${HOME}/.config`` for gcloud Google Cloud Platform client (among others)
615+
* credentials stored in ``${HOME}/.config`` for gcloud Google Cloud client (among others)
616616
* credentials stored in ``${HOME}/.docker`` for docker client
617617

618618
Adding a New System Test
@@ -878,7 +878,7 @@ your local sources to the ``/opt/airflow`` location of the sources within the co
878878
Setup VM on GCP with SSH forwarding
879879
-----------------------------------
880880

881-
Below are the steps you need to take to set up your virtual machine in the Google Cloud Platform.
881+
Below are the steps you need to take to set up your virtual machine in the Google Cloud.
882882

883883
1. The next steps will assume that you have configured environment variables with the name of the network and
884884
a virtual machine, project ID and the zone where the virtual machine will be created

UPDATING.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -521,7 +521,7 @@ The following configurations have been moved from `[core]` to the new `[logging]
521521
#### Remove gcp_service_account_keys option in airflow.cfg file
522522

523523
This option has been removed because it is no longer supported by the Google Kubernetes Engine. The new
524-
recommended service account keys for the Google Cloud Platform management method is
524+
recommended service account keys for the Google Cloud management method is
525525
[Workload Identity](https://cloud.google.com/kubernetes-engine/docs/how-to/workload-identity).
526526

527527
#### Fernet is enabled by default
@@ -1037,17 +1037,17 @@ have been made to the core (including core operators) as they can affect the int
10371037
of this provider.
10381038

10391039
This section describes the changes that have been made, and what you need to do to update your if
1040-
you use operators or hooks which integrate with Google services (including Google Cloud Platform - GCP).
1040+
you use operators or hooks which integrate with Google services (including Google Cloud - GCP).
10411041

10421042
#### Direct impersonation added to operators communicating with Google services
10431043
[Directly impersonating a service account](https://cloud.google.com/iam/docs/understanding-service-accounts#directly_impersonating_a_service_account)
10441044
has been made possible for operators communicating with Google services via new argument called `impersonation_chain`
10451045
(`google_impersonation_chain` in case of operators that also communicate with services of other cloud providers).
10461046
As a result, GCSToS3Operator no longer derivatives from GCSListObjectsOperator.
10471047

1048-
#### Normalize gcp_conn_id for Google Cloud Platform
1048+
#### Normalize gcp_conn_id for Google Cloud
10491049

1050-
Previously not all hooks and operators related to Google Cloud Platform use
1050+
Previously not all hooks and operators related to Google Cloud use
10511051
`gcp_conn_id` as parameter for GCP connection. There is currently one parameter
10521052
which apply to most services. Parameters like ``datastore_conn_id``, ``bigquery_conn_id``,
10531053
``google_cloud_storage_conn_id`` and similar have been deprecated. Operators that require two connections are not changed.
@@ -1082,7 +1082,7 @@ Following components were affected by normalization:
10821082
#### Changes to import paths and names of GCP operators and hooks
10831083

10841084
According to [AIP-21](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-21%3A+Changes+in+import+paths)
1085-
operators related to Google Cloud Platform has been moved from contrib to core.
1085+
operators related to Google Cloud has been moved from contrib to core.
10861086
The following table shows changes in import paths.
10871087

10881088
| Old path | New path |
@@ -1265,9 +1265,9 @@ The following table shows changes in import paths.
12651265
|airflow.contrib.sensors.gcs_sensor.GoogleCloudStorageUploadSessionCompleteSensor |airflow.providers.google.cloud.sensors.gcs.GCSUploadSessionCompleteSensor |
12661266
|airflow.contrib.sensors.pubsub_sensor.PubSubPullSensor |airflow.providers.google.cloud.sensors.pubsub.PubSubPullSensor |
12671267

1268-
#### Unify default conn_id for Google Cloud Platform
1268+
#### Unify default conn_id for Google Cloud
12691269

1270-
Previously not all hooks and operators related to Google Cloud Platform use
1270+
Previously not all hooks and operators related to Google Cloud use
12711271
``google_cloud_default`` as a default conn_id. There is currently one default
12721272
variant. Values like ``google_cloud_storage_default``, ``bigquery_default``,
12731273
``google_cloud_datastore_default`` have been deprecated. The configuration of
@@ -1408,7 +1408,7 @@ Now this parameter requires a value. To restore the previous behavior, configure
14081408
specifying the service account.
14091409

14101410
Detailed information about connection management is available:
1411-
[Google Cloud Platform Connection](https://airflow.apache.org/howto/connection/gcp.html).
1411+
[Google Cloud Connection](https://airflow.apache.org/howto/connection/gcp.html).
14121412

14131413

14141414
#### `airflow.providers.google.cloud.hooks.gcs.GCSHook`
@@ -2053,7 +2053,7 @@ If the `AIRFLOW_CONFIG` environment variable was not set and the
20532053
will discover its config file using the `$AIRFLOW_CONFIG` and `$AIRFLOW_HOME`
20542054
environment variables rather than checking for the presence of a file.
20552055

2056-
### Changes in Google Cloud Platform related operators
2056+
### Changes in Google Cloud related operators
20572057

20582058
Most GCP-related operators have now optional `PROJECT_ID` parameter. In case you do not specify it,
20592059
the project id configured in
@@ -2080,7 +2080,7 @@ Operators involved:
20802080

20812081
Other GCP operators are unaffected.
20822082

2083-
### Changes in Google Cloud Platform related hooks
2083+
### Changes in Google Cloud related hooks
20842084

20852085
The change in GCP operators implies that GCP Hooks for those operators require now keyword parameters rather
20862086
than positional ones in all methods where `project_id` is used. The methods throw an explanatory exception
@@ -2148,7 +2148,7 @@ gct_hook.create_transfer_job(body)
21482148
```
21492149
The change results from the unification of all hooks and adjust to
21502150
[the official recommendations](https://lists.apache.org/thread.html/e8534d82be611ae7bcb21ba371546a4278aad117d5e50361fd8f14fe@%3Cdev.airflow.apache.org%3E)
2151-
for the Google Cloud Platform.
2151+
for the Google Cloud.
21522152

21532153
The signature of `wait_for_transfer_job` method in `GCPTransferServiceHook` has changed.
21542154

@@ -2765,7 +2765,7 @@ of user-editable configuration properties. See
27652765
All Google Cloud Operators and Hooks are aligned and use the same client library. Now you have a single connection
27662766
type for all kinds of Google Cloud Operators.
27672767

2768-
If you experience problems connecting with your operator make sure you set the connection type "Google Cloud Platform".
2768+
If you experience problems connecting with your operator make sure you set the connection type "Google Cloud".
27692769

27702770
Also the old P12 key file type is not supported anymore and only the new JSON key files are supported as a service
27712771
account.

airflow/providers/amazon/aws/transfers/gcs_to_s3.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -40,10 +40,10 @@ class GCSToS3Operator(BaseOperator):
4040
For e.g to lists the CSV files from in a directory in GCS you would use
4141
delimiter='.csv'.
4242
:type delimiter: str
43-
:param gcp_conn_id: (Optional) The connection ID used to connect to Google Cloud Platform.
43+
:param gcp_conn_id: (Optional) The connection ID used to connect to Google Cloud.
4444
:type gcp_conn_id: str
45-
:param google_cloud_storage_conn_id: (Deprecated) The connection ID used to connect to Google Cloud
46-
Platform. This parameter has been deprecated. You should pass the gcp_conn_id parameter instead.
45+
:param google_cloud_storage_conn_id: (Deprecated) The connection ID used to connect to Google Cloud.
46+
This parameter has been deprecated. You should pass the gcp_conn_id parameter instead.
4747
:type google_cloud_storage_conn_id: str
4848
:param delegate_to: Google account to impersonate using domain-wide delegation of authority,
4949
if any. For this to work, the service account making the request must have

airflow/providers/google/ads/operators/ads.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ class GoogleAdsListAccountsOperator(BaseOperator):
4949
:type bucket: str
5050
:param object_name: GCS path to save the csv file. Must be the full file path (ex. `path/to/file.csv`)
5151
:type object_name: str
52-
:param gcp_conn_id: Airflow Google Cloud Platform connection ID
52+
:param gcp_conn_id: Airflow Google Cloud connection ID
5353
:type gcp_conn_id: str
5454
:param google_ads_conn_id: Airflow Google Ads connection ID
5555
:type google_ads_conn_id: str

airflow/providers/google/ads/transfers/ads_to_gcs.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ class GoogleAdsToGcsOperator(BaseOperator):
5050
:type bucket: str
5151
:param obj: GCS path to save the object. Must be the full file path (ex. `path/to/file.txt`)
5252
:type obj: str
53-
:param gcp_conn_id: Airflow Google Cloud Platform connection ID
53+
:param gcp_conn_id: Airflow Google Cloud connection ID
5454
:type gcp_conn_id: str
5555
:param google_ads_conn_id: Airflow Google Ads connection ID
5656
:type google_ads_conn_id: str

airflow/providers/google/cloud/example_dags/example_bigtable.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
2727
This DAG relies on the following environment variables:
2828
29-
* GCP_PROJECT_ID - Google Cloud Platform project
29+
* GCP_PROJECT_ID - Google Cloud project
3030
* CBT_INSTANCE_ID - desired ID of a Cloud Bigtable instance
3131
* CBT_INSTANCE_DISPLAY_NAME - desired human-readable display name of the Instance
3232
* CBT_INSTANCE_TYPE - type of the Instance, e.g. 1 for DEVELOPMENT

airflow/providers/google/cloud/example_dags/example_cloud_sql.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@
1818

1919
"""
2020
Example Airflow DAG that creates, patches and deletes a Cloud SQL instance, and also
21-
creates, patches and deletes a database inside the instance, in Google Cloud Platform.
21+
creates, patches and deletes a database inside the instance, in Google Cloud.
2222
2323
This DAG relies on the following OS environment variables
2424
https://airflow.apache.org/concepts.html#variables
25-
* GCP_PROJECT_ID - Google Cloud Platform project for the Cloud SQL instance.
25+
* GCP_PROJECT_ID - Google Cloud project for the Cloud SQL instance.
2626
* INSTANCE_NAME - Name of the Cloud SQL instance.
2727
* DB_NAME - Name of the database inside a Cloud SQL instance.
2828
"""

airflow/providers/google/cloud/example_dags/example_cloud_sql_query.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@
2121
2222
This DAG relies on the following OS environment variables
2323
24-
* GCP_PROJECT_ID - Google Cloud Platform project for the Cloud SQL instance
24+
* GCP_PROJECT_ID - Google Cloud project for the Cloud SQL instance
2525
* GCP_REGION - Google Cloud region where the database is created
2626
*
2727
* GCSQL_POSTGRES_INSTANCE_NAME - Name of the postgres Cloud SQL instance

0 commit comments

Comments
 (0)