dizcology
Repos
95
Followers
38
Following
6

A crash course in six episodes for software developers who want to become machine learning practitioners.

2581
854

Official Repo for Google Cloud AI Platform. Find samples for Vertex AI, Google Cloud's new unified ML platform at: https://github.com/GoogleCloudPlatform/vertex-ai-samples

A Python SDK for Vertex AI, a fully managed, end-to-end platform for data science and machine learning.

278
178

Source code (Python, Node.js and Java) for a demo we built which has been shown at a number of conferences, including IoT Solutions World Congress in Barcelona, Google Cloud Next 2019 and Google I/O 2019. Using the Coral Dev Board we show incredible fast machine learning on the edge with minimal power consumption.

13
0

Sample code for the AI in Motion demo

10
10

Events

Created at 5 days ago

feat: Add support for ordery_by in Metadata SDK list methods for Artifact, Execution and Context.

PiperOrigin-RevId: 488531299

Copybara import of the project:

-- c76a319f3e0abb6ecd2d273b4fa3ec5e50c1fc03 by Anthonios Partheniou partheniou@google.com:

chore: regenerate code with gapic-generator-python 1.4.4

-- 9c55be388397a620fe19fdb1ca0c3baa594f59af by Yu-Han Liu yuhanliu@google.com:

chore: regenerate with gapic-generator-python 1.4.4 COPYBARA_INTEGRATE_REVIEW=https://github.com/googleapis/python-aiplatform/pull/1779 from googleapis:regenerate-code-with-gapic-1-4-4 9c55be388397a620fe19fdb1ca0c3baa594f59af PiperOrigin-RevId: 488641639

feat: add Feature Store: Streaming Ingestion (write_feature_values()) and introduce Preview namespace to Vertex SDK

PiperOrigin-RevId: 489080135

Copybara import of the project:

-- 2fb2b63d3965f7535921bb3c306793ef7cdd7f6d by release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>:

chore(main): release 1.19.0

COPYBARA_INTEGRATE_REVIEW=https://github.com/googleapis/python-aiplatform/pull/1768 from googleapis:release-please--branches--main 2fb2b63d3965f7535921bb3c306793ef7cdd7f6d PiperOrigin-RevId: 489217395

chore: update 1.19.0 release notes

PiperOrigin-RevId: 489253566

chore: fix list method in _VertexAiPipelineBasedService class

PiperOrigin-RevId: 489464748

Created at 1 week ago
tests.system.aiplatform.test_e2e_tabular.TestEndToEndTabular: test_end_to_end_tabular failed

Note: #1741 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.


commit: fe6bf2bc04ec7895081e8a6eafc6f21f06833e95 buildURL: Build Status, Sponge status: failed

@pytest.fixture(scope="class")
def tear_down_resources(self, shared_state: Dict[str, Any]):
    """Delete every Vertex AI resource created during test"""

    yield

    # TODO(b/218310362): Add resource deletion system tests
    # Bring all Endpoints to the front of the list
    # Ensures Models are undeployed first before we attempt deletion
    shared_state["resources"].sort(
        key=lambda r: 1
        if isinstance(r, aiplatform.Endpoint)
        or isinstance(r, aiplatform.MatchingEngineIndexEndpoint)
        else 2
    )

    for resource in shared_state["resources"]:
        try:
            if isinstance(
                resource,
                (
                    aiplatform.Endpoint,
                    aiplatform.Featurestore,
                    aiplatform.MatchingEngineIndexEndpoint,
                ),
            ):
                # For endpoint, undeploy model then delete endpoint
                # For featurestore, force delete its entity_types and features with the featurestore
              resource.delete(force=True)

tests/system/aiplatform/e2e_base.py:197:


google/cloud/aiplatform/models.py:1750: in delete self.undeploy_all(sync=sync) google/cloud/aiplatform/models.py:1721: in undeploy_all self._sync_gca_resource() google/cloud/aiplatform/base.py:645: in _sync_gca_resource self._gca_resource = self._get_gca_resource(resource_name=self.resource_name) google/cloud/aiplatform/base.py:675: in resource_name self._assert_gca_resource_is_available() google/cloud/aiplatform/models.py:231: in _assert_gca_resource_is_available super()._assert_gca_resource_is_available()


self = <google.cloud.aiplatform.models.Endpoint object at 0x7f69431498b0> failed with Training failed with: code: 13 message: "INTERNAL"

def _assert_gca_resource_is_available(self) -> None:
    """Helper method to raise when accessing properties that do not exist.

    Overrides VertexAiResourceNoun to provide a more informative exception if
    resource creation has failed asynchronously.

    Raises:
        RuntimeError: When resource has not been created.
    """
    if not getattr(self._gca_resource, "name", None):
      raise RuntimeError(
            f"{self.__class__.__name__} resource has not been created."
            + (
                f" Resource failed with: {self._exception}"
                if self._exception
                else ""
            )
        )

E RuntimeError: Endpoint resource has not been created. Resource failed with: Training failed with: E code: 13 E message: "INTERNAL"

google/cloud/aiplatform/base.py:1318: RuntimeError

Created at 1 week ago
tests.system.aiplatform.test_model_upload.TestModelUploadAndUpdate: test_upload_and_deploy_xgboost_model failed

Note: #1697 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.


commit: 414e39b46acb9c49cb7b650b4e168c78bc6c49d2 buildURL: Build Status, Sponge status: failed

@pytest.fixture(scope="class")
def delete_staging_bucket(self, shared_state: Dict[str, Any]):
    """Delete the staging bucket and all it's contents"""

    yield

    # Get the staging bucket used for testing and wipe it
    bucket = shared_state["bucket"]
  bucket.delete(force=True)

tests/system/aiplatform/e2e_base.py:116:


.nox/system-3-8/lib/python3.8/site-packages/google/cloud/storage/bucket.py:1482: in delete blobs = list( .nox/system-3-8/lib/python3.8/site-packages/google/api_core/page_iterator.py:214: in _items_iter for page in self._page_iter(increment=False): .nox/system-3-8/lib/python3.8/site-packages/google/api_core/page_iterator.py:253: in _page_iter page = self._next_page() .nox/system-3-8/lib/python3.8/site-packages/google/api_core/page_iterator.py:382: in _next_page response = self._get_next_page_response() .nox/system-3-8/lib/python3.8/site-packages/google/api_core/page_iterator.py:441: in _get_next_page_response return self.api_request( .nox/system-3-8/lib/python3.8/site-packages/google/cloud/storage/_http.py:80: in api_request return call() .nox/system-3-8/lib/python3.8/site-packages/google/api_core/retry.py:286: in retry_wrapped_func return retry_target( .nox/system-3-8/lib/python3.8/site-packages/google/api_core/retry.py:189: in retry_target return target()


self = <google.cloud.storage._http.Connection object at 0x7f732dfb1610> method = 'GET', path = '/b/ucaip-sample-tests-vertex-staging-us-central1/o' query_params = {'maxResults': 257, 'projection': 'noAcl'}, data = None content_type = None, headers = None, api_base_url = None, api_version = None expect_json = True, _target_object = None, timeout = 60, extra_api_info = None

def api_request(
    self,
    method,
    path,
    query_params=None,
    data=None,
    content_type=None,
    headers=None,
    api_base_url=None,
    api_version=None,
    expect_json=True,
    _target_object=None,
    timeout=_DEFAULT_TIMEOUT,
    extra_api_info=None,
):
    """Make a request over the HTTP transport to the API.

    You shouldn't need to use this method, but if you plan to
    interact with the API using these primitives, this is the
    correct one to use.

    :type method: str
    :param method: The HTTP method name (ie, ``GET``, ``POST``, etc).
                   Required.

    :type path: str
    :param path: The path to the resource (ie, ``'/b/bucket-name'``).
                 Required.

    :type query_params: dict or list
    :param query_params: A dictionary of keys and values (or list of
                         key-value pairs) to insert into the query
                         string of the URL.

    :type data: str
    :param data: The data to send as the body of the request. Default is
                 the empty string.

    :type content_type: str
    :param content_type: The proper MIME type of the data provided. Default
                         is None.

    :type headers: dict
    :param headers: extra HTTP headers to be sent with the request.

    :type api_base_url: str
    :param api_base_url: The base URL for the API endpoint.
                         Typically you won't have to provide this.
                         Default is the standard API base URL.

    :type api_version: str
    :param api_version: The version of the API to call.  Typically
                        you shouldn't provide this and instead use
                        the default for the library.  Default is the
                        latest API version supported by
                        google-cloud-python.

    :type expect_json: bool
    :param expect_json: If True, this method will try to parse the
                        response as JSON and raise an exception if
                        that cannot be done.  Default is True.

    :type _target_object: :class:`object`
    :param _target_object:
        (Optional) Protected argument to be used by library callers. This
        can allow custom behavior, for example, to defer an HTTP request
        and complete initialization of the object at a later time.

    :type timeout: float or tuple
    :param timeout: (optional) The amount of time, in seconds, to wait
        for the server response.

        Can also be passed as a tuple (connect_timeout, read_timeout).
        See :meth:`requests.Session.request` documentation for details.

    :type extra_api_info: string
    :param extra_api_info: (optional) Extra api info to be appended to
        the X-Goog-API-Client header

    :raises ~google.cloud.exceptions.GoogleCloudError: if the response code
        is not 200 OK.
    :raises ValueError: if the response content type is not JSON.
    :rtype: dict or str
    :returns: The API response payload, either as a raw string or
              a dictionary if the response is valid JSON.
    """
    url = self.build_api_url(
        path=path,
        query_params=query_params,
        api_base_url=api_base_url,
        api_version=api_version,
    )

    # Making the executive decision that any dictionary
    # data will be sent properly as JSON.
    if data and isinstance(data, dict):
        data = json.dumps(data)
        content_type = "application/json"

    response = self._make_request(
        method=method,
        url=url,
        data=data,
        content_type=content_type,
        headers=headers,
        target_object=_target_object,
        timeout=timeout,
        extra_api_info=extra_api_info,
    )

    if not 200 <= response.status_code < 300:
      raise exceptions.from_http_response(response)

E google.api_core.exceptions.NotFound: 404 GET https://storage.googleapis.com/storage/v1/b/ucaip-sample-tests-vertex-staging-us-central1/o?maxResults=257&projection=noAcl&prettyPrint=false: The specified bucket does not exist.

.nox/system-3-8/lib/python3.8/site-packages/google/cloud/_http/init.py:494: NotFound

Created at 1 week ago

chore(deps): update dependency googleapis-common-protos to v1.57.0 (#1493)

chore(main): release 1.6.2 (#1494)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

fix: snippetgen should call await on the operation coroutine before calling result

Created at 1 week ago
pull request opened
fix: snippetgen should call await on the operation coroutine before c…

Currently the generated snippet contains the following code for async client that returns a long running operation, such as Vision API's import_product_sets:

    ...
    operation = client.import_product_sets(request=request)
    ...
    response = await operation.result()
    ...

This is incorrect since operation is a coroutine that needs to be awaited to return an actual operation (which has the result attribute).

The proposed fix is to change the generated code to:

     response = (await operation).result()
Created at 1 week ago
dizcology create branch snippetgen-fix-async-lro
Created at 1 week ago

feat: add service_account to batch_prediction_job in aiplatform v1 batch_prediction_job.proto

PiperOrigin-RevId: 488416174

feat: added TrainProcessorVersion, EvaluateProcessorVersion, GetEvaluation, and ListEvaluations v1beta3 APIs feat: added evaluation.proto feat: added document_schema field in ProcessorVersion processor.proto feat: added image_quality_scores field in Document.Page in document.proto feat: added font_family field in Document.Style in document.proto

PiperOrigin-RevId: 488417413

chore: regenerate API index

feat: add Pipeline.secret_environment, Action.secret_environment, VirtualMachine.reservation

PiperOrigin-RevId: 488460572

chore: regenerate API index

feat: added CreateSshPublicKey RPC

PiperOrigin-RevId: 488460648

chore: regenerate API index

feat: added google.api.Service.publishing and client libraries settings feat: added google.api.JwtLocation.cookie feat: new fields in enum google.api.ErrorReason fix: deprecate google.api.Endpoint.aliases fix: deprecate google.api.BackendRule.min_deadline docs: minor updates to comments

PiperOrigin-RevId: 488484261

feat: add compact placement feature for node pools

Use a compact placement policy to specify that nodes within the node pool should be placed in closer physical proximity to each other within a zone. Having nodes closer to each other can reduce network latency between nodes, which can be useful for tightly-coupled batch workloads.

PiperOrigin-RevId: 488490422

feat: added Sku.geo_taxonomy fix: more oauth scopes

PiperOrigin-RevId: 488493014

feat: add support for additional HMAC algorithms

PiperOrigin-RevId: 488651504

feat: Add OsConfig patch_job_log proto for Cloud Platform documentation

PiperOrigin-RevId: 488663043

feat: added field_mask field in DocumentOutputConfig.GcsOutputConfig in document_io.proto

PiperOrigin-RevId: 488680436

feat: add missing_value_interpretations to AppendRowsRequest

PiperOrigin-RevId: 488693558

Created at 1 week ago

fix: snippetgen handling of repeated enum field (#1443)

  • fix: snippetgen handling of repeated enum field

feat: Add typing to proto.Message based class attributes (#1474)

  • feat: Add typing to proto.Message based class attributes

  • fix: Apply actual mutable flavor for Sequence/Mapping

  • fix: Update iterators and all tests

  • fix: Update client and goldens

  • fix: Remove goldens to resolve conflicts with updated main branch

  • fix: Conform with PEP 484 and mypy new default (no_implicit_optional=True)

  • fix: Conform with PEP 484 and mypy new default (no_implicit_optional=True)

Co-authored-by: Anthonios Partheniou partheniou@google.com

chore(main): release 1.6.0 (#1476)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

chore(deps): update dependency pytest-asyncio to v0.20.2 (#1489)

chore(deps): update dependency setuptools to v65.5.1 (#1485)

Co-authored-by: Anthonios Partheniou partheniou@google.com

fix: allow google-cloud-documentai < 3 (#1487)

chore: fix docs build (#1488)

  • chore: fix docs build

This addresses the error "duplicate object description" seen in the docs build of downstream clients.

  • regenerate golden files

chore: fix url in templated setup.py (#1486)

  • chore: fix url in templated setup.py

  • regenerate golden files

fix: fix typo in testing/constraints-3.7.txt (#1483)

  • fix: fix typo in testing/constraints-3.7.txt

  • generate goldens

  • Remove erroneous comment

  • generate golden files

chore(main): release 1.6.1 (#1490)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

Created at 1 week ago
chore: regenerate code with gapic-generator-python 1.4.4

Merged at https://github.com/googleapis/python-aiplatform/commit/43e28052d798c380de6e102edbe257a0100738cd

Created at 1 week ago
pull request closed
chore: regenerate code with gapic-generator-python 1.4.4

I ran the following steps to produce this PR:

  1. Create a branch in this repo
  2. Fix a replacement in owlbot.py that wasn't working properly
  3. Clone googleapis/googleapis and change the version of _gapic_generator_python_version to 1.4.4 in WORKSPACE.
  4. Run bazel build //google/cloud/aiplatform/v1:aiplatform-v1-py and bazel build //google/cloud/aiplatform/v1beta1:aiplatform-v1beta1-py
  5. In a branch for this repo, run
docker run --rm --user $(id -u):$(id -g) \
  -v $(pwd):/repo \
  -v $HOME/<PATH_TO_YOUR_>/googleapis/bazel-bin:/bazel-bin \
  gcr.io/cloud-devrel-public-resources/owlbot-cli:latest copy-bazel-bin \
  --source-dir /bazel-bin --dest /repo
  1. In a branch for this repo, run
docker run --user $(id -u):$(id -g) --rm -v $(pwd):/repo -w /repo gcr.io/cloud-devrel-public-resources/owlbot-python:latest
Created at 1 week ago

chore: regenerate with gapic-generator-python 1.4.4

Created at 1 week ago

docs: update README with new link for AI Platform API

PiperOrigin-RevId: 487676461

chore: update PR template

PiperOrigin-RevId: 488355190

Merge branch 'main' into regenerate-code-with-gapic-1-4-4

Created at 1 week ago

feat(googleads): Protos and build files for Google Ads API v12

Committer: @raibaz PiperOrigin-RevId: 484011462

chore: regenerate API index

chore(bazel): update Go generator to v0.33.3

PiperOrigin-RevId: 484304984

feat: Add allow_failure, exit_code, and allow_exit_code to BuildStep message

Committer: @arvinddayal PiperOrigin-RevId: 484308212

chore(ruby): Update Ruby generator to 0.17.1

PiperOrigin-RevId: 484308308

feat: add PHP, Ruby, C# library rules for the Cloud EKG API

PiperOrigin-RevId: 484374856

chore: regenerate API index

chore: update to gapic-generator-python 1.5.0

feat: add support for google.cloud.<api>.__version__ PiperOrigin-RevId: 484665853

chore: regenerate API index

feat: Integration of Cloud Build with Artifact Registry

Committer: @amcooper PiperOrigin-RevId: 484745059

feat: Initial launch of SA360 reporting API

PiperOrigin-RevId: 484920548

chore: regenerate API index

docs: Clarify interactive logging TTL behavior

PiperOrigin-RevId: 485069403

feat: add Eco Routes feature to ComputeRoutes feat: add Route Token feature to ComputeRoutes feat: add Fuel Consumption feature to ComputeRoutes

PiperOrigin-RevId: 485094668

chore: regenerate API index

docs: Update documentation to reflect new Cloud Log messages for Fleet Engine

PiperOrigin-RevId: 485111970

chore: Update autogenerated Ruby samples to include snippet methods

PiperOrigin-RevId: 485148946

feat: add policy based routing" will work

PiperOrigin-RevId: 485359269

chore: regenerate API index

docs: updated comment for Route.route_token

PiperOrigin-RevId: 485396820

Created at 1 week ago

chore: add test for gapic import

PiperOrigin-RevId: 487566342

fix: correct data file gcs path for import_data_text_sentiment_analysis_sample test

PiperOrigin-RevId: 487588905

docs: update README with new link for AI Platform API

PiperOrigin-RevId: 487676461

chore: update PR template

PiperOrigin-RevId: 488355190

Created at 1 week ago

chore: add test for gapic import

PiperOrigin-RevId: 487566342

fix: correct data file gcs path for import_data_text_sentiment_analysis_sample test

PiperOrigin-RevId: 487588905

Merge branch 'main' into regenerate-code-with-gapic-1-4-4

Created at 2 weeks ago

chore: fix todo comments formatting in model monitoring for batch prediction

PiperOrigin-RevId: 487395425

Merge branch 'main' into regenerate-code-with-gapic-1-4-4

Created at 2 weeks ago

feat: add support for HTTPS URI pipeline templates (#1683)

First-party pipelines are not yet available in AR, meaning other than using a local file, the only way to access a first-party pipeline is using its GitHub URI. Since support was added for AR URIs, it is not much more effort to support general HTTPS URIs.

  • [x] Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • [x] Ensure the tests and linter pass
  • [x] Code coverage does not decrease (if any source code was changed)
  • [x] Appropriate docs were updated (if necessary)

Fixes b/247878583 🦕

chore: Add headers for CPR model server errors. (#1701)

  • chore: Add headers for CPR model server errors.

  • chore: Fix comments.

feat: add model_source_info to Model in aiplatform v1beta1 model.proto (#1691)

  • feat: release SensitiveAction Cloud Logging payload to v1

PiperOrigin-RevId: 476083958

Source-Link: https://github.com/googleapis/googleapis/commit/fafd03f8da224663da0d83ee7c6ed1d84f3cb2e6

Source-Link: https://github.com/googleapis/googleapis-gen/commit/79c1b9ce243374735651e72cdebcbed99f5e4e65 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNzljMWI5Y2UyNDMzNzQ3MzU2NTFlNzJjZGViY2JlZDk5ZjVlNGU2NSJ9

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • feat: add model_source_info to Model in aiplatform v1 model.proto

PiperOrigin-RevId: 476193748

Source-Link: https://github.com/googleapis/googleapis/commit/a7f38907a05fa1610847e42a0efcbd1be20b064a

Source-Link: https://github.com/googleapis/googleapis-gen/commit/5589b9310a6ed26b5681461c476d57372acd1264 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNTU4OWI5MzEwYTZlZDI2YjU2ODE0NjFjNDc2ZDU3MzcyYWNkMTI2NCJ9

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • feat: add model_source_info to Model in aiplatform v1beta1 model.proto

PiperOrigin-RevId: 476411826

Source-Link: https://github.com/googleapis/googleapis/commit/72f0faae32860689ca6e47833fd9fc4210c2ae50

Source-Link: https://github.com/googleapis/googleapis-gen/commit/7909f5b1d51349dcefbe370f6a488981b80c1bfd Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNzkwOWY1YjFkNTEzNDlkY2VmYmUzNzBmNmE0ODg5ODFiODBjMWJmZCJ9

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • feat: Add an enum value for raw locations from Apple platforms

PiperOrigin-RevId: 476961484

Source-Link: https://github.com/googleapis/googleapis/commit/695134be07e7f85a59eef40840fe693be51468e6

Source-Link: https://github.com/googleapis/googleapis-gen/commit/49457f9e7aae89cfe0491af4dbd9961a90125d32 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNDk0NTdmOWU3YWFlODljZmUwNDkxYWY0ZGJkOTk2MWE5MDEyNWQzMiJ9

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Anthonios Partheniou partheniou@google.com Co-authored-by: Yu-Han Liu yuhanliu@google.com Co-authored-by: gcf-merge-on-green[bot] <60162190+gcf-merge-on-green[bot]@users.noreply.github.com>

fix: fix endpoint parsing in ModelDeploymentMonitoringJob.update (#1671)

  • fix: fix endpoint parsing in ModelDeploymentMonitoringJob.update() function

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • addressed PR feedback

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • addressed PR comments

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • addressed more PR feedback

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • addressed more PR comments

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • removed unused code

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • addressed more PR feedback

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • fixing linter issues

  • addressed more PR comments

  • fixing pylint errors

  • 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

  • silencing unused import warning

  • fixed unused import error

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>

docs: fix typos (#1709)

feat: Support complex metrics in Vertex Experiments (#1698)

  • Experiments complex metrics (#8)

  • feat: new class and API for metrics

  • update system test

  • update high level log method

  • fix system test

  • update example

  • change from system schema to google schema

  • fix: import error

  • Update log_classification_metrics_sample.py

  • Update samples/model-builder/experiment_tracking/log_classification_metrics_sample.py

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com

  • Update log_classification_metrics_sample_test.py

  • Update samples/model-builder/conftest.py

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com

  • fix: unit test

  • fix comments

  • fix comments and update google.ClassificationMetrics

  • fix comments and update ClassificationMetrics class

  • fix: ClassificationMetrics doesn't catch params with value=0

  • add sample for get_classification_metrics

  • fix linting

  • add todos

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com

fix: project/location parsing for nested resources (#1700)

  • testing parsing

  • adding util function

  • removing print statements

  • adding changes

  • using regex and dict

  • lint check

  • adding fs test for passing in location and project

  • comment fix

  • adding docstring changes

  • fixing featurestore unit tests

  • lint

fix: show inherited SDK methods in pydoc (#1707)

Co-authored-by: nayaknishant nishantnayak@google.com

chore: surface CPR docs, and reformat docstrings (#1708)

  • chore: add prediction module to official SDK docs

  • chore: Make prediction module docs discoverable

  • chore: rename doc title

  • chore: fix formatting of docstrings

  • chore: fix docstring spacing issues

  • chore: another attempt to fix code block

  • chore: yet another attempt to fix code block

  • chore: change code blocks to use code-block

  • chore: fix spacing

  • chore: more docstring formatting changes

  • fix: more docstring format changes

  • chore: more formatting changes

  • chore: fix lint

  • chore: more formatting changes

  • chore: update comments

  • Update google/cloud/aiplatform/prediction/local_model.py

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com

  • Update google/cloud/aiplatform/prediction/local_model.py

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com

  • chore: fix typo

Co-authored-by: Dan Lee 71398022+dandhlee@users.noreply.github.com Co-authored-by: Rosie Zou rosiezou@users.noreply.github.com

fix(deps): require protobuf >= 3.20.2 (#1699)

  • chore: exclude requirements.txt file from renovate-bot

Source-Link: https://github.com/googleapis/synthtool/commit/f58d3135a2fab20e225d98741dbc06d57459b816 Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:7a40313731a7cb1454eef6b33d3446ebb121836738dc3ab3d2d3ded5268c35b6

  • update constraints files

  • fix(deps): require protobuf 3.20.2

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com> Co-authored-by: Anthonios Partheniou partheniou@google.com Co-authored-by: Sara Robinson sararob@users.noreply.github.com

chore(main): release 1.18.0 (#1676)

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>

docs(samples): improve docstring of Vertex AI Python SDK Model Registry samples (#1705)

  • improve doc string

  • nox tests passed

  • rename location

Co-authored-by: Andrew Ferlitsch aferlitsch@google.com

chore: adding SDK team to CODEOWNERS (#1713)

  • chore: adding SDK team to CODEOWNERS

  • chore: adding SDK team to CODEOWNERS

  • chore: adding SDK team to CODEOWNERS

  • chore: adding SDK team to CODEOWNERS

docs: fix create experiment sample (#1716)

chore: expose model registry class (#1719)

fix: PipelineJob should only pass bearer tokens for AR URIs (#1717)

When downloading compiled KFP pipelines over HTTPS, we only need to pass a bearer token when we need to authenticate for services like Artifact Registry. We may get unexpected behavior passing this token in all HTTPS requests, which is the current behavior.

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • [x] Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • [x] Ensure the tests and linter pass
  • [x] Code coverage does not decrease (if any source code was changed)
  • [x] Appropriate docs were updated (if necessary)

Fixes b/251143831 🦕

fix(deps): allow protobuf 3.19.5 (#1720)

  • fix(deps): allow protobuf 3.19.5

  • explicitly exclude protobuf 4.21.0

chore: merge v1.18.1 into main (#1725)

  • fix(deps): allow protobuf 3.19.5 (#1720)

  • fix(deps): allow protobuf 3.19.5

  • explicitly exclude protobuf 4.21.0

  • update changelog/version

Co-authored-by: Victor Chudnovsky vchudnov@google.com

Internal change

PiperOrigin-RevId: 480655720

docs: resurface googleapis.dev and prediction docs (#1724)

I've mistakenly broken some of the docs for types links on googleapis.dev. Fixing the docs structure so the docs are showing again. If you run nox -s docs and take a look around the docs, they'll be back up.

I've verified that prediction pages are back up on c.g.c.

  • [x] Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • [x] Ensure the tests and linter pass
  • [x] Code coverage does not decrease (if any source code was changed)
  • [x] Appropriate docs were updated (if necessary)

Fixes #1722 🦕

Created at 2 weeks ago
chore: regenerate code with gapic-generator-python 1.4.4

This PR should contain the code in https://github.com/googleapis/python-aiplatform/pull/1760 that was recently rolled back.

Created at 2 weeks ago

chore: Roll back #1760

PiperOrigin-RevId: 487000978

chore: add _VertexAiPipelineBasedService base class

PiperOrigin-RevId: 487064823

fix: Print error for schema classes

PiperOrigin-RevId: 487277666

Merge branch 'main' into regenerate-code-with-gapic-1-4-4

Created at 2 weeks ago
pull request closed
feat: add annotation_labels to ImportDataConfig in aiplatform v1 dataset.proto
  • [ ] Regenerate this pull request now.

feat: add start_time to BatchReadFeatureValuesRequest in aiplatform v1 featurestore_service.proto feat: add metadata_artifact to Model in aiplatform v1 model.proto feat: add failed_main_jobs and failed_pre_caching_check_jobs to ContainerDetail in aiplatform v1 pipeline_job.proto feat: add persist_ml_use_assignment to InputDataConfig in aiplatform v1 training_pipeline.proto

PiperOrigin-RevId: 485963171

Source-Link: https://github.com/googleapis/googleapis/commit/9691f513420651ce2d303f3e056f88054702b2c3

Source-Link: https://github.com/googleapis/googleapis-gen/commit/85710316f329dfc73f244610eb019924e4580a56 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiODU3MTAzMTZmMzI5ZGZjNzNmMjQ0NjEwZWIwMTk5MjRlNDU4MGE1NiJ9

BEGIN_NESTED_COMMIT feat: add NVIDIA_A100_80GB to AcceleratorType in aiplatform v1beta1 accelerator_type.proto feat: add annotation_labels to ImportDataConfig in aiplatform v1beta1 dataset.proto feat: add total_deployed_model_count and total_endpoint_count to QueryDeployedModelsResponse in aiplatform v1beta1 deployment_resource_pool_service.proto feat: add start_time to BatchReadFeatureValuesRequest in aiplatform v1beta1 featurestore_service.proto feat: add metadata_artifact to Model in aiplatform v1beta1 model.proto feat: add failed_main_jobs and failed_pre_caching_check_jobs to ContainerDetail in aiplatform v1beta1 pipeline_job.proto feat: add persist_ml_use_assignment to InputDataConfig in aiplatform v1beta1 training_pipeline.proto

PiperOrigin-RevId: 485963130

Source-Link: https://github.com/googleapis/googleapis/commit/af14709a8a400efb90758b6a7751034a7b4cadf5

Source-Link: https://github.com/googleapis/googleapis-gen/commit/3d9d484a0104e0ccc4367769b79305c2cb6fc3d8 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiM2Q5ZDQ4NGEwMTA0ZTBjY2M0MzY3NzY5Yjc5MzA1YzJjYjZmYzNkOCJ9 END_NESTED_COMMIT BEGIN_NESTED_COMMIT chore: update to gapic-generator-python 1.5.0 PiperOrigin-RevId: 485148946

Source-Link: https://github.com/googleapis/googleapis/commit/a94d5d453a09f4bc874c07b511eac77b10ea9949

Source-Link: https://github.com/googleapis/googleapis-gen/commit/48917f4f84ce5625dbd3e3021cb8c97298c20ab8 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNDg5MTdmNGY4NGNlNTYyNWRiZDNlMzAyMWNiOGM5NzI5OGMyMGFiOCJ9 END_NESTED_COMMITBEGIN_NESTED_COMMITchore: update to gapic-generator-python 1.5.0 feat: add support for google.cloud.<api>.__version__ PiperOrigin-RevId: 484665853

Source-Link: https://github.com/googleapis/googleapis/commit/8eb249a19db926c2fbc4ecf1dc09c0e521a88b22

Source-Link: https://github.com/googleapis/googleapis-gen/commit/c8aa327b5f478865fc3fd91e3c2768e54e26ad44 Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiYzhhYTMyN2I1ZjQ3ODg2NWZjM2ZkOTFlM2MyNzY4ZTU0ZTI2YWQ0NCJ9 END_NESTED_COMMIT

Created at 2 weeks ago
dizcology delete branch parthea-patch-2
Created at 2 weeks ago

chore: configure release-please to use manifest (#296)

Created at 2 weeks ago
pull request closed
chore: configure release-please to use manifest

This PR fixes an issue where gapic_version.py is not being updated by release-please. In PR https://github.com/googleapis/python-analytics-admin/pull/293, the version in .release-please-manifest.json is updated but the version in google/analytics/admin/gapic_version.py should also be updated.

Created at 2 weeks ago