Build/Test Explorer

TestFusion
Invocation status: Failed

Kokoro: cloud-devrel/client-libraries/python/googleapis/python-aiplatform/continuous/system

2 targets evaluated on for 3 hr, 47 min, 9 sec
by kokoro-github-subscriber
2 Failed

Showing build.log

Download
Warning: Permanently added 'localhost' (ED25519) to the list of known hosts.
Warning: Permanently added 'localhost' (ED25519) to the list of known hosts.
Warning: Permanently added 'localhost' (ED25519) to the list of known hosts.
[12:33:18 PST] Transferring environment variable script to build VM
[12:33:19 PST] Transferring kokoro_log_reader.py to build VM
[12:33:20 PST] Transferring source code to build VM
[12:33:32 PST] Executing build script on build VM



[ID: 8515517] Executing command via SSH:
export KOKORO_BUILD_NUMBER="3228"
export KOKORO_JOB_NAME="cloud-devrel/client-libraries/python/googleapis/python-aiplatform/continuous/system"
source /tmpfs/kokoro-env_vars.sh; cd /tmpfs/src/ ; chmod 755 github/python-aiplatform/.kokoro/trampoline.sh ; PYTHON_3_VERSION="$(pyenv which python3 2> /dev/null || which python3)" ; PYTHON_2_VERSION="$(pyenv which python2 2> /dev/null || which python2)" ; if "$PYTHON_3_VERSION" -c "import psutil" ; then KOKORO_PYTHON_COMMAND="$PYTHON_3_VERSION" ; else KOKORO_PYTHON_COMMAND="$PYTHON_2_VERSION" ; fi > /dev/null 2>&1 ; echo "export KOKORO_PYTHON_COMMAND="$KOKORO_PYTHON_COMMAND"" > "$HOME/.kokoro_python_vars" ; nohup bash -c "( rm -f /tmpfs/kokoro_build_exit_code ; github/python-aiplatform/.kokoro/trampoline.sh ; echo \${PIPESTATUS[0]} > /tmpfs/kokoro_build_exit_code ) > /tmpfs/kokoro_build.log 2>&1" > /dev/null 2>&1 & echo $! > /tmpfs/kokoro_build.pid ; source "$HOME/.kokoro_python_vars" ; "$KOKORO_PYTHON_COMMAND" /tmpfs/kokoro_log_reader.py /tmpfs/kokoro_build.log /tmpfs/kokoro_build_exit_code /tmpfs/kokoro_build.pid /tmpfs/kokoro_log_reader.pid --start_byte 0

2024-12-18 12:33:33 Creating folder on disk for secrets: /tmpfs/src/gfile/secret_manager
Activated service account credentials for: [kokoro-trampoline@cloud-devrel-kokoro-resources.iam.gserviceaccount.com]
WARNING: Your config file at [/home/kbuilder/.docker/config.json] contains these credential helper entries:

{
"credHelpers": {
"gcr.io": "gcr",
"us.gcr.io": "gcr",
"asia.gcr.io": "gcr",
"staging-k8s.gcr.io": "gcr",
"eu.gcr.io": "gcr"
}
}
These will be overwritten.
Docker configuration file updated.
Using default tag: latest
latest: Pulling from cloud-devrel-kokoro-resources/python-multi
ff65ddf9395b: Pulling fs layer
cdd9d79692c1: Pulling fs layer
e4862826b2aa: Pulling fs layer
d32630687f55: Pulling fs layer
dfa7816fe874: Pulling fs layer
0a8e089113ba: Pulling fs layer
d8c355139ad8: Pulling fs layer
cf2983c386f5: Pulling fs layer
f1826a77298e: Pulling fs layer
046f509f9618: Pulling fs layer
8614918c0f65: Pulling fs layer
4f8e6ed8e976: Pulling fs layer
d32630687f55: Waiting
42d703dfe78c: Pulling fs layer
3e6bd4022c16: Pulling fs layer
1faa0020373d: Pulling fs layer
dfa7816fe874: Waiting
0dbbcabd03ab: Pulling fs layer
0a8e089113ba: Waiting
3f7572e2d709: Pulling fs layer
017bd45224e3: Pulling fs layer
14922c403bfb: Pulling fs layer
d8c355139ad8: Waiting
137985dae17e: Pulling fs layer
cf2983c386f5: Waiting
c13dd2a78d7b: Pulling fs layer
946b68838e6d: Pulling fs layer
8c9cccb372af: Pulling fs layer
11221cca7dfc: Pulling fs layer
1c462c293704: Pulling fs layer
017bd45224e3: Waiting
f1826a77298e: Waiting
14922c403bfb: Waiting
137985dae17e: Waiting
046f509f9618: Waiting
8614918c0f65: Waiting
4f8e6ed8e976: Waiting
42d703dfe78c: Waiting
3e6bd4022c16: Waiting
1faa0020373d: Waiting
0dbbcabd03ab: Waiting
3f7572e2d709: Waiting
946b68838e6d: Waiting
8c9cccb372af: Waiting
11221cca7dfc: Waiting
1c462c293704: Waiting
c13dd2a78d7b: Waiting
e4862826b2aa: Verifying Checksum
e4862826b2aa: Download complete
ff65ddf9395b: Download complete
d32630687f55: Verifying Checksum
d32630687f55: Download complete
dfa7816fe874: Verifying Checksum
dfa7816fe874: Download complete
0a8e089113ba: Verifying Checksum
0a8e089113ba: Download complete
cf2983c386f5: Verifying Checksum
cf2983c386f5: Download complete
d8c355139ad8: Verifying Checksum
d8c355139ad8: Download complete
ff65ddf9395b: Pull complete
046f509f9618: Verifying Checksum
046f509f9618: Download complete
cdd9d79692c1: Download complete
4f8e6ed8e976: Verifying Checksum
4f8e6ed8e976: Download complete
8614918c0f65: Verifying Checksum
8614918c0f65: Download complete
42d703dfe78c: Verifying Checksum
42d703dfe78c: Download complete
3e6bd4022c16: Download complete
1faa0020373d: Verifying Checksum
1faa0020373d: Download complete
0dbbcabd03ab: Verifying Checksum
0dbbcabd03ab: Download complete
3f7572e2d709: Download complete
017bd45224e3: Verifying Checksum
017bd45224e3: Download complete
137985dae17e: Verifying Checksum
137985dae17e: Download complete
14922c403bfb: Download complete
c13dd2a78d7b: Verifying Checksum
c13dd2a78d7b: Download complete
946b68838e6d: Verifying Checksum
946b68838e6d: Download complete
8c9cccb372af: Verifying Checksum
8c9cccb372af: Download complete
11221cca7dfc: Verifying Checksum
11221cca7dfc: Download complete
1c462c293704: Verifying Checksum
1c462c293704: Download complete
f1826a77298e: Verifying Checksum
f1826a77298e: Download complete
cdd9d79692c1: Pull complete
e4862826b2aa: Pull complete
d32630687f55: Pull complete
dfa7816fe874: Pull complete
0a8e089113ba: Pull complete
d8c355139ad8: Pull complete
cf2983c386f5: Pull complete
f1826a77298e: Pull complete
046f509f9618: Pull complete
8614918c0f65: Pull complete
4f8e6ed8e976: Pull complete
42d703dfe78c: Pull complete
3e6bd4022c16: Pull complete
1faa0020373d: Pull complete
0dbbcabd03ab: Pull complete
3f7572e2d709: Pull complete
017bd45224e3: Pull complete
14922c403bfb: Pull complete
137985dae17e: Pull complete
c13dd2a78d7b: Pull complete
946b68838e6d: Pull complete
8c9cccb372af: Pull complete
11221cca7dfc: Pull complete
1c462c293704: Pull complete
Digest: sha256:9bb9bd3cd602cf04a361ef46d2f32e514ed06ab856e0fb889349572ec028d89a
Status: Downloaded newer image for gcr.io/cloud-devrel-kokoro-resources/python-multi:latest
gcr.io/cloud-devrel-kokoro-resources/python-multi:latest
Executing: docker run --rm --interactive --network=host --privileged --volume=/var/run/docker.sock:/var/run/docker.sock --workdir=/tmpfs/src --entrypoint=github/python-aiplatform/.kokoro/build.sh --env-file=/tmpfs/tmp/tmpbhbppm29/envfile --volume=/tmpfs:/tmpfs gcr.io/cloud-devrel-kokoro-resources/python-multi
KOKORO_KEYSTORE_DIR=/tmpfs/src/keystore
KOKORO_GITHUB_COMMIT_URL=https://github.com/googleapis/python-aiplatform/commit/ede026619f874189c5cc0edb6e5494591fac609c
KOKORO_JOB_NAME=cloud-devrel/client-libraries/python/googleapis/python-aiplatform/continuous/system
KOKORO_JOB_CLUSTER=GCP_UBUNTU
KOKORO_GIT_COMMIT=ede026619f874189c5cc0edb6e5494591fac609c
KOKORO_BLAZE_DIR=/tmpfs/src/objfs
KOKORO_ROOT=/tmpfs
KOKORO_JOB_TYPE=CONTINUOUS_INTEGRATION
KOKORO_ROOT_DIR=/tmpfs/
KOKORO_BUILD_NUMBER=3228
KOKORO_JOB_POOL=yoshi-ubuntu
KOKORO_GITHUB_COMMIT=ede026619f874189c5cc0edb6e5494591fac609c
KOKORO_BUILD_INITIATOR=kokoro-github-subscriber
KOKORO_ARTIFACTS_DIR=/tmpfs/src
KOKORO_BUILD_ID=d9eda141-e757-4bfd-8d7e-9b41050a4fcc
KOKORO_GFILE_DIR=/tmpfs/src/gfile
KOKORO_BUILD_CONFIG_DIR=
KOKORO_POSIX_ROOT=/tmpfs
KOKORO_BUILD_ARTIFACTS_SUBDIR=prod/cloud-devrel/client-libraries/python/googleapis/python-aiplatform/continuous/system/3228/20241218-123227
WARNING: Skipping nox-automation as it is not installed.

[notice] A new release of pip is available: 23.0.1 -> 24.3.1
[notice] To update, run: pip install --upgrade pip
2024.10.9
nox > Running session system-3.10
nox > Creating virtual environment (virtualenv) using python3.10 in .nox/system-3-10
nox > python -m pip install --pre 'grpcio!=1.52.0rc1'
nox > python -m pip install mock pytest google-cloud-testutils -c /tmpfs/src/github/python-aiplatform/testing/constraints-3.10.txt
nox > python -m pip install -e '.[testing]' -c /tmpfs/src/github/python-aiplatform/testing/constraints-3.10.txt
nox > py.test -v --junitxml=system_3.10_sponge_log.xml tests/system
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
============================= test session starts ==============================
platform linux -- Python 3.10.15, pytest-8.3.4, pluggy-1.5.0 -- /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python
cachedir: .pytest_cache
rootdir: /tmpfs/src/github/python-aiplatform
plugins: xdist-3.3.1, asyncio-0.25.0, anyio-3.7.1
asyncio: mode=strict, asyncio_default_fixture_loop_scope=None
created: 16/16 workers
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:207: PytestDeprecationWarning: The configuration option "asyncio_default_fixture_loop_scope" is unset.
The event loop scope for asynchronous fixtures will default to the fixture caching scope. Future versions of pytest-asyncio will default the loop scope for asynchronous fixtures to function scope. Set the default fixture loop scope explicitly in order to avoid unexpected behavior in the future. Valid fixture loop scopes are: "function", "class", "module", "package", "session"

warnings.warn(PytestDeprecationWarning(_DEFAULT_FIXTURE_LOOP_SCOPE_UNSET))
16 workers [248 items]

scheduling tests via LoadScopeScheduling

tests/system/aiplatform/test_dataset.py::TestDataset::test_get_existing_dataset
tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_with_autorun_creation
tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_prebuilt_container
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_featurestore
tests/system/aiplatform/test_experiments.py::TestExperiments::test_create_experiment
tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_system_dataset_artifact_create
tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting1::test_end_to_end_forecasting[AutoMLForecastingTrainingJob]
tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting3::test_end_to_end_forecasting[TemporalFusionTransformerForecastingTrainingJob]
tests/system/aiplatform/test_initializer.py::TestInitializer::test_init_calls_set_google_auth_default
tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting2::test_end_to_end_forecasting[SequenceToSequencePlusForecastingTrainingJob]
tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting4::test_end_to_end_forecasting[TimeSeriesDenseEncoderForecastingTrainingJob]
tests/system/aiplatform/test_e2e_tabular.py::TestEndToEndTabular::test_end_to_end_tabular
tests/system/aiplatform/test_matching_engine_index.py::TestMatchingEngine::test_create_get_list_matching_engine_index
tests/system/aiplatform/test_batch_prediction.py::TestBatchPredictionJob::test_model_monitoring
[gw3] [ 0%] SKIPPED tests/system/aiplatform/test_dataset.py::TestDataset::test_get_existing_dataset
tests/system/aiplatform/test_dataset.py::TestDataset::test_get_nonexistent_dataset
[gw13] [ 0%] PASSED tests/system/aiplatform/test_initializer.py::TestInitializer::test_init_calls_set_google_auth_default
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_sklearn_model
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation[grpc]
tests/system/aiplatform/test_initializer.py::TestInitializer::test_init_rest_async_incorrect_credentials
[gw13] [ 1%] PASSED tests/system/aiplatform/test_initializer.py::TestInitializer::test_init_rest_async_incorrect_credentials
tests/system/aiplatform/test_pipeline_job.py::TestPipelineJob::test_add_pipeline_job_to_experiment
[gw3] [ 1%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_get_nonexistent_dataset
tests/system/aiplatform/test_dataset.py::TestDataset::test_get_new_dataset_and_import
[gw8] [ 2%] PASSED tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_system_dataset_artifact_create
tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_google_dataset_artifact_create
[gw8] [ 2%] PASSED tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_google_dataset_artifact_create
tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_execution_create_using_system_schema_class
[gw14] [ 2%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation[rest]
[gw8] [ 3%] PASSED tests/system/aiplatform/test_e2e_metadata_schema.py::TestMetadataSchema::test_execution_create_using_system_schema_class
[gw14] [ 3%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_preview_count_tokens[grpc]
tests/system/aiplatform/test_project_id_inference.py::TestProjectIDInference::test_project_id_inference
[gw14] [ 4%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_preview_count_tokens[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_preview_count_tokens[rest]
[gw14] [ 4%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_preview_count_tokens[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_model_predict_async
[gw11] [ 4%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_create_experiment
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiment
[gw11] [ 5%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiment
tests/system/aiplatform/test_experiments.py::TestExperiments::test_start_run
[gw10] [ 5%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_sklearn_model
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_booster_with_custom_uri
[gw14] [ 6%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_model_predict_async
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_streaming[grpc]
[gw14] [ 6%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_streaming[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_streaming[rest]
[gw14] [ 6%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_streaming[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_preview_text_generation_from_pretrained[grpc]
[gw11] [ 7%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_start_run
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_run
[gw11] [ 7%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_run
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_params
[gw14] [ 8%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_preview_text_generation_from_pretrained[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_preview_text_generation_from_pretrained[rest]
[gw10] [ 8%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_booster_with_custom_uri
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_xgbmodel_with_custom_names
[gw14] [ 8%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_preview_text_generation_from_pretrained[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_on_chat_model[grpc]
[gw11] [ 9%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_params
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_metrics
[gw11] [ 9%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_metrics
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_time_series_metrics
[gw14] [ 10%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_on_chat_model[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_on_chat_model[rest]
[gw10] [ 10%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_xgbmodel_with_custom_names
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_tensorflow_keras_model_with_input_example
[gw11] [ 10%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_time_series_metrics
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_time_series_data_frame_batch_read_success
[gw0] [ 11%] FAILED tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_with_autorun_creation
tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_with_manual_run_creation
[gw14] [ 11%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_on_chat_model[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_preview_count_tokens[grpc]
[gw14] [ 12%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_preview_count_tokens[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_preview_count_tokens[rest]
[gw14] [ 12%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_preview_count_tokens[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_async
[gw14] [ 12%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_async
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_send_message_streaming[grpc]
[gw14] [ 13%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_send_message_streaming[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_send_message_streaming[rest]
[gw14] [ 13%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_chat_model_send_message_streaming[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding[grpc]
[gw14] [ 14%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding[rest]
[gw14] [ 14%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding_async
[gw14] [ 14%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_embedding_async
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_tuning[grpc]
[gw14] [ 15%] SKIPPED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_tuning[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_tuning[rest]
[gw14] [ 15%] SKIPPED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_tuning[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_text_generation[grpc]
[gw11] [ 16%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_time_series_data_frame_batch_read_success
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_classification_metrics
[gw11] [ 16%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_classification_metrics
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_model
[gw11] [ 16%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_model
tests/system/aiplatform/test_experiments.py::TestExperiments::test_create_artifact
[gw0] [ 17%] PASSED tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_with_manual_run_creation
tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_nested_run_model
[gw11] [ 17%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_create_artifact
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_artifact_by_uri
[gw10] [ 18%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_tensorflow_keras_model_with_input_example
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_tensorflow_module_with_gpu_container
[gw11] [ 18%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_artifact_by_uri
tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_execution_and_artifact
[gw11] [ 18%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_execution_and_artifact
tests/system/aiplatform/test_experiments.py::TestExperiments::test_end_run
[gw11] [ 19%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_end_run
tests/system/aiplatform/test_experiments.py::TestExperiments::test_run_context_manager
[gw11] [ 19%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_run_context_manager
tests/system/aiplatform/test_experiments.py::TestExperiments::test_add_pipeline_job_to_experiment
[gw0] [ 20%] PASSED tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_nested_run_model
[gw10] [ 20%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_tensorflow_module_with_gpu_container
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_deploy_model_with_cpu_container
[gw14] [ 20%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_text_generation[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_text_generation[rest]
tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard
[gw0] [ 21%] PASSED tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard
tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_experiment
[gw0] [ 21%] PASSED tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_experiment
tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_run
[gw0] [ 22%] PASSED tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_run
tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_time_series
[gw0] [ 22%] PASSED tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_create_and_get_tensorboard_time_series
tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_write_tensorboard_scalar_data
[gw0] [ 22%] PASSED tests/system/aiplatform/test_tensorboard.py::TestTensorboard::test_write_tensorboard_scalar_data
tests/system/vertex_ray/test_cluster_management.py::TestClusterManagement::test_cluster_management[2.9]
[gw14] [ 23%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_text_generation[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_textembedding[grpc]
[gw14] [ 23%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_textembedding[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_textembedding[rest]
[gw13] [ 24%] PASSED tests/system/aiplatform/test_pipeline_job.py::TestPipelineJob::test_add_pipeline_job_to_experiment
tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
[gw14] [ 24%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_textembedding[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_code_generation[grpc]
[gw11] [ 25%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_add_pipeline_job_to_experiment
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df
[gw11] [ 25%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df_include_time_series_false
[gw14] [ 25%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_code_generation[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_code_generation[rest]
[gw11] [ 26%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df_include_time_series_false
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_does_not_exist_raises_exception
[gw11] [ 26%] FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_does_not_exist_raises_exception
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_success
[gw11] [ 27%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_success
tests/system/aiplatform/test_experiments.py::TestExperiments::test_reuse_run_success
[gw10] [ 27%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_deploy_model_with_cpu_container
tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_deploy_model_with_gpu_container
[gw11] [ 27%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_reuse_run_success
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_then_tensorboard_success
[gw11] [ 28%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_then_tensorboard_success
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_wout_backing_tensorboard_reuse_run_raises_exception
[gw11] [ 28%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_wout_backing_tensorboard_reuse_run_raises_exception
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_experiment_does_not_exist_raises_exception
[gw11] [ 29%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_experiment_does_not_exist_raises_exception
tests/system/aiplatform/test_experiments.py::TestExperiments::test_init_associates_global_tensorboard_to_experiment
[gw11] [ 29%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_init_associates_global_tensorboard_to_experiment
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_backing_tensorboard_resource_returns_tensorboard
[gw11] [ 29%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_backing_tensorboard_resource_returns_tensorboard
tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_backing_tensorboard_resource_returns_none
[gw14] [ 30%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_batch_prediction_for_code_generation[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_generation_streaming[grpc]
[gw11] [ 30%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_backing_tensorboard_resource_returns_none
tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_backing_tensorboard_experiment_run_success
[gw14] [ 31%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_generation_streaming[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_generation_streaming[rest]
[gw11] [ 31%] PASSED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_backing_tensorboard_experiment_run_success
[gw14] [ 31%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_generation_streaming[rest]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_chat_model_send_message_streaming[grpc]
tests/system/vertexai/test_batch_prediction.py::TestBatchPrediction::test_batch_prediction_with_gcs_input
[gw14] [ 32%] FAILED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_chat_model_send_message_streaming[grpc]
tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_chat_model_send_message_streaming[rest]
[gw14] [ 32%] PASSED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_chat_model_send_message_streaming[rest]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_cached_content_from_text[grpc-PROD_ENDPOINT]
[gw13] [ 33%] FAILED tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
tests/system/aiplatform/test_private_endpoint.py::TestPrivateEndpoint::test_create_deploy_delete_private_endpoint
[gw8] [ 33%] PASSED tests/system/aiplatform/test_project_id_inference.py::TestProjectIDInference::test_project_id_inference
tests/system/aiplatform/test_telemetry.py::TestTelemetry::test_single_context_manager
[gw8] [ 33%] PASSED tests/system/aiplatform/test_telemetry.py::TestTelemetry::test_single_context_manager
tests/system/aiplatform/test_telemetry.py::TestTelemetry::test_nested_context_manager
[gw8] [ 34%] PASSED tests/system/aiplatform/test_telemetry.py::TestTelemetry::test_nested_context_manager
tests/system/vertexai/test_reasoning_engines.py::TestReasoningEngines::test_langchain_template
[gw14] [ 34%] FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_cached_content_from_text[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_cached_content_from_text[rest-PROD_ENDPOINT]
[gw12] [ 35%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_featurestore
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_entity_types
[gw12] [ 35%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_entity_types
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_features
[gw12] [ 35%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_create_get_list_features
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values
[gw2] [ 36%] PASSED tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_prebuilt_container
tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_custom_container
[gw10] [ 36%] PASSED tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_deploy_model_with_gpu_container
[gw3] [ 37%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_get_new_dataset_and_import
tests/system/aiplatform/test_dataset.py::TestDataset::test_create_and_import_image_dataset
tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_lifecycle
[gw10] [ 37%] PASSED tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_lifecycle
tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_study_deletion
[gw10] [ 37%] PASSED tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_study_deletion
tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_trial_deletion
[gw2] [ 38%] PASSED tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_custom_container
tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_enable_autolog_prebuilt_container
[gw10] [ 38%] PASSED tests/system/aiplatform/test_vizier.py::TestVizier::test_vizier_trial_deletion
[gw14] [ 39%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_cached_content_from_text[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text[grpc-PROD_ENDPOINT]
[gw14] [ 39%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text[rest-PROD_ENDPOINT]
[gw14] [ 39%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_latency[grpc-PROD_ENDPOINT]
[gw13] [ 40%] PASSED tests/system/aiplatform/test_private_endpoint.py::TestPrivateEndpoint::test_create_deploy_delete_private_endpoint
tests/system/vertex_ray/test_ray_data.py::TestRayData::test_ray_data[2.9]
[gw14] [ 40%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_latency[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_latency[rest-PROD_ENDPOINT]
[gw14] [ 41%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_latency[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_async[grpc-PROD_ENDPOINT]
[gw14] [ 41%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_async[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_async[rest-PROD_ENDPOINT]
[gw14] [ 41%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_async[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming[grpc-PROD_ENDPOINT]
[gw8] [ 42%] FAILED tests/system/vertexai/test_reasoning_engines.py::TestReasoningEngines::test_langchain_template
[gw14] [ 42%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming[rest-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
[gw14] [ 43%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming_async[grpc-PROD_ENDPOINT]
[gw14] [ 43%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming_async[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming_async[rest-PROD_ENDPOINT]
[gw14] [ 43%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_streaming_async[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_parameters[grpc-PROD_ENDPOINT]
[gw14] [ 44%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_parameters[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_parameters[rest-PROD_ENDPOINT]
[gw14] [ 44%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_parameters[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_gemini_15_parameters[grpc-PROD_ENDPOINT]
[gw14] [ 45%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_gemini_15_parameters[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_gemini_15_parameters[rest-PROD_ENDPOINT]
[gw14] [ 45%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_gemini_15_parameters[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_list_of_content_dict[grpc-PROD_ENDPOINT]
[gw14] [ 45%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_list_of_content_dict[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_list_of_content_dict[rest-PROD_ENDPOINT]
[gw14] [ 46%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_list_of_content_dict[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_remote_image[grpc-PROD_ENDPOINT]
[gw14] [ 46%] SKIPPED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_remote_image[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_remote_image[rest-PROD_ENDPOINT]
[gw14] [ 47%] SKIPPED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_remote_image[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_image[grpc-PROD_ENDPOINT]
[gw14] [ 47%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_image[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_image[rest-PROD_ENDPOINT]
[gw8] [ 47%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
[gw14] [ 48%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_image[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_video[grpc-PROD_ENDPOINT]
[gw14] [ 48%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_video[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_video[rest-PROD_ENDPOINT]
[gw0] [ 49%] PASSED tests/system/vertex_ray/test_cluster_management.py::TestClusterManagement::test_cluster_management[2.9]
tests/system/vertex_ray/test_cluster_management.py::TestClusterManagement::test_cluster_management[2.33]
[gw14] [ 49%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_video[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_audio[grpc-PROD_ENDPOINT]
[gw8] [ 50%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
[gw14] [ 50%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_audio[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_audio[rest-PROD_ENDPOINT]
[gw14] [ 50%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_from_text_and_remote_audio[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever[grpc-PROD_ENDPOINT]
[gw14] [ 51%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever[rest-PROD_ENDPOINT]
[gw14] [ 51%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever_with_dynamic_retrieval[grpc-PROD_ENDPOINT]
[gw14] [ 52%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever_with_dynamic_retrieval[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever_with_dynamic_retrieval[rest-PROD_ENDPOINT]
[gw8] [ 52%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
[gw14] [ 52%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_grounding_google_search_retriever_with_dynamic_retrieval[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_send_message_from_text[grpc-PROD_ENDPOINT]
[gw14] [ 53%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_send_message_from_text[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_send_message_from_text[rest-PROD_ENDPOINT]
[gw14] [ 53%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_send_message_from_text[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_function_calling[grpc-PROD_ENDPOINT]
[gw14] [ 54%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_function_calling[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_function_calling[rest-PROD_ENDPOINT]
[gw14] [ 54%] FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_function_calling[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[grpc-PROD_ENDPOINT]
[gw14] [ 54%] FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[rest-PROD_ENDPOINT]
[gw14] [ 55%] FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_model_router[grpc-PROD_ENDPOINT]
[gw14] [ 55%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_model_router[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_model_router[rest-PROD_ENDPOINT]
[gw14] [ 56%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_model_router[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_automatic_function_calling[grpc-PROD_ENDPOINT]
[gw14] [ 56%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_automatic_function_calling[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_automatic_function_calling[rest-PROD_ENDPOINT]
[gw14] [ 56%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_automatic_function_calling[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_additional_request_metadata[grpc-PROD_ENDPOINT]
[gw14] [ 57%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_additional_request_metadata[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_additional_request_metadata[rest-PROD_ENDPOINT]
[gw8] [ 57%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
[gw14] [ 58%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_additional_request_metadata[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_compute_tokens_from_text[grpc-PROD_ENDPOINT]
[gw14] [ 58%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_compute_tokens_from_text[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_compute_tokens_from_text[rest-PROD_ENDPOINT]
[gw14] [ 58%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_compute_tokens_from_text[rest-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_count_tokens_from_text[grpc-PROD_ENDPOINT]
[gw14] [ 59%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_count_tokens_from_text[grpc-PROD_ENDPOINT]
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_count_tokens_from_text[rest-PROD_ENDPOINT]
[gw14] [ 59%] PASSED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_count_tokens_from_text[rest-PROD_ENDPOINT]
[gw12] [ 60%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_create_features
[gw12] [ 60%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_create_features
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values_from_df_using_feature_time_column_and_online_read_multiple_entities
[gw8] [ 60%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[_get_tokenizer_for_model_preview-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 61%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
[gw11] [ 61%] PASSED tests/system/vertexai/test_batch_prediction.py::TestBatchPrediction::test_batch_prediction_with_gcs_input
tests/system/vertexai/test_batch_prediction.py::TestBatchPrediction::test_batch_prediction_with_bq_input
[gw8] [ 62%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 62%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
[gw2] [ 62%] PASSED tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_enable_autolog_prebuilt_container
tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_enable_autolog_custom_container
[gw8] [ 63%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 63%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_local[get_tokenizer_for_model-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 64%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 64%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
[gw2] [ 64%] PASSED tests/system/aiplatform/test_custom_job.py::TestCustomJob::test_from_local_script_enable_autolog_custom_container
[gw8] [ 65%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 65%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
[gw12] [ 66%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values_from_df_using_feature_time_column_and_online_read_multiple_entities
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values_from_df_using_feature_time_datetime_and_online_read_single_entity
[gw8] [ 66%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[_get_tokenizer_for_model_preview-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
[gw3] [ 66%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_create_and_import_image_dataset
tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset
[gw3] [ 67%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset
tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset_from_dataframe
[gw3] [ 67%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset_from_dataframe
tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset_from_dataframe_with_provided_schema
[gw3] [ 68%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_create_tabular_dataset_from_dataframe_with_provided_schema
tests/system/aiplatform/test_dataset.py::TestDataset::test_create_time_series_dataset
[gw3] [ 68%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_create_time_series_dataset
tests/system/aiplatform/test_dataset.py::TestDataset::test_export_data
[gw3] [ 68%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_export_data
tests/system/aiplatform/test_dataset.py::TestDataset::test_export_data_for_custom_training
[gw3] [ 69%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_export_data_for_custom_training
tests/system/aiplatform/test_dataset.py::TestDataset::test_update_dataset
[gw8] [ 69%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.0-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
[gw3] [ 70%] PASSED tests/system/aiplatform/test_dataset.py::TestDataset::test_update_dataset
[gw1] [ 70%] PASSED tests/system/aiplatform/test_batch_prediction.py::TestBatchPredictionJob::test_model_monitoring
tests/system/aiplatform/test_model_evaluation.py::TestModelEvaluationJob::test_model_evaluate_custom_tabular_model
[gw0] [ 70%] PASSED tests/system/vertex_ray/test_cluster_management.py::TestClusterManagement::test_cluster_management[2.33]
tests/system/vertex_ray/test_job_submission_dashboard.py::TestJobSubmissionDashboard::test_job_submission_dashboard[2.9]
[gw8] [ 71%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-pro-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
[gw8] [ 71%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-flash-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
[gw13] [ 72%] PASSED tests/system/vertex_ray/test_ray_data.py::TestRayData::test_ray_data[2.9]
tests/system/vertex_ray/test_ray_data.py::TestRayData::test_ray_data[2.33]
[gw8] [ 72%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-flash-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
[gw11] [ 72%] PASSED tests/system/vertexai/test_batch_prediction.py::TestBatchPrediction::test_batch_prediction_with_bq_input
tests/system/vertexai/test_prompts.py::TestPrompts::test_create_prompt_with_variables
[gw11] [ 73%] PASSED tests/system/vertexai/test_prompts.py::TestPrompts::test_create_prompt_with_variables
tests/system/vertexai/test_prompts.py::TestPrompts::test_create_prompt_with_function_calling
[gw11] [ 73%] PASSED tests/system/vertexai/test_prompts.py::TestPrompts::test_create_prompt_with_function_calling
tests/system/vertexai/test_prompts.py::TestPrompts::test_get_prompt_with_variables
[gw11] [ 74%] PASSED tests/system/vertexai/test_prompts.py::TestPrompts::test_get_prompt_with_variables
tests/system/vertexai/test_prompts.py::TestPrompts::test_get_prompt_with_function_calling
[gw11] [ 74%] PASSED tests/system/vertexai/test_prompts.py::TestPrompts::test_get_prompt_with_function_calling
[gw8] [ 75%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_compute_tokens[get_tokenizer_for_model-gemini-1.5-pro-002-udhr-udhr-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 75%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 75%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 76%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 76%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 77%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction[gemini-1.5-pro-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 77%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 77%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 78%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 78%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 79%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_call[gemini-1.5-pro-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 79%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 79%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 80%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 80%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 81%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_system_instruction_is_function_response[gemini-1.5-pro-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 81%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 81%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 82%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 82%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 83%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_tool_is_function_declaration[gemini-1.5-pro-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 83%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 83%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 84%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 84%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 85%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_call[gemini-1.5-pro-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.0-pro-PROD_ENDPOINT]
[gw8] [ 85%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.0-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-pro-PROD_ENDPOINT]
[gw8] [ 85%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-pro-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-flash-PROD_ENDPOINT]
[gw8] [ 86%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-flash-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-flash-002-PROD_ENDPOINT]
[gw8] [ 86%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-flash-002-PROD_ENDPOINT]
tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw8] [ 87%] PASSED tests/system/vertexai/test_tokenization.py::TestTokenization::test_count_tokens_content_is_function_response[gemini-1.5-pro-002-PROD_ENDPOINT]
[gw12] [ 87%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_ingest_feature_values_from_df_using_feature_time_datetime_and_online_read_single_entity
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_write_features
[gw12] [ 87%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_write_features
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_search_features
[gw12] [ 88%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_search_features
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_df
[gw12] [ 88%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_df
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_gcs
[gw0] [ 89%] PASSED tests/system/vertex_ray/test_job_submission_dashboard.py::TestJobSubmissionDashboard::test_job_submission_dashboard[2.9]
tests/system/vertex_ray/test_job_submission_dashboard.py::TestJobSubmissionDashboard::test_job_submission_dashboard[2.33]
[gw13] [ 89%] PASSED tests/system/vertex_ray/test_ray_data.py::TestRayData::test_ray_data[2.33]
[gw12] [ 89%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_gcs
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_bq
[gw12] [ 90%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_bq
tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_online_reads
[gw12] [ 90%] PASSED tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_online_reads
[gw1] [ 91%] PASSED tests/system/aiplatform/test_model_evaluation.py::TestModelEvaluationJob::test_model_evaluate_custom_tabular_model
[gw0] [ 91%] PASSED tests/system/vertex_ray/test_job_submission_dashboard.py::TestJobSubmissionDashboard::test_job_submission_dashboard[2.33]
[gw7] [ 91%] FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting4::test_end_to_end_forecasting[TimeSeriesDenseEncoderForecastingTrainingJob]
tests/system/aiplatform/test_model_version_management.py::TestVersionManagement::test_upload_deploy_manage_versioned_model
[gw7] [ 92%] FAILED tests/system/aiplatform/test_model_version_management.py::TestVersionManagement::test_upload_deploy_manage_versioned_model
[gw4] [ 92%] FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting1::test_end_to_end_forecasting[AutoMLForecastingTrainingJob]
tests/system/aiplatform/test_model_interactions.py::TestModelInteractions::test_prediction
[gw4] [ 93%] PASSED tests/system/aiplatform/test_model_interactions.py::TestModelInteractions::test_prediction
tests/system/aiplatform/test_model_interactions.py::TestModelInteractions::test_endpoint_predict_async
[gw4] [ 93%] PASSED tests/system/aiplatform/test_model_interactions.py::TestModelInteractions::test_endpoint_predict_async
[gw6] [ 93%] FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting2::test_end_to_end_forecasting[SequenceToSequencePlusForecastingTrainingJob]
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_create_endpoint
[gw15] [ 94%] PASSED tests/system/aiplatform/test_matching_engine_index.py::TestMatchingEngine::test_create_get_list_matching_engine_index
tests/system/aiplatform/test_matching_engine_index.py::TestMatchingEngine::test_matching_engine_stream_index
[gw6] [ 94%] PASSED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_create_endpoint
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_one_valid_config
[gw6] [ 95%] FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_one_valid_config
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_pause_and_update_config
[gw6] [ 95%] SKIPPED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_pause_and_update_config
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_two_valid_configs
[gw6] [ 95%] FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_two_valid_configs
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_invalid_config_incorrect_model_id
[gw6] [ 96%] PASSED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_invalid_config_incorrect_model_id
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_invalid_config_xai
[gw6] [ 96%] PASSED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_invalid_config_xai
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_invalid_configs_xai
[gw6] [ 97%] PASSED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_invalid_configs_xai
tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_notification_channel_alert_config
[gw6] [ 97%] FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_notification_channel_alert_config
[gw5] [ 97%] PASSED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting3::test_end_to_end_forecasting[TemporalFusionTransformerForecastingTrainingJob]
tests/system/aiplatform/test_model_upload.py::TestModelUploadAndUpdate::test_upload_and_deploy_xgboost_model
[gw5] [ 98%] PASSED tests/system/aiplatform/test_model_upload.py::TestModelUploadAndUpdate::test_upload_and_deploy_xgboost_model
[gw9] [ 98%] PASSED tests/system/aiplatform/test_e2e_tabular.py::TestEndToEndTabular::test_end_to_end_tabular
tests/system/aiplatform/test_persistent_resource.py::TestPersistentResource::test_create_persistent_resource
[gw9] [ 99%] PASSED tests/system/aiplatform/test_persistent_resource.py::TestPersistentResource::test_create_persistent_resource
[gw15] [ 99%] PASSED tests/system/aiplatform/test_matching_engine_index.py::TestMatchingEngine::test_matching_engine_stream_index
tests/system/aiplatform/test_pipeline_job_schedule.py::TestPipelineJobSchedule::test_create_get_pause_resume_update_list
[gw15] [100%] PASSED tests/system/aiplatform/test_pipeline_job_schedule.py::TestPipelineJobSchedule::test_create_get_pause_resume_update_list

=================================== FAILURES ===================================
____________________ TestExperiments.test_create_experiment ____________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (context {
name: "projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--543941... string_value: "projects/580378083368/locations/us-central1/tensorboards/43289971808796672"
}
}
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'context.name=projects/580378083368/locations/us-central1/metadataStores/defau... grpc/1.51.3 gax/2.21.0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.initializer._Config.init')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAVAILABLE
E details = "The service is currently unavailable."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.142.95:443 {created_time:"2024-12-18T20:40:45.86594976+00:00", grpc_status:14, grpc_message:"The service is currently unavailable."}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
shared_state = {'bucket': , 'resources': [}

def test_create_experiment(self, shared_state):

# Truncating the name because of resource id constraints from the service
tensorboard = aiplatform.Tensorboard.create(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
display_name=self._experiment_name,
)

shared_state["resources"] = [tensorboard]

> aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
experiment_tensorboard=tensorboard,
)

tests/system/aiplatform/test_experiments.py:86:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/initializer.py:295: in init
metadata._experiment_tracker.set_experiment(
google/cloud/aiplatform/metadata/metadata.py:351: in set_experiment
experiment.assign_backing_tensorboard(tensorboard=backing_tb)
google/cloud/aiplatform/metadata/experiment_resources.py:636: in assign_backing_tensorboard
self._metadata_context.update(
google/cloud/aiplatform/metadata/context.py:310: in update
super().update(
google/cloud/aiplatform/metadata/resource.py:317: in update
update_gca_resource = self._update_resource(
google/cloud/aiplatform/metadata/context.py:350: in _update_resource
return client.update_context(context=resource)
google/cloud/aiplatform_v1/services/metadata_service/client.py:2539: in update_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (context {
name: "projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--543941... string_value: "projects/580378083368/locations/us-central1/tensorboards/43289971808796672"
}
}
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'context.name=projects/580378083368/locations/us-central1/metadataStores/defau... grpc/1.51.3 gax/2.21.0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.initializer._Config.init')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.ServiceUnavailable: 503 The service is currently unavailable.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: ServiceUnavailable
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:85 Creating Tensorboard
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:88 Create Tensorboard backing LRO: projects/580378083368/locations/us-central1/tensorboards/43289971808796672/operations/2632524853176958976
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:113 Tensorboard created. Resource name: projects/580378083368/locations/us-central1/tensorboards/43289971808796672
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:114 To use this Tensorboard in another session:
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:115 tb = aiplatform.Tensorboard('projects/580378083368/locations/us-central1/tensorboards/43289971808796672')
________________________ TestExperiments.test_start_run ________________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (parent: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default"
context {
display_name: "run-1"
...truct_value {
}
}
}
}
}
context_id: "tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1/metadataStores/defau...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAVAILABLE
E details = "The service is currently unavailable."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.142.95:443 {grpc_message:"The service is currently unavailable.", grpc_status:14, created_time:"2024-12-18T20:40:51.170667394+00:00"}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =

def test_start_run(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> run = aiplatform.start_run(_RUN)

tests/system/aiplatform/test_experiments.py:111:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:491: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun.create(
google/cloud/aiplatform/metadata/experiment_run_resource.py:756: in create
metadata_context = _create_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:739: in _create_context
return context.Context._create(
google/cloud/aiplatform/metadata/context.py:244: in _create
resource = cls._create_resource(
google/cloud/aiplatform/metadata/context.py:282: in _create_resource
return client.create_context(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2197: in create_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (parent: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default"
context {
display_name: "run-1"
...truct_value {
}
}
}
}
}
context_id: "tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1/metadataStores/defau...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.ServiceUnavailable: 503 The service is currently unavailable.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: ServiceUnavailable
----------------------------- Captured stdout call -----------------------------

_________________________ TestExperiments.test_get_run _________________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_get_run(self):
> run = aiplatform.ExperimentRun(
run_name=_RUN,
experiment=self._experiment_name,
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
)

tests/system/aiplatform/test_experiments.py:115:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default....75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.experiment_run_resource.ExperimentRun.__init__')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
_______________________ TestExperiments.test_log_params ________________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_log_params(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

_______________________ TestExperiments.test_log_metrics _______________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_log_metrics(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:141:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

_________________ TestExperiments.test_log_time_series_metrics _________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_log_time_series_metrics(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)

> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:153:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

____________ TestAutologging.test_autologging_with_autorun_creation ____________
[gw0] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self = Index(['experiment_name', 'run_name', 'run_type', 'state', 'param.copy_X',
'param.fit_intercept', 'param.positi...ot_mean_squared_error', 'metric.training_r2_score',
'metric.training_mean_squared_error'],
dtype='object')
key = 'metric.training_mae'

def get_loc(self, key):
"""
Get integer location, slice or boolean mask for requested label.

Parameters
----------
key : label

Returns
-------
int if unique index, slice if monotonic index, else mask

Examples
--------
>>> unique_index = pd.Index(list('abc'))
>>> unique_index.get_loc('b')
1

>>> monotonic_index = pd.Index(list('abbc'))
>>> monotonic_index.get_loc('b')
slice(1, 3, None)

>>> non_monotonic_index = pd.Index(list('abcb'))
>>> non_monotonic_index.get_loc('b')
array([False, True, False, True])
"""
casted_key = self._maybe_cast_indexer(key)
try:
> return self._engine.get_loc(casted_key)

.nox/system-3-10/lib/python3.10/site-packages/pandas/core/indexes/base.py:3805:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
index.pyx:167: in pandas._libs.index.IndexEngine.get_loc
???
index.pyx:196: in pandas._libs.index.IndexEngine.get_loc
???
pandas/_libs/hashtable_class_helper.pxi:7081: in pandas._libs.hashtable.PyObjectHashTable.get_item
???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

> ???
E KeyError: 'metric.training_mae'

pandas/_libs/hashtable_class_helper.pxi:7089: KeyError

The above exception was the direct cause of the following exception:

self =
shared_state = {'bucket': , 'resources': [}

def test_autologging_with_autorun_creation(self, shared_state):

aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_autocreate_scikit,
experiment_tensorboard=self._backing_tensorboard,
)

shared_state["resources"] = [self._backing_tensorboard]

shared_state["resources"].append(
aiplatform.metadata.metadata._experiment_tracker.experiment
)

aiplatform.autolog()

build_and_train_test_scikit_model()

# Confirm sklearn run, params, and metrics exist
experiment_df_scikit = aiplatform.get_experiment_df()
assert experiment_df_scikit["run_name"][0].startswith("sklearn-")
assert experiment_df_scikit["param.fit_intercept"][0] == "True"
> assert experiment_df_scikit["metric.training_mae"][0] > 0

tests/system/aiplatform/test_autologging.py:162:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/pandas/core/frame.py:4102: in __getitem__
indexer = self.columns.get_loc(key)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = Index(['experiment_name', 'run_name', 'run_type', 'state', 'param.copy_X',
'param.fit_intercept', 'param.positi...ot_mean_squared_error', 'metric.training_r2_score',
'metric.training_mean_squared_error'],
dtype='object')
key = 'metric.training_mae'

def get_loc(self, key):
"""
Get integer location, slice or boolean mask for requested label.

Parameters
----------
key : label

Returns
-------
int if unique index, slice if monotonic index, else mask

Examples
--------
>>> unique_index = pd.Index(list('abc'))
>>> unique_index.get_loc('b')
1

>>> monotonic_index = pd.Index(list('abbc'))
>>> monotonic_index.get_loc('b')
slice(1, 3, None)

>>> non_monotonic_index = pd.Index(list('abcb'))
>>> non_monotonic_index.get_loc('b')
array([False, True, False, True])
"""
casted_key = self._maybe_cast_indexer(key)
try:
return self._engine.get_loc(casted_key)
except KeyError as err:
if isinstance(casted_key, slice) or (
isinstance(casted_key, abc.Iterable)
and any(isinstance(x, slice) for x in casted_key)
):
raise InvalidIndexError(key)
> raise KeyError(key) from err
E KeyError: 'metric.training_mae'

.nox/system-3-10/lib/python3.10/site-packages/pandas/core/indexes/base.py:3812: KeyError
------------------------------ Captured log setup ------------------------------
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:85 Creating Tensorboard
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:88 Create Tensorboard backing LRO: projects/580378083368/locations/us-central1/tensorboards/73126319340126208/operations/298534346292199424
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:113 Tensorboard created. Resource name: projects/580378083368/locations/us-central1/tensorboards/73126319340126208
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:114 To use this Tensorboard in another session:
INFO google.cloud.aiplatform.tensorboard.tensorboard_resource:base.py:115 tb = aiplatform.Tensorboard('projects/580378083368/locations/us-central1/tensorboards/73126319340126208')
----------------------------- Captured stdout call -----------------------------


------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.metadata.experiment_resources:experiment_resources.py:797 Associating projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--1cc8dd08-536e-4091-bff3-c355b46a7442-sklearn-2024-12-18-20-40-46-d89aa to Experiment: tmpvrtxsdk-e2e--1cc8dd08-536e-4091-bff3-c355b46a7442
_______________ TestExperiments.test_log_classification_metrics ________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'resources': [}

def test_log_classification_metrics(self, shared_state):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:202:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

________________________ TestExperiments.test_log_model ________________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'resources': [}

def test_log_model(self, shared_state):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:227:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

_______________ TestExperiments.test_log_execution_and_artifact ________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'resources': [}

def test_log_execution_and_artifact(self, shared_state):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:288:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

_________________________ TestExperiments.test_end_run _________________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_end_run(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)
> aiplatform.start_run(_RUN, resume=True)

tests/system/aiplatform/test_experiments.py:357:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/metadata.py:480: in start_run
self._experiment_run = experiment_run_resource.ExperimentRun(
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default...0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.metadata._ExperimentTracker.start_run')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
----------------------------- Captured stdout call -----------------------------

___________________ TestExperiments.test_get_experiments_df ____________________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_get_experiments_df(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)

df = aiplatform.get_experiment_df()

pipelines_param_and_metrics = {
"param.dropout_rate": 0.2,
"param.learning_rate": 0.1,
"metric.accuracy": 0.8,
"metric.mse": 1.2,
}

true_df_dict_1 = {f"metric.{key}": value for key, value in _METRICS.items()}
for key, value in _PARAMS.items():
true_df_dict_1[f"param.{key}"] = value

true_df_dict_1["experiment_name"] = self._experiment_name
true_df_dict_1["run_name"] = _RUN
true_df_dict_1["state"] = aiplatform.gapic.Execution.State.COMPLETE.name
true_df_dict_1["run_type"] = aiplatform.metadata.constants.SYSTEM_EXPERIMENT_RUN
true_df_dict_1[f"time_series_metric.{_TIME_SERIES_METRIC_KEY}"] = 4.0

true_df_dict_2 = {f"metric.{key}": value for key, value in _METRICS_2.items()}
for key, value in _PARAMS_2.items():
true_df_dict_2[f"param.{key}"] = value

true_df_dict_2["experiment_name"] = self._experiment_name
true_df_dict_2["run_name"] = _RUN_2
true_df_dict_2["state"] = aiplatform.gapic.Execution.State.COMPLETE.name
true_df_dict_2["run_type"] = aiplatform.metadata.constants.SYSTEM_EXPERIMENT_RUN
true_df_dict_2[f"time_series_metric.{_TIME_SERIES_METRIC_KEY}"] = 0.0
true_df_dict_2.update(pipelines_param_and_metrics)

true_df_dict_3 = {
"experiment_name": self._experiment_name,
"run_name": self._pipeline_job_id,
"run_type": aiplatform.metadata.constants.SYSTEM_PIPELINE_RUN,
"state": aiplatform.gapic.Execution.State.COMPLETE.name,
"time_series_metric.accuracy": 0.0,
}

true_df_dict_3.update(pipelines_param_and_metrics)

for key in pipelines_param_and_metrics.keys():
true_df_dict_1[key] = 0.0
true_df_dict_2[key] = 0.0

for key in _PARAMS.keys():
true_df_dict_3[f"param.{key}"] = 0.0

for key in _METRICS.keys():
true_df_dict_3[f"metric.{key}"] = 0.0

> assert sorted(
[true_df_dict_1, true_df_dict_2, true_df_dict_3],
key=lambda d: d["run_name"],
) == sorted(df.fillna(0.0).to_dict("records"), key=lambda d: d["run_name"])
E AssertionError: assert [{'experiment...1': 0.0, ...}] == [{'experiment...1': 0.0, ...}]
E
E At index 0 diff: {'metric.sdk-metric-test-1': 0.8, 'metric.sdk-metric-test-2': 100.0, 'param.sdk-param-test-1': 0.1, 'param.sdk-param-test-2': 0.2, 'experiment_name': 'tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6', 'run_name': 'run-1', 'state': 'COMPLETE', 'run_type': 'system.ExperimentRun', 'time_series_metric.accuracy': 4.0, 'param.dropout_rate': 0.0, 'param.learning_rate': 0.0, 'metric.accuracy': 0.0, 'metric.mse': 0.0} != {'experiment_name': 'tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6', 'run_name': 'run-2', 'run_type': 'system.ExperimentRun', 'stat...
E
E ...Full output truncated (51 lines hidden), use '-vv' to show

tests/system/aiplatform/test_experiments.py:474: AssertionError
----------------------------- Captured stdout call -----------------------------

______ TestExperiments.test_get_experiments_df_include_time_series_false _______
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_get_experiments_df_include_time_series_false(self):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
experiment=self._experiment_name,
)

df = aiplatform.get_experiment_df(include_time_series=False)

pipelines_param_and_metrics = {
"param.dropout_rate": 0.2,
"param.learning_rate": 0.1,
"metric.accuracy": 0.8,
"metric.mse": 1.2,
}

true_df_dict_1 = {f"metric.{key}": value for key, value in _METRICS.items()}
for key, value in _PARAMS.items():
true_df_dict_1[f"param.{key}"] = value

true_df_dict_1["experiment_name"] = self._experiment_name
true_df_dict_1["run_name"] = _RUN
true_df_dict_1["state"] = aiplatform.gapic.Execution.State.COMPLETE.name
true_df_dict_1["run_type"] = aiplatform.metadata.constants.SYSTEM_EXPERIMENT_RUN

true_df_dict_2 = {f"metric.{key}": value for key, value in _METRICS_2.items()}
for key, value in _PARAMS_2.items():
true_df_dict_2[f"param.{key}"] = value

true_df_dict_2["experiment_name"] = self._experiment_name
true_df_dict_2["run_name"] = _RUN_2
true_df_dict_2["state"] = aiplatform.gapic.Execution.State.COMPLETE.name
true_df_dict_2["run_type"] = aiplatform.metadata.constants.SYSTEM_EXPERIMENT_RUN
true_df_dict_2.update(pipelines_param_and_metrics)

true_df_dict_3 = {
"experiment_name": self._experiment_name,
"run_name": self._pipeline_job_id,
"run_type": aiplatform.metadata.constants.SYSTEM_PIPELINE_RUN,
"state": aiplatform.gapic.Execution.State.COMPLETE.name,
}

true_df_dict_3.update(pipelines_param_and_metrics)

for key in pipelines_param_and_metrics.keys():
true_df_dict_1[key] = 0.0
true_df_dict_2[key] = 0.0

for key in _PARAMS.keys():
true_df_dict_3[f"param.{key}"] = 0.0

for key in _METRICS.keys():
true_df_dict_3[f"metric.{key}"] = 0.0

> assert sorted(
[true_df_dict_1, true_df_dict_2, true_df_dict_3],
key=lambda d: d["run_name"],
) == sorted(df.fillna(0.0).to_dict("records"), key=lambda d: d["run_name"])
E AssertionError: assert [{'experiment...1': 0.0, ...}] == [{'experiment...1': 0.0, ...}]
E
E At index 0 diff: {'metric.sdk-metric-test-1': 0.8, 'metric.sdk-metric-test-2': 100.0, 'param.sdk-param-test-1': 0.1, 'param.sdk-param-test-2': 0.2, 'experiment_name': 'tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6', 'run_name': 'run-1', 'state': 'COMPLETE', 'run_type': 'system.ExperimentRun', 'param.dropout_rate': 0.0, 'param.learning_rate': 0.0, 'metric.accuracy': 0.0, 'metric.mse': 0.0} != {'experiment_name': 'tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6', 'run_name': 'run-2', 'run_type': 'system.ExperimentRun', 'state': 'COMPLETE', 'param.learning_rate...
E
E ...Full output truncated (48 lines hidden), use '-vv' to show

tests/system/aiplatform/test_experiments.py:533: AssertionError
----------------------------- Captured stdout call -----------------------------

------------------------------ Captured log call -------------------------------
WARNING google.cloud.aiplatform.metadata.experiment_resources:experiment_resources.py:469 include_time_series is set to False. Time series metrics will not be included in this call even if they exist.
_______ TestExperiments.test_delete_run_does_not_exist_raises_exception ________
[gw11] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =

def test_delete_run_does_not_exist_raises_exception(self):
> run = aiplatform.ExperimentRun(
run_name=_RUN,
experiment=self._experiment_name,
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
)

tests/system/aiplatform/test_experiments.py:539:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/metadata/experiment_run_resource.py:173: in __init__
raise context_not_found
google/cloud/aiplatform/metadata/experiment_run_resource.py:162: in __init__
self._metadata_node = _get_context()
google/cloud/aiplatform/metadata/experiment_run_resource.py:151: in _get_context
run_context = context.Context(
google/cloud/aiplatform/metadata/resource.py:102: in __init__
self._gca_resource = getattr(self.api_client, self._getter_method)(
google/cloud/aiplatform_v1/services/metadata_service/client.py:2301: in get_context
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:293: in retry_wrapped_func
return retry_target(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:153: in retry_target
_retry_error_helper(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_base.py:212: in _retry_error_helper
raise final_exc from source_exc
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py:144: in retry_target
result = target()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/ucaip-sample-tests/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/ucaip-sample-tests/locations/us-central1/metadataStores/default....75.0+top_google_constructor_method+google.cloud.aiplatform.metadata.experiment_run_resource.ExperimentRun.__init__')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.NotFound: 404 Resource not found.; GetContext is unable to find context resource with name: projects/580378083368/locations/us-central1/metadataStores/default/contexts/tmpvrtxsdk-e2e--5439416e-cac7-4418-8847-1010b13fb2a6-run-1

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: NotFound
_____ TestLanguageModels.test_code_chat_model_send_message_streaming[grpc] _____
[gw14] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
api_transport = 'grpc'

@pytest.mark.parametrize("api_transport", ["grpc", "rest"])
def test_code_chat_model_send_message_streaming(self, api_transport):
aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
api_transport=api_transport,
)

chat_model = language_models.CodeChatModel.from_pretrained("codechat-bison@001")
chat = chat_model.start_chat()

message1 = "Please help write a function to calculate the max of two numbers"
for response in chat.send_message_streaming(message1):
> assert response.text
E AssertionError: assert ''
E + where '' = MultiCandidateTextGenerationResponse(text='', _prediction_response=Prediction(predictions=[{'candidates': [{'content': '', 'author': '1'}], 'citationMetadata': [{'citations': None}], 'safetyAttributes': [{'blocked': False, 'scores': None, 'categories': None}]}], deployed_model_id='', metadata=None, model_version_id=None, model_resource_name=None, explanations=None), is_blocked=False, errors=(), safety_attributes={}, grounding_metadata=GroundingMetadata(citations=[], search_queries=[]), candidates=[TextGenerationResponse(text='', is_blocked=False, errors=(), safety_attributes={}, grounding_metadata=GroundingMetadata(citations=[], search_queries=[]))]).text

tests/system/aiplatform/test_language_models.py:559: AssertionError
___________ TestPredictionCpr.test_build_cpr_model_upload_and_deploy ___________
[gw13] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {}
caplog = <_pytest.logging.LogCaptureFixture object at 0x7f414fe83070>

def test_build_cpr_model_upload_and_deploy(self, shared_state, caplog):
"""Creates a CPR model from custom predictor, uploads it and deploys."""

caplog.set_level(logging.INFO)

aiplatform.init(project=e2e_base._PROJECT, location=e2e_base._LOCATION)

local_model = LocalModel.build_cpr_model(
_USER_CODE_DIR,
_IMAGE_URI,
predictor=SklearnPredictor,
requirements_path=os.path.join(_USER_CODE_DIR, _REQUIREMENTS_FILE),
)

with local_model.deploy_to_local_endpoint(
artifact_uri=_LOCAL_MODEL_DIR,
) as local_endpoint:
local_predict_response = local_endpoint.predict(
request=f'{{"instances": {_PREDICTION_INPUT}}}',
headers={"Content-Type": "application/json"},
)
assert len(json.loads(local_predict_response.content)["predictions"]) == 1

interactive_local_endpoint = local_model.deploy_to_local_endpoint(
artifact_uri=_LOCAL_MODEL_DIR,
)
interactive_local_endpoint.serve()
interactive_local_predict_response = interactive_local_endpoint.predict(
request=f'{{"instances": {_PREDICTION_INPUT}}}',
headers={"Content-Type": "application/json"},
)
interactive_local_endpoint.stop()
assert (
len(json.loads(interactive_local_predict_response.content)["predictions"])
== 1
)

# Configure docker.
logging.info(
subprocess.run(["gcloud", "auth", "configure-docker"], capture_output=True)
)

> local_model.push_image()

tests/system/aiplatform/test_prediction_cpr.py:94:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/prediction/local_model.py:612: in push_image
errors.raise_docker_error_with_command(command, return_code)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

command = ['docker', 'push', 'gcr.io/ucaip-sample-tests/prediction-cpr/sklearn:20241218_204029']
return_code = 1

def raise_docker_error_with_command(command: List[str], return_code: int) -> NoReturn:
"""Raises DockerError with the given command and return code.

Args:
command (List(str)):
Required. The docker command that fails.
return_code (int):
Required. The return code from the command.

Raises:
DockerError which error message populated by the given command and return code.
"""
error_msg = textwrap.dedent(
"""
Docker failed with error code {code}.
Command: {cmd}
""".format(
code=return_code, cmd=" ".join(command)
)
)
> raise DockerError(error_msg, command, return_code)
E google.cloud.aiplatform.docker_utils.errors.DockerError: ('\nDocker failed with error code 1.\nCommand: docker push gcr.io/ucaip-sample-tests/prediction-cpr/sklearn:20241218_204029\n', ['docker', 'push', 'gcr.io/ucaip-sample-tests/prediction-cpr/sklearn:20241218_204029'], 1)

google/cloud/aiplatform/docker_utils/errors.py:60: DockerError
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.docker_utils.build:build.py:531 Running command: docker build -t gcr.io/ucaip-sample-tests/prediction-cpr/sklearn:20241218_204029 --rm -f- /tmpfs/src/github/python-aiplatform/tests/system/aiplatform/test_resources/cpr_user_code
INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 DEPRECATED: The legacy builder is deprecated and will be removed in a future release.

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Install the buildx component to build images with BuildKit:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 https://docs.docker.com/go/buildx/

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Sending build context to Docker daemon 11.53kB
INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 1/14 : FROM python:3.10

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 3.10: Pulling from library/python

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 fdf894e782a2: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5bd71677db44: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 551df7f94f9c: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ce82e98d553d: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bd84c4462442: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 4936f42e964c: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5ddbee3ba41e: Pulling fs layer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 4936f42e964c: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ce82e98d553d: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5ddbee3ba41e: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bd84c4462442: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5bd71677db44: Verifying Checksum

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5bd71677db44: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 fdf894e782a2: Verifying Checksum

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 fdf894e782a2: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 551df7f94f9c: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bd84c4462442: Verifying Checksum

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bd84c4462442: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 4936f42e964c: Verifying Checksum

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 4936f42e964c: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5ddbee3ba41e: Verifying Checksum

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5ddbee3ba41e: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 fdf894e782a2: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5bd71677db44: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ce82e98d553d: Download complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 551df7f94f9c: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ce82e98d553d: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bd84c4462442: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 4936f42e964c: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5ddbee3ba41e: Pull complete

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Digest: sha256:3ba2e48b887586835af6a0c35fc6fc6086fb4881e963082330ab0a35f3f42c16

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Status: Downloaded newer image for python:3.10

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 7dac968d8635

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 2/14 : ENV PYTHONDONTWRITEBYTECODE=1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 412826b433d1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 412826b433d1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 440c07bd19bd

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 3/14 : EXPOSE 8080

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in d68621a247ac

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container d68621a247ac

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 577f2c3d5655

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 4/14 : ENTRYPOINT ["python", "-m", "google.cloud.aiplatform.prediction.model_server"]

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 48a3e3180262

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 48a3e3180262

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 3b125dc225b0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 5/14 : RUN mkdir -m 777 -p /usr/app /home

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in dc672644d7f9

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container dc672644d7f9

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 897016f4dc5d

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 6/14 : WORKDIR /usr/app

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 42433cddd1bb

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 42433cddd1bb

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 4a01df02f308

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 7/14 : ENV HOME=/home

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 62116cde9afe

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 62116cde9afe

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> bda070d8c420

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 8/14 : RUN pip install --no-cache-dir --force-reinstall 'google-cloud-aiplatform[prediction]>=1.27.0'

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in f63048c4941f

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-aiplatform[prediction]>=1.27.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_aiplatform-1.75.0-py2.py3-none-any.whl (6.9 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.9/6.9 MB 87.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting shapely<3.0.0dev

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading shapely-2.0.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 204.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-auth<3.0.0dev,>=2.14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_auth-2.37.0-py2.py3-none-any.whl (209 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.8/209.8 kB 201.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting packaging>=14.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading packaging-24.2-py3-none-any.whl (65 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.5/65.5 kB 169.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_api_core-2.24.0-py3-none-any.whl (158 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 158.6/158.6 kB 188.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting proto-plus<2.0.0dev,>=1.22.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading proto_plus-1.25.0-py3-none-any.whl (50 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.1/50.1 kB 164.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pydantic<3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pydantic-2.10.4-py3-none-any.whl (431 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 431.8/431.8 kB 215.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_resource_manager-1.14.0-py2.py3-none-any.whl (384 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 384.1/384.1 kB 198.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-bigquery!=3.20.0,<4.0.0dev,>=1.15.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_bigquery-3.27.0-py2.py3-none-any.whl (240 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 240.1/240.1 kB 200.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting docstring-parser<1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading docstring_parser-0.16-py3-none-any.whl (36 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<6.0.0dev,>=3.20.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading protobuf-5.29.2-cp38-abi3-manylinux2014_x86_64.whl (319 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 319.7/319.7 kB 214.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-storage<3.0.0dev,>=1.32.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_storage-2.19.0-py2.py3-none-any.whl (131 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 131.8/131.8 kB 180.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting docker>=5.0.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading docker-7.1.0-py3-none-any.whl (147 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 147.8/147.8 kB 195.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting uvicorn[standard]>=0.16.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading uvicorn-0.34.0-py3-none-any.whl (62 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.3/62.3 kB 180.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting fastapi<=0.114.0,>=0.71.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading fastapi-0.114.0-py3-none-any.whl (94 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.0/94.0 kB 185.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httpx<0.25.0,>=0.23.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httpx-0.24.1-py3-none-any.whl (75 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 75.4/75.4 kB 171.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting starlette>=0.17.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading starlette-0.42.0-py3-none-any.whl (73 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 73.4/73.4 kB 147.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting urllib3>=1.26.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading urllib3-2.2.3-py3-none-any.whl (126 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 126.3/126.3 kB 206.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting requests>=2.26.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading requests-2.32.3-py3-none-any.whl (64 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 64.9/64.9 kB 181.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting typing-extensions>=4.8.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading typing_extensions-4.12.2-py3-none-any.whl (37 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting starlette>=0.17.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading starlette-0.38.6-py3-none-any.whl (71 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 71.5/71.5 kB 185.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting googleapis-common-protos<2.0.dev0,>=1.56.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading googleapis_common_protos-1.66.0-py2.py3-none-any.whl (221 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 221.7/221.7 kB 212.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpcio<2.0dev,>=1.33.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpcio-1.68.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 210.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpcio-status<2.0.dev0,>=1.33.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpcio_status-1.68.1-py3-none-any.whl (14 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting rsa<5,>=3.1.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading rsa-4.9-py3-none-any.whl (34 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyasn1-modules>=0.2.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pyasn1_modules-0.4.1-py3-none-any.whl (181 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.5/181.5 kB 207.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting cachetools<6.0,>=2.0.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading cachetools-5.5.0-py3-none-any.whl (9.5 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-core<3.0.0dev,>=2.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_core-2.4.1-py2.py3-none-any.whl (29 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting python-dateutil<3.0dev,>=2.7.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 229.9/229.9 kB 214.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-resumable-media<3.0dev,>=2.0.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_resumable_media-2.7.2-py2.py3-none-any.whl (81 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.3/81.3 kB 191.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpc_google_iam_v1-0.13.1-py2.py3-none-any.whl (24 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-crc32c<2.0dev,>=1.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_crc32c-1.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (37 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting sniffio

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading sniffio-1.3.1-py3-none-any.whl (10 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting certifi

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading certifi-2024.12.14-py3-none-any.whl (164 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 164.9/164.9 kB 211.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httpcore<0.18.0,>=0.15.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httpcore-0.17.3-py3-none-any.whl (74 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 74.5/74.5 kB 177.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting idna

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading idna-3.10-py3-none-any.whl (70 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 kB 175.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting annotated-types>=0.6.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading annotated_types-0.7.0-py3-none-any.whl (13 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pydantic-core==2.27.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 213.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting numpy<3,>=1.14

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading numpy-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.4 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.4/16.4 MB 199.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting anyio<5,>=3.4.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading anyio-4.7.0-py3-none-any.whl (93 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.1/93.1 kB 196.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting h11>=0.8

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading h11-0.14.0-py3-none-any.whl (58 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.3/58.3 kB 170.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting click>=7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading click-8.1.7-py3-none-any.whl (97 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.9/97.9 kB 185.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting python-dotenv>=0.13

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading python_dotenv-1.0.1-py3-none-any.whl (19 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting uvloop!=0.15.0,!=0.15.1,>=0.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading uvloop-0.21.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.8 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.8/3.8 MB 211.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyyaml>=5.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (751 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 751.2/751.2 kB 222.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httptools>=0.6.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httptools-0.6.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 442.1/442.1 kB 208.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting websockets>=10.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading websockets-14.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (168 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 168.2/168.2 kB 192.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting watchfiles>=0.13

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading watchfiles-1.0.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (443 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 443.8/443.8 kB 211.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting exceptiongroup>=1.0.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading exceptiongroup-1.2.2-py3-none-any.whl (16 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyasn1<0.7.0,>=0.4.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pyasn1-0.6.1-py3-none-any.whl (83 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.1/83.1 kB 174.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting six>=1.5

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading six-1.17.0-py2.py3-none-any.whl (11 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting charset-normalizer<4,>=2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (144 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 144.8/144.8 kB 205.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Installing collected packages: websockets, uvloop, urllib3, typing-extensions, sniffio, six, pyyaml, python-dotenv, pyasn1, protobuf, packaging, numpy, idna, httptools, h11, grpcio, google-crc32c, exceptiongroup, docstring-parser, click, charset-normalizer, certifi, cachetools, annotated-types, uvicorn, shapely, rsa, requests, python-dateutil, pydantic-core, pyasn1-modules, proto-plus, googleapis-common-protos, google-resumable-media, anyio, watchfiles, starlette, pydantic, httpcore, grpcio-status, google-auth, docker, httpx, grpc-google-iam-v1, google-api-core, fastapi, google-cloud-core, google-cloud-storage, google-cloud-resource-manager, google-cloud-bigquery, google-cloud-aiplatform

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully installed annotated-types-0.7.0 anyio-4.7.0 cachetools-5.5.0 certifi-2024.12.14 charset-normalizer-3.4.0 click-8.1.7 docker-7.1.0 docstring-parser-0.16 exceptiongroup-1.2.2 fastapi-0.114.0 google-api-core-2.24.0 google-auth-2.37.0 google-cloud-aiplatform-1.75.0 google-cloud-bigquery-3.27.0 google-cloud-core-2.4.1 google-cloud-resource-manager-1.14.0 google-cloud-storage-2.19.0 google-crc32c-1.6.0 google-resumable-media-2.7.2 googleapis-common-protos-1.66.0 grpc-google-iam-v1-0.13.1 grpcio-1.68.1 grpcio-status-1.68.1 h11-0.14.0 httpcore-0.17.3 httptools-0.6.4 httpx-0.24.1 idna-3.10 numpy-2.2.0 packaging-24.2 proto-plus-1.25.0 protobuf-5.29.2 pyasn1-0.6.1 pyasn1-modules-0.4.1 pydantic-2.10.4 pydantic-core-2.27.2 python-dateutil-2.9.0.post0 python-dotenv-1.0.1 pyyaml-6.0.2 requests-2.32.3 rsa-4.9 shapely-2.0.6 six-1.17.0 sniffio-1.3.1 starlette-0.38.6 typing-extensions-4.12.2 urllib3-2.2.3 uvicorn-0.34.0 uvloop-0.21.0 watchfiles-1.0.3 websockets-14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60


INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 [notice] A new release of pip is available: 23.0.1 -> 24.3.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 [notice] To update, run: pip install --upgrade pip

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60
Removing intermediate container f63048c4941f

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 610e104f50f2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 9/14 : ENV HANDLER_MODULE=google.cloud.aiplatform.prediction.handler

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in b7394e9cbbb4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container b7394e9cbbb4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> d163c541e5a1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 10/14 : ENV HANDLER_CLASS=PredictionHandler

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 9873be436ffd

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 9873be436ffd

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 05950668fa30

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 11/14 : ENV PREDICTOR_MODULE=predictor

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 2f417c3e8cf0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 2f417c3e8cf0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> c438fa60b003

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 12/14 : ENV PREDICTOR_CLASS=SklearnPredictor

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in 9f9921c518b6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Removing intermediate container 9f9921c518b6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> 884386262b12

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 13/14 : COPY [".", "."]

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> d1790ccac5db

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Step 14/14 : RUN pip install --no-cache-dir --force-reinstall -r requirements.txt

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> Running in f81297c47b63

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting scikit-learn

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading scikit_learn-1.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.5 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.5/13.5 MB 180.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-aiplatform[prediction]

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_aiplatform-1.75.0-py2.py3-none-any.whl (6.9 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.9/6.9 MB 215.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting threadpoolctl>=3.1.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading threadpoolctl-3.5.0-py3-none-any.whl (18 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting scipy>=1.6.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading scipy-1.14.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (41.2 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.2/41.2 MB 204.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting joblib>=1.2.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading joblib-1.4.2-py3-none-any.whl (301 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 301.8/301.8 kB 221.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting numpy>=1.19.5

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading numpy-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.4 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.4/16.4 MB 206.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-auth<3.0.0dev,>=2.14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_auth-2.37.0-py2.py3-none-any.whl (209 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 209.8/209.8 kB 216.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-bigquery!=3.20.0,<4.0.0dev,>=1.15.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_bigquery-3.27.0-py2.py3-none-any.whl (240 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 240.1/240.1 kB 206.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<6.0.0dev,>=3.20.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading protobuf-5.29.2-cp38-abi3-manylinux2014_x86_64.whl (319 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 319.7/319.7 kB 221.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting docstring-parser<1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading docstring_parser-0.16-py3-none-any.whl (36 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-resource-manager<3.0.0dev,>=1.3.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_resource_manager-1.14.0-py2.py3-none-any.whl (384 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 384.1/384.1 kB 221.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting shapely<3.0.0dev

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading shapely-2.0.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 188.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pydantic<3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pydantic-2.10.4-py3-none-any.whl (431 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 431.8/431.8 kB 217.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting proto-plus<2.0.0dev,>=1.22.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading proto_plus-1.25.0-py3-none-any.whl (50 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.1/50.1 kB 176.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-storage<3.0.0dev,>=1.32.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_storage-2.19.0-py2.py3-none-any.whl (131 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 131.8/131.8 kB 180.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.34.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_api_core-2.24.0-py3-none-any.whl (158 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 158.6/158.6 kB 207.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting packaging>=14.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading packaging-24.2-py3-none-any.whl (65 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.5/65.5 kB 189.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httpx<0.25.0,>=0.23.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httpx-0.24.1-py3-none-any.whl (75 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 75.4/75.4 kB 150.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting uvicorn[standard]>=0.16.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading uvicorn-0.34.0-py3-none-any.whl (62 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.3/62.3 kB 183.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting docker>=5.0.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading docker-7.1.0-py3-none-any.whl (147 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 147.8/147.8 kB 207.9 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting starlette>=0.17.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading starlette-0.42.0-py3-none-any.whl (73 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 73.4/73.4 kB 182.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting fastapi<=0.114.0,>=0.71.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading fastapi-0.114.0-py3-none-any.whl (94 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.0/94.0 kB 200.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting urllib3>=1.26.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading urllib3-2.2.3-py3-none-any.whl (126 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 126.3/126.3 kB 205.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting requests>=2.26.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading requests-2.32.3-py3-none-any.whl (64 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 64.9/64.9 kB 186.8 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting starlette>=0.17.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading starlette-0.38.6-py3-none-any.whl (71 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 71.5/71.5 kB 178.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting typing-extensions>=4.8.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading typing_extensions-4.12.2-py3-none-any.whl (37 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting googleapis-common-protos<2.0.dev0,>=1.56.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading googleapis_common_protos-1.66.0-py2.py3-none-any.whl (221 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 221.7/221.7 kB 215.3 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpcio-status<2.0.dev0,>=1.33.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpcio_status-1.68.1-py3-none-any.whl (14 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpcio<2.0dev,>=1.33.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpcio-1.68.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 171.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyasn1-modules>=0.2.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pyasn1_modules-0.4.1-py3-none-any.whl (181 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.5/181.5 kB 198.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting cachetools<6.0,>=2.0.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading cachetools-5.5.0-py3-none-any.whl (9.5 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting rsa<5,>=3.1.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading rsa-4.9-py3-none-any.whl (34 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-resumable-media<3.0dev,>=2.0.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_resumable_media-2.7.2-py2.py3-none-any.whl (81 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.3/81.3 kB 192.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-cloud-core<3.0.0dev,>=2.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_cloud_core-2.4.1-py2.py3-none-any.whl (29 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting python-dateutil<3.0dev,>=2.7.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 229.9/229.9 kB 206.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading grpc_google_iam_v1-0.13.1-py2.py3-none-any.whl (24 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting google-crc32c<2.0dev,>=1.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading google_crc32c-1.6.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (37 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting sniffio

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading sniffio-1.3.1-py3-none-any.whl (10 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting idna

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading idna-3.10-py3-none-any.whl (70 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 kB 177.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httpcore<0.18.0,>=0.15.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httpcore-0.17.3-py3-none-any.whl (74 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 74.5/74.5 kB 155.4 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting certifi

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading certifi-2024.12.14-py3-none-any.whl (164 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 164.9/164.9 kB 195.1 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting annotated-types>=0.6.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading annotated_types-0.7.0-py3-none-any.whl (13 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pydantic-core==2.27.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 220.6 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting anyio<5,>=3.4.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading anyio-4.7.0-py3-none-any.whl (93 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.1/93.1 kB 190.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting h11>=0.8

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading h11-0.14.0-py3-none-any.whl (58 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.3/58.3 kB 166.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting click>=7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading click-8.1.7-py3-none-any.whl (97 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.9/97.9 kB 198.0 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting python-dotenv>=0.13

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading python_dotenv-1.0.1-py3-none-any.whl (19 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting watchfiles>=0.13

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading watchfiles-1.0.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (443 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 443.8/443.8 kB 216.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting httptools>=0.6.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading httptools-0.6.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 442.1/442.1 kB 204.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyyaml>=5.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (751 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 751.2/751.2 kB 228.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting websockets>=10.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading websockets-14.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (168 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 168.2/168.2 kB 166.7 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting uvloop!=0.15.0,!=0.15.1,>=0.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading uvloop-0.21.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.8 MB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.8/3.8 MB 180.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting exceptiongroup>=1.0.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading exceptiongroup-1.2.2-py3-none-any.whl (16 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting pyasn1<0.7.0,>=0.4.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading pyasn1-0.6.1-py3-none-any.whl (83 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.1/83.1 kB 198.2 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting six>=1.5

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading six-1.17.0-py2.py3-none-any.whl (11 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Collecting charset-normalizer<4,>=2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Downloading charset_normalizer-3.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (144 kB)

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 144.8/144.8 kB 119.5 MB/s eta 0:00:00

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Installing collected packages: websockets, uvloop, urllib3, typing-extensions, threadpoolctl, sniffio, six, pyyaml, python-dotenv, pyasn1, protobuf, packaging, numpy, joblib, idna, httptools, h11, grpcio, google-crc32c, exceptiongroup, docstring-parser, click, charset-normalizer, certifi, cachetools, annotated-types, uvicorn, shapely, scipy, rsa, requests, python-dateutil, pydantic-core, pyasn1-modules, proto-plus, googleapis-common-protos, google-resumable-media, anyio, watchfiles, starlette, scikit-learn, pydantic, httpcore, grpcio-status, google-auth, docker, httpx, grpc-google-iam-v1, google-api-core, fastapi, google-cloud-core, google-cloud-storage, google-cloud-resource-manager, google-cloud-bigquery, google-cloud-aiplatform

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: websockets

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: websockets 14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling websockets-14.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled websockets-14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: uvloop

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: uvloop 0.21.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling uvloop-0.21.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled uvloop-0.21.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: urllib3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: urllib3 2.2.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling urllib3-2.2.3:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled urllib3-2.2.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: typing-extensions

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: typing_extensions 4.12.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling typing_extensions-4.12.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled typing_extensions-4.12.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: sniffio

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: sniffio 1.3.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling sniffio-1.3.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled sniffio-1.3.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: six

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: six 1.17.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling six-1.17.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled six-1.17.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: pyyaml

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: PyYAML 6.0.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling PyYAML-6.0.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled PyYAML-6.0.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: python-dotenv

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: python-dotenv 1.0.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling python-dotenv-1.0.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled python-dotenv-1.0.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: pyasn1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: pyasn1 0.6.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling pyasn1-0.6.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled pyasn1-0.6.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: protobuf

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: protobuf 5.29.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling protobuf-5.29.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled protobuf-5.29.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: packaging

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: packaging 24.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling packaging-24.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled packaging-24.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: numpy

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: numpy 2.2.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling numpy-2.2.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled numpy-2.2.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: idna

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: idna 3.10

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling idna-3.10:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled idna-3.10

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: httptools

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: httptools 0.6.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling httptools-0.6.4:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled httptools-0.6.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: h11

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: h11 0.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling h11-0.14.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled h11-0.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: grpcio

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: grpcio 1.68.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling grpcio-1.68.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled grpcio-1.68.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-crc32c

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-crc32c 1.6.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-crc32c-1.6.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-crc32c-1.6.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: exceptiongroup

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: exceptiongroup 1.2.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling exceptiongroup-1.2.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled exceptiongroup-1.2.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: docstring-parser

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: docstring_parser 0.16

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling docstring_parser-0.16:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled docstring_parser-0.16

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: click

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: click 8.1.7

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling click-8.1.7:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled click-8.1.7

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: charset-normalizer

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: charset-normalizer 3.4.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling charset-normalizer-3.4.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled charset-normalizer-3.4.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: certifi

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: certifi 2024.12.14

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling certifi-2024.12.14:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled certifi-2024.12.14

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: cachetools

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: cachetools 5.5.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling cachetools-5.5.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled cachetools-5.5.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: annotated-types

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: annotated-types 0.7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling annotated-types-0.7.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled annotated-types-0.7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: uvicorn

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: uvicorn 0.34.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling uvicorn-0.34.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled uvicorn-0.34.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: shapely

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: shapely 2.0.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling shapely-2.0.6:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled shapely-2.0.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: rsa

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: rsa 4.9

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling rsa-4.9:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled rsa-4.9

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: requests

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: requests 2.32.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling requests-2.32.3:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled requests-2.32.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: python-dateutil

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: python-dateutil 2.9.0.post0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling python-dateutil-2.9.0.post0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled python-dateutil-2.9.0.post0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: pydantic-core

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: pydantic_core 2.27.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling pydantic_core-2.27.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled pydantic_core-2.27.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: pyasn1-modules

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: pyasn1_modules 0.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling pyasn1_modules-0.4.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled pyasn1_modules-0.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: proto-plus

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: proto-plus 1.25.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling proto-plus-1.25.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled proto-plus-1.25.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: googleapis-common-protos

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: googleapis-common-protos 1.66.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling googleapis-common-protos-1.66.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled googleapis-common-protos-1.66.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-resumable-media

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-resumable-media 2.7.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-resumable-media-2.7.2:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-resumable-media-2.7.2

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: anyio

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: anyio 4.7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling anyio-4.7.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled anyio-4.7.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: watchfiles

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: watchfiles 1.0.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling watchfiles-1.0.3:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled watchfiles-1.0.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: starlette

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: starlette 0.38.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling starlette-0.38.6:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled starlette-0.38.6

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: pydantic

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: pydantic 2.10.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling pydantic-2.10.4:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled pydantic-2.10.4

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: httpcore

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: httpcore 0.17.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling httpcore-0.17.3:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled httpcore-0.17.3

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: grpcio-status

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: grpcio-status 1.68.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling grpcio-status-1.68.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled grpcio-status-1.68.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-auth

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-auth 2.37.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-auth-2.37.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-auth-2.37.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: docker

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: docker 7.1.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling docker-7.1.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled docker-7.1.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: httpx

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: httpx 0.24.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling httpx-0.24.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled httpx-0.24.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: grpc-google-iam-v1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: grpc-google-iam-v1 0.13.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling grpc-google-iam-v1-0.13.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled grpc-google-iam-v1-0.13.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-api-core

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-api-core 2.24.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-api-core-2.24.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-api-core-2.24.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: fastapi

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: fastapi 0.114.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling fastapi-0.114.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled fastapi-0.114.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-cloud-core

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-cloud-core 2.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-cloud-core-2.4.1:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-cloud-core-2.4.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-cloud-storage

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-cloud-storage 2.19.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-cloud-storage-2.19.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-cloud-storage-2.19.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-cloud-resource-manager

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-cloud-resource-manager 1.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-cloud-resource-manager-1.14.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-cloud-resource-manager-1.14.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-cloud-bigquery

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-cloud-bigquery 3.27.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-cloud-bigquery-3.27.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-cloud-bigquery-3.27.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Attempting uninstall: google-cloud-aiplatform

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Found existing installation: google-cloud-aiplatform 1.75.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Uninstalling google-cloud-aiplatform-1.75.0:

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully uninstalled google-cloud-aiplatform-1.75.0

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully installed annotated-types-0.7.0 anyio-4.7.0 cachetools-5.5.0 certifi-2024.12.14 charset-normalizer-3.4.0 click-8.1.7 docker-7.1.0 docstring-parser-0.16 exceptiongroup-1.2.2 fastapi-0.114.0 google-api-core-2.24.0 google-auth-2.37.0 google-cloud-aiplatform-1.75.0 google-cloud-bigquery-3.27.0 google-cloud-core-2.4.1 google-cloud-resource-manager-1.14.0 google-cloud-storage-2.19.0 google-crc32c-1.6.0 google-resumable-media-2.7.2 googleapis-common-protos-1.66.0 grpc-google-iam-v1-0.13.1 grpcio-1.68.1 grpcio-status-1.68.1 h11-0.14.0 httpcore-0.17.3 httptools-0.6.4 httpx-0.24.1 idna-3.10 joblib-1.4.2 numpy-2.2.0 packaging-24.2 proto-plus-1.25.0 protobuf-5.29.2 pyasn1-0.6.1 pyasn1-modules-0.4.1 pydantic-2.10.4 pydantic-core-2.27.2 python-dateutil-2.9.0.post0 python-dotenv-1.0.1 pyyaml-6.0.2 requests-2.32.3 rsa-4.9 scikit-learn-1.6.0 scipy-1.14.1 shapely-2.0.6 six-1.17.0 sniffio-1.3.1 starlette-0.38.6 threadpoolctl-3.5.0 typing-extensions-4.12.2 urllib3-2.2.3 uvicorn-0.34.0 uvloop-0.21.0 watchfiles-1.0.3 websockets-14.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60


INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 [notice] A new release of pip is available: 23.0.1 -> 24.3.1

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 [notice] To update, run: pip install --upgrade pip

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60
Removing intermediate container f81297c47b63

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 ---> f410abc30c8b

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully built f410abc30c8b

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 Successfully tagged gcr.io/ucaip-sample-tests/prediction-cpr/sklearn:20241218_204029

INFO google.cloud.aiplatform.prediction.local_endpoint:local_endpoint.py:237 Got the project id from the global config: ucaip-sample-tests.
INFO google.cloud.aiplatform.prediction.local_endpoint:local_endpoint.py:237 Got the project id from the global config: ucaip-sample-tests.
INFO root:test_prediction_cpr.py:90 CompletedProcess(args=['gcloud', 'auth', 'configure-docker'], returncode=0, stdout=b'', stderr=b'Adding credentials for all GCR repositories.\nWARNING: A long list of credential helpers may cause delays running \'docker build\'. We recommend passing the registry name to configure only the registry you are using.\nAfter update, the following will be written to your Docker config file located \nat [/root/.docker/config.json]:\n {\n "credHelpers": {\n "gcr.io": "gcloud",\n "us.gcr.io": "gcloud",\n "eu.gcr.io": "gcloud",\n "asia.gcr.io": "gcloud",\n "staging-k8s.gcr.io": "gcloud",\n "marketplace.gcr.io": "gcloud"\n }\n}\n\nDo you want to continue (Y/n)? \nDocker configuration file updated.\n')
INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 The push refers to repository [gcr.io/ucaip-sample-tests/prediction-cpr/sklearn]

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 7958aba9bde9: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 b5fae6677bf2: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 52603c99d4a2: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 d75cacb996de: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 5a8facc0bb33: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bc979ff557a4: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 961b75ccd986: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 0aeeeb7c293d: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 c81d4fdb67fc: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 0e82d78b3ea1: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 301c1bb42cc0: Preparing

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 bc979ff557a4: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 961b75ccd986: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 0aeeeb7c293d: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 c81d4fdb67fc: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 301c1bb42cc0: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 0e82d78b3ea1: Waiting

INFO google.cloud.aiplatform.docker_utils.local_util:local_util.py:60 error parsing HTTP 412 response body: invalid character 'C' looking for beginning of value: "Container Registry is deprecated and shutting down, please use the auto migration tool to migrate to Artifact Registry. For more details see: https://cloud.google.com/artifact-registry/docs/transition/auto-migrate-gcr-ar"
_ TestGenerativeModels.test_generate_content_with_cached_content_from_text[grpc-PROD_ENDPOINT] _
[gw14] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (model: "projects/ucaip-sample-tests/locations/us-central1/publishers/google/models/gemini-1.5-pro-001"
contents {
r...rojects/580378083368/locations/us-central1/cachedContents/740358353586225152"
generation_config {
temperature: 0
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'model=projects/ucaip-sample-tests/locations/us-central1/publishers/google/mod...21.0 gapic/1.75.0+top_google_constructor_method+vertexai.preview.generative_models.GenerativeModel.generate_content')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.RESOURCE_EXHAUSTED
E details = "Resource exhausted. Please try again later. Please refer to https://cloud.google.com/vertex-ai/generative-ai/docs/error-code-429 for more details."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:173.194.203.95:443 {created_time:"2024-12-18T20:46:14.831042279+00:00", grpc_status:8, grpc_message:"Resource exhausted. Please try again later. Please refer to https://cloud.google.com/vertex-ai/generative-ai/docs/error-code-429 for more details."}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
api_endpoint_env_name = 'PROD_ENDPOINT'

def test_generate_content_with_cached_content_from_text(
self, api_endpoint_env_name
):
cached_content = caching.CachedContent.create(
model_name=GEMINI_15_PRO_MODEL_NAME,
system_instruction="Please answer all the questions like a pirate.",
contents=[
Content.from_dict(
{
"role": "user",
"parts": [
{
"file_data": {
"mime_type": "application/pdf",
"file_uri": "gs://ucaip-samples-us-central1/sdk_system_test_resources/megatro-llm.pdf",
}
}
for _ in range(10)
]
+ [
{"text": "Please try to summarize the previous contents."},
],
}
)
],
)

model = preview_generative_models.GenerativeModel.from_cached_content(
cached_content=cached_content
)

> response = model.generate_content(
"Why is sky blue?",
generation_config=preview_generative_models.GenerationConfig(temperature=0),
)

tests/system/vertexai/test_generative_models.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
vertexai/generative_models/_generative_models.py:621: in generate_content
return self._generate_content(
vertexai/generative_models/_generative_models.py:746: in _generate_content
gapic_response = self._prediction_client.generate_content(request=request)
google/cloud/aiplatform_v1beta1/services/prediction_service/client.py:2348: in generate_content
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (model: "projects/ucaip-sample-tests/locations/us-central1/publishers/google/models/gemini-1.5-pro-001"
contents {
r...rojects/580378083368/locations/us-central1/cachedContents/740358353586225152"
generation_config {
temperature: 0
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'model=projects/ucaip-sample-tests/locations/us-central1/publishers/google/mod...21.0 gapic/1.75.0+top_google_constructor_method+vertexai.preview.generative_models.GenerativeModel.generate_content')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.ResourceExhausted: 429 Resource exhausted. Please try again later. Please refer to https://cloud.google.com/vertex-ai/generative-ai/docs/error-code-429 for more details.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: ResourceExhausted
_________________ TestReasoningEngines.test_langchain_template _________________
[gw8] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'resources': [}

def test_langchain_template(self, shared_state):
super().setup_method()
credentials, _ = auth.default(
scopes=["https://www.googleapis.com/auth/cloud-platform"]
)
vertexai.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
staging_bucket=f"gs://{shared_state['staging_bucket_name']}",
credentials=credentials,
)
# Test prebuilt langchain_template
created_app = reasoning_engines.ReasoningEngine.create(
reasoning_engines.LangchainAgent(
model="gemini-1.5-pro-preview-0409",
model_tool_kwargs={
"tool_config": {
"function_calling_config": {
"mode": ToolConfig.FunctionCallingConfig.Mode.AUTO,
},
},
},
),
requirements=["google-cloud-aiplatform[reasoningengine,langchain]"],
display_name="test-display-name",
description="test-description",
gcs_dir_name="test-gcs-dir-name",
)
shared_state.setdefault("resources", [])
shared_state["resources"].append(created_app) # Deletion at teardown.
got_app = reasoning_engines.ReasoningEngine(created_app.resource_name)

# Test resource attributes
assert isinstance(created_app.resource_name, str)
assert got_app.resource_name == created_app.resource_name
assert got_app.gca_resource.name == got_app.resource_name
assert got_app.gca_resource.display_name == "test-display-name"
assert got_app.gca_resource.description == "test-description"

# Test operation schemas
assert got_app.operation_schemas() == created_app.operation_schemas()

# Test query response
# (Wrap in a try-except block because of non-determinism from Gemini.)
try:
> response = created_app.query(input="hello")
E AttributeError: 'ReasoningEngine' object has no attribute 'query'

tests/system/vertexai/test_reasoning_engines.py:84: AttributeError
------------------------------ Captured log call -------------------------------
INFO vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:628 Using bucket test-reasoning-engine-980d7bd7-9868-4395-812a-ac412ae58684
INFO vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:647 Writing to gs://test-reasoning-engine-980d7bd7-9868-4395-812a-ac412ae58684/test-gcs-dir-name/reasoning_engine.pkl
INFO vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:659 Writing to gs://test-reasoning-engine-980d7bd7-9868-4395-812a-ac412ae58684/test-gcs-dir-name/requirements.txt
INFO vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:668 Creating in-memory tarfile of extra_packages
INFO vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:677 Writing to gs://test-reasoning-engine-980d7bd7-9868-4395-812a-ac412ae58684/test-gcs-dir-name/dependencies.tar.gz
WARNING vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:982 failed to generate schema for query: issubclass() arg 1 must be a class
WARNING vertexai.reasoning_engines._reasoning_engines:_reasoning_engines.py:982 failed to generate schema for stream_query: issubclass() arg 1 must be a class
INFO vertexai.reasoning_engines._reasoning_engines:base.py:85 Creating ReasoningEngine
INFO vertexai.reasoning_engines._reasoning_engines:base.py:88 Create ReasoningEngine backing LRO: projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016/operations/328933643776950272
INFO vertexai.reasoning_engines._reasoning_engines:base.py:113 ReasoningEngine created. Resource name: projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016
INFO vertexai.reasoning_engines._reasoning_engines:base.py:114 To use this ReasoningEngine in another session:
INFO vertexai.reasoning_engines._reasoning_engines:base.py:115 reasoning_engine = vertexai.preview.reasoning_engines.ReasoningEngine('projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016')
---------------------------- Captured log teardown -----------------------------
INFO google.cloud.aiplatform.base:base.py:189 Deleting ReasoningEngine : projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016
INFO google.cloud.aiplatform.base:base.py:222 ReasoningEngine deleted. . Resource name: projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016
INFO google.cloud.aiplatform.base:base.py:156 Deleting ReasoningEngine resource: projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016
INFO google.cloud.aiplatform.base:base.py:161 Delete ReasoningEngine backing LRO: projects/580378083368/locations/us-central1/operations/2382575073857896448
INFO google.cloud.aiplatform.base:base.py:174 ReasoningEngine resource projects/580378083368/locations/us-central1/reasoningEngines/8353667135376982016 deleted.
_____ TestGenerativeModels.test_chat_function_calling[rest-PROD_ENDPOINT] ______
[gw14] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
api_endpoint_env_name = 'PROD_ENDPOINT'

def test_chat_function_calling(self, api_endpoint_env_name):
get_current_weather_func = generative_models.FunctionDeclaration(
name="get_current_weather",
description="Get the current weather in a given location",
parameters=_REQUEST_FUNCTION_PARAMETER_SCHEMA_STRUCT,
)

weather_tool = generative_models.Tool(
function_declarations=[get_current_weather_func],
)

model = generative_models.GenerativeModel(
GEMINI_MODEL_NAME,
# Specifying the tools once to avoid specifying them in every request
tools=[weather_tool],
)

chat = model.start_chat()

response1 = chat.send_message("What is the weather like in Boston?")
assert (
> response1.candidates[0].content.parts[0].function_call.name
== "get_current_weather"
)
E AttributeError: 'NoneType' object has no attribute 'name'

tests/system/vertexai/test_generative_models.py:511: AttributeError
_ TestGenerativeModels.test_generate_content_function_calling[grpc-PROD_ENDPOINT] _
[gw14] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
api_endpoint_env_name = 'PROD_ENDPOINT'

def test_generate_content_function_calling(self, api_endpoint_env_name):
get_current_weather_func = generative_models.FunctionDeclaration(
name="get_current_weather",
description="Get the current weather in a given location",
parameters=_REQUEST_FUNCTION_PARAMETER_SCHEMA_STRUCT,
)

weather_tool = generative_models.Tool(
function_declarations=[get_current_weather_func],
)

model = generative_models.GenerativeModel(
GEMINI_MODEL_NAME,
# Specifying the tools once to avoid specifying them in every request
tools=[weather_tool],
)

# Define the user's prompt in a Content object that we can reuse in model calls
prompt = "What is the weather like in Boston?"
user_prompt_content = generative_models.Content(
role="user",
parts=[
generative_models.Part.from_text(prompt),
],
)

# Send the prompt and instruct the model to generate content using the Tool
response = model.generate_content(
user_prompt_content,
generation_config={"temperature": 0},
tools=[weather_tool],
)
response_function_call_content = response.candidates[0].content

assert (
response.candidates[0].content.parts[0].function_call.name
== "get_current_weather"
)

assert response.candidates[0].function_calls[0].args["location"]
assert len(response.candidates[0].function_calls) == 1
> assert (
response.candidates[0].function_calls[0]
== response.candidates[0].content.parts[0].function_call
)
E assert name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n == name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n
E + where name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n = function_call {\n name: "get_current_weather"\n args {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n }\n}\n.function_call

tests/system/vertexai/test_generative_models.py:565: AssertionError
_ TestGenerativeModels.test_generate_content_function_calling[rest-PROD_ENDPOINT] _
[gw14] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
api_endpoint_env_name = 'PROD_ENDPOINT'

def test_generate_content_function_calling(self, api_endpoint_env_name):
get_current_weather_func = generative_models.FunctionDeclaration(
name="get_current_weather",
description="Get the current weather in a given location",
parameters=_REQUEST_FUNCTION_PARAMETER_SCHEMA_STRUCT,
)

weather_tool = generative_models.Tool(
function_declarations=[get_current_weather_func],
)

model = generative_models.GenerativeModel(
GEMINI_MODEL_NAME,
# Specifying the tools once to avoid specifying them in every request
tools=[weather_tool],
)

# Define the user's prompt in a Content object that we can reuse in model calls
prompt = "What is the weather like in Boston?"
user_prompt_content = generative_models.Content(
role="user",
parts=[
generative_models.Part.from_text(prompt),
],
)

# Send the prompt and instruct the model to generate content using the Tool
response = model.generate_content(
user_prompt_content,
generation_config={"temperature": 0},
tools=[weather_tool],
)
response_function_call_content = response.candidates[0].content

assert (
response.candidates[0].content.parts[0].function_call.name
== "get_current_weather"
)

assert response.candidates[0].function_calls[0].args["location"]
assert len(response.candidates[0].function_calls) == 1
> assert (
response.candidates[0].function_calls[0]
== response.candidates[0].content.parts[0].function_call
)
E assert name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n == name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n
E + where name: "get_current_weather"\nargs {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n}\n = function_call {\n name: "get_current_weather"\n args {\n fields {\n key: "location"\n value {\n string_value: "Boston, MA"\n }\n }\n }\n}\n.function_call

tests/system/vertexai/test_generative_models.py:565: AssertionError
_ TestEndToEndForecasting4.test_end_to_end_forecasting[TimeSeriesDenseEncoderForecastingTrainingJob] _
[gw7] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'staging_bucket_name': 'temp-ver...ting-9c49c43d-1f36-4243-be92-15f3d6d', 'storage_client': }
training_job =

@pytest.mark.parametrize(
"training_job",
[
training_jobs.TimeSeriesDenseEncoderForecastingTrainingJob,
],
)
def test_end_to_end_forecasting(self, shared_state, training_job):
"""Builds a dataset, trains models, and gets batch predictions."""
resources = []

aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
staging_bucket=shared_state["staging_bucket_name"],
)
try:
ds = aiplatform.TimeSeriesDataset.create(
display_name=self._make_display_name("dataset"),
bq_source=[_TRAINING_DATASET_BQ_PATH],
sync=False,
create_request_timeout=180.0,
)
resources.append(ds)

time_column = "date"
time_series_identifier_column = "store_name"
target_column = "sale_dollars"
column_specs = {
time_column: "timestamp",
target_column: "numeric",
"city": "categorical",
"zip_code": "categorical",
"county": "categorical",
}

job = training_job(
display_name=self._make_display_name("train-housing-forecasting"),
optimization_objective="minimize-rmse",
column_specs=column_specs,
)
resources.append(job)

model = job.run(
dataset=ds,
target_column=target_column,
time_column=time_column,
time_series_identifier_column=time_series_identifier_column,
available_at_forecast_columns=[time_column],
unavailable_at_forecast_columns=[target_column],
time_series_attribute_columns=["city", "zip_code", "county"],
forecast_horizon=30,
context_window=30,
data_granularity_unit="day",
data_granularity_count=1,
budget_milli_node_hours=1000,
holiday_regions=["GLOBAL"],
hierarchy_group_total_weight=1,
window_stride_length=1,
model_display_name=self._make_display_name("forecasting-liquor-model"),
sync=False,
)
resources.append(model)

batch_prediction_job = model.batch_predict(
job_display_name=self._make_display_name("forecasting-liquor-model"),
instances_format="bigquery",
predictions_format="csv",
machine_type="n1-standard-4",
bigquery_source=_PREDICTION_DATASET_BQ_PATH,
gcs_destination_prefix=(
f'gs://{shared_state["staging_bucket_name"]}/bp_results/'
),
sync=False,
)
resources.append(batch_prediction_job)

batch_prediction_job.wait()
model.wait()
assert job.state == pipeline_state.PipelineState.PIPELINE_STATE_SUCCEEDED
assert batch_prediction_job.state == job_state.JobState.JOB_STATE_SUCCEEDED
finally:
for resource in resources:
> resource.delete()

tests/system/aiplatform/test_e2e_forecasting.py:395:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/base.py:857: in wrapper
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
tests/system/aiplatform/test_e2e_forecasting.py:389: in test_end_to_end_forecasting
batch_prediction_job.wait()
google/cloud/aiplatform/base.py:306: in wait
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:372: in wait_for_dependencies_and_invoke
future.result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:458: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:374: in wait_for_dependencies_and_invoke
result = method(*args, **kwargs)
google/cloud/aiplatform/training_jobs.py:2658: in _run
new_model = self._run_job(
google/cloud/aiplatform/training_jobs.py:854: in _run_job
model = self._get_model(block=block)
google/cloud/aiplatform/training_jobs.py:941: in _get_model
self._block_until_complete()
google/cloud/aiplatform/training_jobs.py:984: in _block_until_complete
self._raise_failure()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self =
resource name: projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696

def _raise_failure(self):
"""Helper method to raise failure if TrainingPipeline fails.

Raises:
RuntimeError: If training failed.
"""

if self._gca_resource.error.code != code_pb2.OK:
> raise RuntimeError("Training failed with:\n%s" % self._gca_resource.error)
E RuntimeError: Training failed with:
E code: 9
E message: "FAILED_PRECONDITION"

google/cloud/aiplatform/training_jobs.py:1001: RuntimeError
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.datasets.dataset:base.py:85 Creating TimeSeriesDataset
INFO google.cloud.aiplatform.datasets.dataset:base.py:88 Create TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/datasets/6543708818092589056/operations/1726175428168646656
INFO google.cloud.aiplatform.datasets.dataset:base.py:113 TimeSeriesDataset created. Resource name: projects/580378083368/locations/us-central1/datasets/6543708818092589056
INFO google.cloud.aiplatform.datasets.dataset:base.py:114 To use this TimeSeriesDataset in another session:
INFO google.cloud.aiplatform.datasets.dataset:base.py:115 ds = aiplatform.TimeSeriesDataset('projects/580378083368/locations/us-central1/datasets/6543708818092589056')
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:550 No dataset split provided. The service will use a default split.
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:852 View Training:
https://console.cloud.google.com/ai/platform/locations/us-central1/training/5926362416672669696?project=580378083368
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 TimeSeriesDenseEncoderForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/5926362416672669696 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.base:base.py:189 Deleting TimeSeriesDataset : projects/580378083368/locations/us-central1/datasets/6543708818092589056
INFO google.cloud.aiplatform.base:base.py:222 TimeSeriesDataset deleted. . Resource name: projects/580378083368/locations/us-central1/datasets/6543708818092589056
INFO google.cloud.aiplatform.base:base.py:156 Deleting TimeSeriesDataset resource: projects/580378083368/locations/us-central1/datasets/6543708818092589056
INFO google.cloud.aiplatform.base:base.py:161 Delete TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/operations/2760314492603596800
INFO google.cloud.aiplatform.base:base.py:174 TimeSeriesDataset resource projects/580378083368/locations/us-central1/datasets/6543708818092589056 deleted.
_______ TestVersionManagement.test_upload_deploy_manage_versioned_model ________
[gw7] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (name: "projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/580378083368/locations/us-central1/models/my_model_id2d735df546...thon/3.10.15 grpc/1.51.3 gax/2.21.0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.base.wrapper')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.UNAVAILABLE
E details = "The service is currently unavailable."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:173.194.203.95:443 {grpc_message:"The service is currently unavailable.", grpc_status:14, created_time:"2024-12-18T22:23:23.9513038+00:00"}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
shared_state = {'resources': [
resource name: projects/580378083368/l...6ceb0>
resource name: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485]}

def test_upload_deploy_manage_versioned_model(self, shared_state):
"""Upload XGBoost model from local file and deploy it for prediction. Additionally, update model name, description and labels"""

aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
)

storage_client = storage.Client(project=e2e_base._PROJECT)
model_blob = storage.Blob.from_string(
uri=test_model_upload._XGBOOST_MODEL_URI, client=storage_client
)
model_path = tempfile.mktemp() + ".my_model.xgb"
model_blob.download_to_filename(filename=model_path)

model_id = "my_model_id" + uuid.uuid4().hex
version_description = "My description"
version_aliases = ["system-test-model", "testing"]

model = aiplatform.Model.upload_xgboost_model_file(
model_file_path=model_path,
version_aliases=version_aliases,
model_id=model_id,
version_description=version_description,
)
shared_state["resources"] = [model]

staging_bucket = storage.Blob.from_string(
uri=model.uri, client=storage_client
).bucket
# Checking that the bucket is auto-generated
assert "-vertex-staging-" in staging_bucket.name

assert model.version_description == version_description
assert model.version_aliases == version_aliases
assert "default" in model.version_aliases

model2 = aiplatform.Model.upload_xgboost_model_file(
model_file_path=model_path, parent_model=model_id, is_default_version=False
)
shared_state["resources"].append(model2)

assert model2.version_id == "2"
assert model2.resource_name == model.resource_name
assert model2.version_aliases == []

# Test that VersionInfo properties are correct.
model_info = model2.versioning_registry.get_version_info("testing")
> version_list = model2.versioning_registry.list_versions()

tests/system/aiplatform/test_model_version_management.py:84:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/models.py:7343: in list_versions
page_result = self.client.list_model_versions(
google/cloud/aiplatform_v1/services/model_service/client.py:1319: in list_model_versions
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (name: "projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485"
,)
kwargs = {'metadata': [('x-goog-request-params', 'name=projects/580378083368/locations/us-central1/models/my_model_id2d735df546...thon/3.10.15 grpc/1.51.3 gax/2.21.0 gapic/1.75.0+top_google_constructor_method+google.cloud.aiplatform.base.wrapper')]}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.ServiceUnavailable: 503 The service is currently unavailable.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: ServiceUnavailable
------------------------------ Captured log call -------------------------------
WARNING google.cloud.aiplatform.models:models.py:6302 Only the following XGBoost model file extensions are currently supported: '['.pkl', '.joblib', '.bst']'
WARNING google.cloud.aiplatform.models:models.py:6305 Treating the model file as a binary serialized XGBoost Booster.
INFO google.cloud.aiplatform.models:base.py:85 Creating Model
INFO google.cloud.aiplatform.models:base.py:88 Create Model backing LRO: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485/operations/1774026174209458176
INFO google.cloud.aiplatform.models:base.py:113 Model created. Resource name: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485@1
INFO google.cloud.aiplatform.models:base.py:114 To use this Model in another session:
INFO google.cloud.aiplatform.models:base.py:115 model = aiplatform.Model('projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485@1')
WARNING google.cloud.aiplatform.models:models.py:6302 Only the following XGBoost model file extensions are currently supported: '['.pkl', '.joblib', '.bst']'
WARNING google.cloud.aiplatform.models:models.py:6305 Treating the model file as a binary serialized XGBoost Booster.
INFO google.cloud.aiplatform.models:base.py:85 Creating Model
INFO google.cloud.aiplatform.models:base.py:88 Create Model backing LRO: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485/operations/5674143451512307712
INFO google.cloud.aiplatform.models:base.py:113 Model created. Resource name: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485@2
INFO google.cloud.aiplatform.models:base.py:114 To use this Model in another session:
INFO google.cloud.aiplatform.models:base.py:115 model = aiplatform.Model('projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485@2')
INFO google.cloud.aiplatform.models:models.py:7375 Getting version testing info for projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
INFO google.cloud.aiplatform.models:models.py:7336 Getting versions for projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
---------------------------- Captured log teardown -----------------------------
INFO google.cloud.aiplatform.base:base.py:189 Deleting Model : projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
INFO google.cloud.aiplatform.base:base.py:222 Model deleted. . Resource name: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
INFO google.cloud.aiplatform.base:base.py:156 Deleting Model resource: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
INFO google.cloud.aiplatform.base:base.py:161 Delete Model backing LRO: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485/operations/6156028611640950784
INFO google.cloud.aiplatform.base:base.py:174 Model resource projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485 deleted.
INFO google.cloud.aiplatform.base:base.py:189 Deleting Model : projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485
ERROR root:e2e_base.py:210 Could not delete resource:
resource name: projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485 due to: 404 The model "projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485" does not exist.
Traceback (most recent call last):
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py", line 76, in error_remapped_callable
return callable_(*args, **kwargs)
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py", line 247, in __call__
response, ignored_call = self._with_call(request,
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py", line 290, in _with_call
return call.result(), call
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py", line 343, in result
raise self
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py", line 274, in continuation
response, call = self._thunk(new_method).with_call(
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py", line 957, in with_call
return _end_unary_response_blocking(state, call, True, None)
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py", line 849, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.NOT_FOUND
details = "The model "projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485" does not exist."
debug_error_string = "UNKNOWN:Error received from peer ipv4:173.194.203.95:443 {grpc_message:"The model \"projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485\" does not exist.", grpc_status:5, created_time:"2024-12-18T22:23:24.58486334+00:00"}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/tmpfs/src/github/python-aiplatform/tests/system/aiplatform/e2e_base.py", line 208, in tear_down_resources
resource.delete()
File "/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform/base.py", line 863, in wrapper
return method(*args, **kwargs)
File "/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform/base.py", line 1371, in delete
self._delete()
File "/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform/base.py", line 1202, in _delete
possible_lro = getattr(self.api_client, self._delete_method)(
File "/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform_v1/services/model_service/client.py", line 1723, in delete_model
response = rpc(
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
return wrapped_func(*args, **kwargs)
File "/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable
raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.NotFound: 404 The model "projects/580378083368/locations/us-central1/models/my_model_id2d735df5469747e9937770ce9963e485" does not exist.
_ TestEndToEndForecasting1.test_end_to_end_forecasting[AutoMLForecastingTrainingJob] _
[gw4] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'staging_bucket_name': 'temp-ver...ting-91daa531-c3f0-46ae-8dd0-fd2d1f1', 'storage_client': }
training_job =

@pytest.mark.parametrize(
"training_job",
[
training_jobs.AutoMLForecastingTrainingJob,
],
)
def test_end_to_end_forecasting(self, shared_state, training_job):
"""Builds a dataset, trains models, and gets batch predictions."""
resources = []

aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
staging_bucket=shared_state["staging_bucket_name"],
)
try:
ds = aiplatform.TimeSeriesDataset.create(
display_name=self._make_display_name("dataset"),
bq_source=[_TRAINING_DATASET_BQ_PATH],
sync=False,
create_request_timeout=180.0,
)
resources.append(ds)

time_column = "date"
time_series_identifier_column = "store_name"
target_column = "sale_dollars"
column_specs = {
time_column: "timestamp",
target_column: "numeric",
"city": "categorical",
"zip_code": "categorical",
"county": "categorical",
}

job = training_job(
display_name=self._make_display_name("train-housing-forecasting"),
optimization_objective="minimize-rmse",
column_specs=column_specs,
)
resources.append(job)

model = job.run(
dataset=ds,
target_column=target_column,
time_column=time_column,
time_series_identifier_column=time_series_identifier_column,
available_at_forecast_columns=[time_column],
unavailable_at_forecast_columns=[target_column],
time_series_attribute_columns=["city", "zip_code", "county"],
forecast_horizon=30,
context_window=30,
data_granularity_unit="day",
data_granularity_count=1,
budget_milli_node_hours=1000,
holiday_regions=["GLOBAL"],
hierarchy_group_total_weight=1,
window_stride_length=1,
model_display_name=self._make_display_name("forecasting-liquor-model"),
sync=False,
)
resources.append(model)

batch_prediction_job = model.batch_predict(
job_display_name=self._make_display_name("forecasting-liquor-model"),
instances_format="bigquery",
predictions_format="csv",
machine_type="n1-standard-4",
bigquery_source=_PREDICTION_DATASET_BQ_PATH,
gcs_destination_prefix=(
f'gs://{shared_state["staging_bucket_name"]}/bp_results/'
),
sync=False,
)
resources.append(batch_prediction_job)

batch_prediction_job.wait()
model.wait()
assert job.state == pipeline_state.PipelineState.PIPELINE_STATE_SUCCEEDED
assert batch_prediction_job.state == job_state.JobState.JOB_STATE_SUCCEEDED
finally:
for resource in resources:
> resource.delete()

tests/system/aiplatform/test_e2e_forecasting.py:122:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/base.py:857: in wrapper
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
tests/system/aiplatform/test_e2e_forecasting.py:116: in test_end_to_end_forecasting
batch_prediction_job.wait()
google/cloud/aiplatform/base.py:306: in wait
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:372: in wait_for_dependencies_and_invoke
future.result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:458: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:374: in wait_for_dependencies_and_invoke
result = method(*args, **kwargs)
google/cloud/aiplatform/training_jobs.py:2658: in _run
new_model = self._run_job(
google/cloud/aiplatform/training_jobs.py:854: in _run_job
model = self._get_model(block=block)
google/cloud/aiplatform/training_jobs.py:941: in _get_model
self._block_until_complete()
google/cloud/aiplatform/training_jobs.py:984: in _block_until_complete
self._raise_failure()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self =
resource name: projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216

def _raise_failure(self):
"""Helper method to raise failure if TrainingPipeline fails.

Raises:
RuntimeError: If training failed.
"""

if self._gca_resource.error.code != code_pb2.OK:
> raise RuntimeError("Training failed with:\n%s" % self._gca_resource.error)
E RuntimeError: Training failed with:
E code: 9
E message: "FAILED_PRECONDITION"

google/cloud/aiplatform/training_jobs.py:1001: RuntimeError
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.datasets.dataset:base.py:85 Creating TimeSeriesDataset
INFO google.cloud.aiplatform.datasets.dataset:base.py:88 Create TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/datasets/5854658075104903168/operations/3937442845207560192
INFO google.cloud.aiplatform.datasets.dataset:base.py:113 TimeSeriesDataset created. Resource name: projects/580378083368/locations/us-central1/datasets/5854658075104903168
INFO google.cloud.aiplatform.datasets.dataset:base.py:114 To use this TimeSeriesDataset in another session:
INFO google.cloud.aiplatform.datasets.dataset:base.py:115 ds = aiplatform.TimeSeriesDataset('projects/580378083368/locations/us-central1/datasets/5854658075104903168')
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:550 No dataset split provided. The service will use a default split.
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:852 View Training:
https://console.cloud.google.com/ai/platform/locations/us-central1/training/6185319395246473216?project=580378083368
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 AutoMLForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/6185319395246473216 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.base:base.py:189 Deleting TimeSeriesDataset : projects/580378083368/locations/us-central1/datasets/5854658075104903168
INFO google.cloud.aiplatform.base:base.py:222 TimeSeriesDataset deleted. . Resource name: projects/580378083368/locations/us-central1/datasets/5854658075104903168
INFO google.cloud.aiplatform.base:base.py:156 Deleting TimeSeriesDataset resource: projects/580378083368/locations/us-central1/datasets/5854658075104903168
INFO google.cloud.aiplatform.base:base.py:161 Delete TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/operations/2699515897634095104
INFO google.cloud.aiplatform.base:base.py:174 TimeSeriesDataset resource projects/580378083368/locations/us-central1/datasets/5854658075104903168 deleted.
_ TestEndToEndForecasting2.test_end_to_end_forecasting[SequenceToSequencePlusForecastingTrainingJob] _
[gw6] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

self =
shared_state = {'bucket': , 'staging_bucket_name': 'temp-ver...ting-63316d7b-b71c-4db3-ab4a-88dd16d', 'storage_client': }
training_job =

@pytest.mark.parametrize(
"training_job",
[
training_jobs.SequenceToSequencePlusForecastingTrainingJob,
],
)
def test_end_to_end_forecasting(self, shared_state, training_job):
"""Builds a dataset, trains models, and gets batch predictions."""
resources = []

aiplatform.init(
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
staging_bucket=shared_state["staging_bucket_name"],
)
try:
ds = aiplatform.TimeSeriesDataset.create(
display_name=self._make_display_name("dataset"),
bq_source=[_TRAINING_DATASET_BQ_PATH],
sync=False,
create_request_timeout=180.0,
)
resources.append(ds)

time_column = "date"
time_series_identifier_column = "store_name"
target_column = "sale_dollars"
column_specs = {
time_column: "timestamp",
target_column: "numeric",
"city": "categorical",
"zip_code": "categorical",
"county": "categorical",
}

job = training_job(
display_name=self._make_display_name("train-housing-forecasting"),
optimization_objective="minimize-rmse",
column_specs=column_specs,
)
resources.append(job)

model = job.run(
dataset=ds,
target_column=target_column,
time_column=time_column,
time_series_identifier_column=time_series_identifier_column,
available_at_forecast_columns=[time_column],
unavailable_at_forecast_columns=[target_column],
time_series_attribute_columns=["city", "zip_code", "county"],
forecast_horizon=30,
context_window=30,
data_granularity_unit="day",
data_granularity_count=1,
budget_milli_node_hours=1000,
holiday_regions=["GLOBAL"],
hierarchy_group_total_weight=1,
window_stride_length=1,
model_display_name=self._make_display_name("forecasting-liquor-model"),
sync=False,
)
resources.append(model)

batch_prediction_job = model.batch_predict(
job_display_name=self._make_display_name("forecasting-liquor-model"),
instances_format="bigquery",
predictions_format="csv",
machine_type="n1-standard-4",
bigquery_source=_PREDICTION_DATASET_BQ_PATH,
gcs_destination_prefix=(
f'gs://{shared_state["staging_bucket_name"]}/bp_results/'
),
sync=False,
)
resources.append(batch_prediction_job)

batch_prediction_job.wait()
model.wait()
assert job.state == pipeline_state.PipelineState.PIPELINE_STATE_SUCCEEDED
assert batch_prediction_job.state == job_state.JobState.JOB_STATE_SUCCEEDED
finally:
for resource in resources:
> resource.delete()

tests/system/aiplatform/test_e2e_forecasting.py:213:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/base.py:857: in wrapper
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
tests/system/aiplatform/test_e2e_forecasting.py:207: in test_end_to_end_forecasting
batch_prediction_job.wait()
google/cloud/aiplatform/base.py:306: in wait
self._raise_future_exception()
google/cloud/aiplatform/base.py:272: in _raise_future_exception
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:372: in wait_for_dependencies_and_invoke
future.result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:458: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
google/cloud/aiplatform/base.py:284: in _complete_future
future.result() # raises
/usr/local/lib/python3.10/concurrent/futures/_base.py:451: in result
return self.__get_result()
/usr/local/lib/python3.10/concurrent/futures/_base.py:403: in __get_result
raise self._exception
/usr/local/lib/python3.10/concurrent/futures/thread.py:58: in run
result = self.fn(*self.args, **self.kwargs)
google/cloud/aiplatform/base.py:374: in wait_for_dependencies_and_invoke
result = method(*args, **kwargs)
google/cloud/aiplatform/training_jobs.py:2658: in _run
new_model = self._run_job(
google/cloud/aiplatform/training_jobs.py:854: in _run_job
model = self._get_model(block=block)
google/cloud/aiplatform/training_jobs.py:941: in _get_model
self._block_until_complete()
google/cloud/aiplatform/training_jobs.py:984: in _block_until_complete
self._raise_failure()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self =
resource name: projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464

def _raise_failure(self):
"""Helper method to raise failure if TrainingPipeline fails.

Raises:
RuntimeError: If training failed.
"""

if self._gca_resource.error.code != code_pb2.OK:
> raise RuntimeError("Training failed with:\n%s" % self._gca_resource.error)
E RuntimeError: Training failed with:
E code: 9
E message: "FAILED_PRECONDITION"

google/cloud/aiplatform/training_jobs.py:1001: RuntimeError
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.datasets.dataset:base.py:85 Creating TimeSeriesDataset
INFO google.cloud.aiplatform.datasets.dataset:base.py:88 Create TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/datasets/1071835270837436416/operations/6890678300855762944
INFO google.cloud.aiplatform.datasets.dataset:base.py:113 TimeSeriesDataset created. Resource name: projects/580378083368/locations/us-central1/datasets/1071835270837436416
INFO google.cloud.aiplatform.datasets.dataset:base.py:114 To use this TimeSeriesDataset in another session:
INFO google.cloud.aiplatform.datasets.dataset:base.py:115 ds = aiplatform.TimeSeriesDataset('projects/580378083368/locations/us-central1/datasets/1071835270837436416')
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:550 No dataset split provided. The service will use a default split.
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:852 View Training:
https://console.cloud.google.com/ai/platform/locations/us-central1/training/1543234079334334464?project=580378083368
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.training_jobs:training_jobs.py:971 SequenceToSequencePlusForecastingTrainingJob projects/580378083368/locations/us-central1/trainingPipelines/1543234079334334464 current state:
PipelineState.PIPELINE_STATE_RUNNING
INFO google.cloud.aiplatform.base:base.py:189 Deleting TimeSeriesDataset : projects/580378083368/locations/us-central1/datasets/1071835270837436416
INFO google.cloud.aiplatform.base:base.py:222 TimeSeriesDataset deleted. . Resource name: projects/580378083368/locations/us-central1/datasets/1071835270837436416
INFO google.cloud.aiplatform.base:base.py:156 Deleting TimeSeriesDataset resource: projects/580378083368/locations/us-central1/datasets/1071835270837436416
INFO google.cloud.aiplatform.base:base.py:161 Delete TimeSeriesDataset backing LRO: projects/580378083368/locations/us-central1/operations/3513260055304601600
INFO google.cloud.aiplatform.base:base.py:174 TimeSeriesDataset resource projects/580378083368/locations/us-central1/datasets/1071835270837436416 deleted.
______ TestModelDeploymentMonitoring.test_mdm_two_models_one_valid_config ______
[gw6] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...ocacyorg.joonix.net"
}
enable_logging: true
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.PERMISSION_DENIED
E details = "Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:173.194.202.95:443 {grpc_message:"Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again.", grpc_status:7, created_time:"2024-12-18T22:32:37.304491788+00:00"}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
shared_state = {'resources': [
resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800]}

def test_mdm_two_models_one_valid_config(self, shared_state):
"""
Enable model monitoring on two existing models deployed to the same endpoint.
"""
assert len(shared_state["resources"]) == 1
self.endpoint = shared_state["resources"][0]
aiplatform.init(project=e2e_base._PROJECT, location=e2e_base._LOCATION)
# test model monitoring configurations
> job = aiplatform.ModelDeploymentMonitoringJob.create(
display_name=self._make_display_name(key=JOB_NAME),
logging_sampling_strategy=sampling_strategy,
schedule_config=schedule_config,
alert_config=email_alert_config,
objective_configs=objective_config,
create_request_timeout=3600,
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
endpoint=self.endpoint,
)

tests/system/aiplatform/test_model_monitoring.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/jobs.py:3479: in create
self._gca_resource = self.api_client.create_model_deployment_monitoring_job(
google/cloud/aiplatform_v1/services/job_service/client.py:4440: in create_model_deployment_monitoring_job
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/timeout.py:120: in func_with_timeout
return func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...ocacyorg.joonix.net"
}
enable_logging: true
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.PermissionDenied: 403 Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: PermissionDenied
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.jobs:base.py:85 Creating ModelDeploymentMonitoringJob
_____ TestModelDeploymentMonitoring.test_mdm_two_models_two_valid_configs ______
[gw6] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...ocacyorg.joonix.net"
}
enable_logging: true
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.PERMISSION_DENIED
E details = "Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.142.95:443 {grpc_message:"Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again.", grpc_status:7, created_time:"2024-12-18T22:32:38.602911151+00:00"}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
shared_state = {'resources': [
resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800]}

def test_mdm_two_models_two_valid_configs(self, shared_state):
assert len(shared_state["resources"]) == 1
self.endpoint = shared_state["resources"][0]
aiplatform.init(project=e2e_base._PROJECT, location=e2e_base._LOCATION)
[deployed_model1, deployed_model2] = list(
map(lambda x: x.id, self.endpoint.list_models())
)
all_configs = {
deployed_model1: objective_config,
deployed_model2: objective_config2,
}
> job = aiplatform.ModelDeploymentMonitoringJob.create(
display_name=self._make_display_name(key=JOB_NAME),
logging_sampling_strategy=sampling_strategy,
schedule_config=schedule_config,
alert_config=email_alert_config,
objective_configs=all_configs,
create_request_timeout=3600,
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
endpoint=self.endpoint,
)

tests/system/aiplatform/test_model_monitoring.py:292:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/jobs.py:3479: in create
self._gca_resource = self.api_client.create_model_deployment_monitoring_job(
google/cloud/aiplatform_v1/services/job_service/client.py:4440: in create_model_deployment_monitoring_job
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/timeout.py:120: in func_with_timeout
return func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...ocacyorg.joonix.net"
}
enable_logging: true
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.PermissionDenied: 403 Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: PermissionDenied
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.jobs:base.py:85 Creating ModelDeploymentMonitoringJob
___ TestModelDeploymentMonitoring.test_mdm_notification_channel_alert_config ___
[gw6] linux -- Python 3.10.15 /tmpfs/src/github/python-aiplatform/.nox/system-3-10/bin/python

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...le-tests/notificationChannels/11578134490450491958"
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
> return callable_(*args, **kwargs)

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:247: in __call__
response, ignored_call = self._with_call(request,
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:290: in _with_call
return call.result(), call
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:343: in result
raise self
.nox/system-3-10/lib/python3.10/site-packages/grpc/_interceptor.py:274: in continuation
response, call = self._thunk(new_method).with_call(
.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:957: in with_call
return _end_unary_response_blocking(state, call, True, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

state =
call =
with_call = True, deadline = None

def _end_unary_response_blocking(state, call, with_call, deadline):
if state.code is grpc.StatusCode.OK:
if with_call:
rendezvous = _MultiThreadedRendezvous(state, call, None, deadline)
return state.response, rendezvous
else:
return state.response
else:
> raise _InactiveRpcError(state)
E grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
E status = StatusCode.PERMISSION_DENIED
E details = "Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again."
E debug_error_string = "UNKNOWN:Error received from peer ipv4:74.125.142.95:443 {created_time:"2024-12-18T22:32:41.097536223+00:00", grpc_status:7, grpc_message:"Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again."}"
E >

.nox/system-3-10/lib/python3.10/site-packages/grpc/_channel.py:849: _InactiveRpcError

The above exception was the direct cause of the following exception:

self =
shared_state = {'resources': [
resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800]}

def test_mdm_notification_channel_alert_config(self, shared_state):
self.endpoint = shared_state["resources"][0]
aiplatform.init(project=e2e_base._PROJECT, location=e2e_base._LOCATION)
# Reset objective_config.explanation_config
objective_config.explanation_config = None
# test model monitoring configurations
> job = aiplatform.ModelDeploymentMonitoringJob.create(
display_name=self._make_display_name(key=JOB_NAME),
logging_sampling_strategy=sampling_strategy,
schedule_config=schedule_config,
alert_config=alert_config,
objective_configs=objective_config,
create_request_timeout=3600,
project=e2e_base._PROJECT,
location=e2e_base._LOCATION,
endpoint=self.endpoint,
)

tests/system/aiplatform/test_model_monitoring.py:418:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
google/cloud/aiplatform/jobs.py:3479: in create
self._gca_resource = self.api_client.create_model_deployment_monitoring_job(
google/cloud/aiplatform_v1/services/job_service/client.py:4440: in create_model_deployment_monitoring_job
response = rpc(
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py:131: in __call__
return wrapped_func(*args, **kwargs)
.nox/system-3-10/lib/python3.10/site-packages/google/api_core/timeout.py:120: in func_with_timeout
return func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (parent: "projects/ucaip-sample-tests/locations/us-central1"
model_deployment_monitoring_job {
display_name: "temp_e...le-tests/notificationChannels/11578134490450491958"
}
sample_predict_instance {
null_value: NULL_VALUE
}
}
,)
kwargs = {'metadata': [('x-goog-request-params', 'parent=projects/ucaip-sample-tests/locations/us-central1'), ('x-goog-api-clie...0+top_google_constructor_method+google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob.create')], 'timeout': 3600.0}

@functools.wraps(callable_)
def error_remapped_callable(*args, **kwargs):
try:
return callable_(*args, **kwargs)
except grpc.RpcError as exc:
> raise exceptions.from_grpc_error(exc) from exc
E google.api_core.exceptions.PermissionDenied: 403 Vertex AI Service Agent service-580378083368@gcp-sa-aiplatform.iam.gserviceaccount.com does not have the requisite access to BigQuery [bq://mco-mm.bqmlga4.train]. Ensure that the service account has been granted the bigquery.tables.get permission and try again.

.nox/system-3-10/lib/python3.10/site-packages/google/api_core/grpc_helpers.py:78: PermissionDenied
------------------------------ Captured log call -------------------------------
INFO google.cloud.aiplatform.jobs:base.py:85 Creating ModelDeploymentMonitoringJob
---------------------------- Captured log teardown -----------------------------
INFO google.cloud.aiplatform.models:base.py:189 Undeploying Endpoint model: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.models:base.py:209 Undeploy Endpoint model backing LRO: projects/580378083368/locations/us-central1/endpoints/6175324594031820800/operations/3275976649937518592
INFO google.cloud.aiplatform.models:base.py:222 Endpoint model undeployed. Resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.models:base.py:189 Undeploying Endpoint model: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.models:base.py:209 Undeploy Endpoint model backing LRO: projects/580378083368/locations/us-central1/endpoints/6175324594031820800/operations/4089720807608025088
INFO google.cloud.aiplatform.models:base.py:222 Endpoint model undeployed. Resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.base:base.py:189 Deleting Endpoint : projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.base:base.py:222 Endpoint deleted. . Resource name: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.base:base.py:156 Deleting Endpoint resource: projects/580378083368/locations/us-central1/endpoints/6175324594031820800
INFO google.cloud.aiplatform.base:base.py:161 Delete Endpoint backing LRO: projects/580378083368/locations/us-central1/operations/2183853740300173312
INFO google.cloud.aiplatform.base:base.py:174 Endpoint resource projects/580378083368/locations/us-central1/endpoints/6175324594031820800 deleted.
=============================== warnings summary ===============================
.nox/system-3-10/lib/python3.10/site-packages/google/cloud/storage/_http.py:19: 16 warnings
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/google/cloud/storage/_http.py:19: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
import pkg_resources

.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2832: 32 warnings
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2832: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)

.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2832: 32 warnings
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2832: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)

.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2317: 16 warnings
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pkg_resources/__init__.py:2317: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)

tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_text_generation_model_predict_async
tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_async[grpc-PROD_ENDPOINT]
tests/system/aiplatform/test_model_interactions.py::TestModelInteractions::test_endpoint_predict_async
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:854: DeprecationWarning: The event_loop fixture provided by pytest-asyncio has been redefined in
/tmpfs/src/github/python-aiplatform/tests/system/aiplatform/e2e_base.py:212
Replacing the event_loop fixture with a custom implementation is deprecated
and will lead to errors in the future.
If you want to request an asyncio event loop with a scope other than function
scope, use the "scope" argument to the asyncio mark when marking the tests.
If you want to return different types of event loops, use the event_loop_policy
fixture.

warnings.warn(

tests/system/aiplatform/test_experiments.py: 29 warnings
tests/system/aiplatform/test_autologging.py: 5 warnings
tests/system/aiplatform/test_custom_job.py: 2 warnings
tests/system/aiplatform/test_model_evaluation.py: 2 warnings
/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform/utils/_ipython_utils.py:148: DeprecationWarning: Importing display from IPython.core.display is deprecated since IPython 7.14, please import from IPython display
from IPython.core.display import display

tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_booster_with_custom_uri
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/xgboost/core.py:158: UserWarning: [20:40:47] WARNING: /workspace/src/c_api/c_api.cc:1374: Saving model in the UBJSON format as default. You can use file extension: `json`, `ubj` or `deprecated` to choose between formats.
warnings.warn(smsg, UserWarning)

tests/system/aiplatform/test_experiment_model.py::TestExperimentModel::test_xgboost_xgbmodel_with_custom_names
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/xgboost/core.py:158: UserWarning: [20:40:53] WARNING: /workspace/src/c_api/c_api.cc:1374: Saving model in the UBJSON format as default. You can use file extension: `json`, `ubj` or `deprecated` to choose between formats.
warnings.warn(smsg, UserWarning)

tests/system/aiplatform/test_pipeline_job.py::TestPipelineJob::test_add_pipeline_job_to_experiment
tests/system/aiplatform/test_experiments.py::TestExperiments::test_add_pipeline_job_to_experiment
tests/system/aiplatform/test_pipeline_job_schedule.py::TestPipelineJobSchedule::test_create_get_pause_resume_update_list
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/kfp/dsl/component_decorator.py:126: FutureWarning: The default base_image used by the @dsl.component decorator will switch from 'python:3.9' to 'python:3.10' on Oct 1, 2025. To ensure your existing components work with versions of the KFP SDK released after that date, you should provide an explicit base_image argument and ensure your component works as intended on Python 3.10.
return component_factory.create_component_from_func(

tests/system/aiplatform/test_pipeline_job.py::TestPipelineJob::test_add_pipeline_job_to_experiment
tests/system/aiplatform/test_pipeline_job_schedule.py::TestPipelineJobSchedule::test_create_get_pause_resume_update_list
/tmpfs/src/github/python-aiplatform/google/cloud/aiplatform/pipeline_jobs.py:898: DeprecationWarning: The 'warn' method is deprecated, use 'warning' instead
_LOGGER.warn(

tests/system/aiplatform/test_experiments.py::TestExperiments::test_add_pipeline_job_to_experiment
/tmpfs/src/github/python-aiplatform/tests/system/aiplatform/test_experiments.py:376: DeprecationWarning: The module `kfp.v2` is deprecated and will be removed in a futureversion. Please import directly from the `kfp` namespace, instead of `kfp.v2`.
import kfp.v2.dsl as dsl

tests/system/aiplatform/test_experiments.py::TestExperiments::test_add_pipeline_job_to_experiment
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/kfp/compiler/compiler.py:81: DeprecationWarning: Compiling to JSON is deprecated and will be removed in a future version. Please compile to a YAML file by providing a file path with a .yaml extension instead.
builder.write_pipeline_spec_to_file(

tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
/usr/local/lib/python3.10/subprocess.py:955: RuntimeWarning: line buffering (buffering=1) isn't supported in binary mode, the default buffer size will be used
self.stdin = io.open(p2cwrite, 'wb', bufsize)

tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
/usr/local/lib/python3.10/subprocess.py:961: RuntimeWarning: line buffering (buffering=1) isn't supported in binary mode, the default buffer size will be used
self.stdout = io.open(c2pread, 'rb', bufsize)

tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_count_tokens_from_text[rest-PROD_ENDPOINT]
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pytest_asyncio/plugin.py:911: DeprecationWarning: pytest-asyncio detected an unclosed event loop when tearing down the event_loop
fixture: <_UnixSelectorEventLoop running=False closed=False debug=False>
pytest-asyncio will close the event loop for you, but future versions of the
library will no longer do so. In order to ensure compatibility with future
versions, please make sure that:
1. Any custom "event_loop" fixture properly closes the loop after yielding it
2. The scopes of your custom "event_loop" fixtures do not overlap
3. Your code does not modify the event loop in async fixtures or tests

warnings.warn(

tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_df
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pyarrow/pandas_compat.py:735: DeprecationWarning: DatetimeTZBlock is deprecated and will be removed in a future version. Use public APIs instead.
klass=_int.DatetimeTZBlock,

tests/system/aiplatform/test_featurestore.py::TestFeaturestore::test_batch_serve_to_df
/tmpfs/src/github/python-aiplatform/.nox/system-3-10/lib/python3.10/site-packages/pandas/core/frame.py:717: DeprecationWarning: Passing a BlockManager to DataFrame is deprecated and will raise in a future version. Use public APIs instead.
warnings.warn(

tests/system/aiplatform/test_e2e_tabular.py::TestEndToEndTabular::test_end_to_end_tabular
/tmpfs/src/github/python-aiplatform/tests/system/aiplatform/test_e2e_tabular.py:203: PendingDeprecationWarning: Blob.download_as_string() is deprecated and will be removed in future. Use Blob.download_as_bytes() instead.
error_output_filestr = blob.download_as_string().decode()

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: /tmpfs/src/github/python-aiplatform/system_3.10_sponge_log.xml -
=========================== short test summary info ============================
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_create_experiment
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_start_run
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_run
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_params
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_metrics
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_time_series_metrics
FAILED tests/system/aiplatform/test_autologging.py::TestAutologging::test_autologging_with_autorun_creation
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_classification_metrics
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_model
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_log_execution_and_artifact
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_end_run
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_get_experiments_df_include_time_series_false
FAILED tests/system/aiplatform/test_experiments.py::TestExperiments::test_delete_run_does_not_exist_raises_exception
FAILED tests/system/aiplatform/test_language_models.py::TestLanguageModels::test_code_chat_model_send_message_streaming[grpc]
FAILED tests/system/aiplatform/test_prediction_cpr.py::TestPredictionCpr::test_build_cpr_model_upload_and_deploy
FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_with_cached_content_from_text[grpc-PROD_ENDPOINT]
FAILED tests/system/vertexai/test_reasoning_engines.py::TestReasoningEngines::test_langchain_template
FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_chat_function_calling[rest-PROD_ENDPOINT]
FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[grpc-PROD_ENDPOINT]
FAILED tests/system/vertexai/test_generative_models.py::TestGenerativeModels::test_generate_content_function_calling[rest-PROD_ENDPOINT]
FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting4::test_end_to_end_forecasting[TimeSeriesDenseEncoderForecastingTrainingJob]
FAILED tests/system/aiplatform/test_model_version_management.py::TestVersionManagement::test_upload_deploy_manage_versioned_model
FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting1::test_end_to_end_forecasting[AutoMLForecastingTrainingJob]
FAILED tests/system/aiplatform/test_e2e_forecasting.py::TestEndToEndForecasting2::test_end_to_end_forecasting[SequenceToSequencePlusForecastingTrainingJob]
FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_one_valid_config
FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_two_models_two_valid_configs
FAILED tests/system/aiplatform/test_model_monitoring.py::TestModelDeploymentMonitoring::test_mdm_notification_channel_alert_config
==== 28 failed, 214 passed, 6 skipped, 154 warnings in 12987.72s (3:36:27) =====
nox > Command py.test -v --junitxml=system_3.10_sponge_log.xml tests/system failed with exit code 1
nox > Session system-3.10 failed.
[FlakyBot] Sending logs to Flaky Bot...
[FlakyBot] See https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot.
[FlakyBot] Published system_3.10_sponge_log.xml (13267330372609272)!
[FlakyBot] Done!
cleanup


[ID: 8515517] Command finished after 13319 secs, exit value: 1


Warning: Permanently added 'localhost' (ED25519) to the list of known hosts.
[16:15:31 PST] Collecting build artifacts from build VM
Build script failed with exit code: 1