Skip to content

Conversation

@bali0019
Copy link

@bali0019 bali0019 commented Sep 2, 2025

  • Fix existing unit test failures across all test files
  • Add comprehensive Azure DevOps pipeline test coverage
  • Fix markdown link checker tests by adding ignore patterns for broken links
  • Add integration test suite for end-to-end MLOps stack testing
  • Add GitHub Actions workflow for automated integration testing with AWS/Azure support
  • Update main test workflow with Node.js v22 for markdown link compatibility
  • Configure workflows to use Databricks-hosted runners for secure workspace access
  • Fix Black formatting issues across all test files

This enables automated nightly integration testing and improves overall test coverage

- Fix existing unit test failures across all test files
- Add comprehensive Azure DevOps pipeline test coverage
- Fix markdown link checker tests by adding ignore patterns for broken links
- Add integration test suite for end-to-end MLOps stack testing
- Add GitHub Actions workflow for automated integration testing with AWS/Azure support
- Update main test workflow with Node.js v22 for markdown link compatibility
- Configure workflows to use Databricks-hosted runners for secure workspace access
- Fix Black formatting issues across all test files

This enables automated nightly integration testing and improves overall test coverage
Remove config file from test log artifacts to streamline uploaded content

jobs:
integration-tests:
runs-on:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI: This is temporary, we will need to work on dedicated infra for this since we have integration tests now

- Use existing current_user fixture instead of slow CLI calls
- Remove unused deployed_project_path dependency
Copy link
Collaborator

@arpitjasa-db arpitjasa-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bit difficult to review a 4000 line PR, can we break these up into multiple PRs (maybe one per test suite) and I'll review one at a time? I'll go in order and that way comments on one PR can be propagated to future PRs, preventing huge rewrites on your end from one systemic thing

on:
schedule:
# Run nightly at 2 AM UTC
- cron: '0 2 * * *'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm instead of nightly, we can just run (with approval from a maintainer) on PRs


notify-failure:
runs-on:
group: databricks-field-eng-protected-runner-group
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if we can access these groups from public repos (haven't been able to in the past)

username: ${{ secrets.NOTIFICATION_EMAIL_USERNAME }}
password: ${{ secrets.NOTIFICATION_EMAIL_PASSWORD }}
subject: "FAILED: MLOps Stacks Integration Tests Failed - ${{ github.sha }}"
to: ${{ secrets.NOTIFICATION_EMAIL_TO }}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which email is this to?

needs: integration-tests
if: failure() && github.event_name == 'schedule'
steps:
- name: Send Email Notification
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need email notification failures if we end up just running these on PRs, just leave a comment on the PR?

@@ -0,0 +1,2 @@
# Integration tests for MLOps Stacks
# These tests require a real Databricks workspace and are configured via CLI profiles
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably a comment why this file is empty helps



def _cleanup_unity_catalog_model(databricks_cli, workspace_config, project_name):
"""Clean up Unity Catalog models by finding and deleting all matching models."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't we just use the Databricks SDK instead of running all these subprocesses?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants