Skip to content

Block volume shrink on Xen #11004

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: 4.20
Choose a base branch
from

Conversation

JoaoJandre
Copy link
Contributor

Description

On version 4.16.1.0 PR #5829 blocked volume shrinking via UI when using XenServer. However, this change was not sufficient, as the user may still use the APIs directly to try to shrink their volume. Furthermore, via UI, it is still possible to inform the shrink option when changing disk offerings. This PR removes that option from the UI; also, the resizeVolume and changeOfferingForVolume APIs were changed to block this type of operation when using XenServer, similar to what is done when using QCOW2.

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Enhancement (improves an existing feature and functionality)
  • Cleanup (Code refactoring and cleanup, that may add test cases)
  • build/CI
  • test (unit or integration test code)

Feature/Enhancement Scale or Bug Severity

Feature/Enhancement Scale

  • Major
  • Minor

Bug Severity

  • BLOCKER
  • Critical
  • Major
  • Minor
  • Trivial

Screenshots (if appropriate):

How Has This Been Tested?

Here’s the English translation of your Markdown content:


Test Description

Via UI, I validated that the shrink option no longer appears for XenServer volumes:

  • Before:
    image

  • After:
    image

Via CMK, I validated that both APIs block the use of the parameter for XenServer:

(labinterno) 🐱 > change offeringforvolume id=53abae9c-ec2f-4ce3-964d-11981348d2e6 diskofferingid=9aac832e-cdc0-429a-87b5-7b6209111a5d size=10 shrinkok=true
{
  "jobid": "a674a9ae-4ec9-45eb-b2fb-6ee438c60645"
}
(labinterno) 🐱 > query asyncjobresult jobid=a674a9ae-4ec9-45eb-b2fb-6ee438c60645
...
  "errorcode": 431,
  "errortext": "Shrink volume is not supported for the XenServer hypervisor."
...
(labinterno) 🐱 > resize volume id=53abae9c-ec2f-4ce3-964d-11981348d2e6 size=28 shrinkok=true
{
  "jobid": "954da1c8-e107-41d4-a64c-5e15bfeec676"
}
(labinterno) 🐱 > query asyncjobresult jobid=954da1c8-e107-41d4-a64c-5e15bfeec676
...
  "errorcode": 432,
  "errortext": "Shrink volume is not supported for the XenServer hypervisor."
...

I also verified that the APIs still work correctly without using shrink:

(labinterno) 🐱 > change offeringforvolume id=53abae9c-ec2f-4ce3-964d-11981348d2e6 diskofferingid=9aac832e-cdc0-429a-87b5-7b6209111a5d size=100 shrinkok=true
{
  "jobid": "59934dc9-a6e8-42cc-802f-43004d705e2f"
}
(labinterno) 🐱 > query asyncjobresult jobid=59934dc9-a6e8-42cc-802f-43004d705e2f
...
  "jobresultcode": 0,
  "jobstatus": 1,
...
(labinterno) 🐱 > resize volume id=53abae9c-ec2f-4ce3-964d-11981348d2e6 size=30
{
  "jobid": "5848d956-75b3-43bb-acbc-5c3caf3cc9c4"
}
(labinterno) 🐱 > query asyncjobresult jobid=5848d956-75b3-43bb-acbc-5c3caf3cc9c4
...
  "jobresultcode": 0,
  "jobstatus": 1,
...

@JoaoJandre JoaoJandre added this to the 4.20.2 milestone Jun 11, 2025
Copy link

codecov bot commented Jun 11, 2025

Codecov Report

Attention: Patch coverage is 0% with 7 lines in your changes missing coverage. Please review.

Project coverage is 16.14%. Comparing base (41de0b9) to head (c7e4af3).

Files with missing lines Patch % Lines
...pper/xenbase/CitrixResizeVolumeCommandWrapper.java 0.00% 3 Missing and 2 partials ⚠️
...n/java/com/cloud/storage/VolumeApiServiceImpl.java 0.00% 2 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff              @@
##               4.20   #11004      +/-   ##
============================================
- Coverage     16.14%   16.14%   -0.01%     
  Complexity    13255    13255              
============================================
  Files          5657     5657              
  Lines        497903   497908       +5     
  Branches      60375    60377       +2     
============================================
- Hits          80395    80389       -6     
- Misses       408549   408558       +9     
- Partials       8959     8961       +2     
Flag Coverage Δ
uitests 4.00% <ø> (ø)
unittests 16.99% <0.00%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@JoaoJandre
Copy link
Contributor Author

@blueorangutan package

@blueorangutan
Copy link

@JoaoJandre a [SL] Jenkins job has been kicked to build packages. It will be bundled with KVM, XenServer and VMware SystemVM templates. I'll keep you posted as I make progress.

@blueorangutan
Copy link

Packaging result [SF]: ✔️ el8 ✔️ el9 ✖️ debian ✔️ suse15. SL-JID 13716

Copy link
Contributor

@DaanHoogland DaanHoogland left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

clgtm

@blueorangutan
Copy link

Packaging result [SF]: ✔️ el8 ✔️ el9 ✔️ debian ✔️ suse15. SL-JID 13736

Copy link
Contributor

@harikrishna-patnala harikrishna-patnala left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed LGTM

@DaanHoogland
Copy link
Contributor

@blueorangutan test

@blueorangutan
Copy link

@DaanHoogland a [SL] Trillian-Jenkins test job (ol8 mgmt + kvm-ol8) has been kicked to run smoke tests

@blueorangutan
Copy link

[SF] Trillian test result (tid-13520)
Environment: kvm-ol8 (x2), Advanced Networking with Mgmt server ol8
Total time taken: 54397 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr11004-t13520-kvm-ol8.zip
Smoke tests completed. 140 look OK, 1 have errors, 0 did not run
Only failed and skipped tests results shown below:

Test Result Time (s) Test File
test_10_reboot_cpvm_forced Error 9.46 test_ssvm.py

@DaanHoogland
Copy link
Contributor

@blueorangutan test ol8 xcpng82

@blueorangutan
Copy link

@DaanHoogland a [SL] Trillian-Jenkins test job (ol8 mgmt + xcpng82) has been kicked to run smoke tests

@blueorangutan
Copy link

[SF] Trillian Build Failed (tid-13526)

@blueorangutan
Copy link

[SF] Trillian test result (tid-13530)
Environment: xcpng82 (x2), Advanced Networking with Mgmt server ol8
Total time taken: 79379 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr11004-t13530-xcpng82.zip
Smoke tests completed. 140 look OK, 1 have errors, 0 did not run
Only failed and skipped tests results shown below:

Test Result Time (s) Test File
test_05_ping_in_cpvm_success Failure 15.45 test_diagnostics.py

@DaanHoogland
Copy link
Contributor

looks good, not sure if it needs further testing. let’s run tests again to satisfy my insanity:
@blueorangutan test ol8 xcpng82

@blueorangutan
Copy link

@DaanHoogland a [SL] Trillian-Jenkins test job (ol8 mgmt + xcpng82) has been kicked to run smoke tests

@blueorangutan
Copy link

[SF] Trillian Build Failed (tid-13542)

@blueorangutan
Copy link

[SF] Trillian test result (tid-13547)
Environment: xcpng82 (x2), Advanced Networking with Mgmt server ol8
Total time taken: 80961 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr11004-t13547-xcpng82.zip
Smoke tests completed. 141 look OK, 0 have errors, 0 did not run
Only failed and skipped tests results shown below:

Test Result Time (s) Test File

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants