Skip to content
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
53c522c
Docs: Update README.md for validation
KarolinaPomian Sep 5, 2025
15f616b
Fix: update paths-ignore in smoke-tests.yml and clarify README parame…
KarolinaPomian Sep 5, 2025
fa78369
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 5, 2025
6f3e4fb
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 5, 2025
f64812c
Docs: Revise README.md for clarity and structure in validation framew…
KarolinaPomian Sep 5, 2025
b5b61ba
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 9, 2025
a81776b
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 23, 2025
849a37d
Docs: Move validation framework documentation to doc/ directory
KarolinaPomian Sep 24, 2025
e781a79
Docs: Fix critical validation framework setup issues
KarolinaPomian Sep 24, 2025
f9b22e3
Docs: Improve validation config setup instructions
KarolinaPomian Sep 24, 2025
a267ab4
docs: Comprehensive MTL validation framework documentation improvements
KarolinaPomian Sep 24, 2025
49f8bb1
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 24, 2025
facca62
docs: address validation documentation clarity issues
KarolinaPomian Sep 24, 2025
59aa3b0
docs: add interactive paths to config files in configs README
KarolinaPomian Sep 24, 2025
a1f759f
docs: fix markdown linting errors
KarolinaPomian Sep 24, 2025
3974112
Update tests/validation/configs/README.md
KarolinaPomian Sep 24, 2025
298c449
docs: update validation framework and quick start guide with RxTxApp …
KarolinaPomian Sep 24, 2025
9c96ada
Merge branch 'main' into readme-update-validation
KarolinaPomian Sep 29, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/workflows/smoke-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,16 @@ on:
branches:
- main
- 'maint-**'
paths-ignore:
- '**.md'
- 'doc/**'
pull_request:
branches:
- main
- 'maint-**'
paths-ignore:
- '**.md'
- 'doc/**'
env:
BUILD_TYPE: 'Release'
DPDK_VERSION: '25.03'
Expand Down
226 changes: 226 additions & 0 deletions tests/validation/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,226 @@
# Media Transport Library Validation Test Suite

This directory contains the automated validation test suite for the Media Transport Library. The tests are designed to verify the functionality, performance, and compliance of the Media Transport Library with SMPTE ST2110 standards.

## Overview

The validation framework uses pytest to organize and execute tests across various components of the Media Transport Library. It supports testing of single and dual flow scenarios, various transport protocols, and integration with media processing tools like FFmpeg and GStreamer.

## Test Framework Structure

```plaintext
tests/validation/
├── common/ # Shared utilities for tests
│ ├── ffmpeg_handler/ # FFmpeg integration utilities
│ ├── integrity/ # Data integrity verification tools
│ └── nicctl.py # Network interface control
├── configs/ # Test configuration files
│ ├── test_config.yaml # Test environment settings
│ └── topology_config.yaml # Network topology configuration
├── create_pcap_file/ # Tools for packet capture file creation
├── mtl_engine/ # Core test framework components
│ ├── execute.py # Test execution management
│ ├── RxTxApp.py # RX/TX application interface
│ ├── GstreamerApp.py # GStreamer integration
│ ├── ffmpeg_app.py # FFmpeg integration
│ ├── csv_report.py # Test result reporting
│ └── ramdisk.py # RAM disk management
├── tests/ # Test modules
│ ├── single/ # Single-flow test scenarios
│ │ ├── dma/ # DMA tests
│ │ ├── ffmpeg/ # FFmpeg integration tests
│ │ ├── gstreamer/ # GStreamer integration tests
│ │ ├── kernel_socket/ # Kernel socket tests
│ │ ├── performance/ # Performance benchmarking
│ │ ├── ptp/ # Precision Time Protocol tests
│ │ ├── st20p/ # ST2110-20 video tests
│ │ ├── st22p/ # ST2110-22 compressed video tests
│ │ ├── st30p/ # ST2110-30 audio tests
│ │ └── st41/ # ST2110-40 ancillary data tests
│ ├── dual/ # Dual-flow test scenarios
│ └── invalid/ # Error handling and negative test cases
├── conftest.py # pytest configuration and fixtures
├── pytest.ini # pytest settings
└── requirements.txt # Python dependencies
```

## Setup and Installation

### Prerequisites

- Python 3.9 or higher
- Media Transport Library built and installed
- Network interfaces configured for testing
- Sufficient permissions for network management
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • It is also necessary to have the files used as input data. We currently have them on NFS.
  • What does it mean to have "Network interfaces configured for testing"? I guess you need to have to have everything done form MTL's run.md, but VF are created automatically.
  • by "Sufficient permissions for network management" you mean you have to be root user? Is there any other option?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • you need to install ffmpeg and gstreamer plugins to run some of the tests.


### Environment Setup

1. Create and activate a Python virtual environment:

```bash
python -m venv venv
source venv/bin/activate
```

2. Install required dependencies:

```bash
pip install -r requirements.txt
```

3. Configure test parameters:

Edit `configs/test_config.yaml` with the appropriate paths:
- Set `build` and `mtl_path` to the path of your Media Transport Library build
- Configure `media_path` to point to your test media files
- Adjust RAM disk settings if needed

Edit `configs/topology_config.yaml` to match your network configuration:
- Set the correct `ip_address`, `SSH_PORT`, `USERNAME`, and `KEY_PATH`
- Configure the appropriate `pci_device` for your network interfaces

4. Start the MtlManager service:

```bash
sudo MtlManager &
```

5. (Optional) Create VFs for NIC testing:

```bash
sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_P}"
sudo ./script/nicctl.sh create_vf "${TEST_PF_PORT_R}"
```

Replace `${TEST_PF_PORT_P}` and `${TEST_PF_PORT_R}` with your physical port identifiers.

## Running Tests

### Basic Test Execution

Run all tests with configuration files:

```bash
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml
```

Run specific test modules:

```bash
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py
```

### Test Categories

The tests are categorized with markers that can be used to run specific test groups:

- **Smoke Tests**: Quick verification tests
```bash
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke
```

- **Nightly Tests**: Comprehensive tests suitable for nightly runs
```bash
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m nightly
```

### Generating HTML Reports

You can generate HTML reports for test results:

```bash
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html
```

### Test Output and Reports

- Logs are written to `pytest.log`
- CSV reports are generated for compliance results
- The framework stores test results in a structured format for later analysis

## Test Configuration

### RAM Disk Configuration

Tests utilize RAM disks for high-performance media handling. Configure in `test_config.yaml`:

```yaml
ramdisk:
media:
mountpoint: /mnt/ramdisk/media
size_gib: 32
pcap:
mountpoint: /mnt/ramdisk/pcap
size_gib: 768
```

### Network Capture

Configure network packet capture settings in `test_config.yaml`:

```yaml
capture_cfg:
enable: true
test_name: test_name
pcap_dir: /mnt/ramdisk/pcap
capture_time: 5
interface: enp1s0f0
```

## Test Types

### Media Flow Tests

- **ST20p**: Tests for ST2110-20 (uncompressed video)
- **ST22p**: Tests for ST2110-22 (compressed video)
- **ST30p**: Tests for ST2110-30 (audio)
- **ST41**: Tests for ST2110-40 (ancillary data)

### Backend Tests

- **DMA**: Direct Memory Access tests
- **Kernel Socket**: Tests for kernel socket backend
- **XDP**: Tests for Express Data Path backend

### Integration Tests

- **FFmpeg**: Tests for FFmpeg integration
- **GStreamer**: Tests for GStreamer integration

### Performance Tests

- Tests to measure throughput, latency, and other performance metrics

## Extending the Test Suite

### Adding New Tests

1. Create a new test file in the appropriate directory under `tests/`
2. Follow the pytest format for test functions
3. Use existing fixtures from `conftest.py` or create new ones as needed
4. Add appropriate markers for test categorization

### Adding New Test Categories

1. Define the new marker in `pytest.ini`
2. Create a new directory under `tests/` if necessary
3. Add test files with the new marker

## Troubleshooting

### Common Issues

- **Network Interface Not Found**: Verify the interface configuration in `topology_config.yaml`
- **Test Media Not Found**: Check the `media_path` setting in `test_config.yaml`
- **Permission Issues**: Ensure the user has sufficient permissions for network operations

### Logs and Debugging

- Check `pytest.log` for detailed test execution logs
- Use the `--verbose` flag for more detailed output
- For network issues, use the packet capture feature to analyze traffic

## License

BSD-3-Clause License
Copyright (c) 2024-2025 Intel Corporation
103 changes: 103 additions & 0 deletions tests/validation/common/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
# Common Test Utilities

This directory contains shared utilities used across the Media Transport Library validation test suite. These utilities provide common functionality for network interface management, media integrity verification, and FFmpeg handling.

## Components

### nicctl.py

The `nicctl.py` module provides a `Nicctl` class for network interface control:

- Interface configuration and management
- PCI device binding and unbinding
- Link status monitoring
- MTU and other interface parameter configuration

Example usage:

```python
from common.nicctl import Nicctl

# Create a network interface controller
nic = Nicctl()

# Configure interface
nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0")

# Check link status
status = nic.get_link_status("enp1s0f0")
```

### integrity/

This directory contains tools for verifying data integrity in media transport tests:

- Pixel comparison utilities for video integrity checks
- Audio sample verification
- Ancillary data integrity checks
- Error statistics calculation

Key modules:

- `video_integrity.py`: Functions for comparing video frames before and after transport
- `audio_integrity.py`: Functions for comparing audio samples
- `ancillary_integrity.py`: Functions for comparing ancillary data

### ffmpeg_handler/

This directory contains utilities for FFmpeg integration:

- FFmpeg command generation
- Output parsing and analysis
- Media format detection and conversion
- Encoder and decoder integration

Key modules:

- `ffmpeg_cmd.py`: Functions for generating FFmpeg command lines
- `ffmpeg_output.py`: Functions for parsing and analyzing FFmpeg output
- `ffmpeg_formats.py`: Media format definitions and utilities

### gen_frames.sh

A shell script for generating test frames for video testing:

- Creates test patterns in various formats
- Supports different resolutions and frame rates
- Configurable color patterns and test signals
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mby adding usage here would be good ?


## Using Common Utilities in Tests

These utilities are imported and used by test modules to set up test environments, execute tests, and validate results.

Example:

```python
from common.nicctl import Nicctl
from common.integrity.video_integrity import compare_frames

def test_st20_transport():
# Configure network interfaces
nic = Nicctl()
nic.configure_interface("enp1s0f0", "192.168.1.10", "255.255.255.0")

# Run transport test
# ...

# Verify frame integrity
result = compare_frames("reference_frame.yuv", "received_frame.yuv")
assert result.match_percentage > 99.9, "Frame integrity check failed"
```

## Extending Common Utilities

To add new common utilities:

1. Create new Python modules in the appropriate subdirectory
2. Document the module's purpose and API
3. Import the new utilities in test modules as needed

## License

BSD-3-Clause License
Copyright (c) 2024-2025 Intel Corporation
Loading