English | 简体中文
A toolkit for processing and standardizing file sizes, primarily designed for file preparation before uploading to Telegram cloud backup.
- Analyzes the size of all subdirectories in a specified directory
- Generates human-readable size formats
- Saves analysis results to JSON file
- Records and compares directory size changes over time
- Compresses each subdirectory into separate ZIP files
- Supports progress display
- Customizable output directory
- Displays compressed file sizes
- Evenly distributes files from source folder to specified number of target folders
- Optimizes distribution based on file sizes
- Generates detailed allocation report (JSON format)
- Displays total size of each target folder
- Supports large file splitting and merging
- Default chunk size of 1.5GB
- Supports batch processing of large files in directory
- Generates split record file
- Smart chunk file management and merging
python directory_detection.py
python compress_files.py <source_directory> [-o output_directory]
python merge_small_file.py <source_directory> <number_of_target_folders>
# Split files
python split_large_file.py split <file_path> [-s chunk_size(GB)]
# Merge files
python split_large_file.py merge <chunk_file_path_or_directory>
directory_analysis_results.json
: Directory analysis resultsdirectory_analysis_diff.json
: Directory change recordsfolder_files_map.json
: File allocation mappingsplit_record.json
: File splitting records
-
Use
directory_detection.py
to analyze directory sizes, generatingdirectory_analysis_results.json
- When run again after generating
directory_analysis_results.json
, it will automatically read the file and analyze directory changes, generatingdirectory_analysis_diff.json
- When run again after generating
-
Use
compress_files.py
to compress second-level folders in the specified path, generating acompressed_files
folder containing compressed files -
Use
merge_small_file.py
to merge small files in the compressed directory into multiple merged folders, each not exceeding 1.5GB, stored in the generatedmerged_files
folder -
Use
compress_files.py
again to compress the standardized merged folders, generating acompressed_files
folder containing compressed files ready for upload -
Use
split_large_file.py
to batch split all large files (>2GB) in the specified directory into chunks of 1.5GB by default, stored in the current folder for upload- Later use
split_large_file.py
to merge chunk files in the specified directory
- Later use
- Recommended to analyze directories before processing large numbers of files
- Compression and splitting operations may require significant disk space
- Regular backup of important data is recommended