Skip to content

Commit 8be63c9

Browse files
authored
Merge large-requirements.txt and small-requirements.txt (mlflow#5942)
* remove large-requirements.txt Signed-off-by: harupy <[email protected]> * rename Signed-off-by: harupy <[email protected]> * Remove large-requirements.txt Signed-off-by: harupy <[email protected]> * fix env vars Signed-off-by: harupy <[email protected]> * remove scipy Signed-off-by: harupy <[email protected]> * remove pyspark in test-requirements Signed-off-by: harupy <[email protected]> * install pyspark in pyfunc job Signed-off-by: harupy <[email protected]> * remove redundant env var Signed-off-by: harupy <[email protected]> * fix skinny job Signed-off-by: harupy <[email protected]> * fix typo Signed-off-by: harupy <[email protected]> * fix comment Signed-off-by: harupy <[email protected]>
1 parent 51919d5 commit 8be63c9

13 files changed

+21
-46
lines changed

.dockerignore

+1-2
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,7 @@ examples
88

99
dev
1010
# required for Docker build
11-
!requirements/small-requirements.txt
12-
!requirements/large-requirements.txt
11+
!requirements/test-requirements.txt
1312
!requirements/lint-requirements.txt
1413
!requirements/extra-ml-requirements.txt
1514

.github/workflows/cross-version-tests.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@ jobs:
152152
python --version
153153
pip install --upgrade pip wheel
154154
pip install -e .
155-
pip install -r requirements/small-requirements.txt
155+
pip install -r requirements/test-requirements.txt
156156
- name: Install ${{ matrix.package }} ${{ matrix.version }}
157157
env:
158158
CACHE_DIR: /home/runner/.cache/wheels

.github/workflows/examples.yml

+1-2
Original file line numberDiff line numberDiff line change
@@ -69,8 +69,7 @@ jobs:
6969
- name: Install dependencies
7070
if: ${{ github.event_name == 'schedule' || steps.check-diff.outputs.examples_changed == 'true' }}
7171
env:
72-
INSTALL_SMALL_PYTHON_DEPS: true
73-
INSTALL_LARGE_PYTHON_DEPS: true
72+
INSTALL_ML_DEPENDENCIES: true
7473
run: |
7574
source ./dev/install-common-deps.sh
7675

.github/workflows/master.yml

+7-16
Original file line numberDiff line numberDiff line change
@@ -55,8 +55,7 @@ jobs:
5555
echo "::add-matcher::.github/workflows/matchers/pylint.json"
5656
- name: Install dependencies
5757
env:
58-
INSTALL_LARGE_PYTHON_DEPS: true
59-
INSTALL_SMALL_PYTHON_DEPS: true
58+
INSTALL_ML_DEPENDENCIES: true
6059
run: |
6160
source ./dev/install-common-deps.sh
6261
pip install -r requirements/lint-requirements.txt
@@ -110,8 +109,7 @@ jobs:
110109
- uses: ./.github/actions/cache-pip
111110
- name: Install dependencies
112111
env:
113-
INSTALL_LARGE_PYTHON_DEPS: true
114-
INSTALL_SMALL_PYTHON_DEPS: true
112+
INSTALL_ML_DEPENDENCIES: true
115113
run: |
116114
source ./dev/install-common-deps.sh
117115
- name: Run tests
@@ -221,8 +219,7 @@ jobs:
221219
- uses: ./.github/actions/cache-pip
222220
- name: Install dependencies
223221
env:
224-
INSTALL_LARGE_PYTHON_DEPS: true
225-
INSTALL_SMALL_PYTHON_DEPS: true
222+
INSTALL_ML_DEPENDENCIES: true
226223
run: |
227224
source ./dev/install-common-deps.sh
228225
- name: Run tests
@@ -249,9 +246,6 @@ jobs:
249246
java-version: 11
250247
distribution: 'adopt'
251248
- name: Install dependencies
252-
env:
253-
INSTALL_SMALL_PYTHON_DEPS: true
254-
INSTALL_LARGE_PYTHON_DEPS: false
255249
run: |
256250
source ./dev/install-common-deps.sh
257251
pip install pyspark
@@ -276,12 +270,9 @@ jobs:
276270
java-version: 11
277271
distribution: 'adopt'
278272
- name: Install dependencies
279-
env:
280-
INSTALL_SMALL_PYTHON_DEPS: true
281-
INSTALL_LARGE_PYTHON_DEPS: false
282273
run: |
283274
source ./dev/install-common-deps.sh
284-
pip install keras tensorflow
275+
pip install keras tensorflow pyspark
285276
- name: Run tests
286277
env:
287278
SPARK_LOCAL_IP: 127.0.0.1
@@ -303,8 +294,7 @@ jobs:
303294
distribution: 'adopt'
304295
- name: Install dependencies
305296
env:
306-
INSTALL_LARGE_PYTHON_DEPS: true
307-
INSTALL_SMALL_PYTHON_DEPS: true
297+
INSTALL_ML_DEPENDENCIES: true
308298
run: |
309299
source ./dev/install-common-deps.sh
310300
- name: Run tests
@@ -328,9 +318,10 @@ jobs:
328318
distribution: 'adopt'
329319
- name: Install python dependencies
330320
run: |
331-
pip install -r requirements/small-requirements.txt
321+
pip install -r requirements/test-requirements.txt
332322
pip install --no-dependencies tests/resources/mlflow-test-plugin
333323
pip install -e .[extras]
324+
pip install pyspark
334325
- name: Run python tests
335326
run: |
336327
pytest --ignore-flavors --ignore=tests/projects --ignore=tests/examples tests

CONTRIBUTING.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -435,7 +435,7 @@ If you are adding new framework flavor support, you'll need to modify ``pytest``
435435
a. Add your tests to the ignore list, where the other frameworks are ignored
436436
b. Add a pytest command for your tests along with the other framework tests (as a separate command to avoid OOM issues)
437437

438-
2. ``requirements/large-requirements.txt``: add your framework and version to the list of requirements
438+
2. ``requirements/test-requirements.txt``: add your framework and version to the list of requirements
439439

440440
You can see an example of a `flavor PR <https://github.com/mlflow/mlflow/pull/2136/files>`_.
441441

EXTRA_DEPENDENCIES.rst

+1-2
Original file line numberDiff line numberDiff line change
@@ -15,5 +15,4 @@ The full set of extra dependencies are documented, along with the modules that d
1515
in the following files:
1616

1717
* extra-ml-requirements.txt: ML libraries needed to use model persistence and inference APIs
18-
* small-requirements.txt, large-requirements.txt: Libraries required to use non-default
19-
artifact-logging and tracking server configurations
18+
* test-requirements.txt: Libraries required to use non-default artifact-logging and tracking server configurations

dev/dev-env-setup.sh

+3-3
Original file line numberDiff line numberDiff line change
@@ -176,13 +176,13 @@ fi
176176
echo "The top-level dependencies that will be installed are: "
177177

178178
if [[ -n "$full" ]]; then
179-
files=("$rd/small-requirements.txt" "$rd/lint-requirements.txt" "$rd/large-requirements.txt" "$rd/doc-requirements.txt" "$rd/extra-ml-requirements.txt")
179+
files=("$rd/test-requirements.txt" "$rd/lint-requirements.txt" "$rd/doc-requirements.txt" "$rd/extra-ml-requirements.txt")
180180
echo "Files:"
181181
echo "MLflow test plugin: $MLFLOW_HOME/tests/resources/mlflow-test-plugin"
182182
echo "The local development branch of MLflow installed in editable mode with 'extras' requirements"
183183
echo "The following packages: "
184184
else
185-
files=("$rd/small-requirements.txt" "$rd/lint-requirements.txt" "$rd/large-requirements.txt" "$rd/doc-requirements.txt")
185+
files=("$rd/test-requirements.txt" "$rd/lint-requirements.txt" "$rd/doc-requirements.txt")
186186
fi
187187
tail -n +1 "${files[@]}" | grep "^[^#= ]" | sort | cat
188188

@@ -235,7 +235,7 @@ if [[ -n "$full" ]]; then
235235
pip install $(quiet_command) -e "$MLFLOW_HOME/tests/resources//mlflow-test-plugin"
236236
echo "Finished installing pip dependencies."
237237
else
238-
files=("$rd/small-requirements.txt" "$rd/lint-requirements.txt" "$rd/large-requirements.txt" "$rd/doc-requirements.txt")
238+
files=("$rd/test-requirements.txt" "$rd/lint-requirements.txt" "$rd/doc-requirements.txt")
239239
for r in "${files[@]}";
240240
do
241241
pip install $(quiet_command) -r "$r"

dev/install-common-deps.sh

+3-9
Original file line numberDiff line numberDiff line change
@@ -29,18 +29,12 @@ export MLFLOW_HOME=$(pwd)
2929

3030
req_files=""
3131
# Install Python test dependencies only if we're running Python tests
32-
if [[ "$INSTALL_SMALL_PYTHON_DEPS" == "true" ]]; then
33-
# When downloading large packages from PyPI, the connection is sometimes aborted by the
34-
# remote host. See https://github.com/pypa/pip/issues/8510.
35-
# As a workaround, we retry installation of large packages.
36-
req_files+=" -r requirements/small-requirements.txt"
37-
fi
3832
if [[ "$INSTALL_SKINNY_PYTHON_DEPS" == "true" ]]; then
3933
req_files+=" -r requirements/skinny-requirements.txt"
34+
else
35+
req_files+=" -r requirements/test-requirements.txt"
4036
fi
41-
if [[ "$INSTALL_LARGE_PYTHON_DEPS" == "true" ]]; then
42-
req_files+=" -r requirements/large-requirements.txt"
43-
37+
if [[ "$INSTALL_ML_DEPENDENCIES" == "true" ]]; then
4438
# Install prophet's dependencies beforehand, otherwise pip would fail to build a wheel for prophet
4539
if [[ -z "$(pip cache list prophet --format abspath)" ]]; then
4640
tmp_dir=$(mktemp -d)

dev/setup-spark-datasource-autologging.sh

-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
#!/usr/bin/env bash
22
# Test Spark autologging against the Spark 3.0 preview. This script is temporary and should be
33
# removed once Spark 3.0 is released in favor of simply updating all tests to run against Spark 3.0
4-
# (i.e. updating the pyspark dependency version in requirements/large-requirements.txt)
54
set -ex
65

76
# Build Java package

requirements/dev-requirements.txt

+1-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
1-
-r small-requirements.txt
2-
-r large-requirements.txt
1+
-r test-requirements.txt
32
-r extra-ml-requirements.txt
43
-r lint-requirements.txt
54
-r doc-requirements.txt

requirements/extra-ml-requirements.txt

+1
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,7 @@ h2o
3333
# Required by mlflow.onnx
3434
onnx>=1.11.0
3535
onnxruntime
36+
tf2onnx
3637
# Required by mlflow.spark
3738
pyspark
3839
# Required by mlflow.shap

requirements/large-requirements.txt

-4
This file was deleted.

requirements/small-requirements.txt requirements/test-requirements.txt

+1-3
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
1-
## Small test reqs
1+
## Dependencies required to run tests
22
# Required for testing utilities for parsing pip requirements
33
pip>=20.1
4-
scipy
54
# NB: We're specifying a test-only minimum version bound for sqlalchemy in order to reliably
65
# execute schema consistency checks, the semantics of which were changed in sqlalchemy 1.3.21
76
sqlalchemy>=1.3.21
@@ -17,4 +16,3 @@ pillow
1716
matplotlib
1817
shap>=0.40
1918
plotly
20-
pyspark

0 commit comments

Comments
 (0)