@@ -5,67 +5,30 @@ All notable changes to this project will be documented in this file.
5
5
The format is based on [ Keep a Changelog] ( http://keepachangelog.com/en/1.0.0/ )
6
6
and this project adheres to [ Semantic Versioning] ( http://semver.org/spec/v2.0.0.html ) .
7
7
8
- ## [ 0.3.0a5]
9
-
10
- ### Changed
11
-
12
- * ` requirements_txt_file ` no longer optional for model uploads.
13
- * Remove ` id ` from POST params to API server.
14
- * NLP dataset character limit to 1000 characters.
15
-
16
- ### Fixed
17
-
18
- * Fix issue with duplicate feature names for NLP datasets.
19
- * Added protobuf==3.2.0 to requirements to fix bug with model deployment.
20
-
21
- ## [ 0.3.0a2]
22
-
23
- ### Fixed
24
-
25
- * Fixed link to project page when loading / creating a project.
26
- * Presigned url endpoint when using AWS / GCP / Azure.
27
-
28
- ### Changed
29
-
30
- * Removed links when uploading dataset and models. Just the project link is appropriate.
31
-
32
- ## [ 0.3.0a1]
33
-
34
- ### Changed
35
-
36
- * Default Unbox server URL (< https://api-staging.unbox.ai/ > ).
37
-
38
- ## [ 0.3.0a0]
8
+ ## [ 0.3.0]
39
9
40
10
### Added
41
11
42
12
* A ` Project ` helper class.
43
- * A convenience method ` create_or_load_project ` which loads in a project in if it is already created.
13
+ * A convenience method ` create_or_load_project ` which loads in a project if it is already created.
14
+ * Accepts AZURE as a ` DeploymentType ` .
44
15
45
16
### Changed
46
17
18
+ * Compatibility with Unbox API OpenAPI refactor.
47
19
* Models and datasets must be added to projects.
48
20
* Deprecates ` categorical_features_map ` in favor of ` categorical_feature_names ` for model and dataset uploads.
49
21
* Moved ` TaskType ` attribute from the ` Model ` level to the ` Project ` level. Creating a ` Project ` now requires specifying the ` TaskType ` .
50
22
* Removed ` name ` from ` add_dataset ` .
51
23
* Changed ` description ` to ` commit_message ` from ` add_dataset ` , ` add_dataframe ` and ` add_model ` .
52
-
53
- ## [ 0.2.0a1 ]
24
+ * ` requirements_txt_file ` no longer optional for model uploads.
25
+ * NLP dataset character limit is now 1000 characters.
54
26
55
27
### Fixed
56
28
57
- * Fail early if ` custom_model_code ` , ` dependent_dir ` or ` requirements_txt_file ` are ` None ` when model type is ` ModelType.custom ` .
58
- * Fail early if ` model ` is not ` None ` when model type is ` ModelType.custom ` .
59
-
60
- ## [ 0.2.0a0]
61
-
62
- ### Added
63
-
64
- * Accepts AZURE as a ` DeploymentType ` .
65
-
66
- ### Changed
67
-
68
- * Compatibility with Unbox backend storage and data refactor.
29
+ * More comprehensive model and dataset upload validation.
30
+ * Bug with duplicate feature names for NLP datasets if uploading same dataset twice.
31
+ * Added ` protobuf==3.2.0 ` to requirements to fix bug with model deployment.
69
32
70
33
## [ 0.1.2] - 2022-05-22
71
34
0 commit comments