diff --git a/README.md b/README.md index 56421b787..3218120ba 100644 --- a/README.md +++ b/README.md @@ -39,6 +39,8 @@ reservoir computing models. More specifically the software offers - 5+ reservoir states modification algorithms - Sparse matrix computation through [SparseArrays.jl](https://docs.julialang.org/en/v1/stdlib/SparseArrays/) +- Multiple training algorithms via [LIBSVM.jl](https://github.com/JuliaML/LIBSVM.jl) + and [MLJLinearModels.jl](https://github.com/JuliaAI/MLJLinearModels.jl) ## Installation diff --git a/docs/src/index.md b/docs/src/index.md index 11cc3bacf..3f1cfe5aa 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -32,15 +32,32 @@ This approach can be thought as a kernel method with an explicit kernel trick. For faster computations on the CPU it is suggested to add `using MKL` to the script. For clarity's sake this library will not be indicated under every example in the documentation. + +!!! warn "Lux's states vs reservoir computing states" + + ReservoirComputing.jl builds on Lux.jl. As such, it inherits the model’s states `st` + juggling and naming. This contrasts with the naming of the internal expansions + of the input data, which in reservoir computing literature are also known as states. + Since we cannot avoid using the same name for two different things, we tried to make + it as explicit as possible in the documentation when we refer to one or the other. + If you feel that this is causing confusion in some places, please open an issue! + ## Installation To install ReservoirComputing.jl, ensure you have Julia version 1.10 or higher. Follow these steps: - 1. Open the Julia command line. - 2. Enter the Pkg REPL mode by pressing ]. - 3. Type `add ReservoirComputing` and press Enter. +1. Open the Julia command line. +2. Enter the Pkg REPL mode by pressing ]. +3. Type `add ReservoirComputing` and press Enter. + +Alternatively, do: + +```julia +using Pkg +Pkg.add("ReservoirComputing") +``` For a more customized installation or to contribute to the package, consider cloning the repository: @@ -61,7 +78,9 @@ or `dev` the package. - 5+ reservoir states modification algorithms - Sparse matrix computation through [SparseArrays.jl](https://docs.julialang.org/en/v1/stdlib/SparseArrays/) - +- Multiple training algorithms via [LIBSVM.jl](https://github.com/JuliaML/LIBSVM.jl) + and [MLJLinearModels.jl](https://github.com/JuliaAI/MLJLinearModels.jl) + ## Contributing Contributions to ReservoirComputing.jl are highly encouraged and appreciated. diff --git a/docs/src/tutorials/scratch.md b/docs/src/tutorials/scratch.md index 6cacd53ce..b19e66078 100644 --- a/docs/src/tutorials/scratch.md +++ b/docs/src/tutorials/scratch.md @@ -4,7 +4,7 @@ ReservoirComputing.jl provides utilities to build reservoir reservoir computing models from scratch. In this tutorial we are going to build an echo state network ([`ESN`](@ref)) and showcase how this custom implementation is equivalent to the provided model (minus some comfort -utilities) +utilities). ## Using provided layers: ReservoirChain, ESNCell, and LinearReadout @@ -14,7 +14,7 @@ to the chain will concatenate them, and will allow the flow of the input data through the model. To build an ESN we also need a [`ESNCell`](@ref) to provide the ESN -forward pass. However, the cell is stateless, so to keep the memoruy of +forward pass. However, the cell is stateless, so to keep the memory of the input we need to wrap it in a [`StatefulLayer`](@ref), which saves the internal state in the model states `st` and feeds it to the cell in the next step. diff --git a/src/layers/basic.jl b/src/layers/basic.jl index 67ba82ab0..1b270b245 100644 --- a/src/layers/basic.jl +++ b/src/layers/basic.jl @@ -149,13 +149,13 @@ vectors are concatenated with `vcat` in order of appearance. `Collect` only when you want to control where/what is collected (or to stack multiple features). - ```julia - rc = ReservoirChain( - StatefulLayer(ESNCell(3 => 300)), - NLAT2(), - Collect(), # <-- collect the 300-dim reservoir after NLAT2 - LinearReadout(300 => 3; include_collect=false) # <-- toggle off the default Collect() - ) +```julia +rc = ReservoirChain( + StatefulLayer(ESNCell(3 => 300)), + NLAT2(), + Collect(), # <-- collect the 300-dim reservoir after NLAT2 + LinearReadout(300 => 3; include_collect=false) # <-- toggle off the default Collect() +) ``` """ struct Collect <: AbstractReservoirCollectionLayer end