Skip to content

Commit ed65992

Browse files
committed
Remove examples/activation
This has become a standalone kernel in kernels-community. It also precedes the relu example, which is much more minimal and easier to understand.
1 parent db8855f commit ed65992

File tree

24 files changed

+30
-904
lines changed

24 files changed

+30
-904
lines changed

.github/workflows/build_kernel.yaml

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -23,21 +23,16 @@ jobs:
2323
#authToken: "${{ secrets.CACHIX_AUTH_TOKEN }}"
2424
env:
2525
USER: github_runner
26-
- name: Build activation kernel
27-
run: ( cd examples/activation && nix build .\#redistributable.torch27-cxx11-cu126-x86_64-linux )
28-
- name: Copy activation kernel
29-
run: cp -rL examples/activation/result activation-kernel
26+
- name: Build relu kernel
27+
run: ( cd examples/relu && nix build .\#redistributable.torch27-cxx11-cu126-x86_64-linux )
28+
- name: Copy relu kernel
29+
run: cp -rL examples/relu/result relu-kernel
3030

3131
- name: Build cutlass GEMM kernel
3232
run: ( cd examples/cutlass-gemm && nix build .\#redistributable.torch27-cxx11-cu126-x86_64-linux )
3333
- name: Copy cutlass GEMM kernel
3434
run: cp -rL examples/cutlass-gemm/result cutlass-gemm-kernel
3535

36-
- name: Build relu kernel
37-
run: ( cd examples/relu && nix build .\#redistributable.torch27-cxx11-cu126-x86_64-linux )
38-
- name: Copy relu kernel
39-
run: cp -rL examples/relu/result relu-kernel
40-
4136
- name: Build relu-backprop-compile kernel
4237
run: ( cd examples/relu-backprop-compile && nix build .\#redistributable.torch27-cxx11-cu126-x86_64-linux )
4338
- name: Copy relu-backprop-compile kernel

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ nix run nixpkgs#cachix -- use huggingface
3737
Then quick start a build with:
3838

3939
```bash
40-
cd examples/activation
40+
cd examples/relu
4141
nix run .#build-and-copy \
4242
--override-input kernel-builder github:huggingface/kernel-builder \
4343
--max-jobs 8 \
@@ -50,7 +50,7 @@ We also provide Docker containers for CI builds. For a quick build:
5050

5151
```bash
5252
# Using the prebuilt container
53-
cd examples/activation
53+
cd examples/relu
5454
docker run --rm \
5555
--mount type=bind,source=$(pwd),target=/kernelcode \
5656
-w /kernelcode ghcr.io/huggingface/kernel-builder:main build

docs/docker.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -28,8 +28,8 @@ installed.
2828
We provide a Docker image with which you can build a kernel:
2929

3030
```bash
31-
# navigate to the activation directory
32-
cd examples/activation
31+
# navigate to the relu directory
32+
cd examples/relu
3333

3434
# then run the following command to build the kernel
3535
docker run --rm \
@@ -39,7 +39,7 @@ docker run --rm \
3939
```
4040

4141
This will build the kernel and save the output in the `build` directory in
42-
the activation folder.
42+
the relu folder.
4343

4444
## CLI Interface
4545

@@ -55,10 +55,10 @@ The kernel builder includes a command-line interface for easier interaction. The
5555
### Examples
5656

5757
```bash
58-
# Build the example activation kernel from the root of the repository
58+
# Build the example relu kernel from the root of the repository
5959
docker run --rm \
6060
-v $(pwd):/kernel-builder \
61-
-w /kernel-builder/examples/activation \
61+
-w /kernel-builder/examples/relu \
6262
ghcr.io/huggingface/kernel-builder:main \
6363
build
6464

@@ -185,20 +185,20 @@ The whole goal of building these kernels is to allow researchers, developers, an
185185
To load a kernel locally, you should add the kernel build that is compatible with the Torch and CUDA versions in your environment to `PYTHONPATH`. For example:
186186

187187
```bash
188-
# PyTorch 2.6 and CUDA 12.6
189-
export PYTHONPATH="result/torch26-cxx11-cu126-x86_64-linux"
188+
# PyTorch 2.8 and CUDA 12.6
189+
export PYTHONPATH="result/torch28-cxx11-cu126-x86_64-linux"
190190
```
191191

192192
The kernel can then be imported as a Python module:
193193

194194
```python
195195
import torch
196196

197-
import activation
197+
import relu
198198

199199
x = torch.randn(10, 10)
200200
out = torch.empty_like(x)
201-
activation.silu_and_mul(out, x)
201+
relu.relu(x, out)
202202

203203
print(out)
204204
```

docs/nix.md

Lines changed: 2 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ A kernel that has a `flake.nix` file can be built with the `build-and-copy`
4848
command. For example:
4949

5050
```bash
51-
cd examples/activation
51+
cd examples/relu
5252
nix run .#build-and-copy -L
5353
```
5454

@@ -94,7 +94,7 @@ with the kernel in Python's search path. This makes it more convenient to run
9494
tests:
9595

9696
```bash
97-
cd examples/activation
97+
cd examples/relu
9898
nix develop -L .#test
9999
python -m pytest tests
100100
```
@@ -142,14 +142,3 @@ this check enabled, as it is one of the checks that validates that a kernel
142142
is compliant. This option is primarily intended for kernels with
143143
`triton.autotune` decorators, which can fail because there is no GPU available
144144
in the build sandbox.
145-
146-
## Building a kernel without `flake.nix`
147-
148-
If a kernels source directory does not have a `flake.nix` file, you can build the
149-
kernel using the `buildTorchExtensionBundle` function from the kernel builder
150-
itself:
151-
152-
```bash
153-
cd examples/activation
154-
nix build --impure --expr 'with import ../..; lib.x86_64-linux.buildTorchExtensionBundle ./.' -L
155-
```

docs/writing-kernels.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ src = [
7777
"torch-ext/torch_binding.h"
7878
]
7979

80-
[kernel.activation]
80+
[kernel.relu]
8181
backend = "cuda"
8282
src = [
8383
"relu_kernel/relu.cu",

examples/activation/LICENSE

Lines changed: 0 additions & 201 deletions
This file was deleted.

examples/activation/README.md

Lines changed: 0 additions & 5 deletions
This file was deleted.

0 commit comments

Comments
 (0)