|
1 |
| -# JTorch project |
| 1 | +# JTorch: 一个全兼容 PyTorch 接口的高性能动态编译深度学习框架 |
| 2 | + |
| 3 | +JTorch 是一个完全兼容 PyTorch 接口的深度学习框架,同时基于 Jittor 元算子与统一计算图特性的加持,实现高性能动态编译,同时,用户原来使用的PyTorch代码,不需要进行任何修改,即可加速运行。总结而言,JTorch具有以下几点优势: |
| 4 | + |
| 5 | +1. 零成本:完全兼容原生 PyTorch 接口, 用户代码不需要作任何更改。 |
| 6 | +2. 速度快:通过统一计算图执行方法,JTorch可以实现对代码的动态编译以及加速,相比原版 PyTorch拥有更好的性能。 |
| 7 | +3. 支持硬件多:JTorch底层通过元算子抽象,可以快速兼容适配多种人工智能芯片。 |
| 8 | +4. 兼容生态: 对原有 PyTorch 生态形成兼容,如各种第三方开发的 PyTorch 模型库。 |
| 9 | +5. 兼容计图: JTorch完全兼容计图,计图中的接口可以混合使用,性能高。 |
| 10 | +6. 完全自主可控: JTorch 具有完全的自主知识产权,用户完全不需要安装 Torch,即可直接使用。 |
| 11 | + |
| 12 | + |
| 13 | +JTorch相关连接: |
| 14 | + |
| 15 | +* [Github](https://github.com/JITTorch/jtorch) |
| 16 | +* [Jittor 论坛](https://discuss.jittor.org/) |
| 17 | +* 即时通信: QQ Group(761222083) |
| 18 | + |
| 19 | +# 安装与测试 |
| 20 | + |
| 21 | +安装方法如下: |
| 22 | + |
| 23 | +``` |
| 24 | +python3 -m pip install jtorch |
| 25 | +``` |
| 26 | + |
| 27 | +注意,请使用python3.7及以上的版本 |
| 28 | + |
| 29 | +运行简单测试: |
| 30 | + |
| 31 | +``` |
| 32 | +python3 -m jtorch.test.test_tutorial |
| 33 | +``` |
| 34 | + |
| 35 | +# 快速入门 |
| 36 | + |
| 37 | +## 使用 JTorch 实现简单动态网络(PyTorch兼容) |
| 38 | + |
| 39 | +```python |
| 40 | +# -*- coding: utf-8 -*- |
| 41 | +import random |
| 42 | +import torch |
| 43 | +import math |
| 44 | + |
| 45 | + |
| 46 | +class DynamicNet(torch.nn.Module): |
| 47 | + def __init__(self): |
| 48 | + """ |
| 49 | + In the constructor we instantiate five parameters and assign them as members. |
| 50 | + """ |
| 51 | + super().__init__() |
| 52 | + self.a = torch.nn.Parameter(torch.randn(())) |
| 53 | + self.b = torch.nn.Parameter(torch.randn(())) |
| 54 | + self.c = torch.nn.Parameter(torch.randn(())) |
| 55 | + self.d = torch.nn.Parameter(torch.randn(())) |
| 56 | + self.e = torch.nn.Parameter(torch.randn(())) |
| 57 | + |
| 58 | + def forward(self, x): |
| 59 | + """ |
| 60 | + For the forward pass of the model, we randomly choose either 4, 5 |
| 61 | + and reuse the e parameter to compute the contribution of these orders. |
| 62 | +
|
| 63 | + Since each forward pass builds a dynamic computation graph, we can use normal |
| 64 | + Python control-flow operators like loops or conditional statements when |
| 65 | + defining the forward pass of the model. |
| 66 | +
|
| 67 | + Here we also see that it is perfectly safe to reuse the same parameter many |
| 68 | + times when defining a computational graph. |
| 69 | + """ |
| 70 | + y = self.a + self.b * x + self.c * x ** 2 + self.d * x ** 3 |
| 71 | + for exp in range(4, random.randint(4, 6)): |
| 72 | + y = y + self.e * x ** exp |
| 73 | + return y |
| 74 | + |
| 75 | + def string(self): |
| 76 | + """ |
| 77 | + Just like any class in Python, you can also define custom method on PyTorch modules |
| 78 | + """ |
| 79 | + return f'y = {self.a.item()} + {self.b.item()} x + {self.c.item()} x^2 + {self.d.item()} x^3 + {self.e.item()} x^4 ? + {self.e.item()} x^5 ?' |
| 80 | + |
| 81 | + |
| 82 | +# Create Tensors to hold input and outputs. |
| 83 | +x = torch.linspace(-math.pi, math.pi, 2000) |
| 84 | +y = torch.sin(x) |
| 85 | + |
| 86 | +# Construct our model by instantiating the class defined above |
| 87 | +model = DynamicNet() |
| 88 | + |
| 89 | +# Construct our loss function and an Optimizer. Training this strange model with |
| 90 | +# vanilla stochastic gradient descent is tough, so we use momentum |
| 91 | +criterion = torch.nn.MSELoss(reduction='sum') |
| 92 | +optimizer = torch.optim.SGD(model.parameters(), lr=1e-8, momentum=0.9) |
| 93 | +for t in range(60000): |
| 94 | + # Forward pass: Compute predicted y by passing x to the model |
| 95 | + y_pred = model(x) |
| 96 | + |
| 97 | + # Compute and print loss |
| 98 | + loss = criterion(y_pred, y) |
| 99 | + if t % 2000 == 1999: |
| 100 | + print(t, loss.item()) |
| 101 | + |
| 102 | + # Zero gradients, perform a backward pass, and update the weights. |
| 103 | + optimizer.zero_grad() |
| 104 | + loss.backward() |
| 105 | + optimizer.step() |
| 106 | + # print(torch.liveness_info()) |
| 107 | + |
| 108 | +print(f'Result: {model.string()}') |
| 109 | +``` |
| 110 | + |
| 111 | +## 联系我们 |
| 112 | + |
| 113 | + |
| 114 | + |
| 115 | +提出issue:https://github.com/jittorch/jtorch/issues |
| 116 | + |
| 117 | +QQ 群:761222083 |
| 118 | + |
| 119 | + |
| 120 | +## 版权声明 |
| 121 | + |
| 122 | +如LICENSE.txt文件中所示, JTorch 使用Apache 2.0版权协议。 |
| 123 | + |
0 commit comments