Skip to content

Commit ab9bbe4

Browse files
committed
add readme
1 parent cbbaf70 commit ab9bbe4

File tree

3 files changed

+129
-6
lines changed

3 files changed

+129
-6
lines changed

LICENSE.txt

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Copyright (c) 2022 JTorch. All Rights Reserved
1+
Copyright (c) 2022 FittenTech. All Rights Reserved
22

33
Apache License
44
Version 2.0, January 2004
@@ -188,7 +188,7 @@ Copyright (c) 2022 JTorch. All Rights Reserved
188188
same "printed page" as the copyright notice for easier
189189
identification within third-party archives.
190190

191-
Copyright (c) 2022 JTorch. All Rights Reserved.
191+
Copyright (c) 2022 FittenTech. All Rights Reserved.
192192

193193
Licensed under the Apache License, Version 2.0 (the "License");
194194
you may not use this file except in compliance with the License.

README.md

+123-1
Original file line numberDiff line numberDiff line change
@@ -1 +1,123 @@
1-
# JTorch project
1+
# JTorch: 一个全兼容 PyTorch 接口的高性能动态编译深度学习框架
2+
3+
JTorch 是一个完全兼容 PyTorch 接口的深度学习框架,同时基于 Jittor 元算子与统一计算图特性的加持,实现高性能动态编译,同时,用户原来使用的PyTorch代码,不需要进行任何修改,即可加速运行。总结而言,JTorch具有以下几点优势:
4+
5+
1. 零成本:完全兼容原生 PyTorch 接口, 用户代码不需要作任何更改。
6+
2. 速度快:通过统一计算图执行方法,JTorch可以实现对代码的动态编译以及加速,相比原版 PyTorch拥有更好的性能。
7+
3. 支持硬件多:JTorch底层通过元算子抽象,可以快速兼容适配多种人工智能芯片。
8+
4. 兼容生态: 对原有 PyTorch 生态形成兼容,如各种第三方开发的 PyTorch 模型库。
9+
5. 兼容计图: JTorch完全兼容计图,计图中的接口可以混合使用,性能高。
10+
6. 完全自主可控: JTorch 具有完全的自主知识产权,用户完全不需要安装 Torch,即可直接使用。
11+
12+
13+
JTorch相关连接:
14+
15+
* [Github](https://github.com/JITTorch/jtorch)
16+
* [Jittor 论坛](https://discuss.jittor.org/)
17+
* 即时通信: QQ Group(761222083)
18+
19+
# 安装与测试
20+
21+
安装方法如下:
22+
23+
```
24+
python3 -m pip install jtorch
25+
```
26+
27+
注意,请使用python3.7及以上的版本
28+
29+
运行简单测试:
30+
31+
```
32+
python3 -m jtorch.test.test_tutorial
33+
```
34+
35+
# 快速入门
36+
37+
## 使用 JTorch 实现简单动态网络(PyTorch兼容)
38+
39+
```python
40+
# -*- coding: utf-8 -*-
41+
import random
42+
import torch
43+
import math
44+
45+
46+
class DynamicNet(torch.nn.Module):
47+
def __init__(self):
48+
"""
49+
In the constructor we instantiate five parameters and assign them as members.
50+
"""
51+
super().__init__()
52+
self.a = torch.nn.Parameter(torch.randn(()))
53+
self.b = torch.nn.Parameter(torch.randn(()))
54+
self.c = torch.nn.Parameter(torch.randn(()))
55+
self.d = torch.nn.Parameter(torch.randn(()))
56+
self.e = torch.nn.Parameter(torch.randn(()))
57+
58+
def forward(self, x):
59+
"""
60+
For the forward pass of the model, we randomly choose either 4, 5
61+
and reuse the e parameter to compute the contribution of these orders.
62+
63+
Since each forward pass builds a dynamic computation graph, we can use normal
64+
Python control-flow operators like loops or conditional statements when
65+
defining the forward pass of the model.
66+
67+
Here we also see that it is perfectly safe to reuse the same parameter many
68+
times when defining a computational graph.
69+
"""
70+
y = self.a + self.b * x + self.c * x ** 2 + self.d * x ** 3
71+
for exp in range(4, random.randint(4, 6)):
72+
y = y + self.e * x ** exp
73+
return y
74+
75+
def string(self):
76+
"""
77+
Just like any class in Python, you can also define custom method on PyTorch modules
78+
"""
79+
return f'y = {self.a.item()} + {self.b.item()} x + {self.c.item()} x^2 + {self.d.item()} x^3 + {self.e.item()} x^4 ? + {self.e.item()} x^5 ?'
80+
81+
82+
# Create Tensors to hold input and outputs.
83+
x = torch.linspace(-math.pi, math.pi, 2000)
84+
y = torch.sin(x)
85+
86+
# Construct our model by instantiating the class defined above
87+
model = DynamicNet()
88+
89+
# Construct our loss function and an Optimizer. Training this strange model with
90+
# vanilla stochastic gradient descent is tough, so we use momentum
91+
criterion = torch.nn.MSELoss(reduction='sum')
92+
optimizer = torch.optim.SGD(model.parameters(), lr=1e-8, momentum=0.9)
93+
for t in range(60000):
94+
# Forward pass: Compute predicted y by passing x to the model
95+
y_pred = model(x)
96+
97+
# Compute and print loss
98+
loss = criterion(y_pred, y)
99+
if t % 2000 == 1999:
100+
print(t, loss.item())
101+
102+
# Zero gradients, perform a backward pass, and update the weights.
103+
optimizer.zero_grad()
104+
loss.backward()
105+
optimizer.step()
106+
# print(torch.liveness_info())
107+
108+
print(f'Result: {model.string()}')
109+
```
110+
111+
## 联系我们
112+
113+
电子邮件:[email protected]
114+
115+
提出issue:https://github.com/jittorch/jtorch/issues
116+
117+
QQ 群:761222083
118+
119+
120+
## 版权声明
121+
122+
如LICENSE.txt文件中所示, JTorch 使用Apache 2.0版权协议。
123+

setup.py

+4-3
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66
setuptools.setup(
77
name="jtorch",
8-
version="0.0.1",
8+
version="0.0.4",
99
author="jtorch",
1010
author_email="[email protected]",
1111
description="jtorch project",
@@ -19,10 +19,11 @@
1919
"Programming Language :: Python :: 3",
2020
"Operating System :: OS Independent",
2121
],
22+
packages=["jtorch", "torch"],
2223
package_dir={"": "python"},
23-
packages=setuptools.find_packages(where="python"),
24+
package_data={'': ['*', '*/*', '*/*/*','*/*/*/*','*/*/*/*/*','*/*/*/*/*/*']},
2425
python_requires=">=3.7",
2526
install_requires=[
26-
"jittor",
27+
"jittor>=1.3.4.10",
2728
],
2829
)

0 commit comments

Comments
 (0)