Skip to content
/ mxnet Public
forked from apache/mxnet

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

License

Notifications You must be signed in to change notification settings

snooravi/mxnet

This branch is 6430 commits behind apache/mxnet:master.

Folders and files

NameName
Last commit message
Last commit date
Dec 8, 2016
Jun 3, 2017
Jun 3, 2017
May 17, 2017
May 26, 2017
Jan 25, 2017
May 30, 2017
May 17, 2017
May 1, 2017
Jun 2, 2017
Jun 3, 2017
Jun 3, 2017
Mar 31, 2017
Feb 25, 2017
May 17, 2017
May 31, 2017
May 24, 2017
May 31, 2017
Mar 29, 2017
Jun 3, 2017
Jun 4, 2017
May 26, 2017
Jun 3, 2017
Jun 3, 2017
May 27, 2017
May 26, 2017
May 30, 2017
Jun 1, 2017
May 30, 2017
May 17, 2017
May 20, 2017
May 9, 2017
Feb 16, 2017
May 30, 2017
May 27, 2017
May 9, 2017
May 27, 2017
Dec 31, 2016
Mar 16, 2017
Mar 25, 2016
Mar 23, 2017
May 30, 2017

Repository files navigation

for Deep Learning

Build Status Documentation Status GitHub license

banner

MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to mix symbolic and imperative programming to maximize efficiency and productivity. At its core, MXNet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly. A graph optimization layer on top of that makes symbolic execution fast and memory efficient. MXNet is portable and lightweight, scaling effectively to multiple GPUs and multiple machines.

MXNet is also more than a deep learning project. It is also a collection of blue prints and guidelines for building deep learning systems, and interesting insights of DL systems for hackers.

Join the chat at https://gitter.im/dmlc/mxnet

What's New

Contents

Features

  • Design notes providing useful insights that can re-used by other DL projects
  • Flexible configuration for arbitrary computation graph
  • Mix and match imperative and symbolic programming to maximize flexibility and efficiency
  • Lightweight, memory efficient and portable to smart devices
  • Scales up to multi GPUs and distributed setting with auto parallelism
  • Support for Python, R, Scala, C++ and Julia
  • Cloud-friendly and directly compatible with S3, HDFS, and Azure

Ask Questions

  • Please use mxnet/issues for how to use mxnet and reporting bugs

License

© Contributors, 2015-2017. Licensed under an Apache-2.0 license.

Reference Paper

Tianqi Chen, Mu Li, Yutian Li, Min Lin, Naiyan Wang, Minjie Wang, Tianjun Xiao, Bing Xu, Chiyuan Zhang, and Zheng Zhang. MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems. In Neural Information Processing Systems, Workshop on Machine Learning Systems, 2015

History

MXNet emerged from a collaboration by the authors of cxxnet, minerva, and purine2. The project reflects what we have learned from the past projects. MXNet combines aspects of each of these projects to achieve flexibility, speed, and memory efficiency.

About

Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 32.1%
  • Python 27.7%
  • Jupyter Notebook 13.1%
  • Scala 8.8%
  • Perl 6.1%
  • Cuda 5.2%
  • Other 7.0%