leaf: 黑客的机器学习框架
Leaf •
Introduction
Leaf is a Machine Intelligence Framework engineered by software developers, not scientists. It was inspired by the brilliant people behind TensorFlow, Torch, Caffe, Rust and numerous research papers and brings modularity, performance and portability to deep learning. Leaf is lean and tries to introduce minimal technical debt to your stack.
Leaf is a few months old, but thanks to its architecture and Rust already one of the fastest Machine Intelligence Frameworks in the world.
See more Deep Neural Networks benchmarks on Deep Learning Benchmarks.
Leaf is portable. Run it on CPUs, GPUs, FPGAs on machines with an OS or on machines without one. Run it with OpenCL or CUDA. Credit goes to Collenchyma and Rust.
Leaf is part of the Autumn Machine Intelligence Platform, which is working on making AI algorithms 100x more computational efficient. Bringing real-time, offline AI to smartphones and embedded devices.
We see Leaf as the core of constructing high-performance machine intelligence applications. Leafs' design makes it easy to publish independent modules to make e.g. deep reinforcement learning, visualization and monitoring, network distribution, automated preprocessing or scaleable production deployment easily accessible for everyone.
For more info refer to,
- the Leaf examples,
- the Leaf Documentation,
- the Autumn Website or
- the Q&A
Disclaimer: Leaf is currently in an early stage of development. If you are experiencing any bugs that are not due to not yet implemented features, feel free to create a issue.
Getting Started
If you are new to Rust you can it install it as detailed here, and we recommend taking a look at the official Getting Started Guide.
If you're using Cargo, just add Leaf to your Cargo.toml
:
[dependencies] leaf = "0.2.0"
If you're using Cargo Edit, you can call:
cargo add leaf
If you are on a machine that doesn't have support for CUDA or OpenCL you can selectively enable them like this in your Cargo.toml
:
[dependencies] leaf = { version = "0.2.0", default-features = false } [features] default = ["native"] # include only the ones you want to use, in this case "native" native = ["leaf/native"] cuda = ["leaf/cuda"] opencl = ["leaf/opencl"]
Examples
We are providing a Leaf examples repository, where we and others publish executable machine learning models build with Leaf. It features a CLI for easy usage and has a detailed guide in the project README.md.
And Leaf comes with an examples directory as well, which features popular neural networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow the install guide, clone this repoistory and then run
cargo run --release --example benchmarks
Ecosystem / Extensions
We design Leaf and the other crates of the Autumn Platform as modular and extensible as possible. More helpful crates you can use with Leaf:
- Cuticula: Preprocessing Framework for Machine Learning
- Collenchyma: Portable, HPC-Framework on any hardware with CUDA, OpenCL, Rust
Support / Contact
- With a bit of luck you find us online on the #rust-machine-learing IRC at irc.mozilla.org,
- but we are always approachable on Gitter/Leaf
- For bugs and feature request, you can create a Github issue
- And for more private matters, send us a mail straight to our inbox developers@autumnai.com.
- Refer to Autumn for more information
Contributing
Want to contribute? Awesome! We have instructions to help you get started.
Leaf has a near real-time collaboration culture and happens here on Github and on the Leaf Gitter Channel.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as below, without any additional terms or conditions.
Changelog
You can find the release history at the CHANGELOG.md. We are using Clog, the Rust tool for auto-generating CHANGELOG files.
Q&A
Why Rust?
The current hardware just recently became strong enough to support real-world usage of machine intelligence e.g. super-human image recognition, self-driving cars, etc.. For taking advantage of the computational power of the underlying hardware from GPUs to clusters you need a low-level language that allows for control of memory. But to make machine intelligence widely accessible you want to have a high-level comfortable abstraction over the underlying hardware.
Rust allows us to cross this chasm. Rust promises performance like C/C++ but with safe memory-control. For now we can use C Rust wrappers for performant libraries. But in the future Rust rewritten libraries will have the advantage of zero-cost safe memory control, that will make large, parallel learning networks over CPUs and GPUs more feasible and more reliable to develop. The development of these future libraries is already under way e.g. Glium.
On the usability side, Rust offers a trait-system, that makes it easy for researchers and hobbyists alike to extend and work with Leaf as if Leaf would have been written in a higher-level language such as Ruby, Python, Java, etc.
Who can use Leaf?
We develop Leaf under the MIT open source license, which, paired with the easy access and performance, makes Leaf a first-choice option for researchers and developers alike.
Why did you open source Leaf?
We believe strongly in machine intelligence and think that it will have a major impact on future innovations, products and our society. At Autumn, we experienced a lack of common and well engineered tools for machine learning and therefore started to create a modular toolbox for machine learning in Rust. We hope, that with making our work open source, we will speed-up research and development of production-ready applications and make their work easier as well.
Who is Autumn?
Autumn is a startup working on automated decision making. Autumn was started by two developers MJ and Max. The startup is located in Berlin and recently received a pre-seed investment from Axel Springer and Plug&Play.
License
Licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.