Skip to content
Snippets Groups Projects
Commit 63fbd6bd authored by auphelia's avatar auphelia
Browse files

[Sphinx documentation] Add end-to-end flow picture and started section about...

[Sphinx documentation] Add end-to-end flow picture and started section about how to get started with FINN
parent 26031b5e
No related branches found
No related tags found
No related merge requests found
Showing
with 70 additions and 5 deletions
No preview for this file type
No preview for this file type
No preview for this file type
No preview for this file type
source diff could not be displayed: it is too large. Options to address this: view the blob.
**********************
FINN - End-to-End Flow
**********************
.. image:: ../../notebooks/finn-design-flow-example.svg
:scale: 50%
:align: center
***************
Getting Started
***************
How to use the FINN compiler
============================
The FINN compiler should not be thought of a single pushbutton tool that does everything for you, but rather as a collection of scripts/tools that will help you convert a QNN into a custom FPGA accelerator that performs high-performance inference. We do provide several examples of taking trained networks all the way down to FPGA bitfiles, but if you are trying to do this for custom networks you will have to write your own Python scripts that call the appropriate FINN Compiler functions that process your design correctly, or adding new functions as required.
Requirements
============
* Ubuntu 18.04
* Docker
* A working Vivado installation
* A `VIVADO_PATH` environment variable pointing to the Vivado installation directory (e.g. the directory where settings64.sh is located)
Running FINN in Docker
======================
We use Docker extensively for developing and deploying FINN. If you are not familiar with Docker, there are many excellent `online resources <https://docker-curriculum.com/>`_ to get started. There is a Dockerfile in the root of the repository, as well as a `run-docker.sh` script that can be launched in the following modes:
......@@ -17,7 +17,7 @@ What is FINN?
* The FINN project, which includes tools for training quantized neural networks such as Brevitas, the FINN compiler, and the finn-hlslib Vivado HLS library of FPGA components for QNNs. An overview of the project can be taken from the following graphic and details can be seen on the `FINN project homepage <https://xilinx.github.io/finn/>`_.
.. image:: ../img/finn-stack.png
:scale: 75%
:scale: 50%
:align: center
* The repository, this Read the Docs website corresponds to. It is about the FINN compiler, which is the centerpiece of the FINN project. The GitHub repository can be viewed using the link in the upper right corner. To learn more about the FINN compiler, use this website and for a hands-on experience the repository contains some Jupyter notebooks which can be found under this `link <https://github.com/Xilinx/finn/tree/dev/notebooks>`_.
......
......@@ -150,6 +150,8 @@
<div class="section" id="finn-end-to-end-flow">
<h1>FINN - End-to-End Flow<a class="headerlink" href="#finn-end-to-end-flow" title="Permalink to this headline"></a></h1>
<a class="reference internal image-reference" href="_images/finn-design-flow-example.svg"><div align="center" class="align-center"><img alt="_images/finn-design-flow-example.svg" src="_images/finn-design-flow-example.svg" /></div>
</a>
</div>
......
......@@ -82,7 +82,12 @@
<ul class="current">
<li class="toctree-l1"><a class="reference internal" href="end_to_end_flow.html">FINN - End-to-End Flow</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">Getting Started</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">Getting Started</a><ul>
<li class="toctree-l2"><a class="reference internal" href="#how-to-use-the-finn-compiler">How to use the FINN compiler</a></li>
<li class="toctree-l2"><a class="reference internal" href="#requirements">Requirements</a></li>
<li class="toctree-l2"><a class="reference internal" href="#running-finn-in-docker">Running FINN in Docker</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="source_code.html">Source Code</a></li>
</ul>
......@@ -150,6 +155,23 @@
<div class="section" id="getting-started">
<h1>Getting Started<a class="headerlink" href="#getting-started" title="Permalink to this headline"></a></h1>
<div class="section" id="how-to-use-the-finn-compiler">
<h2>How to use the FINN compiler<a class="headerlink" href="#how-to-use-the-finn-compiler" title="Permalink to this headline"></a></h2>
<p>The FINN compiler should not be thought of a single pushbutton tool that does everything for you, but rather as a collection of scripts/tools that will help you convert a QNN into a custom FPGA accelerator that performs high-performance inference. We do provide several examples of taking trained networks all the way down to FPGA bitfiles, but if you are trying to do this for custom networks you will have to write your own Python scripts that call the appropriate FINN Compiler functions that process your design correctly, or adding new functions as required.</p>
</div>
<div class="section" id="requirements">
<h2>Requirements<a class="headerlink" href="#requirements" title="Permalink to this headline"></a></h2>
<ul class="simple">
<li><p>Ubuntu 18.04</p></li>
<li><p>Docker</p></li>
<li><p>A working Vivado installation</p></li>
<li><p>A <cite>VIVADO_PATH</cite> environment variable pointing to the Vivado installation directory (e.g. the directory where settings64.sh is located)</p></li>
</ul>
</div>
<div class="section" id="running-finn-in-docker">
<h2>Running FINN in Docker<a class="headerlink" href="#running-finn-in-docker" title="Permalink to this headline"></a></h2>
<p>We use Docker extensively for developing and deploying FINN. If you are not familiar with Docker, there are many excellent <a class="reference external" href="https://docker-curriculum.com/">online resources</a> to get started. There is a Dockerfile in the root of the repository, as well as a <cite>run-docker.sh</cite> script that can be launched in the following modes:</p>
</div>
</div>
......
......@@ -157,7 +157,7 @@
<ul class="simple">
<li><p>The FINN project, which includes tools for training quantized neural networks such as Brevitas, the FINN compiler, and the finn-hlslib Vivado HLS library of FPGA components for QNNs. An overview of the project can be taken from the following graphic and details can be seen on the <a class="reference external" href="https://xilinx.github.io/finn/">FINN project homepage</a>.</p></li>
</ul>
<a class="reference internal image-reference" href="_images/finn-stack.png"><img alt="_images/finn-stack.png" class="align-center" src="_images/finn-stack.png" style="width: 480.75px; height: 468.75px;" /></a>
<a class="reference internal image-reference" href="_images/finn-stack.png"><img alt="_images/finn-stack.png" class="align-center" src="_images/finn-stack.png" style="width: 320.5px; height: 312.5px;" /></a>
<ul class="simple">
<li><p>The repository, this Read the Docs website corresponds to. It is about the FINN compiler, which is the centerpiece of the FINN project. The GitHub repository can be viewed using the link in the upper right corner. To learn more about the FINN compiler, use this website and for a hands-on experience the repository contains some Jupyter notebooks which can be found under this <a class="reference external" href="https://github.com/Xilinx/finn/tree/dev/notebooks">link</a>.</p></li>
</ul>
......
This diff is collapsed.
**********************
FINN - End-to-End Flow
**********************
.. image:: ../../notebooks/finn-design-flow-example.svg
:scale: 50%
:align: center
***************
Getting Started
***************
How to use the FINN compiler
============================
The FINN compiler should not be thought of a single pushbutton tool that does everything for you, but rather as a collection of scripts/tools that will help you convert a QNN into a custom FPGA accelerator that performs high-performance inference. We do provide several examples of taking trained networks all the way down to FPGA bitfiles, but if you are trying to do this for custom networks you will have to write your own Python scripts that call the appropriate FINN Compiler functions that process your design correctly, or adding new functions as required.
Requirements
============
* Ubuntu 18.04
* Docker
* A working Vivado installation
* A `VIVADO_PATH` environment variable pointing to the Vivado installation directory (e.g. the directory where settings64.sh is located)
Running FINN in Docker
======================
We use Docker extensively for developing and deploying FINN. If you are not familiar with Docker, there are many excellent `online resources <https://docker-curriculum.com/>`_ to get started. There is a Dockerfile in the root of the repository, as well as a `run-docker.sh` script that can be launched in the following modes:
......@@ -17,7 +17,7 @@ What is FINN?
* The FINN project, which includes tools for training quantized neural networks such as Brevitas, the FINN compiler, and the finn-hlslib Vivado HLS library of FPGA components for QNNs. An overview of the project can be taken from the following graphic and details can be seen on the `FINN project homepage <https://xilinx.github.io/finn/>`_.
.. image:: ../img/finn-stack.png
:scale: 75%
:scale: 50%
:align: center
* The repository, this Read the Docs website corresponds to. It is about the FINN compiler, which is the centerpiece of the FINN project. The GitHub repository can be viewed using the link in the upper right corner. To learn more about the FINN compiler, use this website and for a hands-on experience the repository contains some Jupyter notebooks which can be found under this `link <https://github.com/Xilinx/finn/tree/dev/notebooks>`_.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment