Build and setup
Prerequisites
The recommended development is a native Ubuntu 24.04 machine or WSL 2 with an Ubuntu 24.04 image.
Download the code
Ensure you have both
gitandgit-lfspre-installed on your machine:$ git lfs --version
Clone the root repository:
$ git clone https://github.com/synaptics-torq/torq-compiler.git
Go to the directory that was cloned:
$ cd torq-compiler
Clone the required submodules (some submodules are not necessary for the build):
$ scripts/checkout_submodules.sh
Install required system packages
If you are using an Ubunutu 24.04 environment you can install system packages required for the build with the following command:
$ scripts/install_dependencies.sh
If you are using a different environment you can use a Docker image:
Log-in to the GitHub docker registry
$ echo $CR_PAT | docker login ghcr.io -u $GITHUB_USERNAME --password-stdin
You can use your GitHub username and a GitHub personal classic access token as password. The access token must be configured with the following permissions: read:packages, repo The $CR_PAT varible must be set to the github access token.
Please refer to Github documentation for the creation and usage of a personal access token.
Please refer to the official docker documentation to install docker on your machine. Some hints for linux, Windows and MacOS are also available in the SyNAP guide.
Start a development container with access to the current directory and your ssh configuration:
Note: To build and mount volumes correctly, Docker needs access to the entire
torq-compilerproject directory. Running this command from the parent directory allows Docker to mount the full project directory inside the container.$ cd .. && torq-compiler/scripts/dev.sh
In alternative you can customize the docker execution with an alias such as the one in the example here below:
$ alias torq-dev='docker run -it --rm -u $(id -u):$(id -g) -v $MOUNT_PATH:$MOUNT_PATH -w $(pwd) -e PATH=$VENV_PATH:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin -e CCACHE_DIR=$CCACHE_PATH -e HOME=$HOME_PATH -e IREE_BUILD_DIR=$BUILD_PATH -e ADB_SERVER_SOCKET=tcp:host.docker.internal:5037 ghcr.io/synaptics-torq/torq-compiler-dev/builder'
where the variables have the following meaning:
$MOUNT_PATHroot directory to be mounted inside the docker container$VENV_PATHpath to thebindirectory inside the python virtual env to be usedCCACHE_PATHpath to theccacheworking directoryHOME_PATHpath to the home directory (some utilities store info inside the home dir)BUILD_PATHpath to the build directory (if not using the defaultiree-build)The alias can then be used directly to start the torq development environment:
$ torq-devSome tests download items from HuggingFace, in order for this to work you have to login to HuggingFace as well. This has to be done only once using an HuggingFace token, the registration information will be stored inside the
$HOME_PATHdirectory:hf auth login --token $HF_PAT
where
$HF_PATis the personal access token received from HuggingFace.The docker comes with
adbpreinstalled and it is possible to use it to connect to an astra board via TCP/IP or USB.Inside the container, go to the
torq-compilerdirectory and continue with the build steps:$ cd torq-compiler
Build compiler and runtime for host
Setup a python virtual environment with the packages required for development:
$ scripts/configure_python.sh ../venv ../iree-build
The first parameter is the location of the venv and the second is the location where the build will be performed so that the build outputs can be pre-enabled in the environment.
Activate the environment:
$ source ../venv/bin/activate
(optional but strongly suggested) Setup
ccacheas follows::If running inside a docker be sure the CCACHE_DIR is configured correctly, eg:
export CCACHE_DIR=../iree-build/ccache
This can be configured when the docker image is started using the
-eoption.Initialize the ccache:
$ ccache --max-size=20G
Setup the build system with the following command line:
$ scripts/configure_build.sh ../iree-build
Build the Torq compiler and the runtime:
$ cmake --build ../iree-build/ --target torq
Building IREE from source may take several hours, especially on a typical laptop, due to the project’s size and complexity.
Build runtime for target
In order to cross-compile the runtime for an embedded target use the following commands:
Build the host version of the compiler as explained in the previous section (some host tools are required for the cross-build)
Configure the cross-compile build:
$ scripts/configure_soc_build.sh ../iree-build-soc ../iree-build astra_machina poky
Run the cross-compile build:
$ cmake --build ../iree-build-soc/ --target torq-run-module
The statically linked torq-run-module is available in ../iree-build-soc/third_party/iree/tools/torq-run-module.