Skip to main content
SUBMIT A PRSUBMIT AN ISSUElast edit: May 07, 2026

Run with Source Code

To install and run a subtensor node by compiling the source code, follow the below steps.

Not tested on cloud

We have not tested subtensor node installation scripts on any cloud service. In addition, if you are using Runpod cloud service, then note that this service is already containerized. Hence, the only option available to you for Runpod is to install a subtensor node by compiling from source, as described below. Note that we have not tested any subtensor installation steps on Runpod.

Install basic packages

Install the basic requirements by running the below commands on a Linux terminal.

Linux
sudo apt-get update
sudo apt install -y build-essential clang curl git make libssl-dev llvm libudev-dev protobuf-compiler pkg-config

Install the basic requirements by running the below command on macOS.

macOS
brew install protobuf

Install Rust

Next, install Rust and update the environment by running the following commands:

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source ~/.cargo/env

Next, install Rust toolchain:

rustup default stable
rustup update
rustup target add wasm32-unknown-unknown
rustup toolchain install nightly
rustup target add --toolchain nightly wasm32-unknown-unknown

Compile subtensor code

Next, to compile the subtensor source code, follow the below steps:

  1. Clone the Subtensor repository:
git clone https://github.com/opentensor/subtensor.git
  1. Change to the Subtensor directory:
cd subtensor
  1. Switch to main branch:
git checkout main
  1. Remove any previous chain state:
rm -rf /var/lib/subtensor
  1. Install Subtensor by compiling with Cargo:
cargo build -p node-subtensor --profile=production --features=metadata-hash

Run the subtensor node

You can now run the public subtensor node either as a lite node or as an archive node. See below:

Using lite node

To run a lite node connected to the mainchain, execute the below command (note the --sync=warp flag which runs the subtensor node in lite mode):

./target/production/node-subtensor --chain ./chainspecs/raw_spec_finney.json --base-path /var/lib/subtensor --sync=warp --port 30333 --max-runtime-instances 32 --database paritydb --db-cache 4096 --trie-cache-size 2048 --rpc-max-response-size 2048 --rpc-cors all --rpc-port 9944 --bootnodes /dns/bootnode.finney.chain.opentensor.ai/tcp/30333/ws/p2p/12D3KooWRwbMb85RWnT8DSXSYMWQtuDwh4LJzndoRrTDotTR5gDC --no-mdns --rpc-external

Using archive node

To run an archive node connected to the mainchain, execute the below command (note the --sync=full which syncs the node to the full chain and --pruning archive flags, which disables the node's automatic pruning of older historical data):

./target/production/node-subtensor --chain ./chainspecs/raw_spec_finney.json --base-path /var/lib/subtensor --sync=full --pruning archive --port 30333 --max-runtime-instances 32 --rpc-max-response-size 2048 --rpc-cors all --rpc-port 9944 --bootnodes /dns/bootnode.finney.chain.opentensor.ai/tcp/30333/ws/p2p/12D3KooWRwbMb85RWnT8DSXSYMWQtuDwh4LJzndoRrTDotTR5gDC --no-mdns --prometheus-external --rpc-external

Additional flags

Running a Subtensor node from the source code offers various levels of customization to the user. This includes the ability to modify the runtime by including additional flags to the run script.

Some examples of these flags are:

FlagDescription
--validatorEnable validator mode. The node will be started with the authority role and actively participate in any consensus task
--ws-externalExpose WebSocket server to external connections.
--ws-port <port>WebSocket port (default: 9944).
--rpc-rate-limitRPC rate limiting (calls/minute) for each connection. This is disabled by default.
--rpc-max-subscriptions-per-connectionSet the maximum concurrent subscriptions per connection [default: 1024]
--rpc-max-connectionsMaximum number of RPC server connections [default: 100]
--log <target>=<level>Logging configuration.
all supported flags

You can view all supported flags when running a subtensor by running the following command in the subtensor directory:

./target/production/node-subtensor --help

This command requires a local Subtensor build. Ensure the Subtensor node has been compiled before running it.