Docker Setup
This guide will help you set up and run a Masa Node using Docker ready for configuration and deployment.
Prerequisites
Before you begin, ensure you have the following installed:
- Docker
- Docker Compose
- Sepolia ETH (you will need 0.01 Sepolia ETH in your node's wallet for the sepolia Masa faucet to function)
You need to have 0.01 Sepolia ETH on hand to stake sepolia Masa with your node.
Docker Desktop for Windows and Mac includes Docker Compose. On Linux, you may need to install it separately.
Clone the repository
git clone https://github.com/masa-finance/masa-oracle.git
cd masa-oracle
Environment Configuration
Set up environment variables to connect your node to run your node in local bootnode configuration
This guide will configure your node as a Local Bootnode, for a list of network bootnodes, please refer to the Bootnode Configuration bootnode configuration documentation.
Create a .env
file in the root directory with these essential variables:
# Default .env configuration
RPC_URL=https://ethereum-sepolia.publicnode.com
ENV=test
FILE_PATH=.
VALIDATOR=false
PORT=8080
API_ENABLED=true
For more .env options, see our Environment Configuration Guide.
Building the Docker Image
With Docker and Docker Compose installed and your .env
file configured, build the Docker image using the following command:
docker-compose build
This command builds the Docker image based on the instructions in the provided Dockerfile
and docker-compose.yaml
.
4. Start the node
docker-compose up
You will see the following output:
#######################################
# __ __ _ ____ _ #
# | \/ | / \ / ___| / \ #
# | |\/| | / _ \ \___ \ / _ \ #
# | | | |/ ___ \ ___) / ___ \ #
# |_| |_/_/ \_\____/_/ \_\ #
# #
#######################################
Multiaddress: /ip4/192.168.1.8/udp/4001/quic-v1/p2p/16Uiu2HAmDXWNV9RXVoRsbt9z7pFSsKS2KdpN7HHFVLdFZmS7iCvo
IP Address: /ip4/127.0.0.1/udp/4001/quic-v1
Public Key: 0x5dA36a3eB07fd1624B054b99D6417DdF2904e826
Is Staked: false
Is Validator: false
Is TwitterScraper: false
Is DiscordScraper: false
Is TelegramScraper: false
You now have a running node in Local Bootnode configuration
Masa Protocol Configuration
You can now configure your node to start scraping data as a miner, to fetch data from the network or to start participating in the network as a validator.
Configure your node
Stake your node
The Masa Protocol currently supports staking on Sepolia only.
After starting the node, you must stake sepolia Masa tokens to participate in the network; the node comes with a faucet to get sepolia Masa tokens you need Sepolia ETH in your wallet to get Sepolia MASA tokens from the faucet.
Masa Protocol Configuration
You can now configure your node to start scraping data as a miner, to fetch data from the network or to start participating in the network as a validator.
Configure your node
Stake your node
The Masa Protocol currently supports staking on Sepolia only.
After starting the node, you must stake sepolia Masa tokens to participate in the network; the node comes with a faucet to get sepolia Masa tokens you need Sepolia ETH in your wallet to get Sepolia MASA tokens from the faucet.
Get your node's public key from the logs.
Send Sepolia ETH to your node's public key address.
Run the
make faucet
command to get Sepolia MASA:docker-compose run --rm masa-node /usr/bin/masa-node --faucet
Run the
make stake
command to stake your node:docker-compose run --rm masa-node /usr/bin/masa-node --stake 1000
Run the container in detached mode:
docker-compose up -d
Check the logs to verify the node is running properly:
docker-compose logs -f masa-node
Accessing Your Node Keys
The node generates keys that are stored in the /home/masa/.masa/
inside the Docker container, ensuring that your keys are safely stored on your host machine.
Open a shell inside the container:
docker-compose exec masa-node /bin/sh
Navigate to the keys directory:
cd /home/masa/.masa/
List the keys:
ls -la
Copy the keys to your host machine:
cp -r /home/masa/.masa/ /path/to/your/host/machine/
Custom configuration
You can customize your node's configuration by modifying the .env
file inside the Docker container. Follow these steps to make changes:
SSH into the running container:
docker-compose exec masa-node /bin/sh
Navigate to the directory containing the
.env
file:cd /home/masa
Edit the
.env
file using a text editor likenano
orvi
:nano .env
Make your desired changes to the
.env
file. You can modify existing variables or add new ones as needed.Save the changes and exit the text editor.
Exit the container:
exit
Restart the container to apply the changes:
docker-compose down
docker-compose up -d
After following these steps, your node will restart with the updated configuration from the modified .env
file.
Remember to consult the Environment Configuration Guide for a list of available environment variables and their purposes.
Configure a Twitter Scraper
To set up your node as a Twitter scraper, you need to add a twitter_cookies.json
file to the container. For more information on obtaining Twitter cookies refer to our Twitter Scraper Configuration Guide.
Follow these steps:
Prepare your
twitter_cookies.json
file on your local machine.Copy the file into the running container:
docker cp /path/to/your/twitter_cookies.json masa-node:/home/masa/.masa/twitter_cookies.json
SSH into the running container:
docker-compose exec masa-node /bin/sh
Verify the file has been copied correctly:
ls -l /home/masa/.masa/twitter_cookies.json
Ensure the file has the correct permissions:
chmod 600 /home/masa/.masa/twitter_cookies.json
Exit the container:
exit
Modify your
.env
file to enable Twitter scraping:docker-compose exec masa-node /bin/sh -c "echo 'TWITTER_SCRAPER=true' >> /home/masa/.env"
Restart the container to apply the changes:
docker-compose down
docker-compose up -dYour node should now be configured as a Twitter scraper. You can verify this by checking the logs:
docker-compose logs -f masa-node
Look for a line indicating that the Twitter scraper is active:
Is TwitterScraper: true
Ensure that your twitter_cookies.json
file contains valid Twitter credentials. Using invalid or expired credentials may result in the scraper failing to function properly.
For more information on obtaining Twitter cookies and the format of the twitter_cookies.json
file, refer to our Twitter Scraper Configuration Guide.