37 Commits

Author SHA1 Message Date
a2efdb136a Update GitHub Actions workflow to specify Dockerfile path in the build context
Some checks failed
Docker Build Test / build (3.10) (push) Successful in 19s
Docker Build Test / build (3.13) (push) Successful in 27s
Docker Build Test / build (3.9) (push) Successful in 27s
Run Tests / test (3.10) (push) Successful in 37s
Run Tests / test (3.11) (push) Successful in 34s
Run Tests / test (3.12) (push) Successful in 55s
Run Tests / test (3.9) (push) Successful in 35s
Run Tests / test (3.13) (push) Successful in 45s
Publish Python 🐍 distribution πŸ“¦ to PyPI / Build distribution πŸ“¦ (push) Failing after 36s
Docker Build Test / build (3.11) (push) Successful in 23s
Docker Build Test / build (3.12) (push) Successful in 25s
Publish Python 🐍 distribution πŸ“¦ to PyPI / Publish Python 🐍 distribution πŸ“¦ to PyPI (push) Has been skipped
Publish Python 🐍 distribution πŸ“¦ to PyPI / Sign the Python 🐍 distribution πŸ“¦ and create GitHub Release (push) Has been skipped
Build and Publish Docker Image / build (push) Failing after 55s
2025-11-08 14:08:37 -06:00
001613b4fa add microvm
Some checks failed
Docker Build Test / build (3.10) (push) Successful in 3m4s
Docker Build Test / build (3.13) (push) Successful in 2m58s
Docker Build Test / build (3.11) (push) Successful in 3m3s
Docker Build Test / build (3.12) (push) Successful in 3m18s
Docker Build Test / build (3.9) (push) Successful in 1m32s
Build and Publish Docker Image / build (push) Has been cancelled
Run Tests / test (3.10) (push) Successful in 1m5s
Run Tests / test (3.11) (push) Successful in 1m19s
Run Tests / test (3.9) (push) Has been cancelled
Run Tests / test (3.12) (push) Has been cancelled
Run Tests / test (3.13) (push) Has been cancelled
2025-11-08 14:03:00 -06:00
74564d0ef2 Update Makefile to use docker buildx load command for building images and adjust Dockerfile paths to the new 'docker' directory. 2025-11-08 14:02:51 -06:00
81142ad194 Refactor GitHub Actions workflows to use Dockerfiles from the new 'docker' directory and add a new workflow for running tests across multiple Python versions. 2025-11-08 14:02:37 -06:00
fee1d2e2d6 Update package version to 1.2.0 and dependencies to rns 1.0.1 in pyproject.toml, requirements.txt, and setup.py; adjust poetry.lock accordingly. 2025-11-08 14:02:25 -06:00
7c93fdb71d move dockerfiles to docker folder 2025-11-08 14:02:06 -06:00
9e435eeebc Update test scripts to support environment variable processing and validate responses for different data types.
Some checks failed
Docker Build Test / build (3.11) (push) Successful in 9s
Docker Build Test / build (3.13) (push) Successful in 10s
Docker Build Test / build (3.12) (push) Successful in 39s
Docker Build Test / build (3.10) (push) Successful in 42s
Docker Build Test / build (3.9) (push) Successful in 38s
Build and Publish Docker Image / build (push) Failing after 9m35s
2025-10-05 16:41:01 -05:00
5dfcc1f2ce Improve data processing in PageNode class to handle both dictionary and bytes input.
Some checks failed
Docker Build Test / build (3.10) (push) Successful in 11s
Docker Build Test / build (3.12) (push) Successful in 11s
Docker Build Test / build (3.13) (push) Successful in 35s
Docker Build Test / build (3.11) (push) Successful in 37s
Docker Build Test / build (3.9) (push) Successful in 37s
Build and Publish Docker Image / build (push) Has been cancelled
2025-10-05 16:35:38 -05:00
2def60b457 Update GitHub Actions workflow to include Python 3.9 in the testing matrix
Some checks failed
Docker Build Test / build (3.11) (push) Successful in 28s
Docker Build Test / build (3.13) (push) Successful in 27s
Docker Build Test / build (3.10) (push) Successful in 3m19s
Docker Build Test / build (3.9) (push) Successful in 3m15s
Docker Build Test / build (3.12) (push) Successful in 3m26s
Build and Publish Docker Image / build (push) Failing after 9m33s
2025-10-05 16:10:18 -05:00
f708ad4ee1 Update python version requirement in setup.py to 3.9 2025-10-05 16:10:12 -05:00
f7568d81aa Adjust python version requirement to 3.9 2025-10-05 16:10:05 -05:00
251f9bacef Update .gitignore 2025-10-05 16:08:49 -05:00
07892dbfee Update README
Some checks failed
Docker Build Test / build (3.12) (push) Successful in 32s
Docker Build Test / build (3.10) (push) Successful in 36s
Docker Build Test / build (3.13) (push) Successful in 2m54s
Docker Build Test / build (3.11) (push) Successful in 2m57s
Build and Publish Docker Image / build (push) Has been cancelled
2025-10-05 16:04:33 -05:00
54e6849968 Improve path resolution in PageNode class to ensure security by validating file paths before serving. 2025-10-05 16:02:12 -05:00
ea27c380cb update version to 1.1.0 in setup.py 2025-10-05 15:49:10 -05:00
Sudo-Ivan
a338be85e1 update uv commands
Some checks failed
Docker Build Test / build (3.11) (push) Successful in 17s
Docker Build Test / build (3.12) (push) Successful in 16s
Docker Build Test / build (3.13) (push) Successful in 17s
Docker Build Test / build (3.10) (push) Successful in 40s
Build and Publish Docker Image / build (push) Failing after 9m34s
2025-10-01 03:03:53 -05:00
Sudo-Ivan
e31cb3418b Update
Some checks failed
Docker Build Test / build (3.11) (push) Successful in 34s
Docker Build Test / build (3.13) (push) Successful in 33s
Docker Build Test / build (3.12) (push) Has been cancelled
Docker Build Test / build (3.10) (push) Has been cancelled
Build and Publish Docker Image / build (push) Has been cancelled
2025-10-01 03:02:14 -05:00
Sudo-Ivan
798725dca6 Update
Some checks failed
Docker Build Test / build (3.12) (push) Successful in 51s
Docker Build Test / build (3.10) (push) Successful in 1m3s
Docker Build Test / build (3.11) (push) Successful in 1m38s
Docker Build Test / build (3.13) (push) Successful in 1m36s
Build and Publish Docker Image / build (push) Failing after 10m2s
2025-09-30 21:43:41 -05:00
Sudo-Ivan
6f393497f0 Add docstring 2025-09-30 21:37:51 -05:00
Sudo-Ivan
14b5aabf2b Improved PageNode class with documentation, error handling, and path management.
Update file and page serving methods to utilize pathlib for modern python path handling
2025-09-30 21:37:41 -05:00
fb36907447 Improve path handling in PageNode class to ensure consistent formatting of served pages and files. 2025-09-23 04:13:37 -05:00
62fde2617b Fix remote_identity assignment in PageNode class to use hash attribute 2025-09-23 03:00:06 -05:00
9f5ea23eb7 Improve request data parsing in PageNode class to support '|' delimiter and add handling for additional fields 2025-09-23 02:42:54 -05:00
19fad61706 Add Micron interactive features via environment variables 2025-09-23 02:28:55 -05:00
c900cf38c9 Bump to 1.1.0 2025-09-23 01:59:01 -05:00
014ebc25c6 Update default home page message and print node address in terminal output. 2025-09-23 01:57:19 -05:00
Sudo-Ivan
d5e9308fb5 Update GitHub Actions workflows to use updated action versions
- Updated actions/checkout to v5.0.0
- Updated actions/setup-python to v6.0.0
- Updated docker/build-push-action to v6.18.0
- Updated actions/upload-artifact to v4.6.2
- Updated actions/download-artifact to v5.0.0
- Updated sigstore/gh-action-sigstore-python to v3.0.1
2025-09-22 18:44:45 -05:00
Sudo-Ivan
7d5e891261 Update dependencies in poetry.lock and pyproject.toml
- Bump anyio from 4.9.0 to 4.10.0
- Bump authlib from 1.6.0 to 1.6.4
- Bump certifi from 2025.7.14 to 2025.8.3
- Bump cffi from 1.17.1 to 2.0.0
- Bump ruamel.yaml.clib from 0.2.12 to 0.2.13
- Bump ruff from 0.12.3 to 0.12.12
- Bump safety from 3.6.0 to 3.6.1
- Bump typer from 0.16.0 to 0.19.1
- Bump typing-extensions from 4.14.1 to 4.15.0
2025-09-22 14:24:57 -05:00
Sudo-Ivan
c382ed790f Update GitHub Actions workflows to use full-length commit hashes for actions 2025-09-22 14:24:40 -05:00
cb72e57da9 Merge pull request #2 from Sudo-Ivan/dependabot/github_actions/dot-github/workflows/pypa/gh-action-pypi-publish-1.13.0
Bump pypa/gh-action-pypi-publish from 1.12.3 to 1.13.0 in /.github/workflows
2025-09-15 02:24:07 -05:00
dependabot[bot]
aaf5ad23e2 Bump pypa/gh-action-pypi-publish in /.github/workflows
Bumps [pypa/gh-action-pypi-publish](https://github.com/pypa/gh-action-pypi-publish) from 1.12.3 to 1.13.0.
- [Release notes](https://github.com/pypa/gh-action-pypi-publish/releases)
- [Commits](https://github.com/pypa/gh-action-pypi-publish/compare/v1.12.3...v1.13.0)

---
updated-dependencies:
- dependency-name: pypa/gh-action-pypi-publish
  dependency-version: 1.13.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-04 15:36:31 +00:00
Sudo-Ivan
ce1b1dad7d Update Docker volume path 2025-08-18 03:17:23 -05:00
Sudo-Ivan
67ebc7e556 Fix formatting issues in test_client.py and test_client2.py for consistency in lambda function parameters. 2025-08-04 17:24:21 -05:00
Sudo-Ivan
b31fb748b8 Add node address to output and Fix formatting issues 2025-08-04 17:24:15 -05:00
Sudo-Ivan
eb27326763 Bump package version to 1.0.0. 2025-07-14 17:31:45 -05:00
Sudo-Ivan
f40d5a51ae Refactor main to improve readability and maintainability. 2025-07-14 17:27:17 -05:00
Sudo-Ivan
4aa83a2dfb Add badges to README.md. 2025-07-14 17:22:26 -05:00
21 changed files with 862 additions and 1999 deletions

View File

@@ -15,13 +15,13 @@ jobs:
contents: read
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12", "3.13"]
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
- name: Build Docker Image
run: docker build . --file Dockerfile --build-arg PYTHON_VERSION=${{ matrix.python-version }} --tag lxmfy-test:${{ matrix.python-version }}
run: docker build . --file docker/Dockerfile --build-arg PYTHON_VERSION=${{ matrix.python-version }} --tag lxmfy-test:${{ matrix.python-version }}

View File

@@ -20,18 +20,18 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392
with:
platforms: amd64,arm64
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435
- name: Log in to the Container registry
uses: docker/login-action@v3
uses: docker/login-action@184bdaa0721073962dff0199f1fb9940f07167d1
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
@@ -39,7 +39,7 @@ jobs:
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
@@ -51,9 +51,10 @@ jobs:
type=sha,format=short
- name: Build and push Docker image
uses: docker/build-push-action@v5
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
file: ./docker/Dockerfile
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
@@ -63,7 +64,7 @@ jobs:
- name: Extract metadata (tags, labels) for Docker (rootless)
id: meta_rootless
uses: docker/metadata-action@v5
uses: docker/metadata-action@c1e51972afc2121e065aed6d45c65596fe445f3f
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}-rootless
tags: |
@@ -74,10 +75,10 @@ jobs:
type=sha,format=short,suffix=-rootless
- name: Build and push rootless Docker image
uses: docker/build-push-action@v5
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
file: ./Dockerfile.rootless
file: ./docker/Dockerfile.rootless
platforms: linux/amd64,linux/arm64
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta_rootless.outputs.tags }}

View File

@@ -23,11 +23,11 @@ jobs:
contents: read
id-token: write
steps:
- uses: actions/checkout@v4.2.2
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@v5.3.0
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: "3.13"
- name: Install pypa/build
@@ -35,7 +35,7 @@ jobs:
- name: Build a binary wheel and a source tarball
run: python3 -m build
- name: Store the distribution packages
uses: actions/upload-artifact@v4.5.0
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: python-package-distributions
path: dist/
@@ -55,12 +55,12 @@ jobs:
steps:
- name: Download all the dists
uses: actions/download-artifact@v4.1.8
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: python-package-distributions
path: dist/
- name: Publish distribution πŸ“¦ to PyPI
uses: pypa/gh-action-pypi-publish@v1.12.3
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e
github-release:
name: Sign the Python 🐍 distribution πŸ“¦ and create GitHub Release
@@ -73,12 +73,12 @@ jobs:
steps:
- name: Download all the dists
uses: actions/download-artifact@v4.1.8
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: python-package-distributions
path: dist/
- name: Sign the dists with Sigstore
uses: sigstore/gh-action-sigstore-python@v3.0.0
uses: sigstore/gh-action-sigstore-python@f7ad0af51a5648d09a20d00370f0a91c3bdf8f84 # v3.0.1
with:
inputs: >-
./dist/*.tar.gz

44
.github/workflows/tests.yml vendored Normal file
View File

@@ -0,0 +1,44 @@
name: Run Tests
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
test:
runs-on: ubuntu-latest
permissions:
contents: read
strategy:
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"]
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .
- name: Run tests
run: |
cd tests
chmod +x run_tests.sh
timeout 120 ./run_tests.sh
- name: Upload test logs on failure
if: failure()
uses: actions/upload-artifact@v4
with:
name: test-logs-python-${{ matrix.python-version }}
path: tests/node.log

11
.gitignore vendored
View File

@@ -3,3 +3,14 @@ node-config/
files/
.ruff_cache/
__pycache__/
dist/
*.egg-info/
.ruff_cache/
.venv/
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
microvm/

View File

@@ -2,6 +2,7 @@
# Detect if docker buildx is available
DOCKER_BUILD := $(shell docker buildx version >/dev/null 2>&1 && echo "docker buildx build" || echo "docker build")
DOCKER_BUILD_LOAD := $(shell docker buildx version >/dev/null 2>&1 && echo "docker buildx build --load" || echo "docker build")
.PHONY: all build sdist wheel clean install lint format docker-wheels docker-build docker-run docker-build-rootless docker-run-rootless help test docker-test
@@ -29,13 +30,13 @@ format:
ruff check --fix .
docker-wheels:
$(DOCKER_BUILD) --target builder -f Dockerfile.build -t rns-page-node-builder .
$(DOCKER_BUILD) --target builder -f docker/Dockerfile.build -t rns-page-node-builder .
docker create --name builder-container rns-page-node-builder true
docker cp builder-container:/src/dist ./dist
docker rm builder-container
docker-build:
$(DOCKER_BUILD) $(BUILD_ARGS) -f Dockerfile -t rns-page-node:latest .
$(DOCKER_BUILD_LOAD) $(BUILD_ARGS) -f docker/Dockerfile -t rns-page-node:latest .
docker-run:
docker run --rm -it \
@@ -50,7 +51,7 @@ docker-run:
--announce-interval 360
docker-build-rootless:
$(DOCKER_BUILD) $(BUILD_ARGS) -f Dockerfile.rootless -t rns-page-node-rootless:latest .
$(DOCKER_BUILD_LOAD) $(BUILD_ARGS) -f docker/Dockerfile.rootless -t rns-page-node-rootless:latest .
docker-run-rootless:
docker run --rm -it \
@@ -68,7 +69,7 @@ test:
bash tests/run_tests.sh
docker-test:
$(DOCKER_BUILD) -f tests/Dockerfile.tests -t rns-page-node-tests .
$(DOCKER_BUILD_LOAD) -f docker/Dockerfile.tests -t rns-page-node-tests .
docker run --rm rns-page-node-tests
help:

View File

@@ -1,14 +1,48 @@
# RNS Page Node
[Русская](README.ru.md)
[![Build and Publish Docker Image](https://github.com/Sudo-Ivan/rns-page-node/actions/workflows/docker.yml/badge.svg)](https://github.com/Sudo-Ivan/rns-page-node/actions/workflows/docker.yml)
[![Docker Build Test](https://github.com/Sudo-Ivan/rns-page-node/actions/workflows/docker-test.yml/badge.svg)](https://github.com/Sudo-Ivan/rns-page-node/actions/workflows/docker-test.yml)
[![DeepSource](https://app.deepsource.com/gh/Sudo-Ivan/rns-page-node.svg/?label=active+issues&show_trend=true&token=kajzd0SjJXSzkuN3z3kG9gQw)](https://app.deepsource.com/gh/Sudo-Ivan/rns-page-node/)
A simple way to serve pages and files over the [Reticulum network](https://reticulum.network/). Drop-in replacement for [NomadNet](https://github.com/markqvist/NomadNet) nodes that primarily serve pages and files.
## Features
- Static and Dynamic pages.
- Serve files
- Simple
## To-Do
- Parameter parsing for forums, chat etc...
## Usage
```bash
# Pip
# May require --break-system-packages
pip install rns-page-node
# Pipx
pipx install rns-page-node
# uv
uv venv
source .venv/bin/activate
uv pip install rns-page-node
# Git
pipx install git+https://github.com/Sudo-Ivan/rns-page-node.git
```
```bash
# will use current directory for pages and files
rns-page-node
```
@@ -21,7 +55,7 @@ rns-page-node --node-name "Page Node" --pages-dir ./pages --files-dir ./files --
### Docker/Podman
```bash
docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/app/config ghcr.io/sudo-ivan/rns-page-node:latest
docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/root/.reticulum ghcr.io/sudo-ivan/rns-page-node:latest
```
### Docker/Podman Rootless
@@ -54,45 +88,7 @@ make docker-wheels
## Pages
Supports Micron `.mu` and dynamic pages with `#!` in the micron files.
## Statistics Tracking
The node now includes comprehensive statistics tracking for monitoring peer connections and page/file requests:
### Command Line Options for Stats
```bash
# Print stats every 60 seconds
rns-page-node --stats-interval 60
# Save stats to JSON file on shutdown
rns-page-node --save-stats node_stats.json
# Actively write stats to file (live updates)
rns-page-node --stats-file stats.json
# Combined: live stats file + periodic display + final save
rns-page-node --stats-file stats.json --stats-interval 300 --save-stats final_stats.json
```
### Docker Stats Usage
```bash
# With periodic stats display
docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/app/config ghcr.io/sudo-ivan/rns-page-node:latest --stats-interval 60
# Save stats to mounted volume
docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/app/config -v ./stats:/app/stats ghcr.io/sudo-ivan/rns-page-node:latest --save-stats /app/stats/node_stats.json
```
### Tracked Metrics
- **Connection Statistics**: Total connections, active connections, peer tracking
- **Request Statistics**: Page requests, file requests, requests by path and peer
- **Performance Metrics**: Requests per hour, uptime, response patterns
- **Historical Data**: Recent request history, hourly/daily aggregations
- **Top Content**: Most requested pages and files, most active peers
Supports dynamic pages but not request data parsing yet.
## Options
@@ -103,11 +99,9 @@ docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config
-f, --files-dir: The directory to serve files from.
-i, --identity-dir: The directory to persist the node's identity.
-a, --announce-interval: The interval to announce the node's presence.
--page-refresh-interval: The interval to refresh pages (seconds, 0 disables).
--file-refresh-interval: The interval to refresh files (seconds, 0 disables).
-r, --page-refresh-interval: The interval to refresh pages.
-f, --file-refresh-interval: The interval to refresh files.
-l, --log-level: The logging level.
--stats-interval: Print stats every N seconds (0 disables).
--save-stats: Save stats to JSON file on shutdown.
```
## License

94
README.ru.md Normal file
View File

@@ -0,0 +1,94 @@
# RNS Page Node
ΠŸΡ€ΠΎΡΡ‚ΠΎΠΉ способ для Ρ€Π°Π·Π΄Π°Ρ‡ΠΈ страниц ΠΈ Ρ„Π°ΠΉΠ»ΠΎΠ² Ρ‡Π΅Ρ€Π΅Π· ΡΠ΅Ρ‚ΡŒ [Reticulum](https://reticulum.network/). ΠŸΡ€ΡΠΌΠ°Ρ Π·Π°ΠΌΠ΅Π½Π° для ΡƒΠ·Π»ΠΎΠ² [NomadNet](https://github.com/markqvist/NomadNet), ΠΊΠΎΡ‚ΠΎΡ€Ρ‹Π΅ Π² основном слуТат для Ρ€Π°Π·Π΄Π°Ρ‡ΠΈ страниц ΠΈ Ρ„Π°ΠΉΠ»ΠΎΠ².
## ИспользованиС
```bash
# Pip
# ΠœΠΎΠΆΠ΅Ρ‚ ΠΏΠΎΡ‚Ρ€Π΅Π±ΠΎΠ²Π°Ρ‚ΡŒΡΡ --break-system-packages
pip install rns-page-node
# Pipx
pipx install rns-page-node
# uv
uv venv
source .venv/bin/activate
uv pip install rns-page-node
# Git
pipx install git+https://github.com/Sudo-Ivan/rns-page-node.git
```
```bash
# Π±ΡƒΠ΄Π΅Ρ‚ ΠΈΡΠΏΠΎΠ»ΡŒΠ·ΠΎΠ²Π°Ρ‚ΡŒ Ρ‚Π΅ΠΊΡƒΡ‰ΠΈΠΉ ΠΊΠ°Ρ‚Π°Π»ΠΎΠ³ для страниц ΠΈ Ρ„Π°ΠΉΠ»ΠΎΠ²
rns-page-node
```
## ИспользованиС
```bash
rns-page-node --node-name "Page Node" --pages-dir ./pages --files-dir ./files --identity-dir ./node-config --announce-interval 360
```
### Docker/Podman
```bash
docker run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/root/.reticulum ghcr.io/sudo-ivan/rns-page-node:latest
```
### Docker/Podman Π±Π΅Π· root
```bash
mkdir -p ./pages ./files ./node-config ./config
chown -R 1000:1000 ./pages ./files ./node-config ./config
podman run -it --rm -v ./pages:/app/pages -v ./files:/app/files -v ./node-config:/app/node-config -v ./config:/app/config ghcr.io/sudo-ivan/rns-page-node:latest-rootless
```
ΠœΠΎΠ½Ρ‚ΠΈΡ€ΠΎΠ²Π°Π½ΠΈΠ΅ Ρ‚ΠΎΠΌΠΎΠ² Π½Π΅ΠΎΠ±ΡΠ·Π°Ρ‚Π΅Π»ΡŒΠ½ΠΎ, Π²Ρ‹ Ρ‚Π°ΠΊΠΆΠ΅ ΠΌΠΎΠΆΠ΅Ρ‚Π΅ ΡΠΊΠΎΠΏΠΈΡ€ΠΎΠ²Π°Ρ‚ΡŒ страницы ΠΈ Ρ„Π°ΠΉΠ»Ρ‹ Π² ΠΊΠΎΠ½Ρ‚Π΅ΠΉΠ½Π΅Ρ€ с ΠΏΠΎΠΌΠΎΡ‰ΡŒΡŽ `podman cp` ΠΈΠ»ΠΈ `docker cp`.
## Π‘Π±ΠΎΡ€ΠΊΠ°
```bash
make build
```
Π‘Π±ΠΎΡ€ΠΊΠ° wheels:
```bash
make wheel
```
### Π‘Π±ΠΎΡ€ΠΊΠ° Wheels Π² Docker
```bash
make docker-wheels
```
## Π‘Ρ‚Ρ€Π°Π½ΠΈΡ†Ρ‹
ΠŸΠΎΠ΄Π΄Π΅Ρ€ΠΆΠΈΠ²Π°ΡŽΡ‚ΡΡ динамичСскиС страницы, Π½ΠΎ парсинг Π΄Π°Π½Π½Ρ‹Ρ… запроса ΠΏΠΎΠΊΠ° Π½Π΅ Ρ€Π΅Π°Π»ΠΈΠ·ΠΎΠ²Π°Π½.
## ΠžΠΏΡ†ΠΈΠΈ
```
-c, --config: ΠŸΡƒΡ‚ΡŒ ΠΊ Ρ„Π°ΠΉΠ»Ρƒ ΠΊΠΎΠ½Ρ„ΠΈΠ³ΡƒΡ€Π°Ρ†ΠΈΠΈ Reticulum.
-n, --node-name: Имя ΡƒΠ·Π»Π°.
-p, --pages-dir: ΠšΠ°Ρ‚Π°Π»ΠΎΠ³ для Ρ€Π°Π·Π΄Π°Ρ‡ΠΈ страниц.
-f, --files-dir: ΠšΠ°Ρ‚Π°Π»ΠΎΠ³ для Ρ€Π°Π·Π΄Π°Ρ‡ΠΈ Ρ„Π°ΠΉΠ»ΠΎΠ².
-i, --identity-dir: ΠšΠ°Ρ‚Π°Π»ΠΎΠ³ для сохранСния ΠΈΠ΄Π΅Π½Ρ‚ΠΈΡ„ΠΈΠΊΠ°Ρ†ΠΈΠΎΠ½Π½Ρ‹Ρ… Π΄Π°Π½Π½Ρ‹Ρ… ΡƒΠ·Π»Π°.
-a, --announce-interval: Π˜Π½Ρ‚Π΅Ρ€Π²Π°Π» анонсирования присутствия ΡƒΠ·Π»Π°.
-r, --page-refresh-interval: Π˜Π½Ρ‚Π΅Ρ€Π²Π°Π» обновлСния страниц.
-f, --file-refresh-interval: Π˜Π½Ρ‚Π΅Ρ€Π²Π°Π» обновлСния Ρ„Π°ΠΉΠ»ΠΎΠ².
-l, --log-level: Π£Ρ€ΠΎΠ²Π΅Π½ΡŒ логирования.
```
## ЛицСнзия
Π­Ρ‚ΠΎΡ‚ ΠΏΡ€ΠΎΠ΅ΠΊΡ‚ Π²ΠΊΠ»ΡŽΡ‡Π°Π΅Ρ‚ части ΠΊΠΎΠ΄ΠΎΠ²ΠΎΠΉ Π±Π°Π·Ρ‹ [NomadNet](https://github.com/markqvist/NomadNet), которая Π»ΠΈΡ†Π΅Π½Π·ΠΈΡ€ΠΎΠ²Π°Π½Π° ΠΏΠΎΠ΄ GNU General Public License v3.0 (GPL-3.0). Как производная Ρ€Π°Π±ΠΎΡ‚Π°, этот ΠΏΡ€ΠΎΠ΅ΠΊΡ‚ Ρ‚Π°ΠΊΠΆΠ΅ распространяСтся Π½Π° условиях GPL-3.0. ΠŸΠΎΠ»Π½Ρ‹ΠΉ тСкст Π»ΠΈΡ†Π΅Π½Π·ΠΈΠΈ смотритС Π² Ρ„Π°ΠΉΠ»Π΅ [LICENSE](LICENSE).

View File

1521
poetry.lock generated
View File

File diff suppressed because it is too large Load Diff

View File

@@ -1,15 +1,15 @@
[project]
name = "rns-page-node"
version = "0.2.0"
version = "1.2.0"
license = "GPL-3.0-only"
description = "A simple way to serve pages and files over the Reticulum network."
authors = [
{name = "Sudo-Ivan"}
]
readme = "README.md"
requires-python = ">=3.10"
requires-python = ">=3.9"
dependencies = [
"rns (>=1.0.0,<1.5.0)"
"rns (>=1.0.1,<1.5.0)"
]
[project.scripts]
@@ -20,6 +20,4 @@ requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.poetry.group.dev.dependencies]
ruff = "^0.12.3"
safety = "^3.6.0"
ruff = "^0.13.3"

View File

@@ -1 +1 @@
rns=1.0.0
rns=1.0.1

View File

@@ -1,2 +1,6 @@
# rns_page_node package
__all__ = ['main']
"""RNS Page Node package.
A minimal Reticulum page node that serves .mu pages and files over RNS.
"""
__all__ = ["main"]

View File

@@ -1,49 +1,66 @@
#!/usr/bin/env python3
"""
Minimal Reticulum Page Node
"""Minimal Reticulum Page Node.
Serves .mu pages and files over RNS.
"""
import os
import time
import threading
import subprocess
import RNS
import argparse
import logging
import json
from collections import defaultdict, deque
from datetime import datetime
import os
import subprocess
import threading
import time
from pathlib import Path
logger = logging.getLogger(__name__)
import RNS
DEFAULT_INDEX = '''>Default Home Page
DEFAULT_INDEX = """>Default Home Page
This node is serving pages using page node, but the home page file (index.mu) was not found in the pages directory. Please add an index.mu file to customize the home page.
'''
This node is serving pages using rns-page-node, but index.mu was not found.
Please add an index.mu file to customize the home page.
"""
DEFAULT_NOTALLOWED = '''>Request Not Allowed
DEFAULT_NOTALLOWED = """>Request Not Allowed
You are not authorised to carry out the request.
'''
"""
class PageNode:
def __init__(self, identity, pagespath, filespath, announce_interval=360, name=None, page_refresh_interval=0, file_refresh_interval=0, stats_file=None):
"""A Reticulum page node that serves .mu pages and files over RNS."""
def __init__(
self,
identity,
pagespath,
filespath,
announce_interval=360,
name=None,
page_refresh_interval=0,
file_refresh_interval=0,
):
"""Initialize the PageNode.
Args:
identity: RNS Identity for the node
pagespath: Path to directory containing .mu pages
filespath: Path to directory containing files to serve
announce_interval: Seconds between announcements (default: 360)
name: Display name for the node (optional)
page_refresh_interval: Seconds between page rescans (0 = disabled)
file_refresh_interval: Seconds between file rescans (0 = disabled)
"""
self._stop_event = threading.Event()
self._lock = threading.Lock()
self._stats_lock = threading.Lock()
self.logger = logging.getLogger(f"{__name__}.PageNode")
self.identity = identity
self.name = name
self.pagespath = pagespath
self.filespath = filespath
self.stats_file = stats_file
self.destination = RNS.Destination(
identity,
RNS.Destination.IN,
RNS.Destination.SINGLE,
"nomadnetwork",
"node"
"node",
)
self.announce_interval = announce_interval
self.last_announce = 0
@@ -52,506 +69,316 @@ class PageNode:
self.last_page_refresh = time.time()
self.last_file_refresh = time.time()
# Initialize stats tracking
self._init_stats()
self.register_pages()
self.register_files()
self.destination.set_link_established_callback(self.on_connect)
self._announce_thread = threading.Thread(target=self._announce_loop, daemon=True)
self._announce_thread = threading.Thread(
target=self._announce_loop,
daemon=True,
)
self._announce_thread.start()
self._refresh_thread = threading.Thread(target=self._refresh_loop, daemon=True)
self._refresh_thread.start()
def register_pages(self):
"""Scan pages directory and register request handlers for all .mu files."""
with self._lock:
self.servedpages = []
self._scan_pages(self.pagespath)
if not os.path.isfile(os.path.join(self.pagespath, "index.mu")):
pagespath = Path(self.pagespath)
if not (pagespath / "index.mu").is_file():
self.destination.register_request_handler(
"/page/index.mu",
response_generator=self.serve_default_index,
allow=RNS.Destination.ALLOW_ALL
allow=RNS.Destination.ALLOW_ALL,
)
for full_path in self.servedpages:
rel = full_path[len(self.pagespath):]
rel = full_path[len(str(pagespath)) :]
if not rel.startswith("/"):
rel = "/" + rel
request_path = f"/page{rel}"
self.destination.register_request_handler(
request_path,
response_generator=self.serve_page,
allow=RNS.Destination.ALLOW_ALL
allow=RNS.Destination.ALLOW_ALL,
)
def register_files(self):
"""Scan files directory and register request handlers for all files."""
with self._lock:
self.servedfiles = []
self._scan_files(self.filespath)
filespath = Path(self.filespath)
for full_path in self.servedfiles:
rel = full_path[len(self.filespath):]
rel = full_path[len(str(filespath)) :]
if not rel.startswith("/"):
rel = "/" + rel
request_path = f"/file{rel}"
self.destination.register_request_handler(
request_path,
response_generator=self.serve_file,
allow=RNS.Destination.ALLOW_ALL,
auto_compress=32_000_000
auto_compress=32_000_000,
)
def _scan_pages(self, base):
for entry in os.listdir(base):
if entry.startswith('.'):
base_path = Path(base)
for entry in base_path.iterdir():
if entry.name.startswith("."):
continue
path = os.path.join(base, entry)
if os.path.isdir(path):
self._scan_pages(path)
elif os.path.isfile(path) and not entry.endswith(".allowed"):
self.servedpages.append(path)
if entry.is_dir():
self._scan_pages(str(entry))
elif entry.is_file() and not entry.name.endswith(".allowed"):
self.servedpages.append(str(entry))
def _scan_files(self, base):
for entry in os.listdir(base):
if entry.startswith('.'):
base_path = Path(base)
for entry in base_path.iterdir():
if entry.name.startswith("."):
continue
path = os.path.join(base, entry)
if os.path.isdir(path):
self._scan_files(path)
elif os.path.isfile(path):
self.servedfiles.append(path)
if entry.is_dir():
self._scan_files(str(entry))
elif entry.is_file():
self.servedfiles.append(str(entry))
def _init_stats(self):
"""Initialize statistics tracking"""
self.stats = {
'start_time': time.time(),
'total_connections': 0,
'active_connections': 0,
'total_page_requests': 0,
'total_file_requests': 0,
'page_requests_by_path': defaultdict(int),
'file_requests_by_path': defaultdict(int),
'requests_by_peer': defaultdict(int),
'recent_requests': deque(maxlen=100), # Keep last 100 requests
'connected_peers': {}, # link_id -> peer_info
'hourly_stats': defaultdict(lambda: {'pages': 0, 'files': 0}),
'daily_stats': defaultdict(lambda: {'pages': 0, 'files': 0}),
}
# Initialize stats file if specified
if self.stats_file:
self._init_stats_file()
@staticmethod
def serve_default_index(
_path,
_data,
_request_id,
_link_id,
_remote_identity,
_requested_at,
):
"""Serve the default index page when no index.mu file exists."""
return DEFAULT_INDEX.encode("utf-8")
def _init_stats_file(self):
"""Initialize the stats file with basic structure"""
def serve_page(
self,
path,
data,
_request_id,
_link_id,
remote_identity,
_requested_at,
):
"""Serve a .mu page file, executing it as a script if it has a shebang."""
pagespath = Path(self.pagespath).resolve()
relative_path = path[6:] if path.startswith("/page/") else path[5:]
file_path = (pagespath / relative_path).resolve()
if not str(file_path).startswith(str(pagespath)):
return DEFAULT_NOTALLOWED.encode("utf-8")
try:
# Ensure directory exists
dir_path = os.path.dirname(os.path.abspath(self.stats_file))
if dir_path:
os.makedirs(dir_path, exist_ok=True)
# Create initial stats file
initial_stats = {
'node_info': {
'name': self.name or 'Unnamed',
'hash': RNS.hexrep(self.destination.hash, delimit=False),
'start_time': datetime.fromtimestamp(self.stats['start_time']).isoformat()
},
'connections': [],
'requests': [],
'summary': {
'total_connections': 0,
'total_page_requests': 0,
'total_file_requests': 0,
'last_updated': datetime.now().isoformat()
}
}
with open(self.stats_file, 'w') as f:
json.dump(initial_stats, f, indent=2)
self.logger.info(f"Initialized stats file: {self.stats_file}")
except Exception as e:
self.logger.error(f"Failed to initialize stats file {self.stats_file}: {e}")
def _write_stats_event(self, event_type, event_data):
"""Write a single stats event to the file"""
if not self.stats_file:
return
try:
# Read current stats
try:
with open(self.stats_file, 'r') as f:
stats_data = json.load(f)
except (FileNotFoundError, json.JSONDecodeError):
# If file doesn't exist or is corrupted, reinitialize
self._init_stats_file()
with open(self.stats_file, 'r') as f:
stats_data = json.load(f)
# Add the new event
if event_type == 'connection':
stats_data['connections'].append(event_data)
stats_data['summary']['total_connections'] += 1
elif event_type == 'request':
stats_data['requests'].append(event_data)
if event_data['type'] == 'page':
stats_data['summary']['total_page_requests'] += 1
elif event_data['type'] == 'file':
stats_data['summary']['total_file_requests'] += 1
# Update last_updated timestamp
stats_data['summary']['last_updated'] = datetime.now().isoformat()
# Keep only last 1000 events to prevent file from growing too large
if len(stats_data['connections']) > 1000:
stats_data['connections'] = stats_data['connections'][-1000:]
if len(stats_data['requests']) > 1000:
stats_data['requests'] = stats_data['requests'][-1000:]
# Write back to file
with open(self.stats_file, 'w') as f:
json.dump(stats_data, f, indent=2, default=str)
except Exception as e:
self.logger.error(f"Failed to write stats event to {self.stats_file}: {e}")
def _record_request(self, request_type, path, remote_identity, requested_at):
"""Record a request in statistics"""
with self._stats_lock:
# Get peer identity hash with better fallback
if remote_identity:
peer_hash = RNS.hexrep(remote_identity.hash, delimit=False)
# Try to get app_data name if available
try:
app_data = RNS.Identity.recall_app_data(remote_identity.hash)
if app_data:
peer_display = app_data.decode('utf-8', errors='ignore')[:32] # Limit length
else:
peer_display = peer_hash[:16] + "..." # Show first 16 chars
except:
peer_display = peer_hash[:16] + "..."
else:
peer_hash = "anonymous"
peer_display = "anonymous"
# Record basic stats
if request_type == 'page':
self.stats['total_page_requests'] += 1
self.stats['page_requests_by_path'][path] += 1
elif request_type == 'file':
self.stats['total_file_requests'] += 1
self.stats['file_requests_by_path'][path] += 1
self.stats['requests_by_peer'][peer_hash] += 1
# Record recent request
request_info = {
'type': request_type,
'path': path,
'peer': peer_display,
'peer_hash': peer_hash,
'timestamp': requested_at,
'datetime': datetime.fromtimestamp(requested_at).isoformat()
}
self.stats['recent_requests'].append(request_info)
# Record hourly and daily stats
dt = datetime.fromtimestamp(requested_at)
hour_key = dt.strftime('%Y-%m-%d %H:00')
day_key = dt.strftime('%Y-%m-%d')
if request_type == 'page':
self.stats['hourly_stats'][hour_key]['pages'] += 1
self.stats['daily_stats'][day_key]['pages'] += 1
elif request_type == 'file':
self.stats['hourly_stats'][hour_key]['files'] += 1
self.stats['daily_stats'][day_key]['files'] += 1
# Write to stats file immediately
self._write_stats_event('request', request_info)
def serve_default_index(self, path, data, request_id, link_id, remote_identity, requested_at):
self._record_request('page', path, remote_identity, requested_at)
return DEFAULT_INDEX.encode('utf-8')
def serve_page(self, path, data, request_id, link_id, remote_identity, requested_at):
self._record_request('page', path, remote_identity, requested_at)
file_path = path.replace("/page", self.pagespath, 1)
try:
with open(file_path, 'rb') as _f:
with file_path.open("rb") as _f:
first_line = _f.readline()
is_script = first_line.startswith(b'#!')
is_script = first_line.startswith(b"#!")
except Exception:
is_script = False
if is_script and os.access(file_path, os.X_OK):
# Note: You can remove the following try-except block if you just serve static pages.
if is_script and os.access(str(file_path), os.X_OK):
try:
result = subprocess.run([file_path], stdout=subprocess.PIPE)
env = os.environ.copy()
if remote_identity:
env["remote_identity"] = RNS.hexrep(
remote_identity.hash,
delimit=False,
)
if data:
try:
RNS.log(f"Processing request data: {data} (type: {type(data)})", RNS.LOG_DEBUG)
if isinstance(data, dict):
RNS.log(f"Data is dictionary with {len(data)} items", RNS.LOG_DEBUG)
for key, value in data.items():
if isinstance(value, str):
if key.startswith(("field_", "var_")):
env[key] = value
RNS.log(f"Set env[{key}] = {value}", RNS.LOG_DEBUG)
elif key == "action":
env["var_action"] = value
RNS.log(f"Set env[var_action] = {value}", RNS.LOG_DEBUG)
else:
env[f"field_{key}"] = value
RNS.log(f"Set env[field_{key}] = {value}", RNS.LOG_DEBUG)
elif isinstance(data, bytes):
data_str = data.decode("utf-8")
RNS.log(f"Data is bytes, decoded to: {data_str}", RNS.LOG_DEBUG)
if data_str:
if "|" in data_str and "&" not in data_str:
pairs = data_str.split("|")
else:
pairs = data_str.split("&")
for pair in pairs:
if "=" in pair:
key, value = pair.split("=", 1)
if key.startswith(("field_", "var_")):
env[key] = value
elif key == "action":
env["var_action"] = value
else:
env[f"field_{key}"] = value
except Exception as e:
RNS.log(f"Error parsing request data: {e}", RNS.LOG_ERROR)
result = subprocess.run( # noqa: S603
[str(file_path)],
stdout=subprocess.PIPE,
check=True,
env=env,
)
return result.stdout
except Exception:
pass
with open(file_path, 'rb') as f:
except Exception as e:
RNS.log(f"Error executing script page: {e}", RNS.LOG_ERROR)
with file_path.open("rb") as f:
return f.read()
def serve_file(self, path, data, request_id, link_id, remote_identity, requested_at):
self._record_request('file', path, remote_identity, requested_at)
file_path = path.replace("/file", self.filespath, 1)
return [open(file_path, 'rb'), {"name": os.path.basename(file_path).encode('utf-8')}]
def serve_file(
self,
path,
_data,
_request_id,
_link_id,
_remote_identity,
_requested_at,
):
"""Serve a file from the files directory."""
filespath = Path(self.filespath).resolve()
relative_path = path[6:] if path.startswith("/file/") else path[5:]
file_path = (filespath / relative_path).resolve()
if not str(file_path).startswith(str(filespath)):
return DEFAULT_NOTALLOWED.encode("utf-8")
return [
file_path.open("rb"),
{"name": file_path.name.encode("utf-8")},
]
def on_connect(self, link):
"""Called when a new link is established"""
connection_time = time.time()
with self._stats_lock:
self.stats['total_connections'] += 1
self.stats['active_connections'] += 1
# Get peer info with better identification
if link.get_remote_identity():
peer_hash = RNS.hexrep(link.get_remote_identity().hash, delimit=False)
# Try to get app_data name if available
try:
app_data = RNS.Identity.recall_app_data(link.get_remote_identity().hash)
if app_data:
peer_display = app_data.decode('utf-8', errors='ignore')[:32] # Limit length
else:
peer_display = peer_hash[:16] + "..." # Show first 16 chars
except:
peer_display = peer_hash[:16] + "..."
else:
peer_hash = "anonymous"
peer_display = "anonymous"
# Convert link_id to hex string properly
link_id_hex = RNS.hexrep(link.link_id, delimit=False) if hasattr(link, 'link_id') else "unknown"
self.stats['connected_peers'][link_id_hex] = {
'peer_hash': peer_hash,
'peer_display': peer_display,
'connected_at': connection_time,
'link_id': link_id_hex
}
# Write connection event to stats file
connection_info = {
'event': 'connected',
'peer': peer_display,
'peer_hash': peer_hash,
'timestamp': connection_time,
'datetime': datetime.fromtimestamp(connection_time).isoformat(),
'link_id': link_id_hex
}
self._write_stats_event('connection', connection_info)
self.logger.info(f"New connection established from peer {peer_display}")
# Set callback for when link closes
link.set_link_closed_callback(self._on_link_closed)
def _on_link_closed(self, link):
"""Called when a link is closed"""
with self._stats_lock:
if link.link_id in self.stats['connected_peers']:
peer_info = self.stats['connected_peers'].pop(link.link_id)
self.stats['active_connections'] = max(0, self.stats['active_connections'] - 1)
self.logger.info(f"Connection closed from peer {peer_info['peer_hash'][:16]}...")
"""Handle new link connections."""
def _announce_loop(self):
while not self._stop_event.is_set():
try:
try:
while not self._stop_event.is_set():
if time.time() - self.last_announce > self.announce_interval:
if self.name:
self.destination.announce(app_data=self.name.encode('utf-8'))
self.destination.announce(app_data=self.name.encode("utf-8"))
else:
self.destination.announce()
self.last_announce = time.time()
time.sleep(1)
except Exception:
self.logger.exception("Error in announce loop")
except Exception as e:
RNS.log(f"Error in announce loop: {e}", RNS.LOG_ERROR)
def _refresh_loop(self):
while not self._stop_event.is_set():
try:
try:
while not self._stop_event.is_set():
now = time.time()
if self.page_refresh_interval > 0 and now - self.last_page_refresh > self.page_refresh_interval:
if (
self.page_refresh_interval > 0
and now - self.last_page_refresh > self.page_refresh_interval
):
self.register_pages()
self.last_page_refresh = now
if self.file_refresh_interval > 0 and now - self.last_file_refresh > self.file_refresh_interval:
if (
self.file_refresh_interval > 0
and now - self.last_file_refresh > self.file_refresh_interval
):
self.register_files()
self.last_file_refresh = now
time.sleep(1)
except Exception:
self.logger.exception("Error in refresh loop")
def get_stats(self):
"""Get current statistics"""
with self._stats_lock:
# Calculate uptime
uptime = time.time() - self.stats['start_time']
# Get top requested pages and files
top_pages = sorted(self.stats['page_requests_by_path'].items(), key=lambda x: x[1], reverse=True)[:10]
top_files = sorted(self.stats['file_requests_by_path'].items(), key=lambda x: x[1], reverse=True)[:10]
top_peers = sorted(self.stats['requests_by_peer'].items(), key=lambda x: x[1], reverse=True)[:10]
return {
'uptime_seconds': uptime,
'uptime_formatted': self._format_duration(uptime),
'start_time': datetime.fromtimestamp(self.stats['start_time']).isoformat(),
'total_connections': self.stats['total_connections'],
'active_connections': self.stats['active_connections'],
'total_page_requests': self.stats['total_page_requests'],
'total_file_requests': self.stats['total_file_requests'],
'total_requests': self.stats['total_page_requests'] + self.stats['total_file_requests'],
'top_pages': top_pages,
'top_files': top_files,
'top_peers': [(peer[:16] + "..." if len(peer) > 16 else peer, count) for peer, count in top_peers],
'recent_requests': list(self.stats['recent_requests'])[-10:], # Last 10 requests
'connected_peers': len(self.stats['connected_peers']),
'requests_per_hour': self._calculate_requests_per_hour(),
}
def _format_duration(self, seconds):
"""Format duration in human readable format"""
days = int(seconds // 86400)
hours = int((seconds % 86400) // 3600)
minutes = int((seconds % 3600) // 60)
secs = int(seconds % 60)
if days > 0:
return f"{days}d {hours}h {minutes}m {secs}s"
elif hours > 0:
return f"{hours}h {minutes}m {secs}s"
elif minutes > 0:
return f"{minutes}m {secs}s"
else:
return f"{secs}s"
def _calculate_requests_per_hour(self):
"""Calculate average requests per hour"""
uptime_hours = (time.time() - self.stats['start_time']) / 3600
if uptime_hours < 0.1: # Less than 6 minutes
return 0
total_requests = self.stats['total_page_requests'] + self.stats['total_file_requests']
return round(total_requests / uptime_hours, 2)
def print_stats(self):
"""Print formatted statistics to console"""
stats = self.get_stats()
print("\n" + "="*60)
print("RNS PAGE NODE STATISTICS")
print("="*60)
print(f"Node Name: {self.name or 'Unnamed'}")
print(f"Started: {stats['start_time']}")
print(f"Uptime: {stats['uptime_formatted']}")
print(f"Node Hash: {RNS.hexrep(self.destination.hash, delimit=False)}")
print()
print("CONNECTION STATS:")
print(f" Total Connections: {stats['total_connections']}")
print(f" Active Connections: {stats['active_connections']}")
print()
print("REQUEST STATS:")
print(f" Total Requests: {stats['total_requests']}")
print(f" Page Requests: {stats['total_page_requests']}")
print(f" File Requests: {stats['total_file_requests']}")
print(f" Requests/Hour: {stats['requests_per_hour']}")
print()
if stats['top_pages']:
print("TOP REQUESTED PAGES:")
for path, count in stats['top_pages']:
print(f" {count:4d} - {path}")
print()
if stats['top_files']:
print("TOP REQUESTED FILES:")
for path, count in stats['top_files']:
print(f" {count:4d} - {path}")
print()
if stats['top_peers']:
print("TOP REQUESTING PEERS:")
for peer, count in stats['top_peers']:
print(f" {count:4d} - {peer}")
print()
if stats['recent_requests']:
print("RECENT REQUESTS:")
for req in stats['recent_requests']:
print(f" {req['datetime']} - {req['type'].upper()} {req['path']} from {req['peer'][:16]}...")
print("="*60)
def save_stats_to_file(self, filepath):
"""Save statistics to JSON file"""
try:
stats = self.get_stats()
# Ensure directory exists
dir_path = os.path.dirname(os.path.abspath(filepath))
if dir_path:
os.makedirs(dir_path, exist_ok=True)
# Convert defaultdict and other non-serializable objects to regular dicts
with self._stats_lock:
stats_copy = dict(stats)
stats_copy['page_requests_by_path'] = dict(self.stats['page_requests_by_path'])
stats_copy['file_requests_by_path'] = dict(self.stats['file_requests_by_path'])
stats_copy['requests_by_peer'] = dict(self.stats['requests_by_peer'])
stats_copy['hourly_stats'] = {k: dict(v) for k, v in self.stats['hourly_stats'].items()}
stats_copy['daily_stats'] = {k: dict(v) for k, v in self.stats['daily_stats'].items()}
stats_copy['connected_peers'] = dict(self.stats['connected_peers'])
stats_copy['recent_requests'] = list(self.stats['recent_requests'])
with open(filepath, 'w') as f:
json.dump(stats_copy, f, indent=2, default=str)
self.logger.info(f"Statistics saved to {filepath}")
return True
except Exception as e:
self.logger.error(f"Failed to save statistics to {filepath}: {e}")
import traceback
self.logger.error(f"Traceback: {traceback.format_exc()}")
return False
def reset_stats(self):
"""Reset all statistics"""
with self._stats_lock:
self._init_stats()
self.logger.info("Statistics reset")
RNS.log(f"Error in refresh loop: {e}", RNS.LOG_ERROR)
def shutdown(self):
self.logger.info("Shutting down PageNode...")
"""Gracefully shutdown the PageNode and cleanup resources."""
RNS.log("Shutting down PageNode...", RNS.LOG_INFO)
self._stop_event.set()
try:
self._announce_thread.join(timeout=5)
self._refresh_thread.join(timeout=5)
except Exception:
self.logger.exception("Error waiting for threads to shut down")
except Exception as e:
RNS.log(f"Error waiting for threads to shut down: {e}", RNS.LOG_ERROR)
try:
if hasattr(self.destination, 'close'):
if hasattr(self.destination, "close"):
self.destination.close()
except Exception:
self.logger.exception("Error closing RNS destination")
except Exception as e:
RNS.log(f"Error closing RNS destination: {e}", RNS.LOG_ERROR)
def main():
"""Run the RNS page node application."""
parser = argparse.ArgumentParser(description="Minimal Reticulum Page Node")
parser.add_argument('-c', '--config', dest='configpath', help='Reticulum config path', default=None)
parser.add_argument('-p', '--pages-dir', dest='pages_dir', help='Pages directory', default=os.path.join(os.getcwd(), 'pages'))
parser.add_argument('-f', '--files-dir', dest='files_dir', help='Files directory', default=os.path.join(os.getcwd(), 'files'))
parser.add_argument('-n', '--node-name', dest='node_name', help='Node display name', default=None)
parser.add_argument('-a', '--announce-interval', dest='announce_interval', type=int, help='Announce interval in seconds', default=360)
parser.add_argument('-i', '--identity-dir', dest='identity_dir', help='Directory to store node identity', default=os.path.join(os.getcwd(), 'node-config'))
parser.add_argument('--page-refresh-interval', dest='page_refresh_interval', type=int, default=0, help='Page refresh interval in seconds, 0 disables auto-refresh')
parser.add_argument('--file-refresh-interval', dest='file_refresh_interval', type=int, default=0, help='File refresh interval in seconds, 0 disables auto-refresh')
parser.add_argument('-l', '--log-level', dest='log_level', choices=['DEBUG','INFO','WARNING','ERROR','CRITICAL'], default='INFO', help='Logging level')
parser.add_argument('--stats-interval', dest='stats_interval', type=int, default=0, help='Print stats every N seconds (0 disables)')
parser.add_argument('--save-stats', dest='save_stats', help='Save stats to JSON file on shutdown')
parser.add_argument('--stats-file', dest='stats_file', help='Actively write stats to JSON file (live updates)')
parser.add_argument(
"-c",
"--config",
dest="configpath",
help="Reticulum config path",
default=None,
)
parser.add_argument(
"-p",
"--pages-dir",
dest="pages_dir",
help="Pages directory",
default=str(Path.cwd() / "pages"),
)
parser.add_argument(
"-f",
"--files-dir",
dest="files_dir",
help="Files directory",
default=str(Path.cwd() / "files"),
)
parser.add_argument(
"-n",
"--node-name",
dest="node_name",
help="Node display name",
default=None,
)
parser.add_argument(
"-a",
"--announce-interval",
dest="announce_interval",
type=int,
help="Announce interval in seconds",
default=360,
)
parser.add_argument(
"-i",
"--identity-dir",
dest="identity_dir",
help="Directory to store node identity",
default=str(Path.cwd() / "node-config"),
)
parser.add_argument(
"--page-refresh-interval",
dest="page_refresh_interval",
type=int,
default=0,
help="Page refresh interval in seconds, 0 disables auto-refresh",
)
parser.add_argument(
"--file-refresh-interval",
dest="file_refresh_interval",
type=int,
default=0,
help="File refresh interval in seconds, 0 disables auto-refresh",
)
parser.add_argument(
"-l",
"--log-level",
dest="log_level",
choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
default="INFO",
help="Logging level",
)
args = parser.parse_args()
configpath = args.configpath
@@ -562,64 +389,38 @@ def main():
identity_dir = args.identity_dir
page_refresh_interval = args.page_refresh_interval
file_refresh_interval = args.file_refresh_interval
stats_interval = args.stats_interval
save_stats_file = args.save_stats
stats_file = args.stats_file
numeric_level = getattr(logging, args.log_level.upper(), logging.INFO)
logging.basicConfig(level=numeric_level, format='%(asctime)s %(name)s [%(levelname)s] %(message)s')
RNS.Reticulum(configpath)
os.makedirs(identity_dir, exist_ok=True)
identity_file = os.path.join(identity_dir, 'identity')
if os.path.isfile(identity_file):
identity = RNS.Identity.from_file(identity_file)
Path(identity_dir).mkdir(parents=True, exist_ok=True)
identity_file = Path(identity_dir) / "identity"
if identity_file.is_file():
identity = RNS.Identity.from_file(str(identity_file))
else:
identity = RNS.Identity()
identity.to_file(identity_file)
identity.to_file(str(identity_file))
os.makedirs(pages_dir, exist_ok=True)
os.makedirs(files_dir, exist_ok=True)
Path(pages_dir).mkdir(parents=True, exist_ok=True)
Path(files_dir).mkdir(parents=True, exist_ok=True)
node = PageNode(identity, pages_dir, files_dir, announce_interval, node_name, page_refresh_interval, file_refresh_interval, stats_file)
logger.info("Page node running. Press Ctrl-C to exit.")
if stats_interval > 0:
logger.info(f"Stats will be printed every {stats_interval} seconds")
node = PageNode(
identity,
pages_dir,
files_dir,
announce_interval,
node_name,
page_refresh_interval,
file_refresh_interval,
)
RNS.log("Page node running. Press Ctrl-C to exit.", RNS.LOG_INFO)
RNS.log(f"Node address: {RNS.prettyhexrep(node.destination.hash)}", RNS.LOG_INFO)
last_stats_time = 0
try:
while True:
current_time = time.time()
# Print stats if interval is set and enough time has passed
if stats_interval > 0 and current_time - last_stats_time >= stats_interval:
node.print_stats()
last_stats_time = current_time
time.sleep(1)
except KeyboardInterrupt:
logger.info("Keyboard interrupt received, shutting down...")
# Print final stats
node.print_stats()
# Save stats if requested
if save_stats_file:
logger.info(f"Saving final statistics to {save_stats_file}")
if node.save_stats_to_file(save_stats_file):
logger.info(f"Statistics successfully saved to {save_stats_file}")
else:
logger.error(f"Failed to save statistics to {save_stats_file}")
RNS.log("Keyboard interrupt received, shutting down...", RNS.LOG_INFO)
node.shutdown()
finally:
# Ensure stats are saved even if something goes wrong
if save_stats_file and 'node' in locals():
try:
node.save_stats_to_file(save_stats_file)
logger.info(f"Final attempt: Statistics saved to {save_stats_file}")
except Exception as e:
logger.error(f"Final save attempt failed: {e}")
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@@ -1,31 +1,31 @@
from setuptools import setup, find_packages
from setuptools import find_packages, setup
with open('README.md', 'r', encoding='utf-8') as fh:
with open("README.md", encoding="utf-8") as fh:
long_description = fh.read()
setup(
name='rns-page-node',
version='0.2.0',
author='Sudo-Ivan',
author_email='',
description='A simple way to serve pages and files over the Reticulum network.',
name="rns-page-node",
version="1.2.0",
author="Sudo-Ivan",
author_email="",
description="A simple way to serve pages and files over the Reticulum network.",
long_description=long_description,
long_description_content_type='text/markdown',
url='https://github.com/Sudo-Ivan/rns-page-node',
long_description_content_type="text/markdown",
url="https://github.com/Sudo-Ivan/rns-page-node",
packages=find_packages(),
license="GPL-3.0",
python_requires='>=3.10',
python_requires=">=3.9",
install_requires=[
'rns>=1.0.0,<1.5.0',
"rns>=1.0.1,<1.5.0",
],
entry_points={
'console_scripts': [
'rns-page-node=rns_page_node.main:main',
"console_scripts": [
"rns-page-node=rns_page_node.main:main",
],
},
classifiers=[
'Programming Language :: Python :: 3',
'License :: OSI Approved :: GNU General Public License v3 (GPLv3)',
'Operating System :: OS Independent',
"Programming Language :: Python :: 3",
"License :: OSI Approved :: GNU General Public License v3 (GPLv3)",
"Operating System :: OS Independent",
],
)

28
tests/run_tests.sh Normal file β†’ Executable file
View File

@@ -9,11 +9,33 @@ rm -rf config node-config pages files node.log
mkdir -p config node-config pages files
# Create a sample page and a test file
cat > pages/index.mu << EOF
>Test Page
This is a test page.
cat > pages/index.mu << 'EOF'
#!/usr/bin/env python3
import os
print("`F0f0`_`Test Page`_")
print("This is a test page with environment variable support.")
print()
print("`F0f0`_`Environment Variables`_")
params = []
for key, value in os.environ.items():
if key.startswith(('field_', 'var_')):
params.append(f"- `Faaa`{key}`f: `F0f0`{value}`f")
if params:
print("\n".join(params))
else:
print("- No parameters received")
print()
print("`F0f0`_`Remote Identity`_")
remote_id = os.environ.get('remote_identity', '33aff86b736acd47dca07e84630fd192') # Mock for testing
print(f"`Faaa`{remote_id}`f")
EOF
chmod +x pages/index.mu
cat > files/text.txt << EOF
This is a test file.
EOF

View File

@@ -1,29 +1,26 @@
#!/usr/bin/env python3
import os
import sys
import time
import threading
import time
import RNS
# Determine base directory for tests
dir_path = os.path.abspath(os.path.dirname(__file__))
config_dir = os.path.join(dir_path, 'config')
identity_dir = os.path.join(dir_path, 'node-config')
config_dir = os.path.join(dir_path, "config")
identity_dir = os.path.join(dir_path, "node-config")
# Initialize Reticulum with shared config
RNS.Reticulum(config_dir)
# Load server identity (created by the page node)
identity_file = os.path.join(identity_dir, 'identity')
identity_file = os.path.join(identity_dir, "identity")
server_identity = RNS.Identity.from_file(identity_file)
# Create a destination to the server node
destination = RNS.Destination(
server_identity,
RNS.Destination.OUT,
RNS.Destination.SINGLE,
'nomadnetwork',
'node'
server_identity, RNS.Destination.OUT, RNS.Destination.SINGLE, "nomadnetwork", "node",
)
# Ensure we know a path to the destination
@@ -39,66 +36,165 @@ global_link = RNS.Link(destination)
responses = {}
done_event = threading.Event()
# Test data for environment variables
test_data_dict = {
'var_field_test': 'dictionary_value',
'var_field_message': 'hello_world',
'var_action': 'test_action'
}
test_data_bytes = b'field_bytes_test=bytes_value|field_bytes_message=test_bytes|action=bytes_action'
# Callback for page response
def on_page(response):
data = response.response
if isinstance(data, bytes):
text = data.decode('utf-8')
text = data.decode("utf-8")
else:
text = str(data)
print('Received page:')
print("Received page (no data):")
print(text)
responses['page'] = text
if 'file' in responses:
responses["page"] = text
check_responses()
# Callback for page response with dictionary data
def on_page_dict(response):
data = response.response
if isinstance(data, bytes):
text = data.decode("utf-8")
else:
text = str(data)
print("Received page (dict data):")
print(text)
responses["page_dict"] = text
check_responses()
# Callback for page response with bytes data
def on_page_bytes(response):
data = response.response
if isinstance(data, bytes):
text = data.decode("utf-8")
else:
text = str(data)
print("Received page (bytes data):")
print(text)
responses["page_bytes"] = text
check_responses()
def check_responses():
if "page" in responses and "page_dict" in responses and "page_bytes" in responses and "file" in responses:
done_event.set()
# Callback for file response
def on_file(response):
data = response.response
# Handle response as [fileobj, headers]
if isinstance(data, list) and len(data) == 2 and hasattr(data[0], 'read'):
if isinstance(data, list) and len(data) == 2 and hasattr(data[0], "read"):
fileobj, headers = data
file_data = fileobj.read()
filename = headers.get(b'name', b'').decode('utf-8')
print(f'Received file ({filename}):')
print(file_data.decode('utf-8'))
responses['file'] = file_data.decode('utf-8')
filename = headers.get(b"name", b"").decode("utf-8")
print(f"Received file ({filename}):")
print(file_data.decode("utf-8"))
responses["file"] = file_data.decode("utf-8")
# Handle response as a raw file object
elif hasattr(data, 'read'):
elif hasattr(data, "read"):
file_data = data.read()
filename = os.path.basename('text.txt')
print(f'Received file ({filename}):')
print(file_data.decode('utf-8'))
responses['file'] = file_data.decode('utf-8')
filename = os.path.basename("text.txt")
print(f"Received file ({filename}):")
print(file_data.decode("utf-8"))
responses["file"] = file_data.decode("utf-8")
# Handle response as raw bytes
elif isinstance(data, bytes):
text = data.decode('utf-8')
print('Received file:')
text = data.decode("utf-8")
print("Received file:")
print(text)
responses['file'] = text
responses["file"] = text
else:
print('Received file (unhandled format):', data)
responses['file'] = str(data)
if 'page' in responses:
done_event.set()
print("Received file (unhandled format):", data)
responses["file"] = str(data)
check_responses()
# Request the page and file once the link is established
# Request the pages and file once the link is established
def on_link_established(link):
link.request('/page/index.mu', None, response_callback=on_page)
link.request('/file/text.txt', None, response_callback=on_file)
# Test page without data
link.request("/page/index.mu", None, response_callback=on_page)
# Test page with dictionary data (simulates MeshChat)
link.request("/page/index.mu", test_data_dict, response_callback=on_page_dict)
# Test page with bytes data (URL-encoded style)
link.request("/page/index.mu", test_data_bytes, response_callback=on_page_bytes)
# Test file serving
link.request("/file/text.txt", None, response_callback=on_file)
# Register callbacks
global_link.set_link_established_callback(on_link_established)
global_link.set_link_closed_callback(lambda l: done_event.set())
global_link.set_link_closed_callback(lambda link: done_event.set())
# Wait for responses or timeout
if not done_event.wait(timeout=30):
print('Test timed out.', file=sys.stderr)
print("Test timed out.", file=sys.stderr)
sys.exit(1)
if responses.get('page') and responses.get('file'):
print('Tests passed!')
# Validate test results
def validate_test_results():
"""Validate that all responses contain expected content"""
# Check basic page response (no data)
if "page" not in responses:
print("ERROR: No basic page response received", file=sys.stderr)
return False
page_content = responses["page"]
if "No parameters received" not in page_content:
print("ERROR: Basic page should show 'No parameters received'", file=sys.stderr)
return False
if "33aff86b736acd47dca07e84630fd192" not in page_content:
print("ERROR: Basic page should show mock remote identity", file=sys.stderr)
return False
# Check page with dictionary data
if "page_dict" not in responses:
print("ERROR: No dictionary data page response received", file=sys.stderr)
return False
dict_content = responses["page_dict"]
if "var_field_test" not in dict_content or "dictionary_value" not in dict_content:
print("ERROR: Dictionary data page should contain processed environment variables", file=sys.stderr)
return False
if "33aff86b736acd47dca07e84630fd192" not in dict_content:
print("ERROR: Dictionary data page should show mock remote identity", file=sys.stderr)
return False
# Check page with bytes data
if "page_bytes" not in responses:
print("ERROR: No bytes data page response received", file=sys.stderr)
return False
bytes_content = responses["page_bytes"]
if "field_bytes_test" not in bytes_content or "bytes_value" not in bytes_content:
print("ERROR: Bytes data page should contain processed environment variables", file=sys.stderr)
return False
if "33aff86b736acd47dca07e84630fd192" not in bytes_content:
print("ERROR: Bytes data page should show mock remote identity", file=sys.stderr)
return False
# Check file response
if "file" not in responses:
print("ERROR: No file response received", file=sys.stderr)
return False
file_content = responses["file"]
if "This is a test file" not in file_content:
print("ERROR: File content doesn't match expected content", file=sys.stderr)
return False
return True
if validate_test_results():
print("All tests passed! Environment variable processing works correctly.")
sys.exit(0)
else:
print('Tests failed.', file=sys.stderr)
print("Tests failed.", file=sys.stderr)
sys.exit(1)

View File

@@ -1,20 +1,26 @@
#!/usr/bin/env python3
import os
import sys
import time
import threading
import time
import RNS
dir_path = os.path.abspath(os.path.dirname(__file__))
config_dir = os.path.join(dir_path, 'config')
config_dir = os.path.join(dir_path, "config")
RNS.Reticulum(config_dir)
DESTINATION_HEX = '49b2d959db8528347d0a38083aec1042' # Ivans Node that runs rns-page-node
DESTINATION_HEX = (
"49b2d959db8528347d0a38083aec1042" # Ivans Node that runs rns-page-node
)
dest_len = (RNS.Reticulum.TRUNCATED_HASHLENGTH // 8) * 2
if len(DESTINATION_HEX) != dest_len:
print(f"Invalid destination length (got {len(DESTINATION_HEX)}, expected {dest_len})", file=sys.stderr)
print(
f"Invalid destination length (got {len(DESTINATION_HEX)}, expected {dest_len})",
file=sys.stderr,
)
sys.exit(1)
destination_hash = bytes.fromhex(DESTINATION_HEX)
@@ -28,32 +34,32 @@ server_identity = RNS.Identity.recall(destination_hash)
print(f"Recalled server identity for {DESTINATION_HEX}")
destination = RNS.Destination(
server_identity,
RNS.Destination.OUT,
RNS.Destination.SINGLE,
'nomadnetwork',
'node'
server_identity, RNS.Destination.OUT, RNS.Destination.SINGLE, "nomadnetwork", "node",
)
link = RNS.Link(destination)
done_event = threading.Event()
def on_page(response):
data = response.response
if isinstance(data, bytes):
text = data.decode('utf-8')
text = data.decode("utf-8")
else:
text = str(data)
print('Fetched page content:')
print("Fetched page content:")
print(text)
done_event.set()
link.set_link_established_callback(lambda l: l.request('/page/index.mu', None, response_callback=on_page))
link.set_link_closed_callback(lambda l: done_event.set())
link.set_link_established_callback(
lambda link: link.request("/page/index.mu", None, response_callback=on_page),
)
link.set_link_closed_callback(lambda link: done_event.set())
if not done_event.wait(timeout=30):
print('Timed out waiting for page', file=sys.stderr)
print("Timed out waiting for page", file=sys.stderr)
sys.exit(1)
print('Done fetching page.')
print("Done fetching page.")
sys.exit(0)