365 Commits

Author SHA1 Message Date
6d86f6bc67 Remove entries from CommunityInterfacesManager initialization
Some checks failed
CI / test-backend (push) Successful in 4s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 10s
CI / build-frontend (push) Successful in 2m24s
CI / test-lang (push) Successful in 2m42s
CI / lint (push) Successful in 9m39s
Tests / test (push) Failing after 15m1s
Build Test / Build and Test (push) Successful in 29m21s
2026-01-10 20:15:18 -06:00
6da36e2fb2 Fix version determination logic in build workflow
All checks were successful
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 10s
CI / test-backend (push) Successful in 33s
CI / lint (push) Successful in 2m3s
CI / test-lang (push) Successful in 9m35s
CI / build-frontend (push) Successful in 9m50s
Build Test / Build and Test (push) Successful in 12m25s
Tests / test (push) Successful in 13m19s
2026-01-10 19:01:43 -06:00
0b9faceb46 Update dependencies
All checks were successful
CI / test-backend (push) Successful in 4s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 10s
CI / build-frontend (push) Successful in 1m42s
CI / test-lang (push) Successful in 1m51s
CI / lint (push) Successful in 9m50s
Tests / test (push) Successful in 14m13s
Build Test / Build and Test (push) Successful in 45m17s
2026-01-10 18:57:19 -06:00
7fb7543efe Update Taskfile.yml 2026-01-10 18:56:54 -06:00
004639e852 Update localization files for German, English, Italian, and Russian: modify restart instructions to recommend manual restarts and add a new "finish_setup" key for completion prompts. 2026-01-10 18:56:36 -06:00
0d05d348ef Update changelog retrieval logic in ReticulumMeshChat: check for CHANGELOG.md in public folder if not found in the default location, and adjust headless mode default based on application state. 2026-01-10 18:56:31 -06:00
7f5254b1b7 Refactor tutorial completion logic in TutorialModal.vue: rename finishAndRestart to finishTutorial, update button text, and streamline functionality. 2026-01-10 18:56:23 -06:00
7eed90249f Add CHANGELOG.md to include files in build process if it exists 2026-01-10 18:56:14 -06:00
eff722ee18 Update backend process spawning in Electron by adding error handling for failed process initiation and allow overriding the Python command in the build script for cross-platform compatibility.
Some checks failed
CI / test-backend (push) Successful in 4s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 10s
CI / lint (push) Successful in 9m42s
CI / build-frontend (push) Successful in 9m49s
CI / test-lang (push) Successful in 9m45s
Tests / test (push) Successful in 13m18s
Build Test / Build and Test (push) Failing after 32m38s
2026-01-10 18:35:43 -06:00
8489f4531f Add Wine support for building Windows executables and all Electron apps in Taskfile.yml 2026-01-10 18:35:15 -06:00
eac300c4fc Add cross-platform building instructions for Linux to Windows using Wine in README.md 2026-01-10 18:35:10 -06:00
f2bb8f8b23 Update README [skip-ci]
Some checks failed
Build Test / Build and Test (push) Has been cancelled
CI / lint (push) Has been cancelled
CI / build-frontend (push) Has been cancelled
CI / test-backend (push) Has been cancelled
CI / test-lang (push) Has been cancelled
OSV-Scanner Scheduled Scan / scan-scheduled (push) Has been cancelled
Tests / test (push) Has been cancelled
2026-01-10 18:16:04 -06:00
0297c33a26 Update Trivy download links in Gitea workflows to point to the correct repository
All checks were successful
CI / test-backend (push) Successful in 4s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 22s
CI / lint (push) Successful in 9m38s
CI / build-frontend (push) Successful in 9m52s
Build Test / Build and Test (push) Successful in 11m52s
CI / test-lang (push) Successful in 9m38s
Tests / test (push) Successful in 29m15s
2026-01-10 18:11:00 -06:00
5628a8c137 Remove artifact upload step and ZIP build process from Gitea workflows
Some checks failed
CI / test-backend (push) Successful in 4s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 9s
CI / lint (push) Successful in 9m38s
CI / build-frontend (push) Successful in 9m43s
CI / test-lang (push) Successful in 9m37s
Tests / test (push) Successful in 13m22s
Build Test / Build and Test (push) Failing after 40m52s
2026-01-10 18:04:10 -06:00
ba2f5b84d4 Update build workflow to set release draft status to true
Some checks failed
CI / lint (push) Successful in 1m23s
CI / test-backend (push) Successful in 17s
CI / test-lang (push) Successful in 1m3s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 43s
CI / build-frontend (push) Successful in 9m39s
Tests / test (push) Successful in 13m18s
Build Test / Build and Test (push) Failing after 40m44s
2026-01-10 18:01:47 -06:00
3de559a976 Resolve conflict in README.md by using version from massive-changes
Some checks failed
Build Test / Build and Test (push) Has been cancelled
CI / build-frontend (push) Successful in 1m29s
OSV-Scanner Scheduled Scan / scan-scheduled (push) Successful in 10s
CI / lint (push) Successful in 2m11s
CI / test-backend (push) Successful in 45s
CI / test-lang (push) Successful in 9m37s
Tests / test (push) Successful in 13m13s
2026-01-10 17:57:45 -06:00
a914d68e49 locales: add connection method, serial, WiFi, and IP address fields in German, English, Italian, and Russian translations; include error message for failed OTA flashing
Some checks failed
CI / test-backend (push) Successful in 4s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (push) Successful in 1m44s
CI / build-frontend (push) Successful in 1m48s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 51s
Build Test / Build and Test (pull_request) Has been cancelled
CI / test-lang (pull_request) Successful in 9m41s
CI / lint (push) Successful in 9m53s
CI / build-frontend (pull_request) Successful in 9m49s
CI / lint (pull_request) Successful in 9m51s
Tests / test (push) Successful in 13m37s
Build and Publish Docker Image / build-dev (pull_request) Failing after 14m13s
Tests / test (pull_request) Successful in 26m35s
Build Test / Build and Test (push) Failing after 39m21s
2026-01-10 17:54:33 -06:00
e83c36c664 Update .gitignore to exclude private scripts directory 2026-01-10 17:54:24 -06:00
e949ccf10a format 2026-01-10 17:53:31 -06:00
c209b84a96 rnode_flasher: add WiFi connection method and OTA firmware upload functionality (testing required) 2026-01-10 17:53:17 -06:00
68202620cf Update SECURITY 2026-01-10 17:51:56 -06:00
44a560c39f Update README 2026-01-10 17:51:51 -06:00
b009757253 fix windows builds 2026-01-10 17:51:30 -06:00
9a93bb35b3 Update README.md
Some checks failed
CI / test-backend (push) Successful in 14s
CI / lint (push) Failing after 47s
CI / build-frontend (push) Successful in 9m34s
Tests / test (push) Successful in 9m53s
2026-01-10 22:23:06 +00:00
7a419f96ee feat(tests): add unit tests for auto propagation API and logic
Some checks failed
CI / test-backend (pull_request) Successful in 4s
CI / test-backend (push) Successful in 24s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / lint (pull_request) Failing after 2m35s
CI / lint (push) Failing after 2m43s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 52s
CI / build-frontend (push) Successful in 9m42s
CI / test-lang (push) Successful in 9m40s
CI / test-lang (pull_request) Successful in 9m33s
CI / build-frontend (pull_request) Successful in 9m47s
Build Test / Build and Test (pull_request) Successful in 15m55s
Build Test / Build and Test (push) Successful in 16m1s
Build and Publish Docker Image / build-dev (pull_request) Successful in 17m17s
Tests / test (push) Failing after 18m50s
Tests / test (pull_request) Successful in 16m55s
2026-01-08 19:29:30 -06:00
b8ef3d188d feat(tutorial): update tutorial modal with propagation mode step and auto-select option for preferred propagation node 2026-01-08 19:29:22 -06:00
fb790a4c08 feat(identity_context): integrate AutoPropagationManager for background node selection 2026-01-08 19:29:10 -06:00
e7beabba11 feat(config_manager): add auto-select configuration for preferred propagation node 2026-01-08 19:28:55 -06:00
20639fef0c feat(auto_propagation): implement AutoPropagationManager for dynamic propagation node selection 2026-01-08 19:28:47 -06:00
825ddd17fe feat(meshchat): add auto-select option for preferred propagation node 2026-01-08 19:27:23 -06:00
5d70e2c00f feat(locales): update 2026-01-08 19:27:09 -06:00
682ff4ddb7 feat(docs): cleanup 2026-01-08 16:41:10 -06:00
b3c6fd5e16 refactor(telephone_manager): format
Some checks failed
CI / test-backend (push) Successful in 4s
CI / lint (push) Successful in 9m45s
CI / build-frontend (push) Successful in 9m46s
CI / test-lang (push) Successful in 9m40s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (push) Successful in 14m47s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 14s
CI / lint (pull_request) Successful in 9m45s
CI / build-frontend (pull_request) Successful in 9m46s
Build Test / Build and Test (pull_request) Successful in 14m17s
CI / test-lang (pull_request) Successful in 9m34s
Tests / test (push) Failing after 22m13s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m23s
Tests / test (pull_request) Successful in 13m15s
2026-01-08 12:53:20 -06:00
566acf228d feat(IdentitiesPage): optimize rendering performance with memoization
Some checks failed
CI / test-backend (push) Successful in 9s
CI / lint (push) Failing after 5m5s
CI / test-lang (push) Failing after 4m50s
CI / test-backend (pull_request) Successful in 7s
CI / build-frontend (push) Successful in 9m43s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / lint (pull_request) Failing after 5m5s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 10s
Tests / test (push) Successful in 13m18s
Build Test / Build and Test (pull_request) Successful in 13m45s
CI / build-frontend (pull_request) Successful in 9m45s
CI / test-lang (pull_request) Successful in 9m35s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m23s
Tests / test (pull_request) Successful in 14m14s
Build Test / Build and Test (push) Failing after 39m39s
- Added v-memo directive to improve rendering efficiency of identity components by caching their properties.
- Updated test to allow for a longer render time threshold, ensuring performance remains acceptable after changes.
2026-01-08 12:43:17 -06:00
2652f1dd87 chore(Dockerfile): upgrade pip to version 25.3 due to vuln in older version.
Some checks failed
CI / test-backend (push) Successful in 4s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / build-frontend (push) Successful in 1m30s
CI / test-lang (push) Successful in 1m29s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 46s
CI / lint (push) Failing after 5m10s
CI / lint (pull_request) Failing after 5m12s
CI / test-lang (pull_request) Successful in 9m39s
CI / build-frontend (pull_request) Successful in 9m47s
Tests / test (push) Successful in 13m37s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m25s
Tests / test (pull_request) Failing after 24m55s
Build Test / Build and Test (pull_request) Failing after 42m0s
Build Test / Build and Test (push) Failing after 43m5s
2026-01-08 12:39:31 -06:00
115b01ee65 chore(dependencies): update rns package to version 1.1.2 and specify lxst version
Some checks failed
CI / test-backend (pull_request) Successful in 5s
CI / test-backend (push) Successful in 29s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / lint (pull_request) Failing after 2m21s
CI / lint (push) Failing after 2m28s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 53s
CI / build-frontend (push) Successful in 9m43s
CI / test-lang (push) Successful in 9m41s
CI / build-frontend (pull_request) Successful in 9m44s
CI / test-lang (pull_request) Successful in 9m42s
Build Test / Build and Test (pull_request) Successful in 15m35s
Build Test / Build and Test (push) Successful in 15m44s
Build and Publish Docker Image / build-dev (pull_request) Failing after 17m1s
Tests / test (push) Failing after 18m54s
Tests / test (pull_request) Failing after 17m13s
2026-01-08 12:14:26 -06:00
6498956903 chore(Dockerfile): downgrade Python image to 3.12.12 and add espeak-ng package 2026-01-08 12:14:16 -06:00
6860530217 feat(locales): add Reticulum documentation clearing functionality in multiple languages
Some checks failed
CI / test-backend (push) Successful in 4s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / build-frontend (push) Successful in 1m30s
CI / test-lang (push) Successful in 1m43s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 22s
CI / lint (push) Failing after 5m16s
CI / lint (pull_request) Failing after 5m11s
CI / test-lang (pull_request) Successful in 9m42s
CI / build-frontend (pull_request) Successful in 9m50s
Tests / test (push) Successful in 13m35s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m16s
Tests / test (pull_request) Successful in 19m40s
Build Test / Build and Test (pull_request) Successful in 48m10s
Build Test / Build and Test (push) Successful in 50m31s
- Introduced new localization strings for clearing Reticulum documentation in German, English, Italian, and Russian.
- Added success messages for the documentation clearing action across all supported languages.
- Included a new status message for establishing a connection in the user interface.
2026-01-07 19:52:08 -06:00
eef9872b71 feat(tests): add Italian localization tests
- Included Italian locale support in the i18n localization tests.
- Updated test suite to validate the new Italian translations alongside existing German and Russian locales.
2026-01-07 19:51:59 -06:00
1e5564cfa3 refactor(tests): formatting 2026-01-07 19:51:48 -06:00
e02e17d712 refactor(call_page): formatting 2026-01-07 19:51:35 -06:00
069865d444 refactor(call_page): streamline call status message rendering for improved readability 2026-01-07 19:46:10 -06:00
192ac21fb0 feat(docs): add API endpoints for deleting documentation versions and clearing Reticulum docs
- Implemented DELETE endpoints to allow users to delete specific documentation versions and clear all Reticulum documentation.
- Enhanced the DocsManager class with methods for version deletion and clearing documentation, including error handling and logging.
- Updated frontend components to support version deletion and clearing of Reticulum docs with user confirmation dialogs.
2026-01-07 19:46:01 -06:00
f717d501d3 refactor(telephone_manager): improve call status updates for better user feedback 2026-01-07 19:45:47 -06:00
80ea5424fd fix(frontend): update call status messages for clarity 2026-01-07 19:45:31 -06:00
8bc7e39aee refactor(tests): clean up telemetry integration test by removing unused imports 2026-01-07 19:30:06 -06:00
75b17b44a1 update changelog [skip ci] 2026-01-07 19:29:07 -06:00
e2586e9052 feat(tests): add comprehensive telemetry and interface tests
Some checks failed
CI / test-backend (push) Successful in 32s
CI / lint (push) Failing after 2m12s
CI / build-frontend (pull_request) Successful in 1m38s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-backend (pull_request) Successful in 24s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 53s
CI / test-lang (pull_request) Successful in 1m15s
CI / lint (pull_request) Failing after 5m8s
CI / build-frontend (push) Successful in 9m46s
CI / test-lang (push) Successful in 9m48s
Tests / test (push) Successful in 13m32s
Tests / test (pull_request) Successful in 11m23s
Build Test / Build and Test (push) Successful in 15m56s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m42s
Build Test / Build and Test (pull_request) Successful in 16m9s
- Introduced new test files for telemetry functionality, including integration, fuzzing, and extended tests to ensure robustness and performance.
- Added tests for parsing LXMF display names and telemetry data, addressing potential bugs and ensuring correct handling of various input formats.
- Implemented performance tests for the InterfacesPage component, validating rendering efficiency with a large number of discovered interfaces.
- Enhanced existing tests for markdown rendering and link utilities to cover additional edge cases and improve stability.
2026-01-07 19:22:00 -06:00
ecfd124f8f chore(workflows): remove push and pull_request triggers from bench.yml 2026-01-07 19:22:00 -06:00
d8af5509b9 feat(locales): update German and Russian translations, add Italian localization
- Enhanced German and Russian JSON files with new entries for telemetry, location management, and content security policy settings.
- Added Italian localization file with comprehensive translations for the application, covering various features and settings.
- Improved user experience by ensuring consistent terminology across all supported languages.
2026-01-07 19:22:00 -06:00
55f718c72b feat(frontend): enhance link rendering and markdown processing
- Added LinkUtils for detecting and rendering NomadNet and standard links in text.
- Introduced MarkdownRenderer for converting Markdown to HTML, including support for code blocks, headers, and inline formatting.
- Implemented escapeHtml utility function to prevent XSS in rendered text.
- Updated ToastUtils to support an optional key parameter for toast notifications.
- Included Italian language support in the frontend localization.
2026-01-07 19:22:00 -06:00
37d4b317b9 feat(ui): enhance user interface and functionality across multiple components
- Updated sidebar width in App.vue for better layout.
- Added navigation option for RNPath trace in CommandPalette.vue.
- Included Italian language support in LanguageSelector.vue.
- Improved Toast.vue to handle loading state for toasts and update existing toasts.
- Enhanced AboutPage.vue with download buttons for snapshots and backups.
- Refined InterfacesPage.vue to improve layout and filtering capabilities.
- Introduced MiniChat.vue for a compact chat interface on the map.
- Updated ConversationDropDownMenu.vue to include telemetry trust toggle.
- Enhanced ConversationViewer.vue with better telemetry handling and error notifications.
- Added RNPathTracePage.vue for tracing paths to destination hashes.
- Improved ToolsPage.vue to include RNPath trace functionality.
2026-01-07 19:22:00 -06:00
df306cc67b feat(telemetry): implement telemetry tracking and path tracing features
- Added telemetry tracking capabilities, allowing users to toggle tracking for specific peers and retrieve tracked peers.
- Introduced RNPathTraceHandler for tracing paths to destination hashes.
- Enhanced database schema to support telemetry tracking and added related fields in contacts.
- Updated configuration management to include telemetry settings.
- Implemented API endpoints for downloading database backups and snapshots, as well as for telemetry-related functionalities.
- Improved error handling and response messages for telemetry requests and path tracing.
2026-01-07 19:22:00 -06:00
ce568c2965 chore: add fuzz testing script to package.json 2026-01-07 19:21:59 -06:00
b683809713 chore: add backend-manifest.json to .gitignore 2026-01-07 19:21:59 -06:00
7304b373f6 chore: update TODO list by removing rootless docker images and adding sideband plugins support 2026-01-07 19:21:59 -06:00
19a2dc8403 update meshchatx docs 2026-01-07 19:21:59 -06:00
b56c004340 chore: remove backend-manifest.json file as part of project cleanup 2026-01-07 19:21:59 -06:00
c40ba80f8f chore(deps): update dependencies in poetry.lock, pyproject.toml, and requirements.txt to latest versions including hypothesis, lxmf, rns, and urllib3 2026-01-07 19:21:59 -06:00
326e80027e feat(ui): add handling for lxm.ingest_uri.result messages with success, error, warning, and info notifications 2026-01-07 19:21:59 -06:00
f0ab00e9cc style(audio): format 2026-01-07 19:21:59 -06:00
4b686e12c3 fix(ui): handle pop-up blocking in PaperMessageModal and improve QR code message sending logic in PaperMessagePage 2026-01-07 19:21:58 -06:00
19e94cfb6f feat(ui): add functionality to retry all failed or cancelled messages in ConversationViewer component 2026-01-07 19:21:58 -06:00
ffd405808d chore(deps): update pnpm version to 10.27.0 in package.json and workflow files 2026-01-07 19:21:58 -06:00
3b142e9dba feat(docs): add selectedReticulumPath state and update localDocsUrl logic for reticulum documentation handling 2026-01-07 19:21:58 -06:00
969da9a579 feat(audio): better ToneGenerator with high-quality audio processing, including dynamics compression, lowpass filtering, and improved stereo separation 2026-01-07 19:21:58 -06:00
f0e567fe8a feat(localization): add new German and Russian translations for backup settings, message font size, and audio input permissions 2026-01-07 19:21:58 -06:00
edc1e16e03 test(ui): enhance SettingsPage component tests to verify config and display name input 2026-01-07 19:21:58 -06:00
83f480be3c fix(ui): improve message font and icon size handling with error checking for invalid inputs 2026-01-07 19:21:57 -06:00
b46d018b26 feat(ui): update user interface with improved icon handling, audio permission checks, and dynamic message icon size adjustments 2026-01-07 19:21:57 -06:00
44608ffb36 feat(maintenance): add test for deleting all user icons in maintenance module 2026-01-07 19:21:57 -06:00
7aa6f5a467 feat(localization): add LXMF icon clearing options and enhance existing translations in German, English, and Russian 2026-01-07 19:21:57 -06:00
8bb38d3e51 feat(maintenance): add API endpoint to clear LXMF icons and enhance backup configuration options 2026-01-07 19:21:57 -06:00
2bef49de81 Merge branch 'master' into massive-changes
Some checks failed
CI / test-backend (push) Successful in 1m2s
CI / lint (push) Successful in 1m55s
CI / lint (pull_request) Successful in 1m22s
CI / test-backend (pull_request) Successful in 41s
CI / build-frontend (pull_request) Successful in 1m16s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m7s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 26s
Benchmarks / benchmark (push) Has been cancelled
Build Test / Build and Test (push) Has been cancelled
CI / test-lang (push) Has been cancelled
CI / build-frontend (push) Has been cancelled
Benchmarks / benchmark (pull_request) Has been cancelled
Tests / test (push) Has been cancelled
Build Test / Build and Test (pull_request) Has been cancelled
Build and Publish Docker Image / build-dev (pull_request) Has been cancelled
Tests / test (pull_request) Has been cancelled
2026-01-08 01:11:25 +00:00
7d7cd7d487 feat(ui): enhance user experience with new features including QR code display, improved toast messages, and localized strings for various components
Some checks failed
CI / test-backend (push) Successful in 3s
CI / build-frontend (push) Successful in 1m48s
CI / test-backend (pull_request) Successful in 18s
CI / test-lang (push) Successful in 2m5s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m14s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 29s
CI / build-frontend (pull_request) Successful in 9m43s
CI / lint (push) Successful in 9m53s
CI / lint (pull_request) Successful in 9m49s
Build Test / Build and Test (pull_request) Successful in 12m57s
Tests / test (push) Successful in 14m2s
Benchmarks / benchmark (push) Successful in 14m29s
Build and Publish Docker Image / build-dev (pull_request) Successful in 19m25s
Tests / test (pull_request) Failing after 23m6s
Benchmarks / benchmark (pull_request) Successful in 29m13s
Build Test / Build and Test (push) Successful in 45m58s
2026-01-05 19:22:25 -06:00
33cbe07750 chore(docker): remove espeak-ng
Some checks failed
CI / test-backend (push) Successful in 3s
CI / build-frontend (push) Successful in 1m51s
CI / test-backend (pull_request) Successful in 20s
CI / test-lang (push) Successful in 2m14s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m13s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 30s
CI / lint (push) Successful in 9m46s
CI / build-frontend (pull_request) Successful in 9m45s
CI / lint (pull_request) Successful in 9m47s
Build Test / Build and Test (pull_request) Successful in 13m18s
Tests / test (push) Successful in 13m27s
Benchmarks / benchmark (push) Successful in 13m41s
Build and Publish Docker Image / build-dev (pull_request) Successful in 17m20s
Tests / test (pull_request) Failing after 24m54s
Benchmarks / benchmark (pull_request) Successful in 30m57s
Build Test / Build and Test (push) Failing after 36m56s
2026-01-05 17:39:06 -06:00
666c90875a interface discovery, folders for messages, map nodes from discovery, maintenance tools. 2026-01-05 17:38:52 -06:00
30cab64101 chore(docker): add git to build dependencies in Dockerfile
Some checks failed
CI / test-backend (push) Successful in 5s
CI / lint (push) Successful in 1m10s
CI / build-frontend (push) Successful in 9m47s
CI / test-lang (push) Successful in 9m40s
CI / test-backend (pull_request) Successful in 17s
CI / test-lang (pull_request) Successful in 1m3s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (push) Successful in 14m8s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 10s
Benchmarks / benchmark (push) Successful in 17m44s
Benchmarks / benchmark (pull_request) Successful in 16m15s
CI / build-frontend (pull_request) Successful in 9m42s
CI / lint (pull_request) Successful in 9m48s
Build Test / Build and Test (pull_request) Successful in 16m20s
Tests / test (push) Failing after 20m57s
Tests / test (pull_request) Successful in 11m39s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m59s
2026-01-05 12:01:59 -06:00
144cc53cd9 chore(security): lint
Some checks failed
CI / test-backend (push) Successful in 34s
CI / build-frontend (pull_request) Successful in 2m22s
CI / lint (push) Successful in 2m58s
CI / test-backend (pull_request) Successful in 21s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m4s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 32s
Build and Publish Docker Image / build-dev (pull_request) Failing after 6m4s
CI / test-lang (push) Successful in 9m43s
CI / build-frontend (push) Successful in 9m49s
CI / lint (pull_request) Successful in 9m49s
Tests / test (push) Successful in 12m0s
Build Test / Build and Test (pull_request) Successful in 15m12s
Build Test / Build and Test (push) Successful in 16m4s
Tests / test (pull_request) Successful in 19m41s
Benchmarks / benchmark (pull_request) Successful in 33m48s
Benchmarks / benchmark (push) Successful in 33m54s
2026-01-05 11:48:12 -06:00
a9e881a095 chore(dependencies): update hypothesis to version 6.149.0 and add lxmfy package from git repository 2026-01-05 11:48:00 -06:00
5b25e3b4c0 feat(changelog): add new features including Web Audio Bridge, LXMFy for bots, and RNS Discoverable Interfaces 2026-01-05 11:47:52 -06:00
e7728696c8 chore(vite): add asset cleanup before build to prevent accumulation of old files 2026-01-05 11:47:46 -06:00
fda9187e95 numerous improvements 2026-01-05 11:47:35 -06:00
e8d8d64fc0 Update README.md
Some checks failed
CI / test-backend (push) Successful in 14s
CI / lint (push) Failing after 1m41s
CI / build-frontend (push) Successful in 9m36s
Tests / test (push) Successful in 9m58s
2026-01-05 16:29:02 +00:00
5694c1ee67 fix(router): set default propagation cost to 0 in create_lxmf_router function 2026-01-04 23:46:37 -06:00
c2652f72f5 fix(call): update call history management to reflect no more available history 2026-01-04 23:41:14 -06:00
f2bbff5c3d feat(interface): extend interface type handling to include 'RNodeIPInterface' and set default values for new interface properties 2026-01-04 23:39:57 -06:00
5a388d80ed feat(logging): implement dynamic log directory resolution and add rotating file handler for improved logging management 2026-01-04 23:39:50 -06:00
176642db75 fix(call): simplify error handling in audio processing by removing unused catch parameters
Some checks failed
CI / test-backend (push) Successful in 18s
CI / lint (push) Successful in 1m43s
CI / build-frontend (pull_request) Successful in 2m5s
CI / test-backend (pull_request) Successful in 43s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m7s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 23s
CI / test-lang (push) Successful in 9m40s
CI / build-frontend (push) Successful in 9m50s
CI / lint (pull_request) Successful in 9m45s
Tests / test (push) Successful in 12m2s
Build Test / Build and Test (push) Successful in 14m17s
Build Test / Build and Test (pull_request) Successful in 14m45s
Build and Publish Docker Image / build-dev (pull_request) Successful in 15m3s
Tests / test (pull_request) Failing after 16m33s
Benchmarks / benchmark (push) Successful in 24m7s
Benchmarks / benchmark (pull_request) Successful in 24m2s
2026-01-04 23:21:26 -06:00
5d13b3e3f9 chore(manifest): update MANIFEST.in to include backend files and exclude frontend and cache directories 2026-01-04 23:19:59 -06:00
5f2aca4390 feat(tests): add unit tests for WebAudioSource and WebAudioSink functionality 2026-01-04 23:19:44 -06:00
f90fe55372 feat(styles): add spin-reverse animation and corresponding class for enhanced loading effects 2026-01-04 23:19:38 -06:00
1e98d8e859 format 2026-01-04 23:19:31 -06:00
0492e7d7bf refactor(components): update loading animations to use 'animate-spin-reverse' class for consistency 2026-01-04 23:19:17 -06:00
6a23727e55 feat(audio): add Web Audio configuration and device selection to CallPage component 2026-01-04 23:19:09 -06:00
9d8611bb97 feat(audio): implement WebAudioBridge for websocket audio transport and add configuration options 2026-01-04 23:18:55 -06:00
52e5a60724 refactor(docker): streamline multi-stage build process and optimize dependencies 2026-01-04 23:02:48 -06:00
194f467298 chore(task): remove 'out' directory cleanup from Taskfile.yml 2026-01-04 23:02:40 -06:00
a05fdee7e9 feat(interface): add RNodeIPInterface support 2026-01-04 22:39:37 -06:00
c9c2125e6f Merge branch 'master' into massive-changes
Some checks failed
CI / lint (push) Failing after 5m8s
CI / test-backend (push) Successful in 4s
Benchmarks / benchmark (push) Successful in 14m21s
Build Test / Build and Test (push) Successful in 15m23s
CI / build-frontend (push) Successful in 9m44s
CI / test-backend (pull_request) Successful in 4s
CI / lint (pull_request) Failing after 5m3s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (push) Successful in 9m35s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 9s
Benchmarks / benchmark (pull_request) Successful in 12m25s
Build Test / Build and Test (pull_request) Successful in 13m2s
CI / build-frontend (pull_request) Successful in 9m40s
Tests / test (push) Failing after 15m39s
CI / test-lang (pull_request) Successful in 9m36s
Tests / test (pull_request) Successful in 11m58s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m54s
2026-01-05 04:26:54 +00:00
8b3d3c3e66 fix(electron): remove asarUnpack and update resource paths for extraResources in package.json
Some checks failed
CI / test-backend (push) Successful in 5s
CI / build-frontend (push) Successful in 1m47s
CI / test-lang (push) Successful in 1m45s
CI / test-backend (pull_request) Successful in 31s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m3s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 24s
CI / lint (push) Failing after 5m7s
CI / lint (pull_request) Failing after 5m7s
CI / build-frontend (pull_request) Successful in 9m44s
Build Test / Build and Test (pull_request) Successful in 13m40s
Benchmarks / benchmark (push) Successful in 15m48s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m26s
Benchmarks / benchmark (pull_request) Successful in 16m7s
Tests / test (push) Failing after 25m18s
Tests / test (pull_request) Failing after 23m28s
Build Test / Build and Test (push) Successful in 45m53s
2026-01-04 22:26:29 -06:00
f2a93cbc98 fix(electron): update resource paths in main.js to reflect changes in packaging structure for extra resources 2026-01-04 22:26:07 -06:00
5100428b68 feat(config): update asar configuration to enable packing and include extra resources for build 2026-01-04 22:26:01 -06:00
0a40790338 feat(security): add security.md 2026-01-04 22:25:55 -06:00
4974ae0926 Update README.md
Some checks failed
CI / test-backend (push) Successful in 17s
CI / lint (push) Failing after 47s
CI / build-frontend (push) Successful in 9m34s
Tests / test (push) Successful in 9m53s
2026-01-05 04:15:18 +00:00
900da98ecb Update README.md
Some checks failed
CI / test-backend (push) Successful in 14s
CI / lint (push) Failing after 52s
CI / build-frontend (push) Successful in 9m36s
Tests / test (push) Successful in 9m46s
2026-01-05 03:06:35 +00:00
db89e2c86e fix(tests): fix test
All checks were successful
CI / test-backend (push) Successful in 5s
CI / lint (push) Successful in 9m41s
Build Test / Build and Test (push) Successful in 13m29s
CI / build-frontend (push) Successful in 9m40s
CI / test-lang (push) Successful in 9m34s
Benchmarks / benchmark (push) Successful in 14m44s
CI / test-backend (pull_request) Successful in 5s
Build and Publish Docker Image / build (pull_request) Has been skipped
Benchmarks / benchmark (pull_request) Successful in 14m45s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 9s
CI / lint (pull_request) Successful in 9m37s
CI / build-frontend (pull_request) Successful in 9m37s
CI / test-lang (pull_request) Successful in 9m32s
Tests / test (push) Successful in 11m57s
Tests / test (pull_request) Successful in 11m59s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m1s
Build Test / Build and Test (pull_request) Successful in 41m13s
2026-01-04 19:14:53 -06:00
629bbbc7c6 refactor(meshchat): update map to render tiles faster (online), message handling by adding context support to forwarding and delivery methods; improve LXMF message processing and router initialization
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 1m54s
CI / test-lang (push) Successful in 1m53s
CI / test-backend (pull_request) Successful in 21s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 49s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 23s
CI / lint (push) Successful in 9m46s
CI / build-frontend (pull_request) Successful in 9m44s
CI / lint (pull_request) Successful in 9m47s
Build Test / Build and Test (pull_request) Successful in 13m17s
Benchmarks / benchmark (push) Successful in 14m34s
Benchmarks / benchmark (pull_request) Successful in 14m42s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m53s
Tests / test (push) Failing after 26m51s
Tests / test (pull_request) Failing after 24m53s
Build Test / Build and Test (push) Successful in 45m21s
2026-01-04 19:10:22 -06:00
ff69de1346 feat(config): add gitea base URL and documentation download URLs configuration; update related components and logic for dynamic URL handling 2026-01-04 18:55:10 -06:00
2f65bde2d3 feat(call): update tone generator functionality with volume control and enable/disable settings; update frontend components to reflect new configurations 2026-01-04 17:48:07 -06:00
0a65619efb feat(call): update telephony handling by adding remote telephony hash retrieval and updating frontend components to utilize new data
Some checks failed
CI / test-backend (push) Successful in 3s
CI / build-frontend (push) Successful in 2m12s
CI / test-lang (push) Successful in 2m13s
CI / test-backend (pull_request) Successful in 33s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m3s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 25s
CI / lint (push) Failing after 5m5s
CI / lint (pull_request) Failing after 5m4s
CI / build-frontend (pull_request) Successful in 9m43s
Build Test / Build and Test (pull_request) Successful in 13m39s
Benchmarks / benchmark (push) Successful in 15m19s
Benchmarks / benchmark (pull_request) Successful in 15m21s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m0s
Tests / test (push) Failing after 23m27s
Tests / test (pull_request) Failing after 20m38s
Build Test / Build and Test (push) Successful in 43m20s
2026-01-04 17:20:41 -06:00
d836e7a2e8 feat(call): improve call handling by adding remote destination hash tracking, improving initiation status checks, and refining ringtone management in the frontend 2026-01-04 17:16:23 -06:00
5ef41b84d5 feat(call): update call statistics tracking and improve hangup functionality; add tone generation for call states in frontend 2026-01-04 17:01:21 -06:00
ad928d1279 feat(voicemail): add silence file generation for missing recordings and improve logging for voicemail saving
Some checks failed
CI / lint (push) Failing after 5m1s
CI / test-backend (push) Successful in 4s
Benchmarks / benchmark (push) Successful in 12m31s
CI / build-frontend (push) Successful in 9m43s
Build Test / Build and Test (push) Successful in 14m45s
CI / test-backend (pull_request) Successful in 5s
CI / lint (pull_request) Failing after 5m3s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (push) Successful in 9m37s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 11s
Benchmarks / benchmark (pull_request) Successful in 12m20s
Build Test / Build and Test (pull_request) Successful in 13m32s
CI / build-frontend (pull_request) Successful in 9m43s
CI / test-lang (pull_request) Successful in 9m34s
Tests / test (pull_request) Failing after 7m23s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m54s
Tests / test (push) Failing after 20m7s
2026-01-04 16:02:32 -06:00
c4674992e0 lots of fixes, changes, styling, fixing outbound calls, rnode-flasher.
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 1m49s
CI / test-lang (push) Successful in 1m47s
CI / test-backend (pull_request) Successful in 24s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 52s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 24s
CI / lint (push) Failing after 5m14s
CI / lint (pull_request) Failing after 5m8s
Tests / test (push) Failing after 9m17s
CI / build-frontend (pull_request) Successful in 9m48s
Benchmarks / benchmark (push) Successful in 14m52s
Benchmarks / benchmark (pull_request) Successful in 15m9s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m47s
Tests / test (pull_request) Failing after 25m50s
Build Test / Build and Test (pull_request) Successful in 53m37s
Build Test / Build and Test (push) Successful in 56m30s
2026-01-04 15:57:49 -06:00
f3ec20b14e feat(rnpath): improve get_path_table method with filtering, sorting, and pagination; include additional stats for path entries 2026-01-04 15:00:16 -06:00
014e463527 feat(api): add firmware download endpoint with GitHub URL validation and enhance initiation status response with target name 2026-01-04 14:59:47 -06:00
fd846e3ed2 feat(security): update Content Security Policy to allow connections to GitHub API and related domains 2026-01-04 14:59:36 -06:00
bbf61d88a5 feat(vitest): add setupFiles configuration to Vitest for frontend tests 2026-01-04 14:59:27 -06:00
8fac1134e2 feat(call): improve call initiation and status handling with new properties for target hash and name; improve UI modals for tutorial and changelog visibility based on URL parameters 2026-01-04 14:59:20 -06:00
dda8a58bb3 fix(rncp): update eslint comments to enable linting for v-html usage in RNCPPage component 2026-01-04 12:42:29 -06:00
162d7c14f9 feat(theme): refactor theme handling to utilize Vuetify's useTheme, streamline theme application logic, and enhance header styling for improved UX 2026-01-04 12:42:16 -06:00
54ccc03c4d feat(tests): add unit tests for BanishedPage and RNPathPage components, enhancing coverage for blocked items and path management 2026-01-04 12:42:02 -06:00
bc40dcff4e feat(tests): add comprehensive tests for blackhole integration, RNPath management, and RNStatus handling 2026-01-04 12:41:55 -06:00
4482ebf5cd feat(locales): update translations for emergency mode, blackhole integration, and RNCP usage instructions 2026-01-04 12:41:49 -06:00
4507a999fc feat(frontend): big updates (too many) 2026-01-04 12:41:34 -06:00
6a61441e73 style: enhance chip components with improved styling and hover effects for better user experience 2026-01-04 12:40:41 -06:00
f270160c6c feat(theme): add light and dark themes to Vuetify configuration and introduce new RNPath route 2026-01-04 12:40:32 -06:00
63d81a02c9 feat(rnpath): implement the blackhole, RNPathHandler and integrate path management APIs 2026-01-04 12:40:19 -06:00
306557c473 chore(ci): remove post CI result posting steps and delete related script 2026-01-04 11:27:05 -06:00
9b8086a855 chore(dependencies): update aiohttp to version 3.13.3, certifi to 2026.1.4, cx-freeze to 8.5.3, hypothesis to 6.148.11, and rns to 1.1.0
All checks were successful
CI / test-backend (push) Successful in 4s
Build Test / Build and Test (push) Successful in 13m17s
Benchmarks / benchmark (push) Successful in 14m20s
CI / build-frontend (pull_request) Successful in 9m40s
CI / test-backend (pull_request) Successful in 5s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 10s
Tests / test (push) Successful in 11m39s
Benchmarks / benchmark (pull_request) Successful in 14m26s
CI / test-lang (pull_request) Successful in 9m35s
CI / build-frontend (push) Successful in 9m41s
CI / lint (push) Successful in 9m43s
CI / test-lang (push) Successful in 9m37s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / lint (pull_request) Successful in 9m44s
Tests / test (pull_request) Successful in 11m33s
Build and Publish Docker Image / build-dev (pull_request) Successful in 20m0s
Build Test / Build and Test (pull_request) Successful in 48m40s
2026-01-04 11:06:57 -06:00
1e3eedadc8 chore(build-test): update dependencies for Windows build and add i386 architecture support
Some checks failed
CI / build-frontend (pull_request) Successful in 9m45s
CI / lint (pull_request) Successful in 9m47s
Benchmarks / benchmark (push) Successful in 14m43s
CI / test-backend (push) Successful in 4s
Tests / test (push) Successful in 13m25s
Build Test / Build and Test (push) Failing after 46m28s
CI / build-frontend (push) Successful in 1m55s
Benchmarks / benchmark (pull_request) Successful in 14m45s
CI / test-lang (push) Successful in 1m53s
Build and Publish Docker Image / build-dev (pull_request) Successful in 14m41s
Tests / test (pull_request) Successful in 24m46s
Build Test / Build and Test (pull_request) Failing after 48m2s
CI / test-backend (pull_request) Successful in 40s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m3s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 26s
CI / lint (push) Successful in 9m46s
2026-01-04 10:52:32 -06:00
e60db20082 fix(post_ci_results.sh): improve CI result reporting and comment handling for Gitea 2026-01-04 00:34:24 -06:00
1dd6d93729 refactor(AboutPage): remove unused MaterialDesignIcon component from AboutPage.vue 2026-01-04 00:18:19 -06:00
23df1b0618 chore(eslint): add additional directories to ignore patterns 2026-01-04 00:18:11 -06:00
46872209d6 chore(eslint): update ignore patterns to exclude additional directories and files
Some checks failed
CI / test-lang (push) Successful in 1m51s
Build Test / Build and Test (push) Failing after 31m5s
CI / test-backend (pull_request) Successful in 54s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m8s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 29s
CI / lint (push) Failing after 5m11s
CI / lint (pull_request) Failing after 5m8s
CI / build-frontend (pull_request) Successful in 9m45s
Tests / test (push) Successful in 13m22s
Benchmarks / benchmark (push) Successful in 14m43s
Benchmarks / benchmark (pull_request) Successful in 14m46s
Build and Publish Docker Image / build-dev (pull_request) Successful in 17m29s
CI / test-backend (push) Successful in 3s
Tests / test (pull_request) Failing after 20m54s
CI / build-frontend (push) Successful in 1m53s
Build Test / Build and Test (pull_request) Failing after 30m58s
2026-01-04 00:08:50 -06:00
c8a014b06e feat(AboutPage, MapPage): add identity hash and LXMF address display in AboutPage, and ensure axios is available before fetching peers in MapPage 2026-01-04 00:07:29 -06:00
5a995c7304 test(AboutPage, ConfirmDialog, MapPage): update tests for improved component behavior and UI consistency 2026-01-04 00:07:21 -06:00
c7c70a5868 feat(identity_manager, map_manager): add lxmf and lxst address handling in metadata and optimize tile downloading with parallel processing 2026-01-04 00:04:40 -06:00
2b6cef04d0 feat(App, ConfirmDialog, AboutPage, MapPage, ConversationViewer, IdentitiesPage): enhance UI components with improved styles, new features for note editing, and better user interactions 2026-01-04 00:04:32 -06:00
d0db79e4e4 feat(RNodeFlasherPage): add new RNode Flasher page component and update ToolsPage to link to it 2026-01-04 00:04:18 -06:00
c4f13e579b feat(GlobalState, KeyboardShortcuts): enhance global state management with new properties and improve keyboard shortcut handling for better modifier key detection 2026-01-04 00:03:58 -06:00
360dc92883 docs(docker-compose): update comments to clarify automated permission handling and user execution for reticulum-meshchatx 2026-01-04 00:03:36 -06:00
81dcc826d9 fix(Dockerfile): update user and group creation, enhance build process, and improve command execution for meshchat 2026-01-04 00:03:31 -06:00
76b2895569 feat(router): add new route for RNode Flasher tool component 2026-01-04 00:02:35 -06:00
3ba7de9df7 feat(style): add no-scrollbar class to hide scrollbars for a cleaner UI 2026-01-04 00:02:30 -06:00
b569371e3d fix(database): improve error handling during migrations to silence expected duplicate column errors 2026-01-04 00:02:17 -06:00
6b3957fe49 feat(TutorialModal, AddInterfacePage, InterfacesPage): enhance UI components and integrate global state management for interface changes 2026-01-04 00:01:59 -06:00
af6a2b7be1 feat(database): implement safe execution method for database queries for error handling and preventing crashes 2026-01-03 23:16:16 -06:00
4c94c69586 fix(lxmf_utils): improve error handling in message conversion and ensure safe decoding of title and content 2026-01-03 23:15:58 -06:00
a67e7da8af test(fuzzing): add extensive fuzzing tests for WebSocket API, LXMF message handling, and telemetry functions 2026-01-03 23:15:48 -06:00
83d18f4bb3 refactor(tests): remove unnecessary blank lines in AuthPage, CommandPalette, and ConfirmDialog tests 2026-01-03 23:02:33 -06:00
6173a65cc7 fix(SettingsPage.vue): correct config initialization to return default values directly 2026-01-03 23:02:27 -06:00
46b3b25631 docs(README): add 'check' task to automation section for format, lint, and test 2026-01-03 22:42:00 -06:00
571dba4715 feat(Taskfile): add new 'check' task to run format, lint, and test commands 2026-01-03 22:41:54 -06:00
2357fdb83c feat(SettingsPage.vue): update config handling with default values and safe access method 2026-01-03 22:41:15 -06:00
c83691d9e6 fix(CommandPalette.vue): ensure peers and contacts are arrays before processing to prevent errors 2026-01-03 22:40:49 -06:00
2210f10305 test(frontend): add comprehensive unit tests for AuthPage, CommandPalette, ConfirmDialog, UI components, and theme handling 2026-01-03 22:40:27 -06:00
86bbbb8003 test(app_shutdown): update shutdown endpoint test by mocking exit_app and improving sleep handling 2026-01-03 22:14:13 -06:00
997de18d78 refactor(ReticulumMeshChat): extract exit logic into a separate method for improved readability 2026-01-03 22:14:06 -06:00
7f30ffe07a chore(docker): update Docker configuration 2026-01-03 22:12:39 -06:00
e7d86e11d2 fix(SendMessageButton.vue): update deliveryMethod prop validation to allow null or string values 2026-01-03 22:11:49 -06:00
356baee37d chore(pytest.ini): add filterwarnings to suppress specific warnings during test execution 2026-01-03 22:10:43 -06:00
1463eb07bb test(Performance.test.js): increase performance threshold for message updates from 1500ms to 3000ms 2026-01-03 22:10:25 -06:00
4e308e427a test(tests): add global mocks and cleanup fixtures for improved test isolation and resource management 2026-01-03 22:10:19 -06:00
31525e2ede chore(ci workflows): update scripts to post CI results for linting, building, and testing phases 2026-01-03 21:56:53 -06:00
2955b5f2c2 chore(post_bench_results.sh): remove obsolete script for posting benchmark results 2026-01-03 21:56:47 -06:00
7bd681e217 refactor(NetworkVisualiser.vue): update physics simulation parameters and improve interface node positioning for better visualization
Some checks failed
CI / lint (push) Successful in 9m48s
CI / test-backend (pull_request) Successful in 5s
Build and Publish Docker Image / build (pull_request) Has been skipped
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 1m3s
Benchmarks / benchmark (push) Successful in 14m35s
Benchmarks / benchmark (pull_request) Successful in 14m35s
CI / lint (pull_request) Successful in 9m40s
CI / build-frontend (pull_request) Successful in 9m40s
CI / test-lang (pull_request) Successful in 9m37s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m39s
Tests / test (push) Failing after 23m47s
Build Test / Build and Test (push) Failing after 26m34s
CI / test-backend (push) Successful in 4s
Tests / test (pull_request) Failing after 16m19s
Build Test / Build and Test (pull_request) Failing after 5m56s
CI / build-frontend (push) Successful in 9m46s
CI / test-lang (push) Successful in 9m42s
2026-01-03 21:53:57 -06:00
44c263ba96 chore(arch-package.yml): format 2026-01-03 21:46:40 -06:00
bcd9006f31 refactor(NetworkVisualiser.vue): restructure color and opacity properties for edges to improve clarity and maintainability 2026-01-03 21:45:39 -06:00
eaaaddbc06 chore(flake.nix): remove Flatpak packaging dependencies from flake configuration
Some checks failed
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 1m2s
CI / build-frontend (pull_request) Successful in 49s
CI / test-backend (pull_request) Successful in 15s
Build Test / Build and Test (pull_request) Failing after 27m1s
CI / test-lang (pull_request) Successful in 48s
Tests / test (push) Failing after 8m48s
CI / lint (push) Successful in 9m44s
CI / lint (pull_request) Successful in 9m41s
Build Test / Build and Test (push) Failing after 31m26s
Benchmarks / benchmark (push) Successful in 14m44s
Benchmarks / benchmark (pull_request) Successful in 14m47s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m47s
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 1m12s
CI / test-lang (push) Successful in 1m11s
Tests / test (pull_request) Failing after 14m19s
Build and Publish Docker Image / build (pull_request) Has been skipped
2026-01-03 21:36:57 -06:00
241d385c17 chore(dockerignore, gitignore): add Arch Linux packaging artifacts to ignore lists 2026-01-03 21:36:53 -06:00
48b7def004 fix(schema.py): ensure cursor is closed after executing PRAGMA table_info to prevent resource leaks 2026-01-03 21:36:47 -06:00
8c5a68a01f refactor(tests): replace db.close() with db.close_all() in multiple test files and ensure proper teardown of ReticulumMeshChat instances 2026-01-03 21:36:42 -06:00
151b69ad50 feat(docker): add Dockerfile for Arch Linux package build and CI workflow for automated packaging 2026-01-03 21:36:33 -06:00
c1d177a887 feat(App.vue): update app name click functionality and improve UI interactions
Some checks failed
CI / lint (pull_request) Failing after 1s
CI / build-frontend (pull_request) Failing after 0s
CI / test-backend (pull_request) Failing after 0s
CI / test-lang (pull_request) Failing after 0s
Build and Publish Docker Image / build (pull_request) Has been skipped
OSV-Scanner PR Scan / scan-pr (pull_request) Failing after 0s
Build and Publish Docker Image / build-dev (pull_request) Failing after 0s
CI / test-backend (push) Successful in 54s
Tests / test (pull_request) Failing after 0s
CI / lint (push) Successful in 2m16s
Build Test / Build and Test (push) Failing after 7m23s
Build Test / Build and Test (pull_request) Failing after 7m17s
Tests / test (push) Failing after 7m16s
CI / test-lang (push) Successful in 9m49s
CI / build-frontend (push) Successful in 9m59s
Benchmarks / benchmark (push) Successful in 15m25s
Benchmarks / benchmark (pull_request) Successful in 15m20s
2026-01-03 21:18:31 -06:00
1075aef22a feat(locales): add syncing state message to German, English, and Russian translations 2026-01-03 21:18:20 -06:00
6d975a12c4 feat(tests): add comprehensive LXMF propagation and sync tests 2026-01-03 21:18:14 -06:00
409802465a fix(forge.config.js): remove darwin platform from ZIP maker configuration
Some checks failed
Tests / test (pull_request) Failing after 0s
CI / lint (pull_request) Failing after 0s
CI / build-frontend (pull_request) Failing after 0s
CI / test-backend (pull_request) Failing after 0s
CI / test-lang (pull_request) Failing after 0s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build and Publish Docker Image / build-dev (pull_request) Failing after 0s
OSV-Scanner PR Scan / scan-pr (pull_request) Failing after 0s
CI / test-backend (push) Successful in 40s
CI / lint (push) Successful in 1m50s
Build Test / Build and Test (pull_request) Failing after 7m7s
Build Test / Build and Test (push) Failing after 7m13s
Tests / test (push) Failing after 7m13s
CI / test-lang (push) Successful in 9m39s
CI / build-frontend (push) Successful in 9m45s
Benchmarks / benchmark (push) Successful in 14m32s
Benchmarks / benchmark (pull_request) Successful in 14m27s
2026-01-03 20:54:44 -06:00
e4be402510 feat(package.json): add Flatpak configuration and ZIP distribution task to build process 2026-01-03 20:54:39 -06:00
cc5b4a9f0d feat(workflows): add build-zip task for Electron ZIP archive and update build process to include ZIP artifact collection 2026-01-03 20:54:33 -06:00
c028da2485 feat(scripts): add post CI results script for Gitea PR comments 2026-01-03 20:54:20 -06:00
c100aefdd5 feat(CallPage): integrate GlobalEmitter for telephone history and voicemail updates 2026-01-03 20:54:08 -06:00
d2f5ef1ae1 fix(App.vue): update call overlay conditions and improve window opening logic 2026-01-03 20:54:00 -06:00
ccec0afa22 refactor(tests): remove unused asyncio import from test_propagation_nodes_robustness.py 2026-01-03 20:53:47 -06:00
d4ed2c1e8f chore(todos): format 2026-01-03 19:56:25 -06:00
7f9925bca2 fix(requirements): correct 2026-01-03 19:56:15 -06:00
f60431789d feat(workflows): add benchmarking PR result posting
Some checks failed
CI / test-backend (push) Successful in 6s
CI / lint (push) Failing after 5m2s
CI / build-frontend (push) Successful in 9m38s
CI / test-lang (push) Successful in 9m35s
Benchmarks / benchmark (pull_request) Successful in 13m19s
CI / test-backend (pull_request) Successful in 5s
Tests / test (pull_request) Failing after 7m8s
CI / lint (pull_request) Failing after 1m16s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / build-frontend (pull_request) Successful in 9m39s
Build Test / Build and Test (pull_request) Successful in 6m2s
CI / test-lang (pull_request) Successful in 9m36s
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 7s
Benchmarks / benchmark (push) Successful in 12m57s
Build Test / Build and Test (push) Successful in 21m35s
Tests / test (push) Failing after 8m7s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m24s
2026-01-03 19:54:26 -06:00
7668ee5619 feat(workflows): add elfutils installation and Flathub remote setup to build process
Some checks failed
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 8s
Tests / test (push) Failing after 6m51s
CI / build-frontend (push) Successful in 9m41s
CI / test-lang (push) Successful in 9m35s
CI / lint (pull_request) Failing after 5m2s
CI / build-frontend (pull_request) Successful in 9m39s
Tests / test (pull_request) Failing after 7m25s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m24s
Benchmarks / benchmark (push) Successful in 35m32s
CI / test-backend (push) Successful in 4s
Benchmarks / benchmark (pull_request) Successful in 32m30s
CI / lint (push) Failing after 4m59s
Build Test / Build and Test (push) Successful in 8m7s
CI / test-backend (pull_request) Successful in 17s
CI / test-lang (pull_request) Successful in 51s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (pull_request) Successful in 5m52s
2026-01-03 19:46:38 -06:00
961ad0f6ca feat(scripts): add OSV-Scanner script for vulnerability scanning and reporting
Some checks failed
CI / test-backend (push) Successful in 4s
Tests / test (push) Failing after 22m9s
CI / lint (push) Failing after 5m10s
Build Test / Build and Test (pull_request) Successful in 5m12s
CI / build-frontend (push) Successful in 9m43s
CI / test-lang (push) Successful in 9m40s
CI / lint (pull_request) Failing after 5m0s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
OSV-Scanner PR Scan / scan-pr (pull_request) Successful in 7s
Benchmarks / benchmark (push) Successful in 14m33s
Benchmarks / benchmark (pull_request) Successful in 14m36s
CI / build-frontend (pull_request) Successful in 9m40s
Tests / test (pull_request) Failing after 7m18s
Build Test / Build and Test (push) Successful in 17m57s
CI / test-lang (pull_request) Successful in 9m34s
Build and Publish Docker Image / build-dev (pull_request) Successful in 13m6s
2026-01-03 19:40:31 -06:00
371fc6137c feat(workflows): update build process to include RPM and Flatpak packaging, integrate SBOM generation, and refine version validation 2026-01-03 19:40:26 -06:00
d5fa65f6f3 feat(workflows): integrate Trivy for Docker image scanning in CI/CD pipeline 2026-01-03 19:40:21 -06:00
98c3c0194c feat(workflows): add OSV-Scanner workflows for pull request and scheduled scans 2026-01-03 19:40:13 -06:00
a18a19d625 feat(build): add new distribution tasks for Linux RPM and Flatpak packages in package.json
Some checks failed
Benchmarks / benchmark (push) Successful in 13m25s
Build and Publish Docker Image / build-dev (pull_request) Successful in 11m54s
Tests / test (pull_request) Failing after 19m18s
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 2m8s
CI / build-frontend (pull_request) Successful in 1m21s
CI / lint (pull_request) Failing after 2m44s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-backend (pull_request) Successful in 41s
CI / test-lang (pull_request) Successful in 1m24s
Benchmarks / benchmark (pull_request) Successful in 34m2s
CI / lint (push) Failing after 5m5s
Build Test / Build and Test (push) Successful in 5m39s
Build Test / Build and Test (pull_request) Successful in 5m37s
Tests / test (push) Failing after 8m13s
CI / test-lang (push) Successful in 9m36s
2026-01-03 19:28:49 -06:00
db6d8d590b feat(build): add tasks for building Linux RPM and Flatpak packages; update build workflow to include new packaging steps 2026-01-03 19:28:44 -06:00
d209c0c9ab feat(tests): add comprehensive tests for AnnounceDAO filtering functionality and enhance robustness checks for propagation nodes endpoint
Some checks failed
CI / build-frontend (push) Successful in 1m30s
CI / test-backend (push) Successful in 4s
CI / test-lang (push) Successful in 1m28s
CI / test-backend (pull_request) Successful in 45s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m8s
Benchmarks / benchmark (push) Has been cancelled
Build Test / Build and Test (push) Has been cancelled
CI / lint (push) Has been cancelled
Build Test / Build and Test (pull_request) Has been cancelled
Tests / test (push) Has been cancelled
CI / lint (pull_request) Has been cancelled
Build and Publish Docker Image / build-dev (pull_request) Has been cancelled
Benchmarks / benchmark (pull_request) Has been cancelled
CI / build-frontend (pull_request) Has been cancelled
Tests / test (pull_request) Has been cancelled
2026-01-03 19:26:35 -06:00
c9c2aeac68 chore(workflows): format 2026-01-03 19:24:34 -06:00
8ac458bafd test(DebugLogsPage): improve log mock structure and enhance code formatting for better readability 2026-01-03 19:24:23 -06:00
2402b960f0 feat(keyboard): enhance keyboard shortcut handling by synchronizing modifier keys on keydown and keyup events; improve shortcut matching logic for better compatibility and layout independence 2026-01-03 19:24:15 -06:00
af51209c5b feat(icons): add Material Design Icons stylesheet to main.js for enhanced icon support 2026-01-03 19:24:06 -06:00
803eaba5b1 feat(locales): add 'generate_paper_message' key to German, English, and Russian locale files 2026-01-03 19:24:01 -06:00
925b7b2950 feat(announces): fix announce handling by adding identity and destination hash filters; ensure UTC formatting for created_at and updated_at timestamps; introduce message font size configuration 2026-01-03 19:23:53 -06:00
0aa0571403 chore(dependencies): update @mdi/font to version 7.4.47, @electron/notarize to version 2.5.0, and @electron/osx-sign to version 1.3.3; upgrade check-error to version 2.1.3 and @types/node to version 25.0.3 2026-01-03 19:22:48 -06:00
cf4c6ba8ea chore(dependencies): update @electron/fuses to version 1.8.0 and add @mdi/font as a new dependency 2026-01-03 19:22:42 -06:00
90e70d7787 docs(README): format 2026-01-03 19:22:36 -06:00
35e3566a63 feat(tailwind): add markdown renderer to content paths for Tailwind CSS configuration to support backend styling 2026-01-03 19:22:21 -06:00
1e8651c645 feat(icons): update icon sizes across various components for improved consistency and visual clarity; adjust padding and styles in documentation and settings pages 2026-01-03 19:22:14 -06:00
7abd0571c9 refactor(forge.config): standardize quotation marks and reorganize configuration structure for clarity and consistency
Some checks failed
CI / lint (push) Successful in 1m43s
Build Test / Build and Test (pull_request) Successful in 59s
CI / test-backend (pull_request) Successful in 39s
CI / lint (pull_request) Successful in 1m35s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-lang (pull_request) Successful in 1m22s
Tests / test (push) Failing after 8m4s
CI / build-frontend (push) Successful in 9m41s
CI / test-lang (push) Successful in 9m37s
Tests / test (pull_request) Failing after 7m4s
CI / build-frontend (pull_request) Successful in 9m37s
Benchmarks / benchmark (pull_request) Successful in 13m23s
Build and Publish Docker Image / build-dev (pull_request) Successful in 11m37s
Benchmarks / benchmark (push) Successful in 17m12s
CI / test-backend (push) Successful in 4s
Build Test / Build and Test (push) Successful in 1m12s
2026-01-03 18:43:31 -06:00
17d7ad86a0 feat(tests): update Taskfile to exclude i18n tests from frontend tests and add dedicated language tests for Node.js and Python 2026-01-03 18:43:24 -06:00
35476d0c0a feat(ci): add new test-lang job to CI workflow for running language tests with Node.js and Python setup 2026-01-03 18:43:20 -06:00
fd41a62bc1 refactor(tests): streamline test code by removing unused imports and optimizing function calls for performance benchmarks 2026-01-03 18:43:13 -06:00
a1c87bebf3 fix(App): refine notification logic for incoming messages; enhance button formatting in changelog modal; improve icon color handling in user icon component 2026-01-03 18:42:59 -06:00
392fe50f82 refactor(meshchat): clean up code formatting and enhance version retrieval for LXST; improve log handling and anomaly detection logic 2026-01-03 18:42:50 -06:00
b51d04953f feat(App): add emergency banner for active mode and enhance changelog modal with version tracking; improve notification logic for incoming messages
Some checks failed
Tests / test (pull_request) Failing after 12m30s
Benchmarks / benchmark (push) Successful in 15m13s
Benchmarks / benchmark (pull_request) Successful in 23m54s
CI / test-backend (push) Successful in 4s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (pull_request) Failing after 1m23s
CI / build-frontend (push) Successful in 1m31s
CI / lint (push) Failing after 5m4s
CI / lint (pull_request) Failing after 5m5s
Build Test / Build and Test (push) Successful in 6m4s
CI / build-frontend (pull_request) Successful in 9m43s
Tests / test (push) Failing after 9m53s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m43s
2026-01-03 18:31:45 -06:00
d717679790 feat(logging): implement persistent logging with anomaly detection and database integration for debug logs 2026-01-03 18:31:38 -06:00
e1cc971cca feat(DebugLogsPage): update log management with search, filtering, and pagination features 2026-01-03 18:31:16 -06:00
f5950f9a8d test(DebugLogsPage): add unit tests for fetching, searching, and paginating debug logs 2026-01-03 18:31:01 -06:00
1418bb80f7 refactor(manifest): restructure backend integrity manifest to include metadata and adjust file verification logic 2026-01-03 18:30:55 -06:00
ed3cc4215a chore(forge.config): update runtime and base versions to 24.08 2026-01-03 18:30:47 -06:00
9ecdd157f3 refactor(tests): update backend integrity tests to include metadata in manifest and add new debug log tests for persistent logging functionality 2026-01-03 18:30:40 -06:00
842c4a3938 feat(locales): add "do not show ever again" message and emergency mode notification in German, English, and Russian translations 2026-01-03 18:30:33 -06:00
4ce3b1e65c fix(electron): update references to Reticulum MeshChatX in main files and frontend HTML
Some checks failed
CI / test-backend (push) Successful in 41s
Build Test / Build and Test (push) Successful in 1m11s
CI / test-backend (pull_request) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (pull_request) Successful in 1m10s
CI / lint (push) Failing after 1m55s
CI / build-frontend (pull_request) Successful in 2m4s
CI / lint (pull_request) Failing after 5m4s
Tests / test (push) Failing after 8m25s
Tests / test (pull_request) Failing after 8m11s
CI / build-frontend (push) Successful in 9m38s
Benchmarks / benchmark (pull_request) Successful in 13m54s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m44s
Benchmarks / benchmark (push) Successful in 16m54s
2026-01-03 18:06:34 -06:00
82e55509e2 fix(locales): update restart description in German, English, and Russian to reflect the correct application name as MeshChatX 2026-01-03 17:49:06 -06:00
59919f9281 chore(pyproject): configure Ruff to exclude specific directories from linting
Some checks failed
Build and Publish Docker Image / build (pull_request) Has been skipped
Build Test / Build and Test (pull_request) Successful in 1m18s
CI / lint (pull_request) Failing after 2m9s
CI / test-backend (pull_request) Successful in 1m9s
CI / lint (push) Failing after 2m20s
Tests / test (push) Failing after 8m24s
Tests / test (pull_request) Failing after 8m5s
Benchmarks / benchmark (push) Successful in 9m35s
CI / build-frontend (push) Successful in 9m44s
CI / build-frontend (pull_request) Successful in 9m40s
Benchmarks / benchmark (pull_request) Successful in 13m44s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m36s
CI / test-backend (push) Successful in 1m16s
Build Test / Build and Test (push) Successful in 1m21s
2026-01-03 17:41:21 -06:00
246d9f5f74 chore(taskfile): simplify linting commands in Taskfile.yml to check all files 2026-01-03 17:41:16 -06:00
3ed9c96f6c fix(workflow): update build process to include zip creation for frontend assets and add zip to release artifacts 2026-01-03 17:41:12 -06:00
76b0e47a70 feat(workflow): add build and test workflow for CI/CD with NodeJS, Python, and system dependencies 2026-01-03 17:41:04 -06:00
00828e59a4 docs(MeshChatX): move
Some checks failed
CI / test-backend (push) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-backend (pull_request) Successful in 4s
CI / lint (pull_request) Failing after 2m22s
CI / build-frontend (push) Successful in 2m57s
CI / lint (push) Failing after 5m3s
Tests / test (push) Failing after 7m27s
Tests / test (pull_request) Failing after 7m29s
CI / build-frontend (pull_request) Successful in 9m44s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m38s
Benchmarks / benchmark (pull_request) Successful in 16m59s
Benchmarks / benchmark (push) Successful in 17m8s
2026-01-03 17:24:24 -06:00
184f0dbf14 docs(README): update README 2026-01-03 17:23:52 -06:00
96f3f527f4 feat(config): add Electron Forge configuration for packaging and distribution, including support for multiple platforms and fuses plugin 2026-01-03 17:20:10 -06:00
655bf47dc1 feat(docker): add dev docker-compose 2026-01-03 17:20:09 -06:00
edc3f83dd5 chore(.gitignore): add 'out/' directory to .gitignore to exclude build output files 2026-01-03 17:20:09 -06:00
96a7df3bcb feat(taskfile): add Electron Forge tasks for starting, packaging, and generating distributables; remove obsolete Flatpak tasks 2026-01-03 17:20:09 -06:00
bb677c2e27 chore(dependencies): update package.json and pnpm-lock.yaml to include new Electron Forge plugins and dependencies for improved build and packaging processes 2026-01-03 17:20:09 -06:00
90bd917928 chore(npm): update .npmrc to set node-linker to hoisted and remove fetch configuration 2026-01-03 17:20:09 -06:00
5e08a87f70 chore(flatpak): remove flatpak build script and configuration file (in favor of electron forge) 2026-01-03 17:20:09 -06:00
e93e657ab0 Merge branch 'master' into massive-changes
Some checks failed
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-backend (push) Successful in 4s
CI / test-backend (pull_request) Successful in 4s
CI / build-frontend (push) Successful in 2m15s
CI / lint (pull_request) Failing after 2m35s
CI / lint (push) Failing after 5m11s
Tests / test (push) Failing after 8m37s
Tests / test (pull_request) Failing after 8m35s
CI / build-frontend (pull_request) Successful in 9m41s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m43s
Benchmarks / benchmark (push) Successful in 15m14s
Benchmarks / benchmark (pull_request) Successful in 15m11s
2026-01-03 23:02:27 +00:00
978d917e89 chore(TODO): add new tasks
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 59s
Tests / test (push) Successful in 1m36s
CI / lint (push) Failing after 4m56s
2026-01-03 16:55:01 -06:00
4aea5c09f3 docs(README): update task commands 2026-01-03 16:54:55 -06:00
6fbf9a3068 docs(README): fix
Some checks failed
CI / test-backend (push) Successful in 49s
CI / lint (push) Failing after 1m21s
CI / build-frontend (push) Successful in 9m36s
Tests / test (push) Successful in 9m50s
2026-01-03 16:51:44 -06:00
282fe4ca6a docs(README): update with upcoming v4 release information
Some checks failed
CI / test-backend (push) Successful in 3s
CI / build-frontend (push) Successful in 44s
Tests / test (push) Successful in 1m37s
CI / lint (push) Failing after 4m55s
2026-01-03 16:51:33 -06:00
3d7924dce6 chore(TODO): update tasks with progress and add new items
Some checks failed
CI / test-backend (push) Successful in 13s
CI / lint (push) Failing after 42s
CI / build-frontend (push) Successful in 9m33s
Tests / test (push) Successful in 9m52s
2026-01-03 16:38:35 -06:00
f46248490f feat(TODO): create todo
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 55s
Tests / test (push) Successful in 1m33s
CI / lint (push) Failing after 4m54s
2026-01-03 16:36:53 -06:00
d6b2f7c8f7 docs(README): cleanup 2026-01-03 16:36:37 -06:00
0dc0d54f7a feat(changelog): add changelog
Some checks failed
CI / test-backend (push) Successful in 4s
Build and Publish Docker Image / build (pull_request) Has been skipped
CI / test-backend (pull_request) Successful in 4s
CI / build-frontend (push) Successful in 2m31s
CI / lint (push) Failing after 5m3s
CI / lint (pull_request) Failing after 5m6s
Tests / test (pull_request) Failing after 6m52s
CI / build-frontend (pull_request) Successful in 9m40s
Build and Publish Docker Image / build-dev (pull_request) Successful in 12m29s
Tests / test (push) Failing after 13m31s
Benchmarks / benchmark (push) Successful in 19m8s
Benchmarks / benchmark (pull_request) Successful in 18m45s
2026-01-03 16:08:26 -06:00
950abef79c feat(tests): add comprehensive benchmarks for database performance, memory usage, and application stability, including new test files for various frontend and backend functionalities 2026-01-03 16:08:07 -06:00
e88dad7a86 feat(scripts): add checksum generation script and locale template generator for asset management 2026-01-03 16:07:40 -06:00
54b81663bd feat(build): replace execSync with spawnSync for building backend and add integrity manifest generation for build files 2026-01-03 16:07:25 -06:00
1e87d633be fix(indexedDB): enhance compatibility by adding globalThis.indexedDB support in codec2-emscripten scripts 2026-01-03 16:07:17 -06:00
9afaa3f5ef chore(assets): update logo and favicon images for improved branding 2026-01-03 16:07:08 -06:00
dc54bd65a1 feat(locales): update German, English, and Russian translations with new terms for documentation, tutorials, banishment effects, drawing tools, and call functionalities 2026-01-03 16:06:54 -06:00
1fd1405e30 feat(storage): implement MicronStorage for tab management, enhance TileCache with state management, and update DialogUtils and NotificationUtils for improved user interactions 2026-01-03 16:06:45 -06:00
baa24e1cf9 feat(styles): enhance select input and add danger chip styles for improved UI consistency 2026-01-03 16:06:34 -06:00
21e29b34aa feat(router): add new routes for documentation, debug logs, changelog, and tutorial pages 2026-01-03 16:06:24 -06:00
7faa94f5f2 feat: Introduce new CommunityInterfacesManager and DocsManager for managing community interfaces and documentation, enhance AnnounceManager with pagination support, and implement CrashRecovery for improved error handling. Update various database schemas and add new fields for contacts and voicemails. 2026-01-03 16:06:16 -06:00
0c3a0e9a4c chore(version): bump version to 4.0.0 in flake.nix and package.json 2026-01-03 16:05:58 -06:00
1c153daf2a chore(version): update version to 4.0.0 2026-01-03 16:05:10 -06:00
20e4c42094 refactor(meshchat): better multi-identity support and improved logging capabilities and a billion other changes and cleanup 2026-01-03 16:05:04 -06:00
6ab46e8969 refactor(electron): standardize HTML structure and improve formatting in crash and loading pages 2026-01-03 16:03:54 -06:00
2f96ee07f3 feat: Update UI components and add new features including changelog modal, integrity warning, and voicemail playback functionality 2026-01-03 16:03:34 -06:00
46f2700770 feat(electron): add a new crash report HTML page for displaying error details and logs 2026-01-03 15:45:05 -06:00
b59c21f483 feat(Taskfile, workflows): add new benchmarking and integrity testing tasks, and create a Gitea workflow for automated benchmarks 2026-01-03 15:44:17 -06:00
b544108d4b chore: update package.json and pnpm-lock.yaml for dependency upgrades and new configurations 2026-01-03 15:44:06 -06:00
00af1e3b46 chore: update .gitignore and .prettierignore to include MagicMock and additional directories for better file management 2026-01-03 15:43:55 -06:00
dc7448d41e chore(Taskfile): remove redundant command to clean assets before building frontend
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 47s
Tests / test (push) Successful in 1m51s
CI / lint (push) Failing after 4m55s
2026-01-03 10:53:25 -06:00
a1964f8807 fix(vite.config): change emptyOutDir option to false to retain previous build files
Some checks failed
CI / test-backend (push) Successful in 16s
CI / lint (push) Failing after 44s
CI / build-frontend (push) Successful in 9m35s
Tests / test (push) Successful in 9m49s
2026-01-03 10:38:27 -06:00
8d87a61e67 feat(tasks): improve Taskfile with new formatting and testing tasks for Python and frontend 2026-01-03 10:38:14 -06:00
5918d0bcbe docs: update README and Android installation guide with new features, installation instructions, and corrected links 2026-01-03 10:37:54 -06:00
11c4c6729c feat(workflows): add support for building and pushing development Docker images on pull requests 2026-01-03 10:35:15 -06:00
eeaabacf35 refactor(meshchat): improve RNS shutdown process and enhance RPC listener closure handling
Some checks failed
CI / test-backend (push) Successful in 16s
CI / lint (push) Failing after 44s
CI / build-frontend (push) Successful in 9m36s
Tests / test (push) Successful in 9m46s
2026-01-02 20:54:22 -06:00
43d1fcc91a refactor(interfaces, settings): comment out unused button sections in InterfacesPage and SettingsPage for future reference 2026-01-02 20:54:13 -06:00
beb86880e0 refactor(tests): format 2026-01-02 20:53:21 -06:00
5a9e066b10 feat(tests): add comprehensive unit tests for various components including AboutPage, CallPage, and MessagesPage
Some checks failed
CI / test-backend (push) Successful in 17s
CI / lint (push) Successful in 46s
Tests / test (push) Failing after 5m15s
CI / build-frontend (push) Successful in 9m33s
2026-01-02 20:36:58 -06:00
adbf0a9ce9 chore(tests): clean up test files by adding missing newlines and reordering imports for consistency 2026-01-02 20:36:42 -06:00
4ea47b9dcf fix(utils): handle null and undefined values in number formatting to ensure consistent output 2026-01-02 20:36:31 -06:00
c419fa48cf fix(call): ensure safe access to call history, announces, voicemails, and contacts by providing default values 2026-01-02 20:36:25 -06:00
09c89d09f0 fix(version): remove unnecessary newline in version string documentation 2026-01-02 20:36:19 -06:00
e988f7743f fix(ringtones): correct SQL argument formatting for consistency in query execution 2026-01-02 20:36:15 -06:00
0f90e7d459 fix(voicemail): correct argument formatting in OpusFileSource and OpusFileSink initialization for consistency 2026-01-02 20:36:10 -06:00
2e6cbd2c0f fix(parser): add validation for interface configuration to ensure it is a dictionary 2026-01-02 20:36:03 -06:00
39cd801354 refactor(config): format BoolConfig initialization for better readability 2026-01-02 20:35:59 -06:00
686b6374bf feat(meshchat): improve error handling and cleanup processes 2026-01-02 20:35:54 -06:00
e1cf6e4809 chore(tests): remove unnecessary blank line 2026-01-02 20:35:34 -06:00
ba35bc4250 chore(lint): exclude test directories from linting rules 2026-01-02 20:35:25 -06:00
1b7ad1cf61 chore(lint): update ruff commands to exclude tests during linting and formatting 2026-01-02 20:35:20 -06:00
51d705ffa4 feat(tests): add fuzzing tests for telemetry unpacking, interface configuration parsing, and LXMF message handling
Some checks failed
CI / test-backend (push) Successful in 4s
CI / build-frontend (push) Successful in 56s
Tests / test (push) Successful in 1m35s
CI / lint (push) Failing after 4m56s
2026-01-02 20:06:18 -06:00
6195b2e37f feat(dependencies): add hypothesis version 6.148.9 and sortedcontainers version 2.4.0 to project dependencies 2026-01-02 20:06:11 -06:00
8871010b97 chore(docker): update .dockerignore and .gitignore to include coverage and hypothesis files 2026-01-02 20:06:05 -06:00
5f32ae05f3 feat(meshchat): improve RPC address handling and error checking; improve rule data validation for forwarding operations (found with fuzzing) 2026-01-02 20:05:56 -06:00
d4ee7ac2d6 feat(ci): add pull_request trigger to CI workflow
All checks were successful
CI / test-backend (push) Successful in 17s
CI / lint (push) Successful in 46s
CI / build-frontend (push) Successful in 9m35s
Tests / test (push) Successful in 9m33s
2026-01-02 19:45:02 -06:00
1e0f61cbb5 feat(tests): add workflow for automated testing with Node.js and Python setup 2026-01-02 19:44:50 -06:00
00b4290735 feat(tests): add comprehensive test suite for backend functionality, including database, configuration, and telemetry utilities
All checks were successful
CI / test-backend (push) Successful in 15s
CI / lint (push) Successful in 42s
CI / build-frontend (push) Successful in 9m35s
2026-01-02 19:41:05 -06:00
d7a5926e6e feat(reticulum): implement hot reloading for Reticulum instance and enhance error handling; update configuration access to ensure stability 2026-01-02 19:40:53 -06:00
c8f49c349e chore(vitest): remove unnecessary blank line in vitest.config.js 2026-01-02 19:40:47 -06:00
3c43a3079c docs(README): update documentation to include RNS Hot Restart feature, configuration options, and Nix development support 2026-01-02 19:40:40 -06:00
fe55ec349b refactor(frontend): clean up and standardize code formatting across various JavaScript files and HTML templates 2026-01-02 19:40:33 -06:00
7aa1560523 feat(locales): add Reticulum Stack management translations for German, English, and Russian 2026-01-02 19:40:20 -06:00
3620643b92 feat(interfaces): add restart required handling and RNS reload functionality across interface components 2026-01-02 19:40:12 -06:00
d52849e832 chore(gitignore): add meshchat-config directory to .gitignore and ensure coverage files are excluded 2026-01-02 19:40:06 -06:00
f4e176394f chore(dependencies): change package groups in poetry.lock from "main" to "dev" for several libraries to better reflect their usage in development 2026-01-02 19:39:56 -06:00
d0bf9349d3 chore(dependencies): update pnpm-lock.yaml to include new testing libraries and various package updates 2026-01-02 19:31:00 -06:00
1477bd92d8 feat(tests): add tasks for running Python and frontend tests, including a combined test task 2026-01-02 19:30:53 -06:00
be338304b6 chore(dependencies): update package.json and poetry.lock to include testing libraries (pytest, pytest-asyncio, pytest-cov) and add coverage support; update .gitignore to exclude coverage files 2026-01-02 19:30:49 -06:00
9c09e18fa6 feat(flake): add flake.nix and flake.lock for Reticulum-MeshChatX development environment setup 2026-01-02 19:28:10 -06:00
7e571f516d feat(ci): add backend compilation step to CI workflow and introduce compile task for Python syntax checking
All checks were successful
CI / test-backend (push) Successful in 16s
CI / lint (push) Successful in 43s
CI / build-frontend (push) Successful in 9m34s
2026-01-02 18:38:16 -06:00
7dd5f5702a chore(package): add Linux and Windows distribution scripts and update artifact naming convention for builds
All checks were successful
CI / lint (push) Successful in 42s
CI / build-frontend (push) Successful in 9m35s
2026-01-02 18:35:59 -06:00
2e8853aa36 chore(workflows): update build and docker workflows to enable tag-based triggers and refine build steps for Electron and Docker images 2026-01-02 18:35:44 -06:00
e595f0a416 docs(README): add manual installation instructions and ARM64 support details
All checks were successful
CI / build-frontend (push) Successful in 36s
CI / lint (push) Successful in 9m34s
2026-01-02 18:20:37 -06:00
eeae2a9821 feat(frontend): add PWA support with manifest, service worker, and audio processing features 2026-01-02 18:20:18 -06:00
22bacfd944 chore(docker-compose): update image source for reticulum-meshchatx and add security options 2026-01-02 18:19:50 -06:00
41e838284c chore(.dockerignore): refine ignore patterns by adding wildcard for __pycache__ directory 2026-01-02 18:19:43 -06:00
8437b7b74c chore(.gitignore): remove 'public/' directory from ignore list (im a dumbass) 2026-01-02 18:19:37 -06:00
e8e12124b6 chore(docker): update docker-compose configuration to use local meshchat-config directory and add optional LibreTranslate service
All checks were successful
CI / lint (push) Successful in 41s
CI / build-frontend (push) Successful in 9m32s
2026-01-02 17:41:36 -06:00
022386aefd feat(workflow): add Docker build and publish workflow for meshchatx
All checks were successful
CI / build-frontend (push) Successful in 30s
CI / lint (push) Successful in 9m36s
2026-01-02 17:33:05 -06:00
b0a4e4d2d7 fix(build): comment out broken Appimage build triggers in workflow configuration 2026-01-02 17:31:38 -06:00
f1a8defad8 fix(build): ensure proper formatting in build workflow configuration 2026-01-02 17:28:59 -06:00
e27bedd54b feat(Routing): add new route for paper message page under tools section 2026-01-02 17:28:54 -06:00
a280e46fba chore(dependencies): update filelock package from version 3.20.1 to 3.20.2 in poetry.lock 2026-01-02 17:28:50 -06:00
d17804530c feat(LXMF): implement URI handling for LXMF messages, including ingestion and generation of paper message URIs, along with keyboard shortcut management enhancements 2026-01-02 17:28:42 -06:00
a4873493df feat(MessageHandler): optimize conversation fetching using window functions for improved performance and add unread message filtering capability 2026-01-02 17:28:27 -06:00
e81968b3dd feat(DatabaseSchema): update schema to version 23, adding keyboard_shortcuts table and optimizing conversation fetching with new indexes 2026-01-02 17:28:22 -06:00
dcb67d208e feat(KeyboardShortcuts): add methods for managing keyboard shortcuts, including retrieval, upsert, and deletion functionalities 2026-01-02 17:28:17 -06:00
c004820846 feat(Localization): update German, English, and Russian locale files with new strings for sync settings, paper messages, and command palette navigation 2026-01-02 17:28:11 -06:00
f789062616 feat(KeyboardShortcuts): implement keyboard shortcuts management with default actions, recording capabilities, and WebSocket integration for saving and deleting shortcuts 2026-01-02 17:28:04 -06:00
189baa3e26 refactor(WebSocketConnection): simplify connection logic by removing demo mode checks and streamlining connection process 2026-01-02 17:27:58 -06:00
b15586e2e9 feat(MessagesPage): implement ingest paper message modal for reading LXMF messages via URI input and clipboard support 2026-01-02 17:27:52 -06:00
f11235047a fix(SidebarLink): add transition effect to text display for improved UI responsiveness 2026-01-02 17:27:47 -06:00
c701c8071d feat(MessagesSidebar): enhance conversation search with ingest paper message button and loading skeleton for conversations 2026-01-02 17:27:40 -06:00
e974758d3e feat(SettingsPage): add keyboard shortcuts section for customizing workflow, including shortcut recording and management functionalities 2026-01-02 17:27:35 -06:00
6db10e3e8f feat(ConversationViewer): add functionality for sharing messages as paper messages, implement translation feature, and enhance message input with draft saving and dynamic height adjustment 2026-01-02 17:27:31 -06:00
ece1414079 feat(CommandPalette): introduce Command Palette component for enhanced navigation and action execution with search functionality 2026-01-02 17:27:21 -06:00
eb8d8ce75d feat(ShortcutRecorder): implement a new component for recording and managing keyboard shortcuts with clear and record functionalities 2026-01-02 17:27:15 -06:00
49d55ce28b feat(PaperMessagePage): add Paper Message Generator component for creating and ingesting signed LXMF messages with QR code support 2026-01-02 17:25:59 -06:00
abf54ed45f feat(PaperMessageModal): implement modal for generating and displaying paper messages with QR code functionality 2026-01-02 17:25:53 -06:00
106f11db2e feat(App): enhance user experience by adding keyboard shortcuts, updating UI elements for identity switching, and integrating Command Palette functionality 2026-01-02 17:25:48 -06:00
7d28291782 feat(ToolsPage): add new paper message tool card with title and description 2026-01-02 17:25:43 -06:00
0e6dac6440 feat(MapPage): add loading skeleton for map and track loading state 2026-01-02 17:25:36 -06:00
09ce40d073 chore(ci): clean up build.yml by removing commented-out Docker build steps to enhance clarity 2026-01-02 17:24:29 -06:00
d957d051ef chore(Taskfile): remove demo Docker build and run tasks to simplify configuration 2026-01-02 17:24:25 -06:00
35f17b59d5 chore(docker): update Dockerfile to streamline build process with new Node.js and Python versions, add missing configuration files, and optimize dependency installation 2026-01-02 17:24:16 -06:00
ba153e7bf4 chore(docker): remove demo configuration files for Docker and NGINX to streamline project structure 2026-01-02 16:49:04 -06:00
732f5e56b6 chore(docker): update .dockerignore to streamline build process by removing unnecessary entries and adding new directories
All checks were successful
CI / build-frontend (push) Successful in 9m35s
CI / lint (push) Successful in 9m38s
2026-01-02 15:15:59 -06:00
928fb6ac32 chore(ci): remove UV_LINK_MODE environment variable and replace setup step with Poetry installation
All checks were successful
CI / build-frontend (push) Successful in 9m33s
CI / lint (push) Successful in 9m35s
2026-01-02 15:15:47 -06:00
3ae9920d1f feat(Taskfile): add Docker build tasks and update Python environment setup to use Poetry 2026-01-02 12:20:41 -06:00
5730dbd93a style(meshchat): remove unnecessary whitespace for cleaner code formatting
All checks were successful
CI / lint (push) Successful in 9m33s
CI / build-frontend (push) Successful in 9m34s
2026-01-02 11:40:56 -06:00
7255170c86 feat(meshchat): fix WebSocket connection handling and expand public path checks for static assets
Some checks failed
CI / lint (push) Failing after 4m52s
CI / build-frontend (push) Successful in 9m30s
2026-01-02 11:26:24 -06:00
b8de232790 fix(README): update video link to use raw source for better accessibility
Some checks failed
CI / lint (push) Failing after 4m56s
CI / build-frontend (push) Successful in 9m33s
2026-01-02 11:13:02 -06:00
aaa1e9b0c3 chore(release): bump version to 3.3.2 in package.json, pyproject.toml, and version.py
Some checks failed
Build and Release / Build and Release (push) Successful in 2m43s
CI / lint (push) Failing after 4m56s
CI / build-frontend (push) Successful in 9m34s
2026-01-02 11:12:00 -06:00
8e976f39bb chore(ci): streamline Electron app build process and enhance artifact collection in CI workflow 2026-01-02 11:11:34 -06:00
9734c18c4f chore(Taskfile): consolidate build commands for Electron apps and add task for building all platforms 2026-01-02 11:11:30 -06:00
71476c9196 feat(meshchat): fix file serving capabilities and improve path handling 2026-01-02 11:11:26 -06:00
d259bd4839 fix(README): update video link and remove outdated showcase video file 2026-01-02 11:11:04 -06:00
1f9632b396 chore(release): bump version to 3.3.1 and update README with additional screenshots
All checks were successful
Build and Release / Build and Release (push) Successful in 7m22s
CI / lint (push) Successful in 9m28s
CI / build-frontend (push) Successful in 9m34s
2026-01-02 10:28:15 -06:00
6ecd46dcec fix(MicronEditorPage): disable and enable eslint rule for v-html usage
All checks were successful
CI / build-frontend (push) Successful in 9m33s
CI / lint (push) Successful in 9m35s
2026-01-02 09:40:02 -06:00
0b5c8e4e68 refactor(CallPage): cleanup 2026-01-02 09:39:48 -06:00
42e7c2cf3b chore(ci): refactor CI workflow for improved readability and maintainability 2026-01-02 09:39:33 -06:00
e23c5abdd9 fix(version): standardize version string quotes in version.py 2026-01-02 09:39:26 -06:00
fea9389a14 chore(ci): update build workflow to include Windows support and improve release asset handling
Some checks failed
CI / lint (push) Failing after 4m54s
CI / build-frontend (push) Successful in 9m32s
2026-01-02 09:31:16 -06:00
20c0e10767 chore(Taskfile): add tasks for setting up Python environment and linting 2026-01-02 09:31:11 -06:00
7618300619 chore(ci): add CI workflow for linting and building frontend 2026-01-02 09:31:08 -06:00
98092d3c77 chore(cx_setup): update include_files logic to conditionally add directories 2026-01-02 09:23:16 -06:00
2d8e050d61 chore(build): remove sync versions step from workflow 2026-01-02 09:23:10 -06:00
dc6f0cae29 chore(Taskfile): simplify install task and remove sync-version task 2026-01-02 09:23:01 -06:00
2f00c39aba chore(sync_version): remove sync_version.py script as it is no longer needed 2026-01-02 09:20:18 -06:00
92204ba16a docs(README): update 2026-01-02 09:18:19 -06:00
349 changed files with 154666 additions and 7412 deletions

View File

@@ -6,28 +6,24 @@ screenshots/
docs/
# Development files
.github/
electron/
android/
scripts/
Makefile
*.apk
*.aab
# Build artifacts and cache
build/
dist/
public/
meshchatx/public/
node_modules/
__pycache__/
/build/
/dist/
/build-dir/
/python-dist/
/node_modules/
**/__pycache__/
*.py[cod]
*$py.class
*.so
.Python
*.egg-info/
*.egg
python-dist/
# Virtual environments
env/
@@ -77,4 +73,16 @@ telemetry_test_lxmf/
# Environment variables
.env
.env.local
.env.*.local
.env.*.local
.coverage
.hypothesis
.hypothesis/
# Arch Linux packaging artifacts
/packaging/arch/src/
/packaging/arch/pkg/
/packaging/arch/*.pkg.tar.zst
/packaging/arch/MeshChatX/
/packaging/arch/reticulum-meshchatx/

View File

@@ -57,7 +57,7 @@ jobs:
- name: Install pnpm
uses: https://git.quad4.io/actions/setup-pnpm@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
with:
version: 10.0.0
version: 10.27.0
- name: Install system dependencies
run: |

View File

@@ -0,0 +1,27 @@
name: Arch Linux Package
on:
push:
tags:
- "*"
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
fetch-depth: 0
- name: Build Arch Package
run: |
docker build -t arch-builder -f Dockerfile.arch-builder .
docker run --rm -v $(pwd):/home/build/project arch-builder
- name: Upload Artifact
uses: https://git.quad4.io/actions/upload-artifact@ff15f0306b3f739f7b6fd43fb5d26cd321bd4de5 # v3.2.1
with:
name: arch-package
path: packaging/arch/*.pkg.tar.zst

View File

@@ -0,0 +1,45 @@
name: Benchmarks
on:
workflow_dispatch:
jobs:
benchmark:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Node.js
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
cache: pnpm
- name: Setup Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Setup Poetry
run: pip install poetry
- name: Install dependencies
run: task install
- name: Run Benchmarks
id: bench
run: |
set -o pipefail
task bench 2>&1 | tee bench_results.txt
- name: Run Integrity Tests
id: integrity
run: |
set -o pipefail
task test-integrity 2>&1 | tee -a bench_results.txt

View File

@@ -0,0 +1,101 @@
name: Build Test
on:
push:
branches:
- "*"
pull_request:
branches:
- "*"
jobs:
build-test:
name: Build and Test
runs-on: ubuntu-latest
steps:
- name: Clone Repo
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
fetch-depth: 0
- name: Install NodeJS
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
- name: Install Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Install Poetry
run: python -m pip install --upgrade pip poetry>=2.0.0
- name: Install pnpm
uses: https://git.quad4.io/actions/setup-pnpm@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
with:
version: 10.27.0
- name: Install system dependencies
run: |
sudo dpkg --add-architecture i386
sudo apt-get update
sudo apt-get install -y wine32:i386 wine64 patchelf libopusfile0 ffmpeg espeak-ng nsis zip rpm flatpak flatpak-builder elfutils appstream appstream-util
flatpak remote-add --if-not-exists --user flathub https://dl.flathub.org/repo/flathub.flatpakrepo
# Install runtimes required for Flatpak build
flatpak install --user -y flathub org.freedesktop.Platform//24.08 org.freedesktop.Sdk//24.08 org.electronjs.Electron2.BaseApp//24.08
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Install dependencies
run: task install
- name: Build Frontend
run: task build-frontend
- name: Build Backend (Wheel)
run: task wheel
- name: Build Electron App (Linux)
run: pnpm run dist:linux
- name: Build Electron App (RPM - Experimental)
continue-on-error: true
run: task build-rpm
- name: Build Electron App (Flatpak - Experimental)
continue-on-error: true
run: task build-flatpak
- name: Build Electron App (Windows EXE and NSIS)
env:
WINEDEBUG: -all
run: pnpm run dist:windows
- name: Build Electron App (ZIP)
run: task build-zip
- name: Prepare release assets
run: |
mkdir -p release-assets
# Collect Linux artifacts
find dist -maxdepth 1 -type f \( -name "*-linux*.AppImage" -o -name "*-linux*.deb" -o -name "*-linux*.rpm" -o -name "*-linux*.flatpak" \) -exec cp {} release-assets/ \;
# Collect Windows artifacts
find dist -maxdepth 1 -type f \( -name "*-win*.exe" \) -exec cp {} release-assets/ \;
# Collect ZIP artifacts from Electron Forge
find out/make -type f -name "*.zip" -exec cp {} release-assets/ \;
# Collect Python artifacts
find python-dist -maxdepth 1 -type f -name "*.whl" -exec cp {} release-assets/ \;
# Create frontend zip
(cd meshchatx/public && zip -r ../../release-assets/meshchatx-frontend.zip .)
# Generate checksums
cd release-assets
for file in *; do
if [ -f "$file" ] && [[ "$file" != *.sha256 ]]; then
sha256sum "$file" | tee "${file}.sha256"
fi
done

View File

@@ -1,3 +1,4 @@
# Appimage builds produced by action are broken for now
name: Build and Release
on:
@@ -33,23 +34,30 @@ jobs:
- name: Determine version
id: version
run: |
if [ "${{ github.event_name }}" = "workflow_dispatch" ] && [ -n "${{ github.event.inputs.version }}" ]; then
echo "version=${{ github.event.inputs.version }}" >> $GITHUB_OUTPUT
VERSION=""
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
VERSION="${{ inputs.version || github.event.inputs.version }}"
fi
if [ -n "$VERSION" ]; then
echo "Using version from input: $VERSION"
elif [[ "${{ github.ref }}" == refs/tags/* ]]; then
VERSION="${GITHUB_REF#refs/tags/}"
if [ -z "${VERSION}" ]; then
VERSION="${{ github.ref_name }}"
fi
if [ "${VERSION}" = "master" ] || [ "${VERSION}" = "main" ]; then
echo "Error: Invalid tag name '${VERSION}'. Tag name cannot be a branch name." >&2
exit 1
fi
echo "version=${VERSION}" >> $GITHUB_OUTPUT
echo "Using version from tag: $VERSION"
else
SHORT_SHA=$(git rev-parse --short HEAD)
echo "version=${SHORT_SHA}" >> $GITHUB_OUTPUT
VERSION=$(git rev-parse --short HEAD)
echo "Using version from SHA: $VERSION"
fi
if [ "${VERSION}" = "master" ] || [ -z "${VERSION}" ]; then
echo "Error: Invalid version '${VERSION}'. Version cannot be 'master' or empty." >&2
exit 1
fi
echo "version=${VERSION}" >> $GITHUB_OUTPUT
- name: Install NodeJS
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
@@ -66,21 +74,20 @@ jobs:
- name: Install pnpm
uses: https://git.quad4.io/actions/setup-pnpm@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
with:
version: 10.0.0
version: 10.27.0
- name: Install system dependencies
run: |
sudo dpkg --add-architecture i386
sudo apt-get update
sudo apt-get install -y patchelf libopusfile0 ffmpeg espeak-ng
sudo apt-get install -y wine32:i386 wine64 patchelf libopusfile0 ffmpeg espeak-ng nsis zip rpm flatpak flatpak-builder elfutils
flatpak remote-add --if-not-exists --user flathub https://dl.flathub.org/repo/flathub.flatpakrepo
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Sync versions
run: python scripts/sync_version.py
- name: Install dependencies
run: task install
@@ -90,24 +97,60 @@ jobs:
- name: Build Python wheel
run: task wheel
- name: Build Electron App (Universal)
run: pnpm run dist-prebuilt
- name: Build Electron App (Appimage)
run: pnpm run dist:linux
- name: Build Electron App (RPM)
continue-on-error: true
run: task build-rpm
- name: Build Electron App (Flatpak)
continue-on-error: true
run: task build-flatpak
- name: Build Electron App (Windows EXE and NSIS)
env:
WINEDEBUG: -all
run: pnpm run dist:windows
- name: Prepare release assets
run: |
mkdir -p release-assets
# Collect artifacts
find dist -type f \( -name "*-linux*.AppImage" -o -name "*-linux*.deb" \) -exec cp {} release-assets/ \;
find python-dist -type f -name "*.whl" -exec cp {} release-assets/ \;
# Collect artifacts from dist/
# Linux artifacts
find dist -maxdepth 1 -type f \( -name "*-linux*.AppImage" -o -name "*-linux*.deb" -o -name "*-linux*.rpm" -o -name "*-linux*.flatpak" \) -exec cp {} release-assets/ \;
# Windows artifacts
find dist -maxdepth 1 -type f \( -name "*-win*.exe" \) -exec cp {} release-assets/ \;
# Python artifacts
find python-dist -maxdepth 1 -type f -name "*.whl" -exec cp {} release-assets/ \;
# Create frontend zip
(cd meshchatx/public && zip -r ../../release-assets/meshchatx-frontend.zip .)
# Generate SBOM (CycloneDX)
curl -L -o /tmp/trivy.deb https://git.quad4.io/Quad4-Software/Trivy-Assets/raw/commit/917e0e52b2f663cbbe13e63b7176262e248265ae/trivy_0.68.2_Linux-64bit.deb
sudo dpkg -i /tmp/trivy.deb || sudo apt-get install -f -y
trivy fs --format cyclonedx --include-dev-deps --output release-assets/sbom.cyclonedx.json .
# Generate checksums
cd release-assets
for file in *; do
if [ -f "$file" ] && [[ "$file" != *.sha256 ]]; then
sha256sum "$file" | tee "${file}.sha256"
fi
done
# Generate release notes (outside release-assets directory)
cd ..
echo "## SHA256 Checksums" > release-body.md
echo "" >> release-body.md
for file in *; do
if [ -f "$file" ] && [ "$file" != "release-body.md" ] && [[ "$file" != *.sha256 ]]; then
sha256sum "$file" | tee "${file}.sha256"
echo "\`$(cat "${file}.sha256")\`" >> release-body.md
for file in release-assets/*; do
if [ -f "$file" ] && [[ "$file" != *.sha256 ]] && [[ "$file" != *release-body.md* ]]; then
filename=$(basename "$file")
if [ -f "release-assets/${filename}.sha256" ]; then
# Extract just the filename and its sha256 (format: <sha256> <filename>)
echo "\`$(cat "release-assets/${filename}.sha256")\`" >> release-body.md
fi
fi
done
@@ -118,7 +161,7 @@ jobs:
echo "Error: Version is empty" >&2
exit 1
fi
if [ "${VERSION}" = "master" ] || [ "${VERSION}" = "main" ]; then
if [ "${VERSION}" = "master" ]; then
echo "Error: Invalid version '${VERSION}'. Version cannot be a branch name." >&2
exit 1
fi
@@ -132,47 +175,15 @@ jobs:
gitea_token: ${{ secrets.GITEA_TOKEN }}
title: ${{ steps.version.outputs.version }}
tag: ${{ steps.version.outputs.version }}
files: "release-assets/*"
bodyFile: "release-assets/release-body.md"
draft: false
files: |
release-assets/*.AppImage
release-assets/*.deb
release-assets/*.rpm
release-assets/*.flatpak
release-assets/*.exe
release-assets/*.whl
release-assets/*.sha256
release-assets/sbom.cyclonedx.json
body_path: "release-body.md"
draft: true
prerelease: false
# build_docker:
# runs-on: ubuntu-latest
# if: github.event_name == 'push' || (github.event_name == 'workflow_dispatch' && github.event.inputs.build_docker == 'true')
# permissions:
# packages: write
# contents: read
# steps:
# - name: Clone Repo
# uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
# - name: Set lowercase repository owner
# run: echo "REPO_OWNER_LC=${GITHUB_REPOSITORY_OWNER,,}" >> $GITHUB_ENV
# - name: Set up QEMU
# uses: https://git.quad4.io/actions/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
# - name: Set up Docker Buildx
# uses: https://git.quad4.io/actions/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
# - name: Log in to the GitHub Container registry
# uses: https://git.quad4.io/actions/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3
# with:
# registry: ghcr.io
# username: ${{ github.actor }}
# password: ${{ secrets.GITHUB_TOKEN }}
# - name: Build and push Docker images
# uses: https://git.quad4.io/actions/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
# with:
# context: .
# platforms: linux/amd64,linux/arm64
# push: true
# tags: >-
# ghcr.io/${{ env.REPO_OWNER_LC }}/reticulum-meshchatx:latest,
# ghcr.io/${{ env.REPO_OWNER_LC }}/reticulum-meshchatx:${{ github.ref_name }}
# labels: >-
# org.opencontainers.image.title=Reticulum MeshChatX,
# org.opencontainers.image.description=Docker image for Reticulum MeshChatX,
# org.opencontainers.image.url=https://github.com/${{ github.repository }}/pkgs/container/reticulum-meshchatx/

111
.gitea/workflows/ci.yml Normal file
View File

@@ -0,0 +1,111 @@
name: CI
on:
push:
branches:
- "*"
pull_request:
workflow_dispatch:
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Node.js
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
cache: pnpm
- name: Setup Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Setup Poetry
run: pip install poetry
- name: Setup Python environment
run: task setup-python-env
- name: Install Node dependencies
run: task node_modules
- name: Lint
run: |
set -o pipefail
task lint 2>&1 | tee lint_results.txt
build-frontend:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Node.js
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
cache: pnpm
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Install dependencies
run: task node_modules
- name: Determine version
id: version
run: |
SHORT_SHA=$(git rev-parse --short HEAD)
echo "version=${SHORT_SHA}" >> $GITHUB_OUTPUT
- name: Build frontend
run: |
set -o pipefail
task build-frontend 2>&1 | tee build_results.txt
env:
VITE_APP_VERSION: ${{ steps.version.outputs.version }}
test-backend:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Compile backend
run: |
set -o pipefail
task compile 2>&1 | tee compile_results.txt
test-lang:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Node.js
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
cache: pnpm
- name: Setup Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Setup Poetry
run: pip install poetry
- name: Install dependencies
run: task install
- name: Run language tests
run: |
set -o pipefail
task test-lang 2>&1 | tee lang_results.txt

138
.gitea/workflows/docker.yml Normal file
View File

@@ -0,0 +1,138 @@
name: Build and Publish Docker Image
on:
workflow_dispatch:
push:
tags:
- "*"
pull_request:
env:
REGISTRY: git.quad4.io
IMAGE_NAME: rns-things/meshchatx
DEV_IMAGE_NAME: rns-things/meshchatx-dev
jobs:
build:
if: github.event_name != 'pull_request'
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
outputs:
image_digest: ${{ steps.build.outputs.digest }}
image_tags: ${{ steps.meta.outputs.tags }}
steps:
- name: Checkout repository
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
fetch-depth: 0
- name: Set up QEMU
uses: https://git.quad4.io/actions/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
with:
platforms: amd64,arm64
- name: Set up Docker Buildx
uses: https://git.quad4.io/actions/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Log in to the Container registry
uses: https://git.quad4.io/actions/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ${{ env.REGISTRY }}
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_PASSWORD }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: https://git.quad4.io/actions/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch,prefix=,suffix=,enable={{is_default_branch}}
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha,format=short
- name: Build and push Docker image
id: build
uses: https://git.quad4.io/actions/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
- name: Download Trivy
run: |
curl -L -o /tmp/trivy.deb https://git.quad4.io/Quad4-Software/Trivy-Assets/raw/commit/917e0e52b2f663cbbe13e63b7176262e248265ae/trivy_0.68.2_Linux-64bit.deb
sudo dpkg -i /tmp/trivy.deb || sudo apt-get install -f -y
- name: Scan Docker image
run: |
# Extract the first tag from the multi-line tags output
IMAGE_TAG=$(echo "${{ steps.meta.outputs.tags }}" | head -n 1)
trivy image --exit-code 1 "$IMAGE_TAG"
build-dev:
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
fetch-depth: 0
- name: Set up QEMU
uses: https://git.quad4.io/actions/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
with:
platforms: amd64,arm64
- name: Set up Docker Buildx
uses: https://git.quad4.io/actions/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Log in to the Container registry
uses: https://git.quad4.io/actions/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ${{ env.REGISTRY }}
username: ${{ secrets.REGISTRY_USERNAME }}
password: ${{ secrets.REGISTRY_PASSWORD }}
- name: Extract DEV metadata (tags, labels) for Docker
id: meta-dev
uses: https://git.quad4.io/actions/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # v5.10.0
with:
images: ${{ env.REGISTRY }}/${{ env.DEV_IMAGE_NAME }}
tags: |
type=raw,value=dev
type=sha,format=short
- name: Build and push dev Docker image
id: build-dev
uses: https://git.quad4.io/actions/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta-dev.outputs.tags }}
labels: ${{ steps.meta-dev.outputs.labels }}
- name: Download Trivy
run: |
curl -L -o /tmp/trivy.deb https://git.quad4.io/Quad4-Software/Trivy-Assets/raw/commit/917e0e52b2f663cbbe13e63b7176262e248265ae/trivy_0.68.2_Linux-64bit.deb
sudo dpkg -i /tmp/trivy.deb || sudo apt-get install -f -y
- name: Scan Docker image (dev)
run: |
# Extract the first tag from the multi-line tags output
IMAGE_TAG=$(echo "${{ steps.meta-dev.outputs.tags }}" | head -n 1)
trivy image --exit-code 1 "$IMAGE_TAG"

View File

@@ -0,0 +1,20 @@
name: OSV-Scanner PR Scan
on:
pull_request:
branches: [master]
merge_group:
branches: [master]
permissions:
contents: read
jobs:
scan-pr:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: OSV scan
run: bash scripts/osv_scan.sh

View File

@@ -0,0 +1,20 @@
name: OSV-Scanner Scheduled Scan
on:
schedule:
- cron: "30 12 * * 1"
push:
branches: [master]
permissions:
contents: read
jobs:
scan-scheduled:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: OSV scan
run: bash scripts/osv_scan.sh

View File

@@ -0,0 +1,42 @@
name: Tests
on:
push:
branches:
- "*"
pull_request:
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: https://git.quad4.io/actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Node.js
uses: https://git.quad4.io/actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
with:
node-version: 22
cache: pnpm
- name: Setup Python
uses: https://git.quad4.io/actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with:
python-version: "3.13"
- name: Setup Task
uses: https://git.quad4.io/actions/setup-task@0ab1b2a65bc55236a3bc64cde78f80e20e8885c2 # v1
with:
version: "3.46.3"
- name: Setup Poetry
run: pip install poetry
- name: Install dependencies
run: task install
- name: Run tests
run: |
set -o pipefail
task test 2>&1 | tee test_results.txt

24
.gitignore vendored
View File

@@ -30,7 +30,6 @@ venv.bak/
/build/
/dist/
/meshchatx/public/
public/
/electron/build/exe/
python-dist/
@@ -70,4 +69,25 @@ Thumbs.db
# Environment variables
.env
.env.local
.env.*.local
.env.*.local
.coverage
meshchat-config/
.hypothesis
.hypothesis/
MagicMock/
out/
# Arch Linux packaging artifacts
/packaging/arch/src/
/packaging/arch/pkg/
/packaging/arch/*.pkg.tar.zst
/packaging/arch/MeshChatX/
/packaging/arch/reticulum-meshchatx/
electron/backend-manifest.json
scripts/private/

7
.npmrc
View File

@@ -1,6 +1 @@
registry=https://registry.npmjs.org/
fetch-retries=5
fetch-retry-mintimeout=20000
fetch-retry-maxtimeout=120000
fetch-timeout=300000
node-linker=hoisted

View File

@@ -1,9 +1,31 @@
dist
node_modules
build
electron/assets
meshchatx/public
pnpm-lock.yaml
poetry.lock
*.log
# Prettier ignore file
# Dependencies
node_modules/
pnpm-lock.yaml
# Build output
dist/
build/
linux-unpacked/
win-unpacked/
mac-unpacked/
# Public assets and libraries
meshchatx/public/
meshchatx/src/frontend/public/
meshchatx/src/frontend/style.css
# Other
storage/
__pycache__/
.venv/
MagicMock/
*.min.js
*.wasm
*.proto
# Documentation and misc
misc/README.md
android/README.md
CHANGELOG.md

120
CHANGELOG.md Normal file
View File

@@ -0,0 +1,120 @@
# Changelog
All notable changes to this project will be documented in this file.
## [4.0.0] - 2026-01-03
Season 1 Episode 1 - A MASSIVE REFACTOR
### New Features
- **Banishment System (formerly Blocked):**
- Renamed all instances of "Blocked" to **"Banished"**, you can now banish really annoying people to the shadow realm.
- **Blackhole Integration:** Automatically blackholes identities at the RNS transport layer when they are banished in MeshChatX. This prevents their traffic from being relayed through your node and publishes the update to your interfaces (trusted interfaces will pull and enforce the banishment).
- Integrated RNS 1.1.0 Blackhole to display publishing status, sources, and current blackhole counts in the RNStatus page.
- **RNPath Management Tool:** New UI tool to manage the Reticulum path table, monitor announce rates (with rate-limit detection), and perform manual path requests or purges directly from the app.
- **Maps:** You can now draw and doodle directly on the map to mark locations or plan routes.
- **Calls & Audio:**
- Added support for custom ringtones and a brand-new ringtone editor.
- New **Audio Waveform Visualization** for voice messages, providing interactive playback with a visual waveform representation.
- **Paper Messages:** Introduced a tool for generating and scanning paper-based messages with built-in QR code generation for easy sharing.
- **LXMF Telemetry & Live Tracking**:
- Full implementation of Sideband-compatible (Still need to test Columba) telemetry (FIELD_TELEMETRY & FIELD_TELEMETRY_STREAM).
- Live tracking with real-time map updates, distinct blue pulsing animations, and historical path tracing (breadcrumb trails).
- Mini-chat integrated into map markers for quick communication with telemetry peers.
- Privacy controls with global telemetry toggle and per-peer "Trust for Telemetry" settings.
- Detailed telemetry history timeline with interactive battery voltage/percentage sparkline charts.
- **Documentation:** You can now read all the project guides and help docs directly inside the app.
- **Reliability:**
- If the app ever crashes, it's now much better at picking up right where it left off without losing your data.
- Added **Identity Switch Recovery**: mechanism to restore previous identities or create emergency failsafes if a switch fails.
- Multi-Identity "Keep-Alive": Identities can now be kept active in the background when switching, ensuring you still receive messages and calls across all your personas.
- Added **Database Snapshotting & Auto-Backups**: You can now create named snapshots of your database and the app will perform automatic backups every 12 hours.
- Added **Emergency Comms Mode**: A lightweight mode that bypasses database storage and non-essential managers, useful for recovering from corrupted data or running in restricted environments. Can be engaged via UI, CLI flag (`--emergency`), or environment variable (`MESHCHAT_EMERGENCY=1`).
- Added **Snapshot Restoration**: Ability to restore from a specific snapshot on startup via `--restore-from-snapshot` or `MESHCHAT_RESTORE_SNAPSHOT` environment variable.
- **Diagnostics:**
- New **Debug Logs Screen**: View and export internal system logs directly from the UI for easier troubleshooting.
- **Community:** Better support for community-run network interfaces and checking TCP ping status of suggested interfaces.
- **UI Tweaks:** Added a new confirmation box for important actions and a better sidebar for browsing your archived messages.
- **Micron Editor:** Added multi-tab support with IndexedDB persistence, tab renaming, and a full editor reset button.
- **Desktop Enhancements (Electron):**
* **Multi-Window Calls:** Optional support for popping active calls into a focused 2nd window.
* **System Tray Integration:** The app now minimizes to the system tray, keeping you connected to the mesh in the background.
* **Native Notifications:** Switched to system-native notifications with deep-linking (click to focus conversation).
* **Protocol Handling:** Register as default handler for `lxmf://` and `rns://` links for seamless cross-app navigation.
* **Hardware Acceleration Toggle:** Power-user setting to disable GPU acceleration if flickering or glitches occur.
* **Power Management:** Automatically prevents system sleep during active audio calls to maintain RNS path stability.
- **Added Web Audio Bridge** which allows web/electron to hook into LXST backend for passing microphone and audio streams to active telephone calls.
- **Added LXMFy** for running bots.
- **Added RNS Discoverable Interfaces** https://markqvist.github.io/Reticulum/manual/interfaces.html#discoverable-interfaces and ability to map them (ones with a location).
### Improvements
- **Blazingly Fast Performance:**
- **Network Rendering:** The Network Visualizer now uses intelligent batching to handle hundreds of nodes without freezing your screen.
- **Memory Optimization:** Added a smart icon cache that automatically clears itself to keep the app's memory footprint low.
- **Parallel Loading:** The app now fetches network data in parallel, cutting down startup and refresh times significantly.
- **Lazy Loading:** Documentation and other heavy components now load only when you need them, making the initial app launch much faster.
- **Smoother Settings:** Changing settings now uses "smart saving" (debouncing) to prevent unnecessary disk work and keep the interface responsive.
- **Backend Efficiency:** A massive core refactor and new database optimizations make message handling and search nearly instantaneous. Added pagination to announce and discovery lists to improve performance in large networks.
- **Calling:** The call screen and overlays have been completely redesigned to look better and work more smoothly.
- **Messaging:**
- Polished the message lists and archive views to make them easier to navigate.
- Added "Retry All" functionality for failed or cancelled messages in conversation views.
- Improved handling of `lxm.ingest_uri.result` with detailed notifications for success/error/warning states.
- **Maintenance Tools:** Added new maintenance utilities to clear LXMF user icon caches and manage backup configurations.
- **Network View:** The visualizer that shows your network connections is now much clearer and easier to understand.
- **Languages:** Updated translations for English, German, and Russian. Added **Italian (it-IT)** localization. Added a toggle to easily enable or disable translation services.
- **Search:** The command palette (quick search) and notification bell are now more useful.
- **CartoDB Tiles** - some more styles if OSM is not enough for you, MBtiles will export tiles from the selected one.
- **Basic Markdown in Messages** - Support for basic markdown in messages
### Bug Fixes
- Fixed issues where switching between different identities could sometimes cause glitches.
- Fixed several small bugs that could cause messages to get stuck or out of order.
- Lots of small UI fixes to make buttons and menus look right on different screens.
- Fixed glitchy message page
### Technical
- **Backend Architecture:**
- Decoupled logic into new specialized managers: `community_interfaces.py`, `docs_manager.py`, `identity_manager.py`, `voicemail_manager.py`, and `nomadnet_utils.py`.
- Added specialized utility modules: `meshchat_utils.py`, `lxmf_utils.py`, `async_utils.py`, and `identity_context.py`.
- Implemented a robust state-based crash recovery system in `src/backend/recovery/`.
- **Self-Healing Database Schema**: Enhanced `DatabaseSchema` with automatic column synchronization to prevent crashes when upgrading from older versions with missing columns.
- Enhanced database layer with `map_drawings.py` and improved `telephone.py` schema for call logging.
- Standardized markdown processing with a new `markdown_renderer.py`.
- Added pagination support for announce queries in `AnnounceManager`.
- **Performance Engineering & Memory Profiling:**
- Integrated a comprehensive backend benchmarking suite (`tests/backend/run_comprehensive_benchmarks.py`) with high-precision timing and memory delta tracking.
- Added an **EXTREME Stress Mode** to simulate ultra-high load scenarios (100,000+ messages and 50,000+ announces).
- Implemented automated memory leak detection and profiling tests using `psutil` and custom `MemoryTracker` utilities.
- **Full-Stack Integrity & Anti-Tampering:**
- Implemented **Backend Binary Verification**: The app now generates a SHA-256 manifest of the unpacked Python backend during build and verifies it on every startup in Electron.
- Added **Data-at-Rest Integrity Monitoring**: The backend now snapshots the state of identities and database files on clean shutdown and warns if they were modified while the app was closed.
- New **Security Integrity Modal**: Notifies the user via a persistent modal if any tampering is detected, with a version-specific "do not show again" option.
- **Frontend Refactor:**
- Migrated complex call logic into `CallOverlay.vue` and `CallPage.vue` with improved state management.
- Implemented modular UI components: `ArchiveSidebar.vue`, `RingtoneEditor.vue`, `ConfirmDialog.vue`, and `AudioWaveformPlayer.vue`.
- Integrated a new documentation browsing system in `src/frontend/components/docs/`.
- Added custom Leaflet integration for map drawing persistence in `MapPage.vue`.
- **Infrastructure:**
- Added `Dockerfile.build` for multi-stage container builds.
- Introduced `gen_checksums.sh` for release artifact integrity.
- **Comprehensive Testing Suite:**
- Added 80+ new unit, integration, and fuzz tests across `tests/backend/` and `tests/frontend/`.
- Implemented property-based fuzzing for LXMF message parsing and telemetry packing using `hypothesis`.
- Updated CI coverage for telemetry and network interface logic.
- Updated core dependencies: `rns`, `lxmf`, `aiohttp`, and `websockets`.
- **Developer Tools & CI:**
- New `task` commands: `bench-backend` (Standard suite), `bench-extreme` (Breaking Time and Space), `profile-memory` (Leak testing), and `bench` (Full run).
- Added Gitea Actions workflow (`bench.yml`) for automated performance regression tracking on every push.
- **Utilize Electron 39 features:**
- Enabled **ASAR Integrity Validation** (Stable in E39) to protect the application against tampering.
- Hardened security by disabling `runAsNode` and `nodeOptions` environment variables via Electron Fuses.
- Implemented **3-Layer CSP Hardening**: Multi-layered Content Security Policy protection across the entire application stack:
1. **Backend Server CSP** (`meshchatx/meshchat.py`): Applied via `security_middleware` to all HTTP responses, allowing localhost connections, websockets, and required external resources (OpenStreetMap tiles, etc.).
2. **Electron Session CSP** (`electron/main.js`): Shell-level fallback CSP applied via `webRequest.onHeadersReceived` handler to ensure coverage before the backend starts and for all Electron-rendered content.
3. **Loading Screen CSP** (`electron/loading.html`): Bootloader CSP defined in HTML meta tag to protect the initial loading screen while waiting for the backend API to come online.
- Added hardware acceleration monitoring to ensure the Network Visualiser and UI are performing optimally.

View File

@@ -1,47 +1,57 @@
# Build arguments
ARG NODE_VERSION=20
ARG NODE_ALPINE_SHA256=sha256:6a91081a440be0b57336fbc4ee87f3dab1a2fd6f80cdb355dcf960e13bda3b59
ARG PYTHON_VERSION=3.11
ARG PYTHON_ALPINE_SHA256=sha256:822ceb965f026bc47ee667e50a44309d2d81087780bbbf64f2005521781a3621
# Build the frontend
FROM node:${NODE_VERSION}-alpine@${NODE_ALPINE_SHA256} AS build-frontend
ARG NODE_IMAGE=node:22-alpine
ARG NODE_HASH=sha256:0340fa682d72068edf603c305bfbc10e23219fb0e40df58d9ea4d6f33a9798bf
ARG PYTHON_IMAGE=python:3.12.12-alpine3.23
ARG PYTHON_HASH=sha256:68d81cd281ee785f48cdadecb6130d05ec6957f1249814570dc90e5100d3b146
# Stage 1: Build Frontend
FROM ${NODE_IMAGE}@${NODE_HASH} AS build-frontend
WORKDIR /src
COPY package.json pnpm-lock.yaml vite.config.js tailwind.config.js postcss.config.js ./
COPY meshchatx/src/frontend ./meshchatx/src/frontend
RUN corepack enable && corepack prepare pnpm@latest --activate && \
pnpm install --frozen-lockfile && \
pnpm run build-frontend
# Copy required source files
COPY package.json vite.config.js ./
COPY pnpm-lock.yaml ./
COPY meshchatx ./meshchatx
# Install pnpm
RUN corepack enable && corepack prepare pnpm@latest --activate
# Install NodeJS deps, exluding electron
RUN pnpm install --prod && \
pnpm run build-frontend
# Main app build
FROM python:${PYTHON_VERSION}-alpine@${PYTHON_ALPINE_SHA256}
WORKDIR /app
# Install Python deps
COPY ./requirements.txt .
RUN apk add --no-cache ffmpeg espeak-ng opusfile && \
apk add --no-cache --virtual .build-deps \
gcc \
musl-dev \
linux-headers \
python3-dev && \
pip install -r requirements.txt && \
apk del .build-deps
# Copy prebuilt frontend
COPY --from=build-frontend /src/meshchatx/public meshchatx/public
# Copy other required source files
COPY meshchatx ./meshchatx
# Stage 2: Build Backend & Virtual Environment
FROM ${PYTHON_IMAGE}@${PYTHON_HASH} AS builder
WORKDIR /build
# Install build dependencies for C-extensions
RUN apk add --no-cache gcc musl-dev linux-headers python3-dev libffi-dev openssl-dev git
# Setup venv and install dependencies
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
COPY pyproject.toml poetry.lock ./
RUN pip install --no-cache-dir "pip>=25.3" poetry setuptools wheel && \
poetry config virtualenvs.create false && \
poetry install --no-root --only main
CMD ["python", "-m", "meshchatx.meshchat", "--host=0.0.0.0", "--reticulum-config-dir=/config/.reticulum", "--storage-dir=/config/.meshchat", "--headless"]
# Copy source code and built frontend
COPY meshchatx ./meshchatx
COPY --from=build-frontend /src/meshchatx/public ./meshchatx/public
# Install the package itself into the venv
RUN pip install . && \
# Trigger LXST filter compilation while build tools are still present
python -c "import LXST.Filters; print('LXST Filters compiled successfully')" && \
python -m compileall /opt/venv/lib/python3.12/site-packages
# Stage 3: Final Runtime Image
FROM ${PYTHON_IMAGE}@${PYTHON_HASH}
WORKDIR /app
# Install runtime dependencies only
# We keep py3-setuptools because CFFI/LXST might need it at runtime on Python 3.12+
RUN apk add --no-cache ffmpeg opusfile libffi su-exec py3-setuptools espeak-ng && \
python -m pip install --no-cache-dir --upgrade "pip>=25.3" && \
addgroup -g 1000 meshchat && adduser -u 1000 -G meshchat -S meshchat && \
mkdir -p /config && chown meshchat:meshchat /config
# Copy the virtual environment from the build stage
COPY --from=builder --chown=meshchat:meshchat /opt/venv /opt/venv
# Set up environment
ENV PATH="/opt/venv/bin:$PATH"
ENV PYTHONUNBUFFERED=1
ENV PYTHONDONTWRITEBYTECODE=1
# Run the app using the installed 'meshchat' entrypoint
CMD ["sh", "-c", "chown -R meshchat:meshchat /config && exec su-exec meshchat meshchat --host=0.0.0.0 --reticulum-config-dir=/config/.reticulum --storage-dir=/config/.meshchat --headless"]

39
Dockerfile.arch-builder Normal file
View File

@@ -0,0 +1,39 @@
FROM archlinux:latest
# Install build dependencies
RUN pacman -Syu --noconfirm --needed \
base-devel \
git \
sudo \
nodejs \
pnpm \
python \
python-poetry \
opus \
opusfile \
portaudio \
espeak-ng \
nss \
atk \
at-spi2-core \
libxcomposite \
libxrandr \
libxdamage \
mesa \
alsa-lib \
libx11
# Create a non-root user for makepkg
RUN useradd -m build && \
echo "build ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/build
# Set up build directory
USER build
WORKDIR /home/build/project
# Copy packaging files
COPY --chown=build:build packaging/arch /home/build/project/packaging/arch
# Default command to build the package
CMD ["/bin/bash", "-c", "cd packaging/arch && makepkg -s --noconfirm"]

View File

@@ -1,32 +0,0 @@
# Build arguments
ARG NODE_VERSION=20
ARG NODE_ALPINE_SHA256=sha256:6a91081a440be0b57336fbc4ee87f3dab1a2fd6f80cdb355dcf960e13bda3b59
# Build the frontend
FROM node:${NODE_VERSION}-alpine@${NODE_ALPINE_SHA256} AS build-frontend
WORKDIR /src
COPY package.json vite.config.js pnpm-lock.yaml tailwind.config.js postcss.config.js ./
# Copy only the frontend source and version info to speed up builds and reduce image size
COPY meshchatx/src/frontend ./meshchatx/src/frontend
COPY meshchatx/src/version.py ./meshchatx/src/version.py
RUN corepack enable && corepack prepare pnpm@latest --activate
RUN pnpm install
RUN pnpm run build-frontend
RUN find /src/meshchatx/public -type d -exec chmod 755 {} + && \
find /src/meshchatx/public -type f -exec chmod 644 {} +
# Runtime stage
FROM nginxinc/nginx-unprivileged:alpine
COPY --from=build-frontend --chown=101:101 /src/meshchatx/public /usr/share/nginx/html
COPY nginx.demo.conf /etc/nginx/conf.d/default.conf
EXPOSE 8080
CMD ["nginx", "-g", "daemon off;"]

View File

@@ -1,3 +1,12 @@
recursive-include meshchatx/public *
recursive-include meshchatx/src *
recursive-include meshchatx/src/backend *
include meshchatx/src/version.py
include meshchatx/src/__init__.py
include meshchatx/meshchat.py
include meshchatx/__init__.py
exclude meshchatx/src/frontend
recursive-exclude meshchatx/src/frontend *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]

298
README.md
View File

@@ -1,29 +1,72 @@
# Reticulum MeshChatX
> [!WARNING]
> Backup your reticulum-meshchat folder before using! MeshChatX will attempt to auto-migrate whatever it can from the old database without breaking things, but it is best to keep backups.
Contact me for any issues or ideas:
```
LXMF: 7cc8d66b4f6a0e0e49d34af7f6077b5a
```
[![CI](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/ci.yml/badge.svg?branch=master)](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/ci.yml)
[![Tests](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/tests.yml/badge.svg?branch=master)](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/tests.yml)
[![Build](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/build.yml/badge.svg?branch=master)](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/build.yml)
[![Docker](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/docker.yml/badge.svg?branch=master)](https://git.quad4.io/RNS-Things/MeshChatX/actions/workflows/docker.yml)
A [Reticulum MeshChat](https://github.com/liamcottle/reticulum-meshchat) fork from the future.
<video src="showcase/showcase-video-call.mp4" controls="controls" style="max-width: 100%;"></video>
<video src="https://strg.0rbitzer0.net/raw/62926a2a-0a9a-4f44-a5f6-000dd60deac1.mp4" controls="controls" style="max-width: 100%;"></video>
This project is seperate from the original Reticulum MeshChat project, and is not affiliated with the original project.
This project is separate from the original Reticulum MeshChat project, and is not affiliated with the original project.
> [!WARNING]
> Backup your reticulum-meshchat folder before using, even though MeshChatX will attempt to auto-migrate whatever it can from the old database without breaking things. Its best to keep backups.
## Goal
To provide everything you need for Reticulum, LXMF, and LXST in one beautiful and feature-rich application.
- Desktop app (Linux, Windows, macOS)
- Self-host on your server easily with or without containers
- Mobile app (one can dream)
- Reliable, secure, fast and easy to use.
Note on macOS: You will need to manually build or use containers since I do not have a macOS machine or runner.
## Quick Start (Docker - Recommended)
The easiest way to get MeshChatX running is using Docker. Our official image is multi-arch and supports `linux/amd64` and `linux/arm64` (Raspberry Pi, etc.).
```bash
# Pull and run the latest image
docker pull git.quad4.io/rns-things/meshchatx:latest
# Run MeshChatX in a Docker container
docker run -d \
--name=meshchatx \
-p 8000:8000 \
-v $PWD/storage:/app/storage \
# --network=host \ # Uncomment for autointerface support
git.quad4.io/rns-things/meshchatx:latest
# Or use Docker Compose for an even easier setup
docker compose up -d
```
Check [releases](https://git.quad4.io/RNS-Things/MeshChatX/releases) for pre-built binaries (AppImage, DEB, EXE) if you prefer standalone apps. (coming soon)
## Major Features
- Full LXST support w/ custom voicemail, phonebook, contacts, contact sharing and ringtone support.
- Multi-identity support.
- Authentication
- Map (OpenLayers w/ MBTiles upload and exporter for offline maps)
- Security improvements (automatic HTTPS, CORS, and much more)
- Modern Custom UI/UX
- More Tools (RNStatus, RNProbe, RNCP and Translator)
- Built-in page archiving and automatic crawler.
- Block LXMF users, Telephony and NomadNet Nodes
- Toast system for notifications
- i18n support (En, De, Ru)
- Raw SQLite database backend (replaced Peewee ORM)
- LXMF Telemetry support (WIP)
- **Full LXST Support**: Custom voicemail, phonebook, contact sharing, and ringtone support.
- **Interface Discovery and auto-connecting** - Discover interfaces, auto-connect or connect to trusted ones, map them all!
- **Multi-Identity**: Switch between multiple Reticulum identities seamlessly.
- **Modern UI/UX**: A completely redesigned, intuitive interface.
- **Integrated Maps**: OpenLayers with MBTiles support for offline maps.
- **Security**: Read more about it in the [Security](#security) section.
- **Offline Docs**: Access Reticulum documentation without an internet connection.
- **Expanded Tools**: Includes dozens of more tools.
- **Page Archiving**: Built-in crawler and browser for archived pages offline.
- **Banishment**: Banish LXMF users, Telephony, and NomadNet Nodes.
- **i18n**: Support for English, German, Italian, and Russian.
## Screenshots
@@ -31,18 +74,23 @@ This project is seperate from the original Reticulum MeshChat project, and is no
<summary>Telephony & Calling</summary>
### Phone
![Phone](screenshots/phone.png)
### Active Call
![Calling](screenshots/calling.png)
### Call Ended
![Call Ended](screenshots/calling-end.png)
### Voicemail
![Voicemail](screenshots/voicemail.png)
### Ringtone Settings
![Ringtone](screenshots/ringtone.png)
</details>
@@ -51,6 +99,7 @@ This project is seperate from the original Reticulum MeshChat project, and is no
<summary>Networking & Visualization</summary>
### Network Visualiser
![Network Visualiser](screenshots/network-visualiser.png)
![Network Visualiser 2](screenshots/network-visualiser2.png)
@@ -60,9 +109,11 @@ This project is seperate from the original Reticulum MeshChat project, and is no
<summary>Page Archives</summary>
### Archives Browser
![Archives](screenshots/archives.png)
### Viewing Archived Page
![Archive View](screenshots/archive-view.png)
</details>
@@ -71,137 +122,156 @@ This project is seperate from the original Reticulum MeshChat project, and is no
<summary>Tools & Identities</summary>
### Tools
![Tools](screenshots/tools.png)
### Identity Management
![Identities](screenshots/identities.png)
</details>
## TODO
### Pipx / Global Installation
- [ ] Tests and proper CI/CD pipeline.
- [ ] RNS hot reload fix
- [ ] Offline Reticulum documentation tool
- [ ] Spam filter (based on keywords)
- [ ] TAK tool/integration
- [ ] RNS Tunnel - tunnel your regular services over RNS to another MeshchatX user.
- [ ] RNS Filesync - P2P file sync
- [ ] RNS Page Node
- [x] Micron Editor (w/ [micron-parser](https://github.com/RFnexus/micron-parser) by [RFnexus](https://github.com/RFnexus))
If you prefer to install MeshChatX globally using `pipx, pip or uv`, you can do so directly from the repository. However, you must specify the path to your built frontend files using the `--public-dir` flag or `MESHCHAT_PUBLIC_DIR` environment variable, as the static files are not bundled with the source code. The release .whl packages include the built frontend files and also there is a seperate frontend zip to grab and use.
## Usage
1. **Install MeshChatX**:
Check [releases](https://git.quad4.io/Ivan/MeshChatX/releases) for pre-built binaries or appimages.
```bash
pipx install git+https://git.quad4.io/RNS-Things/MeshChatX
```
## Building
2. **Run with Frontend Path**:
```bash
# Replace /path/to/MeshChatX/meshchatx/public with your actual path
meshchat --public-dir /path/to/MeshChatX/meshchatx/public
```
This project uses [Task](https://taskfile.dev/) for build automation. Install Task first, then:
### Manual Installation (From Source)
```bash
task install # installs Python deps via Poetry and Node deps via pnpm
task build
```
If you want to run MeshChatX from the source code locally:
You can run `task run` or `task develop` (a thin alias) to start the backend + frontend loop locally through `poetry run meshchat`.
1. **Clone the repository**:
### Available Tasks
```bash
git clone https://git.quad4.io/RNS-Things/MeshChatX
cd MeshChatX
```
| Task | Description |
| ---------------------------- | ------------------------------------------------------------------------------- |
| `task install` | Install all dependencies (syncs version, installs node modules and python deps) |
| `task node_modules` | Install Node.js dependencies only |
| `task python` | Install Python dependencies using Poetry only |
| `task sync-version` | Sync version numbers across project files |
| `task run` | Run the application |
| `task develop` | Run the application in development mode (alias for `run`) |
| `task build` | Build the application (frontend and backend) |
| `task build-frontend` | Build only the frontend |
| `task clean` | Clean build artifacts and dependencies |
| `task wheel` | Build Python wheel package (outputs to `python-dist/`) |
| `task build-appimage` | Build Linux AppImage |
| `task build-exe` | Build Windows portable executable |
| `task dist` | Build distribution (defaults to AppImage) |
| `task electron-legacy` | Install legacy Electron version |
| `task build-appimage-legacy` | Build Linux AppImage with legacy Electron version |
| `task build-exe-legacy` | Build Windows portable executable with legacy Electron version |
| `task build-docker` | Build Docker image using buildx |
| `task run-docker` | Run Docker container using docker-compose |
2. **Build the Frontend**:
Requires Node.js and pnpm.
All tasks support environment variable overrides. For example:
```bash
corepack enable
pnpm install
pnpm run build-frontend
```
- `PYTHON=python3.12 task install`
- `DOCKER_PLATFORMS=linux/amd64,linux/arm64 task build-docker`
3. **Install & Run Backend**:
Requires Python 3.10+ and Poetry.
```bash
pip install poetry
poetry install
poetry run meshchat --headless --host 127.0.0.1
```
### Python Packaging
### Cross-Platform Building (Linux to Windows)
The backend uses Poetry with `pyproject.toml` for dependency management and packaging. Before building, run `python3 scripts/sync_version.py` (or `task sync-version`) to ensure the generated `src/version.py` reflects the version from `package.json` that the Electron artifacts use. This keeps the CLI release metadata, wheel packages, and other bundles aligned.
If you are on Linux and want to build the Windows `.exe` and installer locally, you can use **Wine**.
#### Build Artifact Locations
1. **Install Windows Python and Git inside Wine**:
```bash
# Download Python installer
wget https://www.python.org/ftp/python/3.13.1/python-3.13.1-amd64.exe
# Install Python to a specific path
wine python-3.13.1-amd64.exe /quiet InstallAllUsers=1 TargetDir=C:\Python313 PrependPath=1
Both `poetry build` and `python -m build` generate wheels inside the default `dist/` directory. The `task wheel` shortcut wraps `poetry build -f wheel` and then runs `python scripts/move_wheels.py` to relocate the generated `.whl` files into `python-dist/` (the layout expected by `scripts/test_wheel.sh` and the release automation). Use `task wheel` if you need the artifacts in `python-dist/`; `poetry build` or `python -m build` alone will leave them in `dist/`.
# Download Git installer
wget https://github.com/git-for-windows/git/releases/download/v2.52.0.windows.1/Git-2.52.0-64-bit.exe
# Install Git (quietly)
wine Git-2.52.0-64-bit.exe /VERYSILENT /NORESTART
```
#### Building with Poetry
2. **Install Build Dependencies in Wine**:
```bash
wine C:/Python313/python.exe -m pip install cx_Freeze poetry
wine C:/Python313/python.exe -m pip install -r requirements.txt
```
```bash
# Install dependencies
poetry install
3. **Run the Build Task**:
```bash
# Build only the Windows portable exe
WINE_PYTHON="wine C:/Python313/python.exe" task build-exe-wine
# Build the package (wheels land in dist/)
poetry build
# Or build everything (Linux + Windows) at once
WINE_PYTHON="wine C:/Python313/python.exe" task build-electron-all-wine
```
# Install locally for testing (consumes dist/)
pip install dist/*.whl
```
## Configuration
#### Building with pip (alternative)
MeshChatX can be configured via command-line arguments or environment variables.
If you prefer pip, you can build/install directly:
| Argument | Environment Variable | Default | Description |
| :-------------- | :--------------------- | :---------- | :------------------- |
| `--host` | `MESHCHAT_HOST` | `127.0.0.1` | Web server address |
| `--port` | `MESHCHAT_PORT` | `8000` | Web server port |
| `--no-https` | `MESHCHAT_NO_HTTPS` | `false` | Disable HTTPS |
| `--headless` | `MESHCHAT_HEADLESS` | `false` | Don't launch browser |
| `--auth` | `MESHCHAT_AUTH` | `false` | Enable basic auth |
| `--storage-dir` | `MESHCHAT_STORAGE_DIR` | `./storage` | Data directory |
| `--public-dir` | `MESHCHAT_PUBLIC_DIR` | - | Frontend files path |
```bash
# Build the wheel
pip install build
python -m build
## Development
# Install locally
pip install .
```
We use [Task](https://taskfile.dev/) for automation.
### Building in Docker
| Task | Description |
| :---------------------------- | :--------------------------------------------- |
| `task install` | Install all dependencies |
| `task run` | Run the application |
| `task dev` | Run the application in development mode |
| `task lint` | Run all linters (Python & Frontend) |
| `task lint-python` | Lint Python code only |
| `task lint-frontend` | Lint frontend code only |
| `task format` | Format all code (Python & Frontend) |
| `task format-python` | Format Python code only |
| `task format-frontend` | Format frontend code only |
| `task test` | Run all tests |
| `task test:cov` | Run tests with coverage reports |
| `task test-python` | Run Python tests only |
| `task test-frontend` | Run frontend tests only |
| `task build` | Build frontend and backend |
| `task build-frontend` | Build only the frontend |
| `task wheel` | Build Python wheel package |
| `task compile` | Compile Python code to check for syntax errors |
| `task build-docker` | Build Docker image using buildx |
| `task run-docker` | Run Docker container using docker-compose |
| `task build-appimage` | Build Linux AppImage |
| `task build-exe` | Build Windows portable executable |
| `task build-exe-wine` | Build Windows portable (Wine cross-build) |
| `task build-electron-linux` | Build Linux Electron app |
| `task build-electron-windows` | Build Windows Electron apps |
| `task build-electron-all-wine`| Build all Electron apps (Wine cross-build) |
| `task android-prepare` | Prepare Android build |
| `task android-build` | Build Android APK |
| `task build-flatpak` | Build Flatpak package |
| `task clean` | Clean build artifacts and dependencies |
```bash
task build-docker
```
## Security
`build-docker` creates `reticulum-meshchatx:local` (or `$DOCKER_IMAGE` if you override it) via `docker buildx`. Set `DOCKER_PLATFORMS` to `linux/amd64,linux/arm64` when you need multi-arch images, and adjust `DOCKER_BUILD_FLAGS`/`DOCKER_BUILD_ARGS` to control `--load`/`--push`.
### Running with Docker Compose
```bash
task run-docker
```
`run-docker` feeds the locally-built image into `docker compose -f docker-compose.yml up --remove-orphans --pull never reticulum-meshchatx`. The compose file uses the `MESHCHAT_IMAGE` env var so you can override the target image without editing the YAML (the default still points at `ghcr.io/sudo-ivan/reticulum-meshchatx:latest`). Use `docker compose down` or `Ctrl+C` to stop the container.
The Electron build artifacts will still live under `dist/` for releases.
### Standalone Executables (cx_Freeze)
The `cx_setup.py` script uses cx_Freeze for creating standalone executables (AppImage for Linux, NSIS for Windows). This is separate from the Poetry/pip packaging workflow.
## Internationalization (i18n)
Multi-language support is in progress. We use `vue-i18n` for the frontend.
Translation files are located in `meshchatx/src/frontend/locales/`.
Currently supported languages:
- English (Primary)
- Russian
- German
- [ASAR Integrity](https://www.electronjs.org/docs/latest/tutorial/asar-integrity) (Stable as of Electron 39)
- Built-in automatic integrity checks on all files (frontend and backend)
- HTTPS by default (automated locally generated certs)
- Redundant CORS protection (loading.html, python backend server, electron main.js)
- Updated dependencies and daily scanning (OSV)
- Container image scanning (Trivy)
- SBOM for dependency observability and tracking
- Extensive testing and fuzzing.
- Rootless docker images
- Pinned actions and container images (supply chain security and deterministic builds)
## Credits
- [Liam Cottle](https://github.com/liamcottle) - Original Reticulum MeshChat
- [micron-parser-js](https://github.com/RFnexus/micron-parser-js) by [RFnexus](https://github.com/RFnexus)
- [RFnexus](https://github.com/RFnexus) - [micron-parser-js](https://github.com/RFnexus/micron-parser-js)
- [Marqvist](https://github.com/markqvist) - Reticulum, LXMF, LXST

32
SECURITY.md Normal file
View File

@@ -0,0 +1,32 @@
# Security Policy
## Contact Information
If you discover a security vulnerability or have concerns about the security of Reticulum MeshChatX, please contact the lead developer using the following methods in order of preference:
1. **LXMF**: `7cc8d66b4f6a0e0e49d34af7f6077b5a`
## Security Overview
Reticulum MeshChatX is designed with a high degree of security in mind, leveraging multiple layers of protection and modern security practices. Detailed security enhancements are documented in the [CHANGELOG.md](CHANGELOG.md) and [README.md](README.md).
### Core Security Features
- **ASAR Integrity Validation**: Utilizes Electron 39 features to protect the application against tampering.
- **Backend Binary Verification**: Generates a SHA-256 manifest of the unpacked Python backend during build and verifies it on every startup.
- **Data-at-Rest Integrity Monitoring**: Snapshots the state of identities and database files on clean shutdown and warns if they were modified while the app was closed.
- **Redundant CSP Hardening**: Multi-layered Content Security Policy protection across the entire application stack:
1. **Backend Server CSP**: Applied via security middleware to all HTTP responses.
2. **Electron Session CSP**: Shell-level fallback CSP applied via `webRequest.onHeadersReceived`.
3. **Loading Screen CSP**: Bootloader CSP defined in HTML meta tags.
- **Hardened Electron Environment**: Hardened security by disabling `runAsNode` and `nodeOptions` environment variables via Electron Fuses.
- **Rootless Docker Images**: Support for running in restricted environments with rootless container images.
### Automated Security Measures
The project employs continuous security monitoring and testing:
- **Security Scanning**: Automated daily scans using OSV-Scanner and Trivy for container image vulnerabilities.
- **Pinned Actions**: All CI/CD workflows use pinned actions with full URLs to forked, vetted actions hosted on our Gitea instance (`git.quad4.io`) to prevent supply chain attacks.
- **Extensive Testing & Fuzzing**: Comprehensive backend benchmarking suite with high-precision timing, memory delta tracking, and extreme stress modes to ensure stability and prevent resource exhaustion.
- **Linting & Code Quality**: Strict linting rules and static analysis are enforced on every push.

6
TODO.md Normal file
View File

@@ -0,0 +1,6 @@
- [ ] RNS hot reload - partially done
- [ ] Spam filter (based on keywords) - partially done
- [ ] RNS Tunnel - tunnel regular internet services over RNS
- [ ] RNS Filesync - P2P file sync over RNS
- [ ] SBOM action
- [ ] Sideband plugins support

View File

@@ -7,6 +7,8 @@ vars:
sh: echo "${NPM:-pnpm}"
LEGACY_ELECTRON_VERSION:
sh: echo "${LEGACY_ELECTRON_VERSION:-30.0.8}"
WINE_PYTHON:
sh: echo "${WINE_PYTHON:-wine python}"
DOCKER_COMPOSE_CMD:
sh: echo "${DOCKER_COMPOSE_CMD:-docker compose}"
DOCKER_COMPOSE_FILE:
@@ -25,6 +27,10 @@ vars:
sh: echo "${DOCKER_CONTEXT:-.}"
DOCKERFILE:
sh: echo "${DOCKERFILE:-Dockerfile}"
DOCKER_BUILD_IMAGE:
sh: echo "${DOCKER_BUILD_IMAGE:-reticulum-meshchatx-build:local}"
DOCKER_BUILD_FILE:
sh: echo "${DOCKER_BUILD_FILE:-Dockerfile.build}"
ANDROID_DIR:
sh: echo "${ANDROID_DIR:-android}"
PYTHON_SRC_DIR:
@@ -50,9 +56,118 @@ tasks:
cmds:
- task --list
setup-python-env:
desc: Setup Python environment using Poetry
cmds:
- poetry install
- poetry run pip install ruff
lint-python:
desc: Lint Python code using ruff
cmds:
- poetry run ruff check .
- poetry run ruff format --check .
lint-frontend:
desc: Lint frontend code
cmds:
- "{{.NPM}} run lint"
lint:
desc: Run all linters (frontend and Python)
deps: [lint-frontend, lint-python]
format-python:
desc: Format Python code using ruff
cmds:
- poetry run ruff format ./ --exclude tests
- poetry run ruff check --fix ./ --exclude tests
format-frontend:
desc: Format frontend code using Prettier and ESLint
cmds:
- "{{.NPM}} run format"
- "{{.NPM}} run lint:fix"
format:
desc: Format all code (Python and frontend)
deps: [format-python, format-frontend]
test-python:
desc: Run Python tests using pytest
cmds:
- poetry run pytest tests/backend --cov=meshchatx/src/backend
test-python-cov:
desc: Run Python tests with detailed coverage report
cmds:
- poetry run pytest tests/backend --cov=meshchatx/src/backend --cov-report=term-missing
test-frontend:
desc: Run frontend tests using vitest
cmds:
- "{{.NPM}} run test -- --exclude tests/frontend/i18n.test.js"
test-lang:
desc: Run language and localization tests
cmds:
- "{{.NPM}} run test tests/frontend/i18n.test.js"
- "poetry run pytest tests/backend/test_translator_handler.py"
gen-locale-template:
desc: Generate a locales.json template with empty values from en.json
cmds:
- "{{.PYTHON}} scripts/generate_locale_template.py"
test:
desc: Run all tests
deps: [test-python, test-frontend, test-lang]
test:cov:
desc: Run all tests with coverage reports
deps: [test-python-cov, test-frontend]
check:
desc: Run format, lint and test
cmds:
- task: format
- task: lint
- task: test
bench-backend:
desc: Run comprehensive backend benchmarks
cmds:
- poetry run python tests/backend/run_comprehensive_benchmarks.py
bench-extreme:
desc: Run extreme backend stress benchmarks (Breaking Space Mode)
cmds:
- poetry run python tests/backend/run_comprehensive_benchmarks.py --extreme
profile-memory:
desc: Run backend memory profiling tests
cmds:
- poetry run pytest tests/backend/test_memory_profiling.py
test-integrity:
desc: Run backend and data integrity tests
cmds:
- poetry run pytest tests/backend/test_integrity.py tests/backend/test_backend_integrity.py
bench:
desc: Run all backend benchmarks and memory profiling
cmds:
- task: bench-backend
- task: profile-memory
compile:
desc: Compile Python code to check for syntax errors
cmds:
- "{{.PYTHON}} -m compileall meshchatx/"
install:
desc: Install all dependencies (syncs version, installs node modules and python deps)
deps: [sync-version, node_modules, python]
desc: Install all dependencies (installs node modules and python deps)
deps: [node_modules, python]
node_modules:
desc: Install Node.js dependencies
@@ -97,18 +212,77 @@ tasks:
build-appimage:
desc: Build Linux AppImage
deps: [build]
deps: [build-frontend]
cmds:
- "{{.NPM}} run electron-postinstall"
- "PLATFORM=linux {{.NPM}} run build-backend"
- "{{.NPM}} run dist -- --linux AppImage"
build-exe:
desc: Build Windows portable executable
deps: [build]
deps: [build-frontend]
cmds:
- "{{.NPM}} run electron-postinstall"
- "PLATFORM=win32 {{.NPM}} run build-backend"
- "{{.NPM}} run dist -- --win portable"
build-exe-wine:
desc: Build Windows portable executable and NSIS installer using Wine
deps: [build-frontend]
cmds:
- "{{.NPM}} run electron-postinstall"
- "PLATFORM=win32 PYTHON_CMD='{{.WINE_PYTHON}}' {{.NPM}} run build-backend"
- "npx electron-builder --win portable nsis --publish=never"
build-electron-linux:
desc: Build Linux Electron app with prebuilt backend
deps: [build-frontend]
cmds:
- "{{.NPM}} run dist:linux"
build-rpm:
desc: Build Linux RPM package
deps: [build-frontend]
cmds:
- "{{.NPM}} run dist:rpm"
build-flatpak:
desc: Build Linux Flatpak package
deps: [build-frontend]
cmds:
- "{{.NPM}} run dist:flatpak"
build-electron-windows:
desc: Build Windows Electron apps (portable and installer)
deps: [build-frontend]
cmds:
- "{{.NPM}} run dist:windows"
build-zip:
desc: Build Electron ZIP archive using Electron Forge
deps: [build-frontend]
cmds:
- "PLATFORM=linux {{.NPM}} run build-backend"
- "{{.NPM}} run dist:zip"
build-electron-all:
desc: Build all Electron apps (Linux and Windows)
deps: [build-frontend]
cmds:
- "{{.NPM}} run electron-postinstall"
- "PLATFORM=linux {{.NPM}} run build-backend"
- "PLATFORM=win32 {{.NPM}} run build-backend"
- "{{.NPM}} run dist -- --linux AppImage deb --win portable nsis"
build-electron-all-wine:
desc: Build all Electron apps (Linux + Windows via Wine)
deps: [build-frontend]
cmds:
- "{{.NPM}} run electron-postinstall"
- "PLATFORM=linux {{.NPM}} run build-backend"
- "PLATFORM=win32 PYTHON_CMD='{{.WINE_PYTHON}}' {{.NPM}} run build-backend"
- "npx electron-builder --linux AppImage deb --win portable nsis --publish=never"
dist:
desc: Build distribution (defaults to AppImage)
cmds:
@@ -121,20 +295,37 @@ tasks:
build-appimage-legacy:
desc: Build Linux AppImage with legacy Electron version
deps: [build, electron-legacy]
deps: [build-frontend, electron-legacy]
cmds:
- "{{.NPM}} run electron-postinstall"
- "{{.NPM}} run dist -- --linux AppImage"
- "PLATFORM=linux {{.NPM}} run build-backend"
- "{{.NPM}} run dist -- --linux AppImage -c.extraMetadata.main=electron/main-legacy.js"
- "./scripts/rename_legacy_artifacts.sh"
build-exe-legacy:
desc: Build Windows portable executable with legacy Electron version
deps: [build, electron-legacy]
deps: [build-frontend, electron-legacy]
cmds:
- "{{.NPM}} run electron-postinstall"
- "{{.NPM}} run dist -- --win portable"
- "PLATFORM=win32 {{.NPM}} run build-backend"
- "{{.NPM}} run dist -- --win portable -c.extraMetadata.main=electron/main-legacy.js"
- "./scripts/rename_legacy_artifacts.sh"
forge-start:
desc: Run the application with Electron Forge
cmds:
- "{{.NPM}} run start"
forge-package:
desc: Package the application with Electron Forge
cmds:
- "{{.NPM}} run package"
forge-make:
desc: Generate distributables with Electron Forge
cmds:
- "{{.NPM}} run make"
clean:
desc: Clean build artifacts and dependencies
cmds:
@@ -143,21 +334,10 @@ tasks:
- rm -rf dist
- rm -rf python-dist
- rm -rf meshchatx/public
- rm -rf build-dir
- rm -rf out
- task: android-clean
sync-version:
desc: Sync version numbers across project files
cmds:
- "{{.PYTHON}} scripts/sync_version.py"
fix:
desc: Format and fix linting issues (Python and frontend)
cmds:
- "{{.PYTHON}} -m poetry run ruff format ./"
- "{{.PYTHON}} -m poetry run ruff check --fix ./"
- "{{.NPM}} run format"
- "{{.NPM}} run lint:fix"
build-docker:
desc: Build Docker image using buildx
cmds:
@@ -175,23 +355,30 @@ tasks:
-f {{.DOCKERFILE}} \
{{.DOCKER_CONTEXT}}
build-demo-docker:
desc: Build Frontend-only Demo Docker image
cmds:
- task: build-docker
vars:
DOCKERFILE: Dockerfile.demo
DOCKER_IMAGE: reticulum-meshchatx-demo:local
run-docker:
desc: Run Docker container using docker-compose
cmds:
- 'MESHCHAT_IMAGE="{{.DOCKER_IMAGE}}" {{.DOCKER_COMPOSE_CMD}} -f {{.DOCKER_COMPOSE_FILE}} up --remove-orphans --pull never reticulum-meshchatx'
run-demo-docker:
desc: Run Frontend-only Demo Docker container
run-docker-dev:
desc: Run Docker container in development mode using docker-compose.dev.yml
cmds:
- 'MESHCHAT_DEMO_IMAGE="reticulum-meshchatx-demo:local" {{.DOCKER_COMPOSE_CMD}} -f docker-compose.demo.yml up --remove-orphans'
- 'MESHCHAT_IMAGE="{{.DOCKER_IMAGE}}" {{.DOCKER_COMPOSE_CMD}} -f docker-compose.dev.yml up --build --remove-orphans reticulum-meshchatx'
docker-build-env:
desc: Build the Docker image for containerized builds
cmds:
- docker build -t {{.DOCKER_BUILD_IMAGE}} -f {{.DOCKER_BUILD_FILE}} .
docker-build-artifacts:
desc: Build whls and electron artifacts inside a container and export them
cmds:
- docker rm -f meshchat-build-temp || true
- docker run --name meshchat-build-temp {{.DOCKER_BUILD_IMAGE}}
- mkdir -p dist python-dist
- docker cp meshchat-build-temp:/app/dist/. ./dist/
- docker cp meshchat-build-temp:/app/python-dist/. ./python-dist/
- docker rm meshchat-build-temp
android-init:
desc: Initialize Gradle wrapper for Android project
@@ -266,41 +453,3 @@ tasks:
cmds:
- cd "{{.ANDROID_DIR}}" && ./gradlew clean
- rm -rf "{{.PYTHON_SRC_DIR}}/meshchatx"
flatpak-check-sdk:
desc: Check if required Flatpak SDK is installed
cmds:
- |
if ! flatpak info org.freedesktop.Sdk//24.08 >/dev/null 2>&1; then
echo "Flatpak SDK 24.08 is not installed."
echo "Install it with: flatpak install org.freedesktop.Sdk//24.08"
exit 1
fi
if ! flatpak info org.freedesktop.Platform//24.08 >/dev/null 2>&1; then
echo "Flatpak Platform runtime 24.08 is not installed."
echo "Install it with: flatpak install org.freedesktop.Platform//24.08"
exit 1
fi
if ! flatpak info org.freedesktop.Sdk.Extension.node20//24.08 >/dev/null 2>&1; then
echo "Flatpak Node.js 20 extension is not installed."
echo "Install it with: flatpak install org.freedesktop.Sdk.Extension.node20//24.08"
exit 1
fi
echo "Required Flatpak SDK, Platform runtime, and Node.js extension are installed."
build-flatpak:
desc: Build Flatpak package
deps: [flatpak-check-sdk]
cmds:
- flatpak-builder --force-clean build-dir flatpak.json
install-flatpak:
desc: Install Flatpak package locally
deps: [build-flatpak]
cmds:
- flatpak-builder --install --user --force-clean build-dir flatpak.json
run-flatpak:
desc: Run Flatpak application
cmds:
- flatpak run com.sudoivan.reticulummeshchatx

View File

@@ -1,3 +1,4 @@
import os
import sys
from pathlib import Path
@@ -8,12 +9,24 @@ from meshchatx.src.version import __version__
ROOT = Path(__file__).resolve().parent
PUBLIC_DIR = ROOT / "meshchatx" / "public"
include_files = [
(str(PUBLIC_DIR), "public"),
("logo", "logo"),
]
target_name = os.environ.get("CX_FREEZE_TARGET_NAME", "ReticulumMeshChatX")
build_exe_dir = os.environ.get("CX_FREEZE_BUILD_EXE", "build/exe")
if (ROOT / "bin").exists():
include_files = []
changelog_path = ROOT / "CHANGELOG.md"
if changelog_path.exists():
include_files.append((str(changelog_path), "CHANGELOG.md"))
if PUBLIC_DIR.exists() and PUBLIC_DIR.is_dir():
include_files.append((str(PUBLIC_DIR), "public"))
logo_dir = ROOT / "logo"
if logo_dir.exists() and logo_dir.is_dir():
include_files.append(("logo", "logo"))
bin_dir = ROOT / "bin"
if bin_dir.exists() and bin_dir.is_dir():
include_files.append(("bin", "bin"))
packages = [
@@ -37,7 +50,7 @@ setup(
Executable(
script="meshchatx/meshchat.py",
base=None,
target_name="ReticulumMeshChatX",
target_name=target_name,
shortcut_name="ReticulumMeshChatX",
shortcut_dir="ProgramMenuFolder",
icon="logo/icon.ico",
@@ -51,7 +64,7 @@ setup(
"PIL",
],
"optimize": 1,
"build_exe": "build/exe",
"build_exe": build_exe_dir,
"replace_paths": [
("*", ""),
],

View File

@@ -1,27 +0,0 @@
services:
meshchatx-demo:
build:
context: .
dockerfile: Dockerfile.demo
image: ${MESHCHAT_DEMO_IMAGE:-reticulum-meshchatx-demo:local}
container_name: reticulum-meshchatx-demo
restart: unless-stopped
# Security Hardening
security_opt:
- no-new-privileges:true
read_only: true
tmpfs:
- /tmp:mode=1777
- /var/cache/nginx:mode=1777
- /var/run:mode=1777
cap_drop:
- ALL
# Resource Limits
deploy:
resources:
limits:
memory: 128M
reservations:
memory: 64M

38
docker-compose.dev.yml Normal file
View File

@@ -0,0 +1,38 @@
services:
reticulum-meshchatx:
build:
context: .
dockerfile: Dockerfile
container_name: reticulum-meshchatx
image: reticulum-meshchatx:local
restart: unless-stopped
# Permission handling is now automated in the Dockerfile via su-exec
# reticulum-meshchatx will run as user 'meshchat' (UID 1000)
security_opt:
- no-new-privileges:true
# Make the meshchat web interface accessible from the host on port 8000
ports:
- 127.0.0.1:8000:8000
volumes:
- ./meshchat-config:/config
# Uncomment if you have a USB device connected, such as an RNode
# devices:
# - /dev/ttyUSB0:/dev/ttyUSB0
#
# Host network for autointerface:
# network_mode: host
# LibreTranslate - optional
# libretranslate:
# container_name: libretranslate
# image: libretranslate/libretranslate:latest
# ports:
# - 127.0.0.1:5000:5000
# restart: unless-stopped
# healthcheck:
# test: ["CMD", "curl", "-f", "http://localhost:5000/health"]
# interval: 10s
# timeout: 4s
# retries: 4
# start_period: 5s

View File

@@ -1,17 +1,33 @@
services:
reticulum-meshchatx:
container_name: reticulum-meshchatx
image: ${MESHCHAT_IMAGE:-ghcr.io/sudo-ivan/reticulum-meshchatx:latest}
pull_policy: always
image: ${MESHCHAT_IMAGE:-git.quad4.io/rns-things/meshchatx:latest}
restart: unless-stopped
security_opt:
- no-new-privileges:true
# Make the meshchat web interface accessible from the host on port 8000
ports:
- 127.0.0.1:8000:8000
volumes:
- meshchat-config:/config
- ./meshchat-config:/config
# Uncomment if you have a USB device connected, such as an RNode
# devices:
# - /dev/ttyUSB0:/dev/ttyUSB0
#
# Host network for autointerface:
# network_mode: host
volumes:
meshchat-config:
# LibreTranslate - optional
# libretranslate:
# container_name: libretranslate
# image: libretranslate/libretranslate:latest
# ports:
# - 127.0.0.1:5000:5000
# restart: unless-stopped
# healthcheck:
# test: ["CMD", "curl", "-f", "http://localhost:5000/health"]
# interval: 10s
# timeout: 4s
# retries: 4
# start_period: 5s

View File

@@ -1,11 +0,0 @@
# MeshChat on Docker
A docker image is automatically built by GitHub actions, and can be downloaded from the GitHub container registry.
```
docker pull ghcr.io/liamcottle/reticulum-meshchat:latest
```
Additionally, an example [docker-compose.yml](../docker-compose.yml) is available.
The example automatically generates a new reticulum config file in the `meshchat-config` volume. The MeshChat database is also stored in this volume.

3
docs/meshchatx.md Normal file
View File

@@ -0,0 +1,3 @@
# Welcome to MeshChatX
A fork of Reticulum Meshchat, with many more features, new UI/UX, better security and integrity.

View File

@@ -20,7 +20,7 @@ pkg install build-essential
### Download and Install Wheel
Download the latest wheel from the [releases page](https://git.quad4.io/Ivan/MeshChatX/releases), then:
Download the latest wheel from the [releases page](https://git.quad4.io/RNS-Things/MeshChatX/releases), then:
```
pip install reticulum_meshchatx-*-py3-none-any.whl
@@ -62,7 +62,7 @@ corepack prepare pnpm@latest --activate
### Clone and Build
```
git clone https://git.quad4.io/Ivan/MeshChatX.git
git clone https://git.quad4.io/RNS-Things/MeshChatX.git
cd MeshChatX
pip install poetry
poetry install

View File

File diff suppressed because one or more lines are too long

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 109 KiB

After

Width:  |  Height:  |  Size: 289 KiB

216
electron/crash.html Normal file
View File

@@ -0,0 +1,216 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta
http-equiv="Content-Security-Policy"
content="default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'; img-src 'self' data:;"
/>
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1" />
<title>MeshChatX - Crash Report</title>
<script src="./assets/js/tailwindcss/tailwind-v3.4.3-forms-v0.5.7.js"></script>
</head>
<body
class="min-h-screen bg-slate-100 text-gray-900 antialiased dark:bg-zinc-950 dark:text-zinc-50 transition-colors"
>
<div class="absolute inset-0 -z-10 overflow-hidden">
<div
class="absolute -left-32 -top-40 h-80 w-80 rounded-full bg-gradient-to-br from-red-500/30 via-orange-500/20 to-rose-500/30 blur-3xl dark:from-red-600/25 dark:via-orange-600/25 dark:to-rose-600/25"
></div>
<div
class="absolute -right-24 top-20 h-64 w-64 rounded-full bg-gradient-to-br from-orange-400/30 via-red-500/20 to-rose-500/30 blur-3xl dark:from-orange-500/25 dark:via-red-500/25 dark:to-rose-500/25"
></div>
</div>
<main class="relative flex min-h-screen items-center justify-center px-4 py-6 sm:px-6">
<div class="w-full max-w-5xl">
<div
class="rounded-2xl border border-slate-200/80 bg-white/80 shadow-2xl backdrop-blur-xl ring-1 ring-white/60 dark:border-zinc-800/70 dark:bg-zinc-900/70 dark:ring-zinc-800/70 transition-colors overflow-hidden"
>
<div class="p-4 sm:p-6 space-y-4">
<div
class="flex flex-col sm:flex-row items-center sm:items-start gap-3 text-center sm:text-left"
>
<div
class="flex h-12 w-12 shrink-0 items-center justify-center rounded-xl bg-gradient-to-br from-red-500 via-orange-500 to-rose-500 shadow-lg ring-4 ring-white/60 dark:ring-zinc-800/70"
>
<svg
xmlns="http://www.w3.org/2000/svg"
class="h-8 w-8 text-white"
fill="none"
viewBox="0 0 24 24"
stroke="currentColor"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z"
/>
</svg>
</div>
<div class="space-y-0.5">
<div class="text-xl font-semibold tracking-tight text-gray-900 dark:text-white">
MeshChatX Crashed
</div>
<div class="text-xs text-gray-600 dark:text-gray-300">
Critical error detected in backend service.
</div>
</div>
</div>
<div class="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
<div
class="rounded-xl border border-red-200/90 bg-red-50/70 p-3 dark:border-red-900/40 dark:bg-red-900/20 transition-colors"
>
<div
class="text-[10px] uppercase tracking-wide text-red-600 dark:text-red-400 font-semibold"
>
Exit Code
</div>
<div
class="mt-0.5 text-base font-mono font-bold text-red-700 dark:text-red-300"
id="exit-code"
>
--
</div>
</div>
<div
class="rounded-xl border border-slate-200/90 bg-white/70 p-3 text-center sm:text-right dark:border-zinc-800/80 dark:bg-zinc-900/70 transition-colors"
>
<div class="text-[10px] uppercase tracking-wide text-gray-500 dark:text-gray-400">
Status
</div>
<div class="mt-0.5 text-base font-semibold text-red-600 dark:text-red-400">Offline</div>
</div>
</div>
<div class="space-y-3">
<div class="flex flex-col sm:flex-row items-center justify-between gap-2 px-1">
<h3
class="text-[10px] font-semibold uppercase tracking-wider text-gray-500 dark:text-gray-400"
>
Diagnostic Logs
</h3>
<button
onclick="copyLogs()"
class="w-full sm:w-auto text-[10px] font-medium text-blue-600 hover:text-blue-500 dark:text-blue-400 dark:hover:text-blue-300 bg-blue-50 dark:bg-blue-900/30 px-3 py-1 rounded-lg transition-colors"
>
Copy all logs
</button>
</div>
<div class="space-y-1">
<div class="text-[10px] font-medium text-gray-500 dark:text-gray-400 px-1">
Standard Output (stdout)
</div>
<div class="relative group">
<pre
id="stdout"
class="h-52 overflow-auto rounded-xl border border-slate-200 bg-slate-50 p-3 font-mono text-[10px] text-slate-700 dark:border-zinc-800 dark:bg-zinc-950 dark:text-zinc-300 select-text scrollbar-thin scrollbar-thumb-slate-300 dark:scrollbar-thumb-zinc-800"
></pre>
</div>
</div>
<div class="space-y-1">
<div class="text-[10px] font-medium text-gray-500 dark:text-gray-400 px-1">
Standard Error (stderr)
</div>
<div class="relative group">
<pre
id="stderr"
class="h-64 overflow-auto rounded-xl border border-red-100 bg-red-50/50 p-3 font-mono text-[10px] text-red-700 dark:border-red-900/20 dark:bg-zinc-950 dark:text-red-400 select-text scrollbar-thin scrollbar-thumb-red-200 dark:scrollbar-thumb-zinc-800"
></pre>
</div>
</div>
</div>
<div class="flex flex-wrap items-center justify-center sm:justify-start gap-2 pt-2">
<button
onclick="window.electron.relaunch()"
class="w-full sm:w-40 rounded-xl bg-blue-600 px-4 py-2.5 text-xs font-semibold text-white shadow-lg shadow-blue-500/25 hover:bg-blue-500 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:ring-offset-2 dark:focus:ring-offset-zinc-900 transition-all active:scale-[0.98]"
>
Relaunch
</button>
<button
onclick="window.electron.relaunchEmergency()"
class="w-full sm:w-48 rounded-xl bg-orange-600 px-4 py-2.5 text-xs font-semibold text-white shadow-lg shadow-orange-500/25 hover:bg-orange-500 focus:outline-none focus:ring-2 focus:ring-orange-500 focus:ring-offset-2 dark:focus:ring-offset-zinc-900 transition-all active:scale-[0.98]"
>
Engage Emergency Mode
</button>
<button
onclick="window.electron.shutdown()"
class="w-full sm:w-24 rounded-xl border border-slate-200 bg-white px-4 py-2.5 text-xs font-semibold text-gray-700 shadow-sm hover:bg-slate-50 dark:border-zinc-800 dark:bg-zinc-900 dark:text-zinc-300 dark:hover:bg-zinc-800 transition-all active:scale-[0.98]"
>
Exit
</button>
</div>
</div>
</div>
</div>
</main>
<script>
// Get data from URL parameters
const params = new URLSearchParams(window.location.search);
document.getElementById("exit-code").innerText = params.get("code") || "Unknown";
// Decoded from base64 to handle complex characters safely
try {
const stdoutBase64 = params.get("stdout") || "";
const stderrBase64 = params.get("stderr") || "";
document.getElementById("stdout").innerText = stdoutBase64 ? atob(stdoutBase64) : "No output recorded.";
document.getElementById("stderr").innerText = stderrBase64
? atob(stderrBase64)
: "No error output recorded.";
} catch (e) {
document.getElementById("stdout").innerText = "Error decoding logs.";
document.getElementById("stderr").innerText = "Error decoding logs.";
}
function copyLogs() {
const stdout = document.getElementById("stdout").innerText;
const stderr = document.getElementById("stderr").innerText;
const exitCode = document.getElementById("exit-code").innerText;
const fullReport = `MeshChatX Crash Report\nExit Code: ${exitCode}\n\n--- STDOUT ---\n${stdout}\n\n--- STDERR ---\n${stderr}`;
navigator.clipboard.writeText(fullReport).then(() => {
const btn = event.target;
const originalText = btn.innerText;
btn.innerText = "Copied!";
btn.classList.replace("text-blue-600", "text-emerald-600");
btn.classList.replace("dark:text-blue-400", "dark:text-emerald-400");
setTimeout(() => {
btn.innerText = originalText;
btn.classList.replace("text-emerald-600", "text-blue-600");
btn.classList.replace("dark:text-emerald-400", "dark:text-blue-400");
}, 2000);
});
}
function detectPreferredTheme() {
try {
const storedTheme =
localStorage.getItem("meshchat.theme") || localStorage.getItem("meshchatx.theme");
if (storedTheme === "dark" || storedTheme === "light") {
return storedTheme;
}
} catch (e) {}
return window.matchMedia && window.matchMedia("(prefers-color-scheme: dark)").matches
? "dark"
: "light";
}
function applyTheme(theme) {
const isDark = theme === "dark";
document.documentElement.classList.toggle("dark", isDark);
}
// Apply theme
applyTheme(detectPreferredTheme());
</script>
</body>
</html>

View File

@@ -1,6 +1,10 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta
http-equiv="Content-Security-Policy"
content="default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self' http://localhost:9337 https://localhost:9337;"
/>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1" />
<meta name="color-scheme" content="light dark" />
@@ -19,15 +23,17 @@
></div>
</div>
<main class="relative flex min-h-screen items-center justify-center px-6 py-10">
<main class="relative flex min-h-screen items-center justify-center px-4 py-10 sm:px-6">
<div class="w-full max-w-xl">
<div
class="rounded-3xl border border-slate-200/80 bg-white/80 shadow-2xl backdrop-blur-xl ring-1 ring-white/60 dark:border-zinc-800/70 dark:bg-zinc-900/70 dark:ring-zinc-800/70 transition-colors"
>
<div class="p-8 space-y-6">
<div class="flex items-center gap-4">
<div class="p-6 sm:p-8 space-y-6">
<div
class="flex flex-col sm:flex-row items-center sm:items-start gap-4 text-center sm:text-left"
>
<div
class="flex h-16 w-16 items-center justify-center rounded-2xl bg-gradient-to-br from-blue-500 via-indigo-500 to-purple-500 shadow-lg ring-4 ring-white/60 dark:ring-zinc-800/70"
class="flex h-16 w-16 shrink-0 items-center justify-center rounded-2xl bg-gradient-to-br from-blue-500 via-indigo-500 to-purple-500 shadow-lg ring-4 ring-white/60 dark:ring-zinc-800/70"
>
<img
class="h-10 w-10 object-contain"
@@ -36,9 +42,6 @@
/>
</div>
<div class="space-y-1">
<p class="text-xs uppercase tracking-[0.2em] text-blue-600 dark:text-blue-300">
MeshChatX
</p>
<div class="text-2xl font-semibold tracking-tight text-gray-900 dark:text-white">
MeshChatX
</div>
@@ -47,11 +50,11 @@
</div>
<div
class="flex items-center justify-between rounded-2xl border border-dashed border-slate-200/90 bg-slate-50/70 px-4 py-3 text-sm text-gray-700 dark:border-zinc-800/80 dark:bg-zinc-900/70 dark:text-gray-200 transition-colors"
class="flex flex-col sm:flex-row items-center justify-between gap-3 rounded-2xl border border-dashed border-slate-200/90 bg-slate-50/70 px-4 py-3 text-sm text-gray-700 dark:border-zinc-800/80 dark:bg-zinc-900/70 dark:text-gray-200 transition-colors"
>
<div class="flex items-center gap-2">
<span class="h-2 w-2 rounded-full bg-blue-500 animate-pulse"></span>
<span>Preparing your node</span>
<span>Preparing your app</span>
</div>
<div
class="inline-flex items-center gap-2 rounded-full bg-blue-100/80 px-3 py-1 text-xs font-semibold text-blue-700 shadow-sm dark:bg-blue-900/50 dark:text-blue-200"
@@ -61,8 +64,10 @@
</div>
</div>
<div class="flex items-center gap-4">
<div class="relative inline-flex h-14 w-14 items-center justify-center">
<div
class="flex flex-col sm:flex-row items-center sm:items-start gap-4 text-center sm:text-left"
>
<div class="relative inline-flex h-14 w-14 shrink-0 items-center justify-center">
<span
class="absolute inset-0 rounded-full border-4 border-blue-500/25 dark:border-blue-500/20"
></span>
@@ -79,7 +84,7 @@
</div>
</div>
<div class="grid grid-cols-2 gap-4 text-sm">
<div class="grid grid-cols-1 sm:grid-cols-2 gap-4 text-sm">
<div
class="rounded-2xl border border-slate-200/90 bg-white/70 p-4 dark:border-zinc-800/80 dark:bg-zinc-900/70 transition-colors"
>

366
electron/main-legacy.js Normal file
View File

@@ -0,0 +1,366 @@
const { app, BrowserWindow, dialog, ipcMain, shell, systemPreferences } = require("electron");
const electronPrompt = require("electron-prompt");
const { spawn } = require("child_process");
const fs = require("fs");
const path = require("node:path");
// remember main window
var mainWindow = null;
// remember child process for exe so we can kill it when app exits
var exeChildProcess = null;
// allow fetching app version via ipc
ipcMain.handle("app-version", () => {
return app.getVersion();
});
// allow fetching hardware acceleration status via ipc
ipcMain.handle("is-hardware-acceleration-enabled", () => {
// New in Electron 39, fallback for legacy
if (typeof app.isHardwareAccelerationEnabled === "function") {
return app.isHardwareAccelerationEnabled();
}
return true; // Assume true for older versions
});
// allow fetching integrity status (Stub for legacy)
ipcMain.handle("get-integrity-status", () => {
return {
backend: { ok: true, issues: ["Not supported in legacy mode"] },
data: { ok: true, issues: ["Not supported in legacy mode"] },
};
});
// ignore ssl errors
app.commandLine.appendSwitch("ignore-certificate-errors");
// add support for showing an alert window via ipc
ipcMain.handle("alert", async (event, message) => {
return await dialog.showMessageBox(mainWindow, {
message: message,
});
});
// add support for showing a confirm window via ipc
ipcMain.handle("confirm", async (event, message) => {
// show confirm dialog
const result = await dialog.showMessageBox(mainWindow, {
type: "question",
title: "Confirm",
message: message,
cancelId: 0, // esc key should press cancel button
defaultId: 1, // enter key should press ok button
buttons: [
"Cancel", // 0
"OK", // 1
],
});
// check if user clicked OK
return result.response === 1;
});
// add support for showing a prompt window via ipc
ipcMain.handle("prompt", async (event, message) => {
return await electronPrompt({
title: message,
label: "",
value: "",
type: "input",
inputAttrs: {
type: "text",
},
});
});
// allow relaunching app via ipc
ipcMain.handle("relaunch", () => {
app.relaunch();
app.exit();
});
// allow showing a file path in os file manager
ipcMain.handle("showPathInFolder", (event, path) => {
shell.showItemInFolder(path);
});
function log(message) {
// log to stdout of this process
console.log(message);
// make sure main window exists
if (!mainWindow) {
return;
}
// make sure window is not destroyed
if (mainWindow.isDestroyed()) {
return;
}
// log to web console
mainWindow.webContents.send("log", message);
}
function getDefaultStorageDir() {
// if we are running a windows portable exe, we want to use .reticulum-meshchat in the portable exe dir
// e.g if we launch "E:\Some\Path\MeshChat.exe" we want to use "E:\Some\Path\.reticulum-meshchat"
const portableExecutableDir = process.env.PORTABLE_EXECUTABLE_DIR;
if (process.platform === "win32" && portableExecutableDir != null) {
return path.join(portableExecutableDir, ".reticulum-meshchat");
}
// otherwise, we will fall back to putting the storage dir in the users home directory
// e.g: ~/.reticulum-meshchat
return path.join(app.getPath("home"), ".reticulum-meshchat");
}
function getDefaultReticulumConfigDir() {
// if we are running a windows portable exe, we want to use .reticulum in the portable exe dir
// e.g if we launch "E:\Some\Path\MeshChat.exe" we want to use "E:\Some\Path\.reticulum"
const portableExecutableDir = process.env.PORTABLE_EXECUTABLE_DIR;
if (process.platform === "win32" && portableExecutableDir != null) {
return path.join(portableExecutableDir, ".reticulum");
}
// otherwise, we will fall back to using the .reticulum folder in the users home directory
// e.g: ~/.reticulum
return path.join(app.getPath("home"), ".reticulum");
}
app.whenReady().then(async () => {
// get arguments passed to application, and remove the provided application path
const ignoredArguments = ["--no-sandbox", "--ozone-platform-hint=auto"];
const userProvidedArguments = process.argv.slice(1).filter((arg) => !ignoredArguments.includes(arg));
const shouldLaunchHeadless = userProvidedArguments.includes("--headless");
if (!shouldLaunchHeadless) {
// create browser window
mainWindow = new BrowserWindow({
width: 1500,
height: 800,
webPreferences: {
// used to inject logging over ipc
preload: path.join(__dirname, "preload.js"),
// Security: disable node integration in renderer
nodeIntegration: false,
// Security: enable context isolation (default in Electron 12+)
contextIsolation: true,
// Security: enable sandbox for additional protection
sandbox: true,
// Security: disable remote module (deprecated but explicit)
enableRemoteModule: false,
},
});
// open external links in default web browser instead of electron
mainWindow.webContents.setWindowOpenHandler(({ url }) => {
var shouldShowInNewElectronWindow = false;
// we want to open call.html in a new electron window
// but all other target="_blank" links should open in the system web browser
// we don't want /rnode-flasher/index.html to open in electron, otherwise user can't select usb devices...
if (
(url.startsWith("http://localhost") || url.startsWith("https://localhost")) &&
url.includes("/call.html")
) {
shouldShowInNewElectronWindow = true;
}
// we want to open blob urls in a new electron window
else if (url.startsWith("blob:")) {
shouldShowInNewElectronWindow = true;
}
// open in new electron window
if (shouldShowInNewElectronWindow) {
return {
action: "allow",
};
}
// fallback to opening any other url in external browser
shell.openExternal(url);
return {
action: "deny",
};
});
// navigate to loading page
await mainWindow.loadFile(path.join(__dirname, "loading.html"));
// ask mac users for microphone access for audio calls to work
if (process.platform === "darwin") {
await systemPreferences.askForMediaAccess("microphone");
}
}
// find path to python/cxfreeze reticulum meshchatx executable
// Note: setup.py creates ReticulumMeshChatX (with X), not ReticulumMeshChat
const exeName = process.platform === "win32" ? "ReticulumMeshChatX.exe" : "ReticulumMeshChatX";
// get app path (handles both development and packaged app)
const appPath = app.getAppPath();
// get resources path (where extraFiles are placed)
const resourcesPath = process.resourcesPath || path.join(appPath, "..", "..");
var exe = null;
// when packaged, extraFiles are placed at resources/app/electron/build/exe
// when packaged with asar, unpacked files are in app.asar.unpacked/ directory
// app.getAppPath() returns the path to app.asar, so unpacked is at the same level
const possiblePaths = [
// packaged app - extraFiles location (resources/app/electron/build/exe)
path.join(resourcesPath, "app", "electron", "build", "exe", exeName),
// packaged app with asar (unpacked files from asarUnpack)
path.join(appPath, "..", "app.asar.unpacked", "build", "exe", exeName),
// packaged app without asar (relative to app path)
path.join(appPath, "build", "exe", exeName),
// development mode (relative to electron directory)
path.join(__dirname, "build", "exe", exeName),
// development mode (relative to project root)
path.join(__dirname, "..", "build", "exe", exeName),
];
// find the first path that exists
for (const possibleExe of possiblePaths) {
if (fs.existsSync(possibleExe)) {
exe = possibleExe;
break;
}
}
// verify executable exists
if (!exe || !fs.existsSync(exe)) {
const errorMsg = `Could not find executable: ${exeName}\nChecked paths:\n${possiblePaths.join("\n")}\n\nApp path: ${appPath}\nResources path: ${resourcesPath}`;
log(errorMsg);
if (mainWindow) {
await dialog.showMessageBox(mainWindow, {
message: errorMsg,
});
}
app.quit();
return;
}
log(`Found executable at: ${exe}`);
try {
// arguments we always want to pass in
const requiredArguments = [
"--headless", // reticulum meshchatx usually launches default web browser, we don't want this when using electron
"--port",
"9337", // FIXME: let system pick a random unused port?
// '--test-exception-message', 'Test Exception Message', // uncomment to test the crash dialog
];
// if user didn't provide reticulum config dir, we should provide it
if (!userProvidedArguments.includes("--reticulum-config-dir")) {
requiredArguments.push("--reticulum-config-dir", getDefaultReticulumConfigDir());
}
// if user didn't provide storage dir, we should provide it
if (!userProvidedArguments.includes("--storage-dir")) {
requiredArguments.push("--storage-dir", getDefaultStorageDir());
}
// spawn executable
exeChildProcess = await spawn(exe, [
...requiredArguments, // always provide required arguments
...userProvidedArguments, // also include any user provided arguments
]);
// log stdout
var stdoutLines = [];
exeChildProcess.stdout.setEncoding("utf8");
exeChildProcess.stdout.on("data", function (data) {
// log
log(data.toString());
// keep track of last 10 stdout lines
stdoutLines.push(data.toString());
if (stdoutLines.length > 10) {
stdoutLines.shift();
}
});
// log stderr
var stderrLines = [];
exeChildProcess.stderr.setEncoding("utf8");
exeChildProcess.stderr.on("data", function (data) {
// log
log(data.toString());
// keep track of last 10 stderr lines
stderrLines.push(data.toString());
if (stderrLines.length > 10) {
stderrLines.shift();
}
});
// log errors
exeChildProcess.on("error", function (error) {
log(error);
});
// quit electron app if exe dies
exeChildProcess.on("exit", async function (code) {
// if no exit code provided, we wanted exit to happen, so do nothing
if (code == null) {
return;
}
// tell user that Visual C++ redistributable needs to be installed on Windows
if (code === 3221225781 && process.platform === "win32") {
await dialog.showMessageBox(mainWindow, {
message: "Microsoft Visual C++ redistributable must be installed to run this application.",
});
app.quit();
return;
}
// show crash log
const stdout = stdoutLines.join("");
const stderr = stderrLines.join("");
await dialog.showMessageBox(mainWindow, {
message: [
"MeshChat Crashed!",
"",
`Exit Code: ${code}`,
"",
`----- stdout -----`,
"",
stdout,
`----- stderr -----`,
"",
stderr,
].join("\n"),
});
// quit after dismissing error dialog
app.quit();
});
} catch (e) {
log(e);
}
});
function quit() {
// kill python process
if (exeChildProcess) {
exeChildProcess.kill("SIGKILL");
}
// quit electron app
app.quit();
}
// quit electron if all windows are closed
app.on("window-all-closed", () => {
quit();
});
// make sure child process is killed if app is quiting
app.on("quit", () => {
quit();
});

View File

@@ -1,20 +1,195 @@
const { app, BrowserWindow, dialog, ipcMain, shell, systemPreferences } = require("electron");
const {
app,
BrowserWindow,
dialog,
ipcMain,
shell,
systemPreferences,
Tray,
Menu,
Notification,
powerSaveBlocker,
session,
} = require("electron");
const electronPrompt = require("electron-prompt");
const { spawn } = require("child_process");
const fs = require("fs");
const path = require("node:path");
const crypto = require("crypto");
// remember main window
var mainWindow = null;
// tray instance
var tray = null;
// power save blocker id
var activePowerSaveBlockerId = null;
// track if we are actually quiting
var isQuiting = false;
// remember child process for exe so we can kill it when app exits
var exeChildProcess = null;
// store integrity status
var integrityStatus = {
backend: { ok: true, issues: [] },
data: { ok: true, issues: [] },
};
// Check for hardware acceleration preference in storage dir
try {
const storageDir = getDefaultStorageDir();
const disableGpuFile = path.join(storageDir, "disable-gpu");
if (fs.existsSync(disableGpuFile)) {
app.disableHardwareAcceleration();
console.log("Hardware acceleration disabled via storage flag.");
}
} catch {
// ignore errors reading storage dir this early
}
// Handle hardware acceleration disabling via CLI
if (process.argv.includes("--disable-gpu") || process.argv.includes("--disable-software-rasterizer")) {
app.disableHardwareAcceleration();
}
// Protocol registration
if (process.defaultApp) {
if (process.argv.length >= 2) {
app.setAsDefaultProtocolClient("lxmf", process.execPath, [path.resolve(process.argv[1])]);
app.setAsDefaultProtocolClient("rns", process.execPath, [path.resolve(process.argv[1])]);
}
} else {
app.setAsDefaultProtocolClient("lxmf");
app.setAsDefaultProtocolClient("rns");
}
// Single instance lock
const gotTheLock = app.requestSingleInstanceLock();
if (!gotTheLock) {
app.quit();
} else {
app.on("second-instance", (event, commandLine) => {
// Someone tried to run a second instance, we should focus our window.
if (mainWindow) {
if (mainWindow.isMinimized()) mainWindow.restore();
mainWindow.show();
mainWindow.focus();
// Handle protocol links from second instance
const url = commandLine.pop();
if (url && (url.startsWith("lxmf://") || url.startsWith("rns://"))) {
mainWindow.webContents.send("open-protocol-link", url);
}
}
});
}
// Handle protocol links on macOS
app.on("open-url", (event, url) => {
event.preventDefault();
if (mainWindow) {
mainWindow.show();
mainWindow.webContents.send("open-protocol-link", url);
}
});
function verifyBackendIntegrity(exeDir) {
const manifestPath = path.join(exeDir, "backend-manifest.json");
if (!fs.existsSync(manifestPath)) {
log("Backend integrity manifest missing, skipping check.");
return { ok: true, issues: ["Manifest missing"] };
}
try {
const manifest = JSON.parse(fs.readFileSync(manifestPath, "utf8"));
const issues = [];
const filesToVerify = manifest.files || manifest;
const metadata = manifest._metadata || {};
// The exeDir is build/exe when running or unpacked
// we only care about files in the manifest
for (const [relPath, expectedHash] of Object.entries(filesToVerify)) {
const fullPath = path.join(exeDir, relPath);
if (!fs.existsSync(fullPath)) {
issues.push(`Missing: ${relPath}`);
continue;
}
const fileBuffer = fs.readFileSync(fullPath);
const actualHash = crypto.createHash("sha256").update(fileBuffer).digest("hex");
if (actualHash !== expectedHash) {
issues.push(`Modified: ${relPath}`);
}
}
if (issues.length > 0 && metadata.date && metadata.time) {
issues.unshift(`Backend build timestamp: ${metadata.date} ${metadata.time}`);
}
return {
ok: issues.length === 0,
issues: issues,
};
} catch (error) {
log(`Backend integrity check failed: ${error.message}`);
return { ok: false, issues: [error.message] };
}
}
// allow fetching app version via ipc
ipcMain.handle("app-version", () => {
return app.getVersion();
});
// allow fetching hardware acceleration status via ipc
ipcMain.handle("is-hardware-acceleration-enabled", () => {
return app.isHardwareAccelerationEnabled();
});
// allow fetching integrity status
ipcMain.handle("get-integrity-status", () => {
return integrityStatus;
});
// Native Notification IPC
ipcMain.handle("show-notification", (event, { title, body, silent }) => {
const notification = new Notification({
title: title,
body: body,
silent: silent,
});
notification.show();
notification.on("click", () => {
if (mainWindow) {
mainWindow.show();
mainWindow.focus();
}
});
});
// Power Management IPC
ipcMain.handle("set-power-save-blocker", (event, enabled) => {
if (enabled) {
if (activePowerSaveBlockerId === null) {
activePowerSaveBlockerId = powerSaveBlocker.start("prevent-app-suspension");
log("Power save blocker started.");
}
} else {
if (activePowerSaveBlockerId !== null) {
powerSaveBlocker.stop(activePowerSaveBlockerId);
activePowerSaveBlockerId = null;
log("Power save blocker stopped.");
}
}
return activePowerSaveBlockerId !== null;
});
// ignore ssl errors
app.commandLine.appendSwitch("ignore-certificate-errors");
@@ -63,6 +238,19 @@ ipcMain.handle("relaunch", () => {
app.exit();
});
ipcMain.handle("relaunch-emergency", () => {
app.relaunch({ args: process.argv.slice(1).concat(["--emergency"]) });
app.exit();
});
ipcMain.handle("shutdown", () => {
quit();
});
ipcMain.handle("get-memory-usage", async () => {
return process.getProcessMemoryInfo();
});
// allow showing a file path in os file manager
ipcMain.handle("showPathInFolder", (event, path) => {
shell.showItemInFolder(path);
@@ -112,7 +300,79 @@ function getDefaultReticulumConfigDir() {
return path.join(app.getPath("home"), ".reticulum");
}
function createTray() {
const iconPath = path.join(__dirname, "build", "icon.png");
const fallbackIconPath = path.join(__dirname, "assets", "images", "logo.png");
const trayIcon = fs.existsSync(iconPath) ? iconPath : fallbackIconPath;
tray = new Tray(trayIcon);
const contextMenu = Menu.buildFromTemplate([
{
label: "Show App",
click: function () {
if (mainWindow) {
mainWindow.show();
}
},
},
{
label: "Quit",
click: function () {
isQuiting = true;
quit();
},
},
]);
tray.setToolTip("Reticulum MeshChatX");
tray.setContextMenu(contextMenu);
tray.on("click", () => {
if (mainWindow) {
if (mainWindow.isVisible()) {
mainWindow.hide();
} else {
mainWindow.show();
}
}
});
}
app.whenReady().then(async () => {
// Security: Enforce CSP for all requests as a shell-level fallback
session.defaultSession.webRequest.onHeadersReceived((details, callback) => {
const responseHeaders = { ...details.responseHeaders };
// Define a robust fallback CSP that matches our backend's policy
const fallbackCsp = [
"default-src 'self'",
"script-src 'self' 'unsafe-inline' 'unsafe-eval'",
"style-src 'self' 'unsafe-inline'",
"img-src 'self' data: blob: https://*.tile.openstreetmap.org https://tile.openstreetmap.org https://*.cartocdn.com",
"font-src 'self' data:",
"connect-src 'self' http://localhost:9337 https://localhost:9337 ws://localhost:* wss://localhost:* blob: https://*.tile.openstreetmap.org https://tile.openstreetmap.org https://nominatim.openstreetmap.org https://git.quad4.io https://*.cartocdn.com",
"media-src 'self' blob:",
"worker-src 'self' blob:",
"frame-src 'self'",
"object-src 'none'",
"base-uri 'self'",
].join("; ");
// If the response doesn't already have a CSP, apply our fallback
if (!responseHeaders["Content-Security-Policy"] && !responseHeaders["content-security-policy"]) {
responseHeaders["Content-Security-Policy"] = [fallbackCsp];
}
callback({ responseHeaders });
});
// Log Hardware Acceleration status (New in Electron 39)
const isHardwareAccelerationEnabled = app.isHardwareAccelerationEnabled();
log(`Hardware Acceleration Enabled: ${isHardwareAccelerationEnabled}`);
// Create system tray
createTray();
// get arguments passed to application, and remove the provided application path
const ignoredArguments = ["--no-sandbox", "--ozone-platform-hint=auto"];
const userProvidedArguments = process.argv.slice(1).filter((arg) => !ignoredArguments.includes(arg));
@@ -137,6 +397,15 @@ app.whenReady().then(async () => {
},
});
// minimize to tray behavior
mainWindow.on("close", (event) => {
if (!isQuiting) {
event.preventDefault();
mainWindow.hide();
return false;
}
});
// open external links in default web browser instead of electron
mainWindow.webContents.setWindowOpenHandler(({ url }) => {
var shouldShowInNewElectronWindow = false;
@@ -160,6 +429,16 @@ app.whenReady().then(async () => {
if (shouldShowInNewElectronWindow) {
return {
action: "allow",
overrideBrowserWindowOptions: {
autoHideMenuBar: true,
webPreferences: {
preload: path.join(__dirname, "preload.js"),
nodeIntegration: false,
contextIsolation: true,
sandbox: true,
enableRemoteModule: false,
},
},
};
}
@@ -179,7 +458,7 @@ app.whenReady().then(async () => {
}
}
// find path to python/cxfreeze reticulum meshchat executable
// find path to python/cxfreeze reticulum meshchatx executable
// Note: setup.py creates ReticulumMeshChatX (with X), not ReticulumMeshChat
const exeName = process.platform === "win32" ? "ReticulumMeshChatX.exe" : "ReticulumMeshChatX";
@@ -189,11 +468,16 @@ app.whenReady().then(async () => {
const resourcesPath = process.resourcesPath || path.join(appPath, "..", "..");
var exe = null;
// when packaged, extraFiles are placed at resources/app/electron/build/exe
// when packaged, extraResources are placed at resources/backend
// when packaged with extraFiles, they were at resources/app/electron/build/exe
// when packaged with asar, unpacked files are in app.asar.unpacked/ directory
// app.getAppPath() returns the path to app.asar, so unpacked is at the same level
const possiblePaths = [
// packaged app - extraFiles location (resources/app/electron/build/exe)
// packaged app - extraResources location (resources/backend)
path.join(resourcesPath, "backend", exeName),
// electron-forge extraResource location (resources/exe)
path.join(resourcesPath, "exe", exeName),
// legacy packaged app - extraFiles location (resources/app/electron/build/exe)
path.join(resourcesPath, "app", "electron", "build", "exe", exeName),
// packaged app with asar (unpacked files from asarUnpack)
path.join(appPath, "..", "app.asar.unpacked", "build", "exe", exeName),
@@ -228,10 +512,17 @@ app.whenReady().then(async () => {
log(`Found executable at: ${exe}`);
// Verify backend integrity before spawning
const exeDir = path.dirname(exe);
integrityStatus.backend = verifyBackendIntegrity(exeDir);
if (!integrityStatus.backend.ok) {
log(`INTEGRITY WARNING: Backend tampering detected! Issues: ${integrityStatus.backend.issues.join(", ")}`);
}
try {
// arguments we always want to pass in
const requiredArguments = [
"--headless", // reticulum meshchat usually launches default web browser, we don't want this when using electron
"--headless", // reticulum meshchatx usually launches default web browser, we don't want this when using electron
"--port",
"9337", // FIXME: let system pick a random unused port?
// '--test-exception-message', 'Test Exception Message', // uncomment to test the crash dialog
@@ -248,11 +539,15 @@ app.whenReady().then(async () => {
}
// spawn executable
exeChildProcess = await spawn(exe, [
exeChildProcess = spawn(exe, [
...requiredArguments, // always provide required arguments
...userProvidedArguments, // also include any user provided arguments
]);
if (!exeChildProcess || !exeChildProcess.pid) {
throw new Error("Failed to start backend process (no PID).");
}
// log stdout
var stdoutLines = [];
exeChildProcess.stdout.setEncoding("utf8");
@@ -260,9 +555,9 @@ app.whenReady().then(async () => {
// log
log(data.toString());
// keep track of last 10 stdout lines
// keep track of last 100 stdout lines
stdoutLines.push(data.toString());
if (stdoutLines.length > 10) {
if (stdoutLines.length > 100) {
stdoutLines.shift();
}
});
@@ -274,9 +569,9 @@ app.whenReady().then(async () => {
// log
log(data.toString());
// keep track of last 10 stderr lines
// keep track of last 100 stderr lines
stderrLines.push(data.toString());
if (stderrLines.length > 10) {
if (stderrLines.length > 100) {
stderrLines.shift();
}
});
@@ -293,35 +588,34 @@ app.whenReady().then(async () => {
return;
}
// tell user that Visual C++ redistributable needs to be installed on Windows
if (code === 3221225781 && process.platform === "win32") {
await dialog.showMessageBox(mainWindow, {
message: "Microsoft Visual C++ redistributable must be installed to run this application.",
});
app.quit();
return;
}
// show crash log
const stdout = stdoutLines.join("");
const stderr = stderrLines.join("");
await dialog.showMessageBox(mainWindow, {
message: [
"MeshChat Crashed!",
"",
`Exit Code: ${code}`,
"",
`----- stdout -----`,
"",
stdout,
`----- stderr -----`,
"",
stderr,
].join("\n"),
});
// quit after dismissing error dialog
app.quit();
// Base64 encode for safe URL passing
const stdoutBase64 = Buffer.from(stdout).toString("base64");
const stderrBase64 = Buffer.from(stderr).toString("base64");
// Load crash page if main window exists
if (mainWindow && !mainWindow.isDestroyed()) {
mainWindow.show(); // Ensure visible
mainWindow.focus();
await mainWindow.loadFile(path.join(__dirname, "crash.html"), {
query: {
code: code.toString(),
stdout: stdoutBase64,
stderr: stderrBase64,
},
});
} else {
// Fallback for cases where window is gone
await dialog.showMessageBox({
type: "error",
title: "MeshChatX Crashed",
message: `Backend exited with code: ${code}\n\nSTDOUT: ${stdout.slice(-500)}\n\nSTDERR: ${stderr.slice(-500)}`,
});
app.quit();
}
});
} catch (e) {
log(e);

View File

@@ -9,6 +9,21 @@ contextBridge.exposeInMainWorld("electron", {
return await ipcRenderer.invoke("app-version");
},
// allow fetching electron version
electronVersion: function () {
return process.versions.electron;
},
// allow fetching chrome version
chromeVersion: function () {
return process.versions.chrome;
},
// allow fetching node version
nodeVersion: function () {
return process.versions.node;
},
// show an alert dialog in electron browser window, this fixes a bug where alert breaks input fields on windows
alert: async function (message) {
return await ipcRenderer.invoke("alert", message);
@@ -29,8 +44,43 @@ contextBridge.exposeInMainWorld("electron", {
return await ipcRenderer.invoke("relaunch");
},
// allow relaunching app in emergency mode
relaunchEmergency: async function () {
return await ipcRenderer.invoke("relaunch-emergency");
},
// allow shutting down app in electron browser window
shutdown: async function () {
return await ipcRenderer.invoke("shutdown");
},
// allow getting memory usage in electron browser window
getMemoryUsage: async function () {
return await ipcRenderer.invoke("get-memory-usage");
},
// allow showing a file path in os file manager
showPathInFolder: async function (path) {
return await ipcRenderer.invoke("showPathInFolder", path);
},
// allow checking hardware acceleration status
isHardwareAccelerationEnabled: async function () {
return await ipcRenderer.invoke("is-hardware-acceleration-enabled");
},
// allow checking integrity status
getIntegrityStatus: async function () {
return await ipcRenderer.invoke("get-integrity-status");
},
// allow showing a native notification
showNotification: function (title, body, silent = false) {
ipcRenderer.invoke("show-notification", { title, body, silent });
},
// allow controlling power save blocker
setPowerSaveBlocker: async function (enabled) {
return await ipcRenderer.invoke("set-power-save-blocker", enabled);
},
// listen for protocol links
onProtocolLink: function (callback) {
ipcRenderer.on("open-protocol-link", (event, url) => callback(url));
},
});

View File

@@ -9,6 +9,12 @@ export default [
"**/node_modules/**",
"**/dist/**",
"**/build/**",
"**/out/**",
"**/android/**",
"**/MagicMock/**",
"**/reticulum_meshchatx.egg-info/**",
"**/meshchat-config/**",
"**/screenshots/**",
"**/electron/assets/**",
"**/meshchatx/public/**",
"**/meshchatx/src/frontend/public/**",
@@ -25,6 +31,8 @@ export default [
"**/*.asar.unpacked/**",
"**/*.wasm",
"**/*.proto",
"**/tests/**",
"**/.pnpm-store/**",
],
},
{

61
flake.lock generated Normal file
View File

@@ -0,0 +1,61 @@
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1731533236,
"narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "11707dc2f618dd54ca8739b309ec4fc024de578b",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1767116409,
"narHash": "sha256-5vKw92l1GyTnjoLzEagJy5V5mDFck72LiQWZSOnSicw=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "cad22e7d996aea55ecab064e84834289143e44a0",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

106
flake.nix Normal file
View File

@@ -0,0 +1,106 @@
{
description = "Reticulum-MeshChatX development environment";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
let
pkgs = import nixpkgs {
inherit system;
config.allowUnfree = true;
};
python = pkgs.python312;
node = pkgs.nodejs_22;
in
{
devShells.default = pkgs.mkShell {
buildInputs = with pkgs; [
# Core
git
curl
go-task
pkg-config
libffi
openssl
# Audio (for LXST/Telephony)
libopus
portaudio
# Backend
python
poetry
ruff
# Frontend
node
pnpm
# Electron & Linux Packaging
electron
fakeroot
rpm
dpkg
wine
mono
# Android Development
gradle
openjdk17
# Containerization
docker
docker-compose
];
shellHook = ''
echo "Reticulum-MeshChatX development environment"
echo "Python version: $(${python}/bin/python --version)"
echo "Node version: $(${node}/bin/node --version)"
echo "Task version: $(task --version 2>/dev/null || echo 'not available')"
echo "Poetry version: $(poetry --version 2>/dev/null || echo 'not available')"
echo "PNPM version: $(pnpm --version 2>/dev/null || echo 'not available')"
# Set up development environment variables
export LD_LIBRARY_PATH="${pkgs.libopus}/lib:${pkgs.portaudio}/lib:$LD_LIBRARY_PATH"
'';
};
# Simple package definition for the backend
packages.default = pkgs.python312Packages.buildPythonPackage {
pname = "reticulum-meshchatx";
version = "4.0.0";
src = ./.;
format = "pyproject";
nativeBuildInputs = with pkgs; [
python312Packages.setuptools
python312Packages.wheel
];
propagatedBuildInputs = with pkgs.python312Packages; [
aiohttp
psutil
websockets
bcrypt
aiohttp-session
cryptography
requests
ply
# Note: rns, lxmf, lxst are handled via poetry or manual vendoring
];
buildInputs = [
pkgs.libopus
pkgs.portaudio
];
doCheck = false;
};
});
}

View File

@@ -1,103 +0,0 @@
#!/bin/bash
set -e
export HOME=/tmp/build
export XDG_CONFIG_HOME=/tmp/build/.config
export XDG_DATA_HOME=/tmp/build/.local/share
mkdir -p /tmp/build/.config /tmp/build/.local/share
NODE_PATHS=(
"/usr/lib/sdk/node20/bin"
"/usr/lib/sdk/node20/root/usr/bin"
"/usr/lib/sdk/node/bin"
"/usr/lib/sdk/node/root/usr/bin"
)
NODE_BIN=""
NPM_BIN=""
for path in "${NODE_PATHS[@]}"; do
if [ -f "$path/node" ] && [ -f "$path/npm" ]; then
NODE_BIN="$path/node"
NPM_BIN="$path/npm"
export PATH="$path:$PATH"
break
fi
done
if [ -z "$NODE_BIN" ] || [ -z "$NPM_BIN" ]; then
if command -v node >/dev/null 2>&1 && command -v npm >/dev/null 2>&1; then
NODE_BIN=$(command -v node)
NPM_BIN=$(command -v npm)
else
echo "Error: Node.js binaries not found. Checking common locations..."
find /usr/lib/sdk -name node -type f 2>/dev/null | head -1
find /usr/lib/sdk -name npm -type f 2>/dev/null | head -1
exit 1
fi
fi
echo "Using Node.js: $NODE_BIN"
echo "Using npm: $NPM_BIN"
PNPM_VERSION="10.0.0"
NPM_PREFIX="$HOME/.local"
mkdir -p "$NPM_PREFIX"
export npm_config_prefix="$NPM_PREFIX"
$NPM_BIN config set prefix "$NPM_PREFIX"
echo "Installing pnpm via npm to $NPM_PREFIX..."
$NPM_BIN install -g pnpm@${PNPM_VERSION} || exit 1
export PATH="$NPM_PREFIX/bin:$PATH"
python3 scripts/sync_version.py
pnpm install --frozen-lockfile
pnpm run build
mkdir -p /tmp/electron-install
cd /tmp/electron-install
pnpm init
pnpm add electron@39.2.7
cd -
pip3 install poetry
poetry install --no-dev
poetry run python cx_setup.py build
mkdir -p /app/bin /app/lib/reticulum-meshchatx /app/share/applications /app/share/icons/hicolor/512x512/apps
cp -r electron /app/lib/reticulum-meshchatx/
cp -r build/exe /app/lib/reticulum-meshchatx/
mkdir -p /app/lib/reticulum-meshchatx/electron-bin
cp -r /tmp/electron-install/node_modules/electron/* /app/lib/reticulum-meshchatx/electron-bin/
cp logo/logo.png /app/share/icons/hicolor/512x512/apps/com.sudoivan.reticulummeshchat.png
cat > /app/share/applications/com.sudoivan.reticulummeshchat.desktop <<'EOF'
[Desktop Entry]
Type=Application
Name=Reticulum MeshChatX
Comment=A simple mesh network communications app powered by the Reticulum Network Stack
Exec=reticulum-meshchatx
Icon=com.sudoivan.reticulummeshchat
Categories=Network;InstantMessaging;
StartupNotify=true
EOF
cat > /app/bin/reticulum-meshchatx <<'EOF'
#!/bin/sh
export ELECTRON_IS_DEV=0
export APP_PATH=/app/lib/reticulum-meshchatx/electron
export EXE_PATH=/app/lib/reticulum-meshchatx/build/exe/ReticulumMeshChatX
ELECTRON_BIN=/app/lib/reticulum-meshchatx/electron-bin/dist/electron
if [ ! -f "$ELECTRON_BIN" ]; then
ELECTRON_BIN=$(find /app/lib/reticulum-meshchatx/electron-bin -name electron -type f 2>/dev/null | head -1)
fi
cd /app/lib/reticulum-meshchatx/electron
exec "$ELECTRON_BIN" . "$@"
EOF
chmod +x /app/bin/reticulum-meshchatx

View File

@@ -1,37 +0,0 @@
{
"app-id": "com.sudoivan.reticulummeshchatx",
"runtime": "org.freedesktop.Platform",
"runtime-version": "24.08",
"sdk": "org.freedesktop.Sdk",
"sdk-extensions": ["org.freedesktop.Sdk.Extension.node20"],
"build-options": {
"env": {
"PYTHON": "/usr/bin/python3"
}
},
"command": "reticulum-meshchatx",
"finish-args": [
"--share=network",
"--socket=wayland",
"--socket=x11",
"--socket=pulseaudio",
"--device=all",
"--filesystem=home",
"--filesystem=host",
"--talk-name=org.freedesktop.NetworkManager",
"--talk-name=org.freedesktop.secrets"
],
"modules": [
{
"name": "reticulum-meshchatx",
"buildsystem": "simple",
"build-commands": ["bash flatpak-build.sh"],
"sources": [
{
"type": "dir",
"path": "."
}
]
}
]
}

74
forge.config.js Normal file
View File

@@ -0,0 +1,74 @@
const { FusesPlugin } = require("@electron-forge/plugin-fuses");
const { FuseV1Options, FuseVersion } = require("@electron/fuses");
const platform = process.env.PLATFORM || process.platform;
const extraResourceDir = platform === "win32" || platform === "win" ? "build/exe/win32" : "build/exe/linux";
module.exports = {
packagerConfig: {
asar: true,
extraResource: [extraResourceDir],
executableName: "reticulum-meshchatx",
name: "Reticulum MeshChatX",
appBundleId: "com.sudoivan.reticulummeshchatx",
icon: "electron/build/icon",
// osxSign: {}, // Uncomment and configure for macOS signing
// osxNotarize: { ... }, // Uncomment and configure for macOS notarization
},
rebuildConfig: {},
makers: [
{
name: "@electron-forge/maker-squirrel",
config: {
name: "reticulum_meshchatx",
},
},
{
name: "@electron-forge/maker-zip",
},
{
name: "@electron-forge/maker-deb",
config: {
options: {
maintainer: "Sudo-Ivan",
homepage: "https://git.quad4.io/RNS-Things/MeshChatX",
categories: ["Network"],
},
},
},
{
name: "@electron-forge/maker-rpm",
config: {},
},
{
name: "@electron-forge/maker-flatpak",
config: {
options: {
categories: ["Network"],
runtime: "org.freedesktop.Platform",
runtimeVersion: "24.08",
sdk: "org.freedesktop.Sdk",
base: "org.electronjs.Electron2.BaseApp",
baseVersion: "24.08",
},
},
},
],
plugins: [
{
name: "@electron-forge/plugin-auto-unpack-natives",
config: {},
},
// Fuses are used to enable/disable various Electron functionality
// at package time, before code signing the application
new FusesPlugin({
version: FuseVersion.V1,
[FuseV1Options.RunAsNode]: false,
[FuseV1Options.EnableCookieEncryption]: true,
[FuseV1Options.EnableNodeOptionsEnvironmentVariable]: false,
[FuseV1Options.EnableNodeCliInspectArguments]: false,
[FuseV1Options.EnableEmbeddedAsarIntegrityValidation]: true,
[FuseV1Options.OnlyLoadAppFromAsar]: true,
}),
],
};

View File

File diff suppressed because it is too large Load Diff

View File

@@ -49,27 +49,43 @@ class AnnounceManager:
destination_hash=None,
query=None,
blocked_identity_hashes=None,
limit=None,
offset=0,
):
sql = "SELECT * FROM announces WHERE 1=1"
sql = """
SELECT a.*, c.custom_image as contact_image
FROM announces a
LEFT JOIN contacts c ON (
a.identity_hash = c.remote_identity_hash OR
a.destination_hash = c.lxmf_address OR
a.destination_hash = c.lxst_address
)
WHERE 1=1
"""
params = []
if aspect:
sql += " AND aspect = ?"
sql += " AND a.aspect = ?"
params.append(aspect)
if identity_hash:
sql += " AND identity_hash = ?"
sql += " AND a.identity_hash = ?"
params.append(identity_hash)
if destination_hash:
sql += " AND destination_hash = ?"
sql += " AND a.destination_hash = ?"
params.append(destination_hash)
if query:
like_term = f"%{query}%"
sql += " AND (destination_hash LIKE ? OR identity_hash LIKE ?)"
sql += " AND (a.destination_hash LIKE ? OR a.identity_hash LIKE ?)"
params.extend([like_term, like_term])
if blocked_identity_hashes:
placeholders = ", ".join(["?"] * len(blocked_identity_hashes))
sql += f" AND identity_hash NOT IN ({placeholders})"
sql += f" AND a.identity_hash NOT IN ({placeholders})"
params.extend(blocked_identity_hashes)
sql += " ORDER BY updated_at DESC"
sql += " ORDER BY a.updated_at DESC"
if limit is not None:
sql += " LIMIT ? OFFSET ?"
params.extend([limit, offset])
return self.db.provider.fetchall(sql, params)

View File

@@ -1,4 +1,5 @@
import asyncio
import sys
from collections.abc import Coroutine
@@ -6,6 +7,44 @@ class AsyncUtils:
# remember main loop
main_loop: asyncio.AbstractEventLoop | None = None
@staticmethod
def apply_asyncio_313_patch():
"""Apply a patch for asyncio on Python 3.13 to avoid a bug in sendfile with SSL.
See: https://github.com/python/cpython/issues/124448
And: https://github.com/aio-libs/aiohttp/issues/8863
"""
if sys.version_info >= (3, 13):
import asyncio.base_events
# We need to patch the loop's sendfile to raise NotImplementedError for SSL transports.
# This will force aiohttp to use its own fallback which works correctly.
original_sendfile = asyncio.base_events.BaseEventLoop.sendfile
async def patched_sendfile(
self,
transport,
file,
offset=0,
count=None,
*,
fallback=True,
):
if transport.get_extra_info("sslcontext"):
raise NotImplementedError(
"sendfile is broken on SSL transports in Python 3.13",
)
return await original_sendfile(
self,
transport,
file,
offset,
count,
fallback=fallback,
)
asyncio.base_events.BaseEventLoop.sendfile = patched_sendfile
@staticmethod
def set_main_loop(loop: asyncio.AbstractEventLoop):
AsyncUtils.main_loop = loop

View File

@@ -0,0 +1,124 @@
import asyncio
import time
import RNS
from meshchatx.src.backend.meshchat_utils import parse_lxmf_propagation_node_app_data
class AutoPropagationManager:
def __init__(self, app, context):
self.app = app
self.context = context
self.config = context.config
self.database = context.database
self.running = False
self._last_check = 0
self._check_interval = 300 # 5 minutes
def stop(self):
self.running = False
async def _run(self):
# Wait a bit after startup to allow discovers to come in
await asyncio.sleep(10)
self.running = True
while self.running and self.context.running:
try:
if self.config.lxmf_preferred_propagation_node_auto_select.get():
await self.check_and_update_propagation_node()
except asyncio.CancelledError:
break
except Exception as e:
print(
f"Error in AutoPropagationManager for {self.context.identity_hash}: {e}",
)
await asyncio.sleep(self._check_interval)
async def check_and_update_propagation_node(self):
# Get all propagation node announces
announces = self.database.announces.get_announces(aspect="lxmf.propagation")
nodes_with_hops = []
for announce in announces:
dest_hash_hex = announce["destination_hash"]
dest_hash = bytes.fromhex(dest_hash_hex)
# Check if propagation is enabled for this node
node_data = parse_lxmf_propagation_node_app_data(announce["app_data"])
if not node_data or not node_data.get("enabled", False):
continue
if RNS.Transport.has_path(dest_hash):
hops = RNS.Transport.hops_to(dest_hash)
nodes_with_hops.append((hops, dest_hash_hex))
# Sort by hops (lowest first)
nodes_with_hops.sort()
current_node = (
self.config.lxmf_preferred_propagation_node_destination_hash.get()
)
if not nodes_with_hops:
return
# Try nodes in order of hops until we find a reachable one
for hops, node_hex in nodes_with_hops:
# If current node is already the best and we have it, check if we should keep it
if node_hex == current_node:
# We could probe it to be sure, but for now let's assume it's fine if it's the best
return
# Before switching to a new "best" node, try to probe it to ensure it's actually reachable
try:
dest_hash = bytes.fromhex(node_hex)
# We use a short timeout for the probe
if await self.probe_node(dest_hash):
print(
f"Auto-propagation: Switching to better node {node_hex} ({hops} hops) for {self.context.identity_hash}",
)
self.app.set_active_propagation_node(node_hex, context=self.context)
self.config.lxmf_preferred_propagation_node_destination_hash.set(
node_hex,
)
return
print(
f"Auto-propagation: Node {node_hex} announced but probe failed, trying next...",
)
except Exception as e:
print(f"Auto-propagation: Error probing node {node_hex}: {e}")
async def probe_node(self, destination_hash):
"""Probes a destination to see if it's reachable."""
try:
# We use the app's probe handler if available
if (
hasattr(self.context, "rnprobe_handler")
and self.context.rnprobe_handler
):
# Re-using the logic from RNProbeHandler but simplified
if not RNS.Transport.has_path(destination_hash):
RNS.Transport.request_path(destination_hash)
# Wait a bit for path
timeout = 5
start = time.time()
while (
not RNS.Transport.has_path(destination_hash)
and time.time() - start < timeout
):
await asyncio.sleep(0.5)
if not RNS.Transport.has_path(destination_hash):
return False
# If we have a path, it's a good sign.
# For propagation nodes, having a path is often enough to try using it.
return True
return RNS.Transport.has_path(destination_hash)
except Exception:
return False

View File

@@ -0,0 +1,341 @@
import json
import logging
import os
import shutil
import subprocess
import sys
import time
import uuid
import RNS
logger = logging.getLogger("meshchatx.bots")
class BotHandler:
def __init__(self, identity_path, config_manager=None):
self.identity_path = os.path.abspath(identity_path)
self.config_manager = config_manager
self.bots_dir = os.path.join(self.identity_path, "bots")
os.makedirs(self.bots_dir, exist_ok=True)
self.running_bots = {}
self.state_file = os.path.join(self.bots_dir, "bots_state.json")
self.bots_state: list[dict] = []
self._load_state()
self.runner_path = os.path.join(
os.path.dirname(__file__),
"bot_process.py",
)
def _load_state(self):
try:
with open(self.state_file, encoding="utf-8") as f:
self.bots_state = json.load(f)
# Ensure all storage paths are absolute
for entry in self.bots_state:
if "storage_dir" in entry:
entry["storage_dir"] = os.path.abspath(entry["storage_dir"])
except FileNotFoundError:
self.bots_state = []
except Exception:
self.bots_state = []
def _save_state(self):
try:
with open(self.state_file, "w", encoding="utf-8") as f:
json.dump(self.bots_state, f, indent=2)
except Exception:
pass
def get_available_templates(self):
return [
{
"id": "echo",
"name": "Echo Bot",
"description": "Repeats any message it receives.",
},
{
"id": "note",
"name": "Note Bot",
"description": "Store and retrieve notes using JSON storage.",
},
{
"id": "reminder",
"name": "Reminder Bot",
"description": "Set and receive reminders using SQLite storage.",
},
]
def restore_enabled_bots(self):
for entry in list(self.bots_state):
if entry.get("enabled"):
try:
self.start_bot(
template_id=entry["template_id"],
name=entry["name"],
bot_id=entry["id"],
storage_dir=entry["storage_dir"],
)
except Exception as exc:
logger.warning("Failed to restore bot %s: %s", entry.get("id"), exc)
def get_status(self):
bots: list[dict] = []
for entry in self.bots_state:
bot_id = entry.get("id")
template = entry.get("template_id") or entry.get("template")
name = entry.get("name") or "Unknown"
pid = entry.get("pid")
running = False
if bot_id in self.running_bots:
running = True
elif pid:
running = self._is_pid_alive(pid)
address_pretty = None
address_full = None
# Try running instance first
instance = self.running_bots.get(bot_id, {}).get("instance")
if (
instance
and getattr(instance, "bot", None)
and getattr(instance.bot, "local", None)
):
try:
address_pretty = RNS.prettyhexrep(instance.bot.local.hash)
address_full = RNS.hexrep(instance.bot.local.hash, delimit=False)
except Exception:
pass
# Fallback to identity file on disk
if address_full is None:
identity = self._load_identity_for_bot(bot_id)
if identity:
try:
destination = RNS.Destination(identity, "lxmf", "delivery")
address_full = destination.hash.hex()
address_pretty = RNS.prettyhexrep(destination.hash)
except Exception:
pass
bots.append(
{
"id": bot_id,
"template": template,
"template_id": template,
"name": name,
"address": address_pretty or "Unknown",
"full_address": address_full,
"running": running,
"pid": pid,
"storage_dir": entry.get("storage_dir"),
},
)
return {
"has_lxmfy": True,
"detection_error": None,
"running_bots": [b for b in bots if b["running"]],
"bots": bots,
}
def start_bot(self, template_id, name=None, bot_id=None, storage_dir=None):
# Reuse existing entry or create new
entry = None
if bot_id:
for e in self.bots_state:
if e.get("id") == bot_id:
entry = e
break
if entry is None:
bot_id = bot_id or uuid.uuid4().hex
bot_storage_dir = storage_dir or os.path.join(self.bots_dir, bot_id)
bot_storage_dir = os.path.abspath(bot_storage_dir)
entry = {
"id": bot_id,
"template_id": template_id,
"name": name or f"{template_id.title()} Bot",
"storage_dir": bot_storage_dir,
"enabled": True,
"pid": None,
}
self.bots_state.append(entry)
else:
bot_storage_dir = entry["storage_dir"]
entry["template_id"] = template_id
entry["name"] = name or entry.get("name") or f"{template_id.title()} Bot"
entry["enabled"] = True
os.makedirs(bot_storage_dir, exist_ok=True)
cmd = [
sys.executable,
self.runner_path,
"--template",
template_id,
"--name",
entry["name"],
"--storage",
bot_storage_dir,
]
proc = subprocess.Popen(cmd, cwd=bot_storage_dir) # noqa: S603
entry["pid"] = proc.pid
self._save_state()
self.running_bots[bot_id] = {
"instance": None,
"thread": None,
"stop_event": None,
"template": template_id,
"pid": proc.pid,
}
logger.info(f"Started bot {bot_id} (template: {template_id}) pid={proc.pid}")
return bot_id
def stop_bot(self, bot_id):
entry = None
for e in self.bots_state:
if e.get("id") == bot_id:
entry = e
break
if entry is None:
return False
pid = entry.get("pid")
if pid:
try:
if sys.platform.startswith("win"):
subprocess.run(
["taskkill", "/PID", str(pid), "/T", "/F"],
check=False,
timeout=5,
)
else:
os.kill(pid, 15)
# brief wait
time.sleep(0.5)
# optional force kill if still alive
try:
os.kill(pid, 0)
os.kill(pid, 9)
except OSError:
pass
except Exception as exc:
logger.warning(
"Failed to terminate bot %s pid %s: %s",
bot_id,
pid,
exc,
)
entry["pid"] = None
entry["enabled"] = False
self._save_state()
if bot_id in self.running_bots:
del self.running_bots[bot_id]
logger.info("Stopped bot %s", bot_id)
return True
def restart_bot(self, bot_id):
entry = None
for e in self.bots_state:
if e.get("id") == bot_id:
entry = e
break
if entry is None:
raise ValueError(f"Unknown bot: {bot_id}")
self.stop_bot(bot_id)
return self.start_bot(
template_id=entry["template_id"],
name=entry["name"],
bot_id=bot_id,
storage_dir=entry["storage_dir"],
)
def delete_bot(self, bot_id):
# Stop it first
self.stop_bot(bot_id)
# Remove from state
entry = None
for i, e in enumerate(self.bots_state):
if e.get("id") == bot_id:
entry = e
del self.bots_state[i]
break
if entry:
# Delete storage dir
storage_dir = entry.get("storage_dir")
if storage_dir and os.path.exists(storage_dir):
try:
shutil.rmtree(storage_dir)
except Exception as exc:
logger.warning(
"Failed to delete storage dir for bot %s: %s",
bot_id,
exc,
)
self._save_state()
logger.info("Deleted bot %s", bot_id)
return True
return False
def get_bot_identity_path(self, bot_id):
entry = None
for e in self.bots_state:
if e.get("id") == bot_id:
entry = e
break
if not entry:
return None
storage_dir = entry.get("storage_dir")
if not storage_dir:
return None
# LXMFy stores identity in the 'config' subdirectory by default
id_path = os.path.join(storage_dir, "config", "identity")
if os.path.exists(id_path):
return id_path
# Fallback to direct identity file if it was moved or configured differently
id_path_alt = os.path.join(storage_dir, "identity")
if os.path.exists(id_path_alt):
return id_path_alt
# LXMFy may nest inside config/lxmf
id_path_lxmf = os.path.join(storage_dir, "config", "lxmf", "identity")
if os.path.exists(id_path_lxmf):
return id_path_lxmf
return None
def _load_identity_for_bot(self, bot_id):
identity_path = self.get_bot_identity_path(bot_id)
if not identity_path:
return None
try:
return RNS.Identity.from_file(identity_path)
except Exception:
return None
@staticmethod
def _is_pid_alive(pid):
if not pid:
return False
try:
os.kill(pid, 0)
return True
except OSError:
return False
def stop_all(self):
for bot_id in list(self.running_bots.keys()):
self.stop_bot(bot_id)

View File

@@ -0,0 +1,45 @@
import argparse
import os
from meshchatx.src.backend.bot_templates import (
EchoBotTemplate,
NoteBotTemplate,
ReminderBotTemplate,
)
TEMPLATE_MAP = {
"echo": EchoBotTemplate,
"note": NoteBotTemplate,
"reminder": ReminderBotTemplate,
}
def main():
parser = argparse.ArgumentParser()
parser.add_argument("--template", required=True, choices=TEMPLATE_MAP.keys())
parser.add_argument("--name", required=True)
parser.add_argument("--storage", required=True)
args = parser.parse_args()
os.makedirs(args.storage, exist_ok=True)
os.chdir(args.storage)
BotCls = TEMPLATE_MAP[args.template]
# LXMFy hardcodes its config directory to os.path.join(os.getcwd(), 'config').
# By chdir'ing into args.storage, we ensure 'config' and data are kept within that folder.
bot_instance = BotCls(name=args.name, storage_path=args.storage, test_mode=False)
# Optional immediate announce for reachability
try:
if hasattr(bot_instance.bot, "announce_enabled"):
bot_instance.bot.announce_enabled = True
if hasattr(bot_instance.bot, "_announce"):
bot_instance.bot._announce()
except Exception:
pass
bot_instance.run()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,265 @@
import re
import time
from datetime import datetime, timedelta
from lxmfy import IconAppearance, LXMFBot, pack_icon_appearance_field
HAS_LXMFY = True
class StoppableBot:
def __init__(self):
self._stop_event = None
def set_stop_event(self, stop_event):
self._stop_event = stop_event
def should_stop(self):
return self._stop_event and self._stop_event.is_set()
class EchoBotTemplate(StoppableBot):
def __init__(self, name="Echo Bot", storage_path=None, test_mode=False):
super().__init__()
self.bot = LXMFBot(
name=name,
announce=600,
command_prefix="",
first_message_enabled=True,
test_mode=test_mode,
storage_path=storage_path,
)
self.setup_commands()
self.setup_message_handlers()
icon_data = IconAppearance(
icon_name="forum",
fg_color=b"\xad\xd8\xe6",
bg_color=b"\x3b\x59\x98",
)
self.icon_lxmf_field = pack_icon_appearance_field(icon_data)
def setup_message_handlers(self):
@self.bot.on_message()
def echo_non_command_messages(sender, message):
if self.should_stop():
return True
content = message.content.decode("utf-8").strip()
if not content:
return False
command_name = content.split()[0]
if command_name in self.bot.commands:
return False
self.bot.send(
sender,
content,
lxmf_fields=self.icon_lxmf_field,
)
return False
def setup_commands(self):
@self.bot.command(name="echo", description="Echo back your message")
def echo(ctx):
if self.should_stop():
return
if ctx.args:
ctx.reply(" ".join(ctx.args), lxmf_fields=self.icon_lxmf_field)
else:
ctx.reply("Usage: echo <message>", lxmf_fields=self.icon_lxmf_field)
@self.bot.on_first_message()
def welcome(sender, message):
if self.should_stop():
return True
content = message.content.decode("utf-8").strip()
self.bot.send(
sender,
f"Hi! I'm an echo bot, You said: {content}\n\n"
"Try: echo <message> to make me repeat things!",
lxmf_fields=self.icon_lxmf_field,
)
return True
def run(self):
self.bot.scheduler.start()
try:
while not self.should_stop():
for _ in range(self.bot.queue.qsize()):
lxm = self.bot.queue.get()
if self.bot.router:
self.bot.router.handle_outbound(lxm)
time.sleep(1)
finally:
self.bot.cleanup()
class NoteBotTemplate(StoppableBot):
def __init__(self, name="Note Bot", storage_path=None, test_mode=False):
super().__init__()
self.bot = LXMFBot(
name=name,
announce=600,
command_prefix="/",
storage_type="json",
storage_path=storage_path or "data/notes",
test_mode=test_mode,
)
self.setup_commands()
def setup_commands(self):
@self.bot.command(name="note", description="Save a note")
def save_note(ctx):
if self.should_stop():
return
if not ctx.args:
ctx.reply("Usage: /note <your note>")
return
note = {
"text": " ".join(ctx.args),
"timestamp": datetime.now().isoformat(),
"tags": [w[1:] for w in ctx.args if w.startswith("#")],
}
notes = self.bot.storage.get(f"notes:{ctx.sender}", [])
notes.append(note)
self.bot.storage.set(f"notes:{ctx.sender}", notes)
ctx.reply("Note saved!")
@self.bot.command(name="notes", description="List your notes")
def list_notes(ctx):
if self.should_stop():
return
notes = self.bot.storage.get(f"notes:{ctx.sender}", [])
if not notes:
ctx.reply("You haven't saved any notes yet!")
return
if not ctx.args:
response = "Your Notes:\n"
for i, note in enumerate(notes[-10:], 1):
tags = (
" ".join(f"#{tag}" for tag in note["tags"])
if note["tags"]
else ""
)
response += f"{i}. {note['text']} {tags}\n"
if len(notes) > 10:
response += f"\nShowing last 10 of {len(notes)} notes. Use /notes all to see all."
ctx.reply(response)
elif ctx.args[0] == "all":
response = "All Your Notes:\n"
for i, note in enumerate(notes, 1):
tags = (
" ".join(f"#{tag}" for tag in note["tags"])
if note["tags"]
else ""
)
response += f"{i}. {note['text']} {tags}\n"
ctx.reply(response)
def run(self):
self.bot.scheduler.start()
try:
while not self.should_stop():
for _ in range(self.bot.queue.qsize()):
lxm = self.bot.queue.get()
if self.bot.router:
self.bot.router.handle_outbound(lxm)
time.sleep(1)
finally:
self.bot.cleanup()
class ReminderBotTemplate(StoppableBot):
def __init__(self, name="Reminder Bot", storage_path=None, test_mode=False):
super().__init__()
self.bot = LXMFBot(
name=name,
announce=600,
command_prefix="/",
storage_type="sqlite",
storage_path=storage_path or "data/reminders.db",
test_mode=test_mode,
)
self.setup_commands()
self.bot.scheduler.add_task(
"check_reminders",
self._check_reminders,
"*/1 * * * *",
)
def setup_commands(self):
@self.bot.command(name="remind", description="Set a reminder")
def remind(ctx):
if self.should_stop():
return
if not ctx.args or len(ctx.args) < 2:
ctx.reply(
"Usage: /remind <time> <message>\nExample: /remind 1h30m Buy groceries",
)
return
time_str = ctx.args[0].lower()
message = " ".join(ctx.args[1:])
total_minutes = 0
time_parts = re.findall(r"(\d+)([dhm])", time_str)
for value, unit in time_parts:
if unit == "d":
total_minutes += int(value) * 24 * 60
elif unit == "h":
total_minutes += int(value) * 60
elif unit == "m":
total_minutes += int(value)
if total_minutes == 0:
ctx.reply("Invalid time format. Use combinations of d, h, m")
return
remind_time = datetime.now() + timedelta(minutes=total_minutes)
reminder = {
"user": ctx.sender,
"message": message,
"time": remind_time.timestamp(),
"created": time.time(),
}
reminders = self.bot.storage.get("reminders", [])
reminders.append(reminder)
self.bot.storage.set("reminders", reminders)
ctx.reply(
f"I'll remind you about '{message}' at {remind_time.strftime('%Y-%m-%d %H:%M:%S')}",
)
def _check_reminders(self):
if self.should_stop():
return
reminders = self.bot.storage.get("reminders", [])
current_time = time.time()
due_reminders = [r for r in reminders if r["time"] <= current_time]
remaining = [r for r in reminders if r["time"] > current_time]
for reminder in due_reminders:
self.bot.send(reminder["user"], f"Reminder: {reminder['message']}")
if due_reminders:
self.bot.storage.set("reminders", remaining)
def run(self):
self.bot.scheduler.start()
try:
while not self.should_stop():
for _ in range(self.bot.queue.qsize()):
lxm = self.bot.queue.get()
if self.bot.router:
self.bot.router.handle_outbound(lxm)
time.sleep(1)
finally:
self.bot.cleanup()

View File

@@ -0,0 +1,82 @@
import asyncio
import time
from typing import Any
class CommunityInterfacesManager:
def __init__(self):
self.interfaces = [
{
"name": "RNS Testnet Amsterdam",
"type": "TCPClientInterface",
"target_host": "amsterdam.connect.reticulum.network",
"target_port": 4965,
"description": "Reticulum Testnet Hub",
},
{
"name": "RNS Testnet BetweenTheBorders",
"type": "TCPClientInterface",
"target_host": "reticulum.betweentheborders.com",
"target_port": 4242,
"description": "Reticulum Testnet Hub",
},
]
self.status_cache = {}
self.last_check = 0
self.check_interval = 600 # Check every 10 minutes
async def check_health(self, host: str, port: int) -> bool:
try:
# Simple TCP connect check as a proxy for "working"
# In a real RNS environment, we might want to use RNS.Transport.probe()
# but that requires Reticulum to be running with a configured interface to that target.
# For "suggested" interfaces, we just check if they are reachable.
reader, writer = await asyncio.wait_for(
asyncio.open_connection(host, port),
timeout=3.0,
)
writer.close()
await writer.wait_closed()
return True
except Exception:
return False
async def update_statuses(self):
tasks = [
self.check_health(iface["target_host"], iface["target_port"])
for iface in self.interfaces
]
results = await asyncio.gather(*tasks)
for iface, is_online in zip(self.interfaces, results):
self.status_cache[iface["name"]] = {
"online": is_online,
"last_check": time.time(),
}
self.last_check = time.time()
async def get_interfaces(self) -> list[dict[str, Any]]:
# If cache is old or empty, update it
if time.time() - self.last_check > self.check_interval or not self.status_cache:
# We don't want to block the request, so we could do this in background
# but for now let's just do it.
await self.update_statuses()
results = []
for iface in self.interfaces:
status = self.status_cache.get(
iface["name"],
{"online": False, "last_check": 0},
)
results.append(
{
**iface,
"online": status["online"],
"last_check": status["last_check"],
},
)
# Sort so online ones are first
results.sort(key=lambda x: x["online"], reverse=True)
return results

View File

@@ -48,6 +48,11 @@ class ConfigManager:
"lxmf_preferred_propagation_node_destination_hash",
None,
)
self.lxmf_preferred_propagation_node_auto_select = self.BoolConfig(
self,
"lxmf_preferred_propagation_node_auto_select",
False,
)
self.lxmf_preferred_propagation_node_auto_sync_interval_seconds = (
self.IntConfig(
self,
@@ -60,6 +65,8 @@ class ConfigManager:
"lxmf_preferred_propagation_node_last_synced_at",
None,
)
self.lxmf_address_hash = self.StringConfig(self, "lxmf_address_hash", None)
self.lxst_address_hash = self.StringConfig(self, "lxst_address_hash", None)
self.lxmf_local_propagation_node_enabled = self.BoolConfig(
self,
"lxmf_local_propagation_node_enabled",
@@ -101,6 +108,7 @@ class ConfigManager:
"archives_max_storage_gb",
1,
)
self.backup_max_count = self.IntConfig(self, "backup_max_count", 5)
self.crawler_enabled = self.BoolConfig(self, "crawler_enabled", False)
self.crawler_max_retries = self.IntConfig(self, "crawler_max_retries", 3)
self.crawler_retry_delay_seconds = self.IntConfig(
@@ -112,6 +120,34 @@ class ConfigManager:
self.auth_enabled = self.BoolConfig(self, "auth_enabled", False)
self.auth_password_hash = self.StringConfig(self, "auth_password_hash", None)
self.auth_session_secret = self.StringConfig(self, "auth_session_secret", None)
self.docs_downloaded = self.BoolConfig(self, "docs_downloaded", False)
self.initial_docs_download_attempted = self.BoolConfig(
self,
"initial_docs_download_attempted",
False,
)
self.gitea_base_url = self.StringConfig(
self,
"gitea_base_url",
"https://git.quad4.io",
)
self.docs_download_urls = self.StringConfig(
self,
"docs_download_urls",
"https://git.quad4.io/Reticulum/reticulum_website/archive/main.zip,https://github.com/markqvist/reticulum_website/archive/refs/heads/main.zip",
)
# desktop config
self.desktop_open_calls_in_separate_window = self.BoolConfig(
self,
"desktop_open_calls_in_separate_window",
False,
)
self.desktop_hardware_acceleration_enabled = self.BoolConfig(
self,
"desktop_hardware_acceleration_enabled",
True,
)
# voicemail config
self.voicemail_enabled = self.BoolConfig(self, "voicemail_enabled", False)
@@ -130,19 +166,65 @@ class ConfigManager:
"voicemail_max_recording_seconds",
60,
)
self.voicemail_tts_speed = self.IntConfig(self, "voicemail_tts_speed", 130)
self.voicemail_tts_pitch = self.IntConfig(self, "voicemail_tts_pitch", 45)
self.voicemail_tts_voice = self.StringConfig(
self,
"voicemail_tts_voice",
"en-us+f3",
)
self.voicemail_tts_word_gap = self.IntConfig(self, "voicemail_tts_word_gap", 5)
# ringtone config
self.custom_ringtone_enabled = self.BoolConfig(
self, "custom_ringtone_enabled", False
self,
"custom_ringtone_enabled",
False,
)
self.ringtone_filename = self.StringConfig(self, "ringtone_filename", None)
self.ringtone_preferred_id = self.IntConfig(self, "ringtone_preferred_id", 0)
self.ringtone_volume = self.IntConfig(self, "ringtone_volume", 100)
# telephony config
self.do_not_disturb_enabled = self.BoolConfig(
self, "do_not_disturb_enabled", False
self,
"do_not_disturb_enabled",
False,
)
self.telephone_allow_calls_from_contacts_only = self.BoolConfig(
self, "telephone_allow_calls_from_contacts_only", False
self,
"telephone_allow_calls_from_contacts_only",
False,
)
self.telephone_audio_profile_id = self.IntConfig(
self,
"telephone_audio_profile_id",
2, # Default to Voice (profile 2)
)
self.telephone_web_audio_enabled = self.BoolConfig(
self,
"telephone_web_audio_enabled",
False,
)
self.telephone_web_audio_allow_fallback = self.BoolConfig(
self,
"telephone_web_audio_allow_fallback",
True,
)
self.call_recording_enabled = self.BoolConfig(
self,
"call_recording_enabled",
False,
)
self.telephone_tone_generator_enabled = self.BoolConfig(
self,
"telephone_tone_generator_enabled",
True,
)
self.telephone_tone_generator_volume = self.IntConfig(
self,
"telephone_tone_generator_volume",
50,
)
# map config
@@ -168,6 +250,60 @@ class ConfigManager:
"https://nominatim.openstreetmap.org",
)
# telemetry config
self.telemetry_enabled = self.BoolConfig(self, "telemetry_enabled", False)
# translator config
self.translator_enabled = self.BoolConfig(self, "translator_enabled", False)
self.libretranslate_url = self.StringConfig(
self,
"libretranslate_url",
"http://localhost:5000",
)
# location config
self.location_source = self.StringConfig(self, "location_source", "browser")
self.location_manual_lat = self.StringConfig(self, "location_manual_lat", "0.0")
self.location_manual_lon = self.StringConfig(self, "location_manual_lon", "0.0")
self.location_manual_alt = self.StringConfig(self, "location_manual_alt", "0.0")
# banishment config
self.banished_effect_enabled = self.BoolConfig(
self,
"banished_effect_enabled",
True,
)
self.banished_text = self.StringConfig(
self,
"banished_text",
"BANISHED",
)
self.banished_color = self.StringConfig(
self,
"banished_color",
"#dc2626",
)
self.message_font_size = self.IntConfig(self, "message_font_size", 14)
self.message_icon_size = self.IntConfig(self, "message_icon_size", 28)
# blackhole integration config
self.blackhole_integration_enabled = self.BoolConfig(
self,
"blackhole_integration_enabled",
True,
)
# csp config so users can set extra CSP sources for local offgrid environments (tile servers, etc.)
self.csp_extra_connect_src = self.StringConfig(
self,
"csp_extra_connect_src",
"",
)
self.csp_extra_img_src = self.StringConfig(self, "csp_extra_img_src", "")
self.csp_extra_frame_src = self.StringConfig(self, "csp_extra_frame_src", "")
self.csp_extra_script_src = self.StringConfig(self, "csp_extra_script_src", "")
self.csp_extra_style_src = self.StringConfig(self, "csp_extra_style_src", "")
def get(self, key: str, default_value=None) -> str | None:
return self.db.config.get(key, default_value)

View File

@@ -1,7 +1,14 @@
import os
import shutil
import zipfile
from datetime import UTC, datetime
from .announces import AnnounceDAO
from .config import ConfigDAO
from .contacts import ContactsDAO
from .debug_logs import DebugLogsDAO
from .legacy_migrator import LegacyMigrator
from .map_drawings import MapDrawingsDAO
from .messages import MessageDAO
from .misc import MiscDAO
from .provider import DatabaseProvider
@@ -25,6 +32,8 @@ class Database:
self.voicemails = VoicemailDAO(self.provider)
self.ringtones = RingtoneDAO(self.provider)
self.contacts = ContactsDAO(self.provider)
self.map_drawings = MapDrawingsDAO(self.provider)
self.debug_logs = DebugLogsDAO(self.provider)
def initialize(self):
self.schema.initialize()
@@ -42,5 +51,288 @@ class Database:
def execute_sql(self, query, params=None):
return self.provider.execute(query, params)
def _tune_sqlite_pragmas(self):
try:
self.execute_sql("PRAGMA wal_autocheckpoint=1000")
self.execute_sql("PRAGMA temp_store=MEMORY")
self.execute_sql("PRAGMA journal_mode=WAL")
except Exception as exc:
print(f"SQLite pragma setup failed: {exc}")
def _get_pragma_value(self, pragma: str, default=None):
try:
cursor = self.execute_sql(f"PRAGMA {pragma}")
row = cursor.fetchone()
if row is None:
return default
return row[0]
except Exception:
return default
def _get_database_file_stats(self):
def size_for(path):
try:
return os.path.getsize(path)
except OSError:
return 0
db_path = self.provider.db_path
wal_path = f"{db_path}-wal"
shm_path = f"{db_path}-shm"
main_bytes = size_for(db_path)
wal_bytes = size_for(wal_path)
shm_bytes = size_for(shm_path)
return {
"main_bytes": main_bytes,
"wal_bytes": wal_bytes,
"shm_bytes": shm_bytes,
"total_bytes": main_bytes + wal_bytes + shm_bytes,
}
def _database_paths(self):
db_path = self.provider.db_path
return {
"main": db_path,
"wal": f"{db_path}-wal",
"shm": f"{db_path}-shm",
}
def get_database_health_snapshot(self):
page_size = self._get_pragma_value("page_size", 0) or 0
page_count = self._get_pragma_value("page_count", 0) or 0
freelist_pages = self._get_pragma_value("freelist_count", 0) or 0
free_bytes = (
page_size * freelist_pages if page_size > 0 and freelist_pages > 0 else 0
)
return {
"quick_check": self._get_pragma_value("quick_check", "unknown"),
"journal_mode": self._get_pragma_value("journal_mode", "unknown"),
"synchronous": self._get_pragma_value("synchronous", None),
"wal_autocheckpoint": self._get_pragma_value("wal_autocheckpoint", None),
"auto_vacuum": self._get_pragma_value("auto_vacuum", None),
"page_size": page_size,
"page_count": page_count,
"freelist_pages": freelist_pages,
"estimated_free_bytes": free_bytes,
"files": self._get_database_file_stats(),
}
def _checkpoint_wal(self, mode: str = "TRUNCATE"):
return self.execute_sql(f"PRAGMA wal_checkpoint({mode})").fetchall()
def run_database_vacuum(self):
try:
# Attempt to checkpoint WAL, ignore errors if busy
try:
self._checkpoint_wal()
except Exception as e:
print(
f"Warning: WAL checkpoint during vacuum failed (non-critical): {e}",
)
self.execute_sql("VACUUM")
self._tune_sqlite_pragmas()
return {
"health": self.get_database_health_snapshot(),
}
except Exception as e:
# Wrap in a cleaner error message
raise Exception(f"Database vacuum failed: {e!s}")
def run_database_recovery(self):
actions = []
actions.append(
{
"step": "quick_check_before",
"result": self._get_pragma_value("quick_check", "unknown"),
},
)
actions.append({"step": "wal_checkpoint", "result": self._checkpoint_wal()})
integrity_rows = self.provider.integrity_check()
integrity = [row[0] for row in integrity_rows] if integrity_rows else []
actions.append({"step": "integrity_check", "result": integrity})
self.provider.vacuum()
self._tune_sqlite_pragmas()
actions.append(
{
"step": "quick_check_after",
"result": self._get_pragma_value("quick_check", "unknown"),
},
)
return {
"actions": actions,
"health": self.get_database_health_snapshot(),
}
def _checkpoint_and_close(self):
try:
self._checkpoint_wal()
except Exception as e:
print(f"Failed to checkpoint WAL: {e}")
try:
self.close()
except Exception as e:
print(f"Failed to close database: {e}")
def close(self):
self.provider.close()
if hasattr(self, "provider"):
self.provider.close()
def close_all(self):
if hasattr(self, "provider"):
self.provider.close_all()
def _backup_to_zip(self, backup_path: str):
paths = self._database_paths()
os.makedirs(os.path.dirname(backup_path), exist_ok=True)
# ensure WAL is checkpointed to get a consistent snapshot
self._checkpoint_wal()
main_filename = os.path.basename(paths["main"])
with zipfile.ZipFile(backup_path, "w", compression=zipfile.ZIP_DEFLATED) as zf:
zf.write(paths["main"], arcname=main_filename)
if os.path.exists(paths["wal"]):
zf.write(paths["wal"], arcname=f"{main_filename}-wal")
if os.path.exists(paths["shm"]):
zf.write(paths["shm"], arcname=f"{main_filename}-shm")
return {
"path": backup_path,
"size": os.path.getsize(backup_path),
}
def backup_database(
self,
storage_path,
backup_path: str | None = None,
max_count: int | None = None,
):
default_dir = os.path.join(storage_path, "database-backups")
os.makedirs(default_dir, exist_ok=True)
if backup_path is None:
timestamp = datetime.now(UTC).strftime("%Y%m%d-%H%M%S")
backup_path = os.path.join(default_dir, f"backup-{timestamp}.zip")
result = self._backup_to_zip(backup_path)
# Cleanup old backups if a limit is set
if max_count is not None and max_count > 0:
try:
backups = []
for file in os.listdir(default_dir):
if file.endswith(".zip"):
full_path = os.path.join(default_dir, file)
stats = os.stat(full_path)
backups.append((full_path, stats.st_mtime))
if len(backups) > max_count:
# Sort by modification time (oldest first)
backups.sort(key=lambda x: x[1])
to_delete = backups[: len(backups) - max_count]
for path, _ in to_delete:
if os.path.exists(path):
os.remove(path)
except Exception as e:
print(f"Failed to cleanup old backups: {e}")
return result
def create_snapshot(self, storage_path, name: str):
"""Creates a named snapshot of the database."""
snapshot_dir = os.path.join(storage_path, "snapshots")
os.makedirs(snapshot_dir, exist_ok=True)
# Ensure name is safe for filesystem
safe_name = "".join(
[c for c in name if c.isalnum() or c in (" ", ".", "-", "_")],
).strip()
if not safe_name:
safe_name = "unnamed_snapshot"
snapshot_path = os.path.join(snapshot_dir, f"{safe_name}.zip")
return self._backup_to_zip(snapshot_path)
def list_snapshots(self, storage_path):
"""Lists all available snapshots."""
snapshot_dir = os.path.join(storage_path, "snapshots")
if not os.path.exists(snapshot_dir):
return []
snapshots = []
for file in os.listdir(snapshot_dir):
if file.endswith(".zip"):
full_path = os.path.join(snapshot_dir, file)
stats = os.stat(full_path)
snapshots.append(
{
"name": file[:-4],
"path": full_path,
"size": stats.st_size,
"created_at": datetime.fromtimestamp(
stats.st_mtime,
UTC,
).isoformat(),
},
)
return sorted(snapshots, key=lambda x: x["created_at"], reverse=True)
def delete_snapshot_or_backup(
self,
storage_path,
filename: str,
is_backup: bool = False,
):
"""Deletes a database snapshot or auto-backup."""
base_dir = "database-backups" if is_backup else "snapshots"
file_path = os.path.join(storage_path, base_dir, filename)
# Basic security check to ensure we stay within the intended directory
abs_path = os.path.abspath(file_path)
abs_base = os.path.abspath(os.path.join(storage_path, base_dir))
if not abs_path.startswith(abs_base):
msg = "Invalid path"
raise ValueError(msg)
if os.path.exists(abs_path):
os.remove(abs_path)
return True
return False
def restore_database(self, backup_path: str):
if not os.path.exists(backup_path):
msg = f"Backup not found at {backup_path}"
raise FileNotFoundError(msg)
paths = self._database_paths()
self._checkpoint_and_close()
# clean existing files
for p in paths.values():
if os.path.exists(p):
os.remove(p)
if zipfile.is_zipfile(backup_path):
with zipfile.ZipFile(backup_path, "r") as zf:
zf.extractall(os.path.dirname(paths["main"]))
else:
shutil.copy2(backup_path, paths["main"])
# reopen and retune
self.initialize()
self._tune_sqlite_pragmas()
integrity = self.provider.integrity_check()
return {
"restored_from": backup_path,
"integrity_check": integrity,
}

View File

@@ -30,7 +30,7 @@ class AnnounceDAO:
)
query = (
f"INSERT INTO announces ({columns}, created_at, updated_at) VALUES ({placeholders}, ?, ?) "
f"INSERT INTO announces ({columns}, created_at, updated_at) VALUES ({placeholders}, ?, ?) " # noqa: S608
f"ON CONFLICT(destination_hash) DO UPDATE SET {update_set}, updated_at = EXCLUDED.updated_at"
)
@@ -54,10 +54,21 @@ class AnnounceDAO:
(destination_hash,),
)
def delete_all_announces(self, aspect=None):
if aspect:
self.provider.execute(
"DELETE FROM announces WHERE aspect = ?",
(aspect,),
)
else:
self.provider.execute("DELETE FROM announces")
def get_filtered_announces(
self,
aspect=None,
search_term=None,
identity_hash=None,
destination_hash=None,
limit=None,
offset=0,
):
@@ -66,6 +77,12 @@ class AnnounceDAO:
if aspect:
query += " AND aspect = ?"
params.append(aspect)
if identity_hash:
query += " AND identity_hash = ?"
params.append(identity_hash)
if destination_hash:
query += " AND destination_hash = ?"
params.append(destination_hash)
if search_term:
query += " AND (destination_hash LIKE ? OR identity_hash LIKE ?)"
like_term = f"%{search_term}%"
@@ -129,3 +146,12 @@ class AnnounceDAO:
"DELETE FROM favourite_destinations WHERE destination_hash = ?",
(destination_hash,),
)
def delete_all_favourites(self, aspect=None):
if aspect:
self.provider.execute(
"DELETE FROM favourite_destinations WHERE aspect = ?",
(aspect,),
)
else:
self.provider.execute("DELETE FROM favourite_destinations")

View File

@@ -5,16 +5,38 @@ class ContactsDAO:
def __init__(self, provider: DatabaseProvider):
self.provider = provider
def add_contact(self, name, remote_identity_hash):
def add_contact(
self,
name,
remote_identity_hash,
lxmf_address=None,
lxst_address=None,
preferred_ringtone_id=None,
custom_image=None,
is_telemetry_trusted=0,
):
self.provider.execute(
"""
INSERT INTO contacts (name, remote_identity_hash)
VALUES (?, ?)
INSERT INTO contacts (name, remote_identity_hash, lxmf_address, lxst_address, preferred_ringtone_id, custom_image, is_telemetry_trusted)
VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(remote_identity_hash) DO UPDATE SET
name = EXCLUDED.name,
lxmf_address = COALESCE(EXCLUDED.lxmf_address, contacts.lxmf_address),
lxst_address = COALESCE(EXCLUDED.lxst_address, contacts.lxst_address),
preferred_ringtone_id = EXCLUDED.preferred_ringtone_id,
custom_image = EXCLUDED.custom_image,
is_telemetry_trusted = EXCLUDED.is_telemetry_trusted,
updated_at = CURRENT_TIMESTAMP
""",
(name, remote_identity_hash),
(
name,
remote_identity_hash,
lxmf_address,
lxst_address,
preferred_ringtone_id,
custom_image,
is_telemetry_trusted,
),
)
def get_contacts(self, search=None, limit=100, offset=0):
@@ -22,10 +44,17 @@ class ContactsDAO:
return self.provider.fetchall(
"""
SELECT * FROM contacts
WHERE name LIKE ? OR remote_identity_hash LIKE ?
WHERE name LIKE ? OR remote_identity_hash LIKE ? OR lxmf_address LIKE ? OR lxst_address LIKE ?
ORDER BY name ASC LIMIT ? OFFSET ?
""",
(f"%{search}%", f"%{search}%", limit, offset),
(
f"%{search}%",
f"%{search}%",
f"%{search}%",
f"%{search}%",
limit,
offset,
),
)
return self.provider.fetchall(
"SELECT * FROM contacts ORDER BY name ASC LIMIT ? OFFSET ?",
@@ -38,28 +67,58 @@ class ContactsDAO:
(contact_id,),
)
def update_contact(self, contact_id, name=None, remote_identity_hash=None):
if name and remote_identity_hash:
self.provider.execute(
"UPDATE contacts SET name = ?, remote_identity_hash = ?, updated_at = CURRENT_TIMESTAMP WHERE id = ?",
(name, remote_identity_hash, contact_id),
)
elif name:
self.provider.execute(
"UPDATE contacts SET name = ?, updated_at = CURRENT_TIMESTAMP WHERE id = ?",
(name, contact_id),
)
elif remote_identity_hash:
self.provider.execute(
"UPDATE contacts SET remote_identity_hash = ?, updated_at = CURRENT_TIMESTAMP WHERE id = ?",
(remote_identity_hash, contact_id),
)
def update_contact(
self,
contact_id,
name=None,
remote_identity_hash=None,
lxmf_address=None,
lxst_address=None,
preferred_ringtone_id=None,
custom_image=None,
clear_image=False,
is_telemetry_trusted=None,
):
updates = []
params = []
if name is not None:
updates.append("name = ?")
params.append(name)
if remote_identity_hash is not None:
updates.append("remote_identity_hash = ?")
params.append(remote_identity_hash)
if lxmf_address is not None:
updates.append("lxmf_address = ?")
params.append(lxmf_address)
if lxst_address is not None:
updates.append("lxst_address = ?")
params.append(lxst_address)
if preferred_ringtone_id is not None:
updates.append("preferred_ringtone_id = ?")
params.append(preferred_ringtone_id)
if is_telemetry_trusted is not None:
updates.append("is_telemetry_trusted = ?")
params.append(1 if is_telemetry_trusted else 0)
if clear_image:
updates.append("custom_image = NULL")
elif custom_image is not None:
updates.append("custom_image = ?")
params.append(custom_image)
if not updates:
return
updates.append("updated_at = CURRENT_TIMESTAMP")
query = f"UPDATE contacts SET {', '.join(updates)} WHERE id = ?"
params.append(contact_id)
self.provider.execute(query, tuple(params))
def delete_contact(self, contact_id):
self.provider.execute("DELETE FROM contacts WHERE id = ?", (contact_id,))
def get_contact_by_identity_hash(self, remote_identity_hash):
return self.provider.fetchone(
"SELECT * FROM contacts WHERE remote_identity_hash = ?",
(remote_identity_hash,),
"SELECT * FROM contacts WHERE remote_identity_hash = ? OR lxmf_address = ? OR lxst_address = ?",
(remote_identity_hash, remote_identity_hash, remote_identity_hash),
)

View File

@@ -0,0 +1,98 @@
from datetime import UTC, datetime
from .provider import DatabaseProvider
class DebugLogsDAO:
def __init__(self, provider: DatabaseProvider):
self.provider = provider
def insert_log(self, level, module, message, is_anomaly=0, anomaly_type=None):
sql = """
INSERT INTO debug_logs (timestamp, level, module, message, is_anomaly, anomaly_type)
VALUES (?, ?, ?, ?, ?, ?)
"""
self.provider.execute(
sql,
(
datetime.now(UTC).timestamp(),
level,
module,
message,
is_anomaly,
anomaly_type,
),
)
def get_logs(
self,
limit=100,
offset=0,
search=None,
level=None,
module=None,
is_anomaly=None,
):
sql = "SELECT * FROM debug_logs WHERE 1=1"
params = []
if search:
sql += " AND (message LIKE ? OR module LIKE ?)"
params.extend([f"%{search}%", f"%{search}%"])
if level:
sql += " AND level = ?"
params.append(level)
if module:
sql += " AND module = ?"
params.append(module)
if is_anomaly is not None:
sql += " AND is_anomaly = ?"
params.append(1 if is_anomaly else 0)
sql += " ORDER BY timestamp DESC LIMIT ? OFFSET ?"
params.extend([limit, offset])
return self.provider.fetchall(sql, tuple(params))
def get_total_count(self, search=None, level=None, module=None, is_anomaly=None):
sql = "SELECT COUNT(*) as count FROM debug_logs WHERE 1=1"
params = []
if search:
sql += " AND (message LIKE ? OR module LIKE ?)"
params.extend([f"%{search}%", f"%{search}%"])
if level:
sql += " AND level = ?"
params.append(level)
if module:
sql += " AND module = ?"
params.append(module)
if is_anomaly is not None:
sql += " AND is_anomaly = ?"
params.append(1 if is_anomaly else 0)
row = self.provider.fetchone(sql, tuple(params))
return row["count"] if row else 0
def cleanup_old_logs(self, max_logs=10000):
"""Removes old logs keeping only the newest max_logs."""
count = self.get_total_count()
if count > max_logs:
# Find the timestamp of the N-th newest log
sql = "SELECT timestamp FROM debug_logs ORDER BY timestamp DESC LIMIT 1 OFFSET ?"
row = self.provider.fetchone(sql, (max_logs - 1,))
if row:
cutoff_ts = row["timestamp"]
self.provider.execute(
"DELETE FROM debug_logs WHERE timestamp < ?",
(cutoff_ts,),
)
def get_anomalies(self, limit=50):
return self.get_logs(limit=limit, is_anomaly=True)

View File

@@ -0,0 +1,49 @@
from datetime import UTC, datetime
from .provider import DatabaseProvider
class MapDrawingsDAO:
def __init__(self, provider: DatabaseProvider):
self.provider = provider
def upsert_drawing(self, identity_hash, name, data):
now = datetime.now(UTC)
# Check if drawing with same name exists for this user
existing = self.provider.fetchone(
"SELECT id FROM map_drawings WHERE identity_hash = ? AND name = ?",
(identity_hash, name),
)
if existing:
self.provider.execute(
"UPDATE map_drawings SET data = ?, updated_at = ? WHERE id = ?",
(data, now, existing["id"]),
)
else:
self.provider.execute(
"""
INSERT INTO map_drawings (identity_hash, name, data, created_at, updated_at)
VALUES (?, ?, ?, ?, ?)
""",
(identity_hash, name, data, now, now),
)
def get_drawings(self, identity_hash):
return self.provider.fetchall(
"SELECT * FROM map_drawings WHERE identity_hash = ? ORDER BY updated_at DESC",
(identity_hash,),
)
def delete_drawing(self, drawing_id):
self.provider.execute(
"DELETE FROM map_drawings WHERE id = ?",
(drawing_id,),
)
def update_drawing(self, drawing_id, name, data):
now = datetime.now(UTC)
self.provider.execute(
"UPDATE map_drawings SET name = ?, data = ?, updated_at = ? WHERE id = ?",
(name, data, now, drawing_id),
)

View File

@@ -18,6 +18,7 @@ class MessageDAO:
"hash",
"source_hash",
"destination_hash",
"peer_hash",
"state",
"progress",
"is_incoming",
@@ -39,7 +40,7 @@ class MessageDAO:
update_set = ", ".join([f"{f} = EXCLUDED.{f}" for f in fields if f != "hash"])
query = (
f"INSERT INTO lxmf_messages ({columns}, created_at, updated_at) VALUES ({placeholders}, ?, ?) "
f"INSERT INTO lxmf_messages ({columns}, created_at, updated_at) VALUES ({placeholders}, ?, ?) " # noqa: S608
f"ON CONFLICT(hash) DO UPDATE SET {update_set}, updated_at = EXCLUDED.updated_at"
)
@@ -62,30 +63,45 @@ class MessageDAO:
(message_hash,),
)
def delete_lxmf_messages_by_hashes(self, message_hashes):
if not message_hashes:
return
placeholders = ", ".join(["?"] * len(message_hashes))
self.provider.execute(
f"DELETE FROM lxmf_messages WHERE hash IN ({placeholders})",
tuple(message_hashes),
)
def delete_lxmf_message_by_hash(self, message_hash):
self.provider.execute(
"DELETE FROM lxmf_messages WHERE hash = ?",
(message_hash,),
)
def delete_all_lxmf_messages(self):
self.provider.execute("DELETE FROM lxmf_messages")
self.provider.execute("DELETE FROM lxmf_conversation_read_state")
def get_all_lxmf_messages(self):
return self.provider.fetchall("SELECT * FROM lxmf_messages")
def get_conversation_messages(self, destination_hash, limit=100, offset=0):
return self.provider.fetchall(
"SELECT * FROM lxmf_messages WHERE destination_hash = ? OR source_hash = ? ORDER BY timestamp DESC LIMIT ? OFFSET ?",
(destination_hash, destination_hash, limit, offset),
"SELECT * FROM lxmf_messages WHERE peer_hash = ? ORDER BY timestamp DESC LIMIT ? OFFSET ?",
(destination_hash, limit, offset),
)
def get_conversations(self):
# This is a bit complex in raw SQL, we need the latest message for each destination
# Optimized using peer_hash column
query = """
SELECT m1.* FROM lxmf_messages m1
JOIN (
SELECT
CASE WHEN is_incoming = 1 THEN source_hash ELSE destination_hash END as peer_hash,
MAX(timestamp) as max_ts
INNER JOIN (
SELECT peer_hash, MAX(timestamp) as max_ts
FROM lxmf_messages
WHERE peer_hash IS NOT NULL
GROUP BY peer_hash
) m2 ON (CASE WHEN m1.is_incoming = 1 THEN m1.source_hash ELSE m1.destination_hash END = m2.peer_hash
AND m1.timestamp = m2.max_ts)
) m2 ON m1.peer_hash = m2.peer_hash AND m1.timestamp = m2.max_ts
GROUP BY m1.peer_hash
ORDER BY m1.timestamp DESC
"""
return self.provider.fetchall(query)
@@ -103,16 +119,32 @@ class MessageDAO:
(destination_hash, now, now, now),
)
def mark_conversations_as_read(self, destination_hashes):
if not destination_hashes:
return
now = datetime.now(UTC).isoformat()
for destination_hash in destination_hashes:
self.provider.execute(
"""
INSERT INTO lxmf_conversation_read_state (destination_hash, last_read_at, created_at, updated_at)
VALUES (?, ?, ?, ?)
ON CONFLICT(destination_hash) DO UPDATE SET
last_read_at = EXCLUDED.last_read_at,
updated_at = EXCLUDED.updated_at
""",
(destination_hash, now, now, now),
)
def is_conversation_unread(self, destination_hash):
row = self.provider.fetchone(
"""
SELECT m.timestamp, r.last_read_at
FROM lxmf_messages m
LEFT JOIN lxmf_conversation_read_state r ON r.destination_hash = ?
WHERE (m.destination_hash = ? OR m.source_hash = ?)
WHERE m.peer_hash = ?
ORDER BY m.timestamp DESC LIMIT 1
""",
(destination_hash, destination_hash, destination_hash),
(destination_hash, destination_hash),
)
if not row:
@@ -140,17 +172,75 @@ class MessageDAO:
def get_failed_messages_for_destination(self, destination_hash):
return self.provider.fetchall(
"SELECT * FROM lxmf_messages WHERE state = 'failed' AND destination_hash = ? ORDER BY id ASC",
"SELECT * FROM lxmf_messages WHERE state = 'failed' AND peer_hash = ? ORDER BY id ASC",
(destination_hash,),
)
def get_failed_messages_count(self, destination_hash):
row = self.provider.fetchone(
"SELECT COUNT(*) as count FROM lxmf_messages WHERE state = 'failed' AND destination_hash = ?",
"SELECT COUNT(*) as count FROM lxmf_messages WHERE state = 'failed' AND peer_hash = ?",
(destination_hash,),
)
return row["count"] if row else 0
def get_conversations_unread_states(self, destination_hashes):
if not destination_hashes:
return {}
placeholders = ", ".join(["?"] * len(destination_hashes))
query = f"""
SELECT peer_hash, MAX(timestamp) as latest_ts, last_read_at
FROM lxmf_messages m
LEFT JOIN lxmf_conversation_read_state r ON r.destination_hash = m.peer_hash
WHERE m.peer_hash IN ({placeholders})
GROUP BY m.peer_hash
""" # noqa: S608
rows = self.provider.fetchall(query, destination_hashes)
unread_states = {}
for row in rows:
peer_hash = row["peer_hash"]
latest_ts = row["latest_ts"]
last_read_at_str = row["last_read_at"]
if not last_read_at_str:
unread_states[peer_hash] = True
continue
last_read_at = datetime.fromisoformat(last_read_at_str)
if last_read_at.tzinfo is None:
last_read_at = last_read_at.replace(tzinfo=UTC)
unread_states[peer_hash] = latest_ts > last_read_at.timestamp()
return unread_states
def get_conversations_failed_counts(self, destination_hashes):
if not destination_hashes:
return {}
placeholders = ", ".join(["?"] * len(destination_hashes))
rows = self.provider.fetchall(
f"SELECT peer_hash, COUNT(*) as count FROM lxmf_messages WHERE state = 'failed' AND peer_hash IN ({placeholders}) GROUP BY peer_hash", # noqa: S608
tuple(destination_hashes),
)
return {row["peer_hash"]: row["count"] for row in rows}
def get_conversations_attachment_states(self, destination_hashes):
if not destination_hashes:
return {}
placeholders = ", ".join(["?"] * len(destination_hashes))
query = f"""
SELECT peer_hash, 1 as has_attachments
FROM lxmf_messages
WHERE peer_hash IN ({placeholders})
AND fields IS NOT NULL AND fields != '{{}}' AND fields != ''
GROUP BY peer_hash
""" # noqa: S608
rows = self.provider.fetchall(query, destination_hashes)
return {row["peer_hash"]: True for row in rows}
# Forwarding Mappings
def get_forwarding_mapping(
self,
@@ -232,3 +322,56 @@ class MessageDAO:
last_viewed_at = last_viewed_at.replace(tzinfo=UTC)
return message_timestamp <= last_viewed_at.timestamp()
# Folders
def get_all_folders(self):
return self.provider.fetchall("SELECT * FROM lxmf_folders ORDER BY name ASC")
def create_folder(self, name):
now = datetime.now(UTC).isoformat()
return self.provider.execute(
"INSERT INTO lxmf_folders (name, created_at, updated_at) VALUES (?, ?, ?)",
(name, now, now),
)
def rename_folder(self, folder_id, new_name):
now = datetime.now(UTC).isoformat()
self.provider.execute(
"UPDATE lxmf_folders SET name = ?, updated_at = ? WHERE id = ?",
(new_name, now, folder_id),
)
def delete_folder(self, folder_id):
self.provider.execute("DELETE FROM lxmf_folders WHERE id = ?", (folder_id,))
def get_conversation_folder(self, peer_hash):
return self.provider.fetchone(
"SELECT * FROM lxmf_conversation_folders WHERE peer_hash = ?",
(peer_hash,),
)
def move_conversation_to_folder(self, peer_hash, folder_id):
now = datetime.now(UTC).isoformat()
if folder_id is None:
self.provider.execute(
"DELETE FROM lxmf_conversation_folders WHERE peer_hash = ?",
(peer_hash,),
)
else:
self.provider.execute(
"""
INSERT INTO lxmf_conversation_folders (peer_hash, folder_id, created_at, updated_at)
VALUES (?, ?, ?, ?)
ON CONFLICT(peer_hash) DO UPDATE SET
folder_id = EXCLUDED.folder_id,
updated_at = EXCLUDED.updated_at
""",
(peer_hash, folder_id, now, now),
)
def move_conversations_to_folder(self, peer_hashes, folder_id):
for peer_hash in peer_hashes:
self.move_conversation_to_folder(peer_hash, folder_id)
def get_all_conversation_folders(self):
return self.provider.fetchall("SELECT * FROM lxmf_conversation_folders")

View File

@@ -90,6 +90,24 @@ class MiscDAO:
(destination_hash,),
)
def get_user_icons(self, destination_hashes):
if not destination_hashes:
return []
placeholders = ", ".join(["?"] * len(destination_hashes))
return self.provider.fetchall(
f"SELECT * FROM lxmf_user_icons WHERE destination_hash IN ({placeholders})", # noqa: S608
tuple(destination_hashes),
)
def delete_user_icon(self, destination_hash):
self.provider.execute(
"DELETE FROM lxmf_user_icons WHERE destination_hash = ?",
(destination_hash,),
)
def delete_all_user_icons(self):
self.provider.execute("DELETE FROM lxmf_user_icons")
# Forwarding Rules
def get_forwarding_rules(self, identity_hash=None, active_only=False):
query = "SELECT * FROM lxmf_forwarding_rules WHERE 1=1"
@@ -165,8 +183,14 @@ class MiscDAO:
sql += " ORDER BY created_at DESC"
return self.provider.fetchall(sql, params)
def delete_archived_pages(self, destination_hash=None, page_path=None):
if destination_hash and page_path:
def delete_archived_pages(self, destination_hash=None, page_path=None, ids=None):
if ids:
placeholders = ", ".join(["?"] * len(ids))
self.provider.execute(
f"DELETE FROM archived_pages WHERE id IN ({placeholders})", # noqa: S608
tuple(ids),
)
elif destination_hash and page_path:
self.provider.execute(
"DELETE FROM archived_pages WHERE destination_hash = ? AND page_path = ?",
(destination_hash, page_path),
@@ -185,13 +209,14 @@ class MiscDAO:
now = datetime.now(UTC)
self.provider.execute(
"""
INSERT INTO crawl_tasks (destination_hash, page_path, status, retry_count, created_at)
VALUES (?, ?, ?, ?, ?)
INSERT INTO crawl_tasks (destination_hash, page_path, status, retry_count, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?)
ON CONFLICT(destination_hash, page_path) DO UPDATE SET
status = EXCLUDED.status,
retry_count = EXCLUDED.retry_count
retry_count = EXCLUDED.retry_count,
updated_at = EXCLUDED.updated_at
""",
(destination_hash, page_path, status, retry_count, now),
(destination_hash, page_path, status, retry_count, now, now),
)
def get_pending_crawl_tasks(self):
@@ -205,6 +230,8 @@ class MiscDAO:
"page_path",
"status",
"retry_count",
"last_retry_at",
"next_retry_at",
"updated_at",
}
filtered_kwargs = {k: v for k, v in kwargs.items() if k in allowed_keys}
@@ -252,7 +279,7 @@ class MiscDAO:
if notification_ids:
placeholders = ", ".join(["?"] * len(notification_ids))
self.provider.execute(
f"UPDATE notifications SET is_viewed = 1 WHERE id IN ({placeholders})",
f"UPDATE notifications SET is_viewed = 1 WHERE id IN ({placeholders})", # noqa: S608
notification_ids,
)
else:
@@ -263,3 +290,51 @@ class MiscDAO:
"SELECT COUNT(*) as count FROM notifications WHERE is_viewed = 0",
)
return row["count"] if row else 0
# Keyboard Shortcuts
def get_keyboard_shortcuts(self, identity_hash):
return self.provider.fetchall(
"SELECT * FROM keyboard_shortcuts WHERE identity_hash = ?",
(identity_hash,),
)
def upsert_keyboard_shortcut(self, identity_hash, action, keys):
now = datetime.now(UTC)
self.provider.execute(
"""
INSERT INTO keyboard_shortcuts (identity_hash, action, keys, created_at, updated_at)
VALUES (?, ?, ?, ?, ?)
ON CONFLICT(identity_hash, action) DO UPDATE SET
keys = EXCLUDED.keys,
updated_at = EXCLUDED.updated_at
""",
(identity_hash, action, keys, now, now),
)
def delete_keyboard_shortcut(self, identity_hash, action):
self.provider.execute(
"DELETE FROM keyboard_shortcuts WHERE identity_hash = ? AND action = ?",
(identity_hash, action),
)
# Last Sent Icon Hashes
def get_last_sent_icon_hash(self, destination_hash):
row = self.provider.fetchone(
"SELECT icon_hash FROM lxmf_last_sent_icon_hashes WHERE destination_hash = ?",
(destination_hash,),
)
return row["icon_hash"] if row else None
def update_last_sent_icon_hash(self, destination_hash, icon_hash):
now = datetime.now(UTC)
self.provider.execute(
"""
INSERT INTO lxmf_last_sent_icon_hashes (destination_hash, icon_hash, created_at, updated_at)
VALUES (?, ?, ?, ?)
ON CONFLICT(destination_hash) DO UPDATE SET icon_hash = EXCLUDED.icon_hash, updated_at = EXCLUDED.updated_at
""",
(destination_hash, icon_hash, now, now),
)
def clear_last_sent_icon_hashes(self):
self.provider.execute("DELETE FROM lxmf_last_sent_icon_hashes")

View File

@@ -1,14 +1,17 @@
import sqlite3
import threading
import weakref
class DatabaseProvider:
_instance = None
_lock = threading.Lock()
_all_locals = weakref.WeakSet()
def __init__(self, db_path=None):
self.db_path = db_path
self._local = threading.local()
self._all_locals.add(self._local)
@classmethod
def get_instance(cls, db_path=None):
@@ -27,39 +30,114 @@ class DatabaseProvider:
@property
def connection(self):
if not hasattr(self._local, "connection"):
# isolation_level=None enables autocommit mode, letting us manage transactions manually
self._local.connection = sqlite3.connect(
self.db_path,
check_same_thread=False,
isolation_level=None,
)
self._local.connection.row_factory = sqlite3.Row
# Enable WAL mode for better concurrency
self._local.connection.execute("PRAGMA journal_mode=WAL")
return self._local.connection
def execute(self, query, params=None):
def execute(self, query, params=None, commit=None):
cursor = self.connection.cursor()
# Convert any datetime objects in params to ISO strings to avoid DeprecationWarning in Python 3.12+
if params:
from datetime import datetime
if isinstance(params, dict):
params = {
k: (v.isoformat() if isinstance(v, datetime) else v)
for k, v in params.items()
}
else:
params = tuple(
(p.isoformat() if isinstance(p, datetime) else p) for p in params
)
if params:
cursor.execute(query, params)
else:
cursor.execute(query)
self.connection.commit()
# In autocommit mode (isolation_level=None), in_transaction is True
# only if we explicitly started one with BEGIN and haven't committed/rolled back.
if commit is True:
self.connection.commit()
elif commit is False:
pass
# Default behavior: if we're in a manual transaction, don't commit automatically
elif not self.connection.in_transaction:
# In autocommit mode, non-DML statements don't start transactions.
# DML statements might if they are part of a BEGIN block.
# Actually, in isolation_level=None, NOTHING starts a transaction unless we say BEGIN.
pass
return cursor
def begin(self):
try:
self.connection.execute("BEGIN")
except sqlite3.OperationalError as e:
if "within a transaction" in str(e):
pass
else:
raise
def commit(self):
if self.connection.in_transaction:
self.connection.commit()
def rollback(self):
if self.connection.in_transaction:
self.connection.rollback()
def __enter__(self):
self.begin()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
if exc_type:
self.rollback()
else:
self.commit()
def fetchone(self, query, params=None):
cursor = self.execute(query, params)
return cursor.fetchone()
row = cursor.fetchone()
return dict(row) if row else None
def fetchall(self, query, params=None):
cursor = self.execute(query, params)
return cursor.fetchall()
rows = cursor.fetchall()
return [dict(row) for row in rows]
def close(self):
if hasattr(self._local, "connection"):
self._local.connection.close()
try:
self.commit() # Ensure everything is saved
self._local.connection.close()
except Exception: # noqa: S110
pass
del self._local.connection
def close_all(self):
with self._lock:
for loc in self._all_locals:
if hasattr(loc, "connection"):
try:
loc.connection.commit()
loc.connection.close()
except Exception: # noqa: S110
pass
del loc.connection
def vacuum(self):
self.execute("VACUUM")
# VACUUM cannot run inside a transaction
self.commit()
self.connection.execute("VACUUM")
def integrity_check(self):
return self.fetchall("PRAGMA integrity_check")

View File

@@ -9,12 +9,13 @@ class RingtoneDAO:
def get_all(self):
return self.provider.fetchall(
"SELECT * FROM ringtones ORDER BY created_at DESC"
"SELECT * FROM ringtones ORDER BY created_at DESC",
)
def get_by_id(self, ringtone_id):
return self.provider.fetchone(
"SELECT * FROM ringtones WHERE id = ?", (ringtone_id,)
"SELECT * FROM ringtones WHERE id = ?",
(ringtone_id,),
)
def get_primary(self):
@@ -42,7 +43,8 @@ class RingtoneDAO:
if is_primary == 1:
# reset others
self.provider.execute(
"UPDATE ringtones SET is_primary = 0, updated_at = ?", (now,)
"UPDATE ringtones SET is_primary = 0, updated_at = ?",
(now,),
)
if display_name is not None and is_primary is not None:

View File

@@ -2,11 +2,22 @@ from .provider import DatabaseProvider
class DatabaseSchema:
LATEST_VERSION = 20
LATEST_VERSION = 37
def __init__(self, provider: DatabaseProvider):
self.provider = provider
def _safe_execute(self, query, params=None):
try:
return self.provider.execute(query, params)
except Exception as e:
# Silence expected errors during migrations (e.g. duplicate columns/indexes)
err_msg = str(e).lower()
if "duplicate column name" in err_msg or "already exists" in err_msg:
return None
print(f"Database operation failed: {query[:100]}... Error: {e}")
return None
def initialize(self):
# Create core tables if they don't exist
self._create_initial_tables()
@@ -15,18 +26,124 @@ class DatabaseSchema:
current_version = self._get_current_version()
self.migrate(current_version)
def _ensure_column(self, table_name, column_name, column_type):
"""Add a column to a table if it doesn't exist."""
# First check if it exists using PRAGMA
cursor = self.provider.connection.cursor()
try:
cursor.execute(f"PRAGMA table_info({table_name})")
columns = [row[1] for row in cursor.fetchall()]
finally:
cursor.close()
if column_name not in columns:
try:
# SQLite has limitations on ALTER TABLE ADD COLUMN:
# 1. Cannot add UNIQUE or PRIMARY KEY columns
# 2. Cannot add columns with non-constant defaults (like CURRENT_TIMESTAMP)
# Strip non-constant defaults if present for the ALTER TABLE statement
stmt_type = column_type
forbidden_defaults = [
"CURRENT_TIMESTAMP",
"CURRENT_TIME",
"CURRENT_DATE",
]
for forbidden in forbidden_defaults:
if f"DEFAULT {forbidden}" in stmt_type.upper():
# Remove the DEFAULT part for the ALTER statement
import re
stmt_type = re.sub(
f"DEFAULT\\s+{forbidden}",
"",
stmt_type,
flags=re.IGNORECASE,
).strip()
# Use the connection directly to avoid any middle-ware issues
res = self._safe_execute(
f"ALTER TABLE {table_name} ADD COLUMN {column_name} {stmt_type}",
)
return res is not None
except Exception as e:
# Log but don't crash, we might be able to continue
print(
f"Unexpected error adding column {column_name} to {table_name}: {e}",
)
return False
return True
return True
def _sync_table_columns(self, table_name, create_sql):
"""Parses a CREATE TABLE statement and ensures all columns exist in the actual table.
This is a robust way to handle legacy tables that are missing columns.
"""
# Find the first '(' and the last ')'
start_idx = create_sql.find("(")
end_idx = create_sql.rfind(")")
if start_idx == -1 or end_idx == -1:
return
inner_content = create_sql[start_idx + 1 : end_idx]
# Split by comma but ignore commas inside parentheses (e.g. DECIMAL(10,2))
definitions = []
depth = 0
current = ""
for char in inner_content:
if char == "(":
depth += 1
elif char == ")":
depth -= 1
if char == "," and depth == 0:
definitions.append(current.strip())
current = ""
else:
current += char
if current.strip():
definitions.append(current.strip())
for definition in definitions:
definition = definition.strip()
# Skip table-level constraints
if not definition or definition.upper().startswith(
("PRIMARY KEY", "FOREIGN KEY", "UNIQUE", "CHECK"),
):
continue
parts = definition.split(None, 1)
if not parts:
continue
column_name = parts[0].strip('"').strip("`").strip("[").strip("]")
column_type = parts[1] if len(parts) > 1 else "TEXT"
# Special case for column types that are already PRIMARY KEY
if "PRIMARY KEY" in column_type.upper() and column_name.upper() != "ID":
# We usually don't want to ALTER TABLE ADD COLUMN with PRIMARY KEY
# unless it's the main ID which should already exist
continue
self._ensure_column(table_name, column_name, column_type)
def _get_current_version(self):
row = self.provider.fetchone(
"SELECT value FROM config WHERE key = ?",
("database_version",),
)
if row:
return int(row["value"])
try:
row = self.provider.fetchone(
"SELECT value FROM config WHERE key = ?",
("database_version",),
)
if row:
return int(row["value"])
except Exception as e:
print(f"Failed to get database version: {e}")
return 0
def _create_initial_tables(self):
# We create the config table first so we can track version
self.provider.execute("""
config_sql = """
CREATE TABLE IF NOT EXISTS config (
id INTEGER PRIMARY KEY AUTOINCREMENT,
key TEXT UNIQUE,
@@ -34,7 +151,9 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
"""
self._safe_execute(config_sql)
self._sync_table_columns("config", config_sql)
# Other essential tables that were present from version 1
# Peewee automatically creates tables if they don't exist.
@@ -81,6 +200,7 @@ class DatabaseSchema:
hash TEXT UNIQUE,
source_hash TEXT,
destination_hash TEXT,
peer_hash TEXT,
state TEXT,
progress REAL,
is_incoming INTEGER,
@@ -155,6 +275,7 @@ class DatabaseSchema:
next_retry_at DATETIME,
status TEXT DEFAULT 'pending',
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
UNIQUE(destination_hash, page_path)
)
""",
@@ -227,6 +348,17 @@ class DatabaseSchema:
UNIQUE(destination_hash, timestamp)
)
""",
"telemetry_tracking": """
CREATE TABLE IF NOT EXISTS telemetry_tracking (
id INTEGER PRIMARY KEY AUTOINCREMENT,
destination_hash TEXT UNIQUE,
is_tracking INTEGER DEFAULT 1,
interval_seconds INTEGER DEFAULT 60,
last_request_at REAL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"ringtones": """
CREATE TABLE IF NOT EXISTS ringtones (
id INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -243,6 +375,8 @@ class DatabaseSchema:
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
remote_identity_hash TEXT UNIQUE,
lxmf_address TEXT,
lxst_address TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
@@ -259,54 +393,139 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"keyboard_shortcuts": """
CREATE TABLE IF NOT EXISTS keyboard_shortcuts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
identity_hash TEXT,
action TEXT,
keys TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
UNIQUE(identity_hash, action)
)
""",
"map_drawings": """
CREATE TABLE IF NOT EXISTS map_drawings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
identity_hash TEXT,
name TEXT,
data TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"lxmf_last_sent_icon_hashes": """
CREATE TABLE IF NOT EXISTS lxmf_last_sent_icon_hashes (
destination_hash TEXT PRIMARY KEY,
icon_hash TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"debug_logs": """
CREATE TABLE IF NOT EXISTS debug_logs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp REAL,
level TEXT,
module TEXT,
message TEXT,
is_anomaly INTEGER DEFAULT 0,
anomaly_type TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"lxmf_folders": """
CREATE TABLE IF NOT EXISTS lxmf_folders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""",
"lxmf_conversation_folders": """
CREATE TABLE IF NOT EXISTS lxmf_conversation_folders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
peer_hash TEXT UNIQUE,
folder_id INTEGER,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (folder_id) REFERENCES lxmf_folders(id) ON DELETE CASCADE
)
""",
}
for table_name, create_sql in tables.items():
self.provider.execute(create_sql)
self._safe_execute(create_sql)
# Robust self-healing: Ensure existing tables have all modern columns
self._sync_table_columns(table_name, create_sql)
# Create indexes that were present
if table_name == "announces":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_aspect ON announces(aspect)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_identity_hash ON announces(identity_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_updated_at ON announces(updated_at)",
)
elif table_name == "lxmf_messages":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_source_hash ON lxmf_messages(source_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_destination_hash ON lxmf_messages(destination_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_peer_hash ON lxmf_messages(peer_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_timestamp ON lxmf_messages(timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_peer_ts ON lxmf_messages(peer_hash, timestamp)",
)
elif table_name == "blocked_destinations":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_blocked_destinations_hash ON blocked_destinations(destination_hash)",
)
elif table_name == "spam_keywords":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_spam_keywords_keyword ON spam_keywords(keyword)",
)
elif table_name == "notification_viewed_state":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_notification_viewed_state_destination_hash ON notification_viewed_state(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_notification_viewed_state_dest_hash_unique ON notification_viewed_state(destination_hash)",
)
elif table_name == "lxmf_telemetry":
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_telemetry_destination_hash ON lxmf_telemetry(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_telemetry_timestamp ON lxmf_telemetry(timestamp)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lxmf_telemetry_dest_ts_unique ON lxmf_telemetry(destination_hash, timestamp)",
)
elif table_name == "debug_logs":
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_timestamp ON debug_logs(timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_level ON debug_logs(level)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_anomaly ON debug_logs(is_anomaly)",
)
def migrate(self, current_version):
if current_version < 7:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS archived_pages (
id INTEGER PRIMARY KEY AUTOINCREMENT,
destination_hash TEXT,
@@ -316,18 +535,18 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_archived_pages_destination_hash ON archived_pages(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_archived_pages_page_path ON archived_pages(page_path)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_archived_pages_hash ON archived_pages(hash)",
)
if current_version < 8:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS crawl_tasks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
destination_hash TEXT,
@@ -339,15 +558,15 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_crawl_tasks_destination_hash ON crawl_tasks(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_crawl_tasks_page_path ON crawl_tasks(page_path)",
)
if current_version < 9:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_forwarding_rules (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
@@ -359,11 +578,11 @@ class DatabaseSchema:
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_forwarding_rules_identity_hash ON lxmf_forwarding_rules(identity_hash)",
)
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_forwarding_mappings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
alias_identity_private_key TEXT,
@@ -374,13 +593,13 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_forwarding_mappings_alias_hash ON lxmf_forwarding_mappings(alias_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_forwarding_mappings_sender_hash ON lxmf_forwarding_mappings(original_sender_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_forwarding_mappings_recipient_hash ON lxmf_forwarding_mappings(final_recipient_hash)",
)
@@ -390,62 +609,58 @@ class DatabaseSchema:
# but a UNIQUE index works for ON CONFLICT.
# Clean up duplicates before adding unique indexes
self.provider.execute(
self._safe_execute(
"DELETE FROM announces WHERE id NOT IN (SELECT MAX(id) FROM announces GROUP BY destination_hash)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM crawl_tasks WHERE id NOT IN (SELECT MAX(id) FROM crawl_tasks GROUP BY destination_hash, page_path)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM custom_destination_display_names WHERE id NOT IN (SELECT MAX(id) FROM custom_destination_display_names GROUP BY destination_hash)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM favourite_destinations WHERE id NOT IN (SELECT MAX(id) FROM favourite_destinations GROUP BY destination_hash)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM lxmf_user_icons WHERE id NOT IN (SELECT MAX(id) FROM lxmf_user_icons GROUP BY destination_hash)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM lxmf_conversation_read_state WHERE id NOT IN (SELECT MAX(id) FROM lxmf_conversation_read_state GROUP BY destination_hash)",
)
self.provider.execute(
self._safe_execute(
"DELETE FROM lxmf_messages WHERE id NOT IN (SELECT MAX(id) FROM lxmf_messages GROUP BY hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_announces_destination_hash_unique ON announces(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_crawl_tasks_destination_path_unique ON crawl_tasks(destination_hash, page_path)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_custom_display_names_dest_hash_unique ON custom_destination_display_names(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_favourite_destinations_dest_hash_unique ON favourite_destinations(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lxmf_messages_hash_unique ON lxmf_messages(hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lxmf_user_icons_dest_hash_unique ON lxmf_user_icons(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lxmf_conversation_read_state_dest_hash_unique ON lxmf_conversation_read_state(destination_hash)",
)
if current_version < 11:
# Add is_spam column to lxmf_messages if it doesn't exist
try:
self.provider.execute(
"ALTER TABLE lxmf_messages ADD COLUMN is_spam INTEGER DEFAULT 0",
)
except Exception:
# Column might already exist if table was created with newest schema
pass
self._safe_execute(
"ALTER TABLE lxmf_messages ADD COLUMN is_spam INTEGER DEFAULT 0",
)
if current_version < 12:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS call_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
remote_identity_hash TEXT,
@@ -457,15 +672,15 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_call_history_remote_hash ON call_history(remote_identity_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_call_history_timestamp ON call_history(timestamp)",
)
if current_version < 13:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS voicemails (
id INTEGER PRIMARY KEY AUTOINCREMENT,
remote_identity_hash TEXT,
@@ -477,15 +692,15 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_voicemails_remote_hash ON voicemails(remote_identity_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_voicemails_timestamp ON voicemails(timestamp)",
)
if current_version < 14:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS notification_viewed_state (
id INTEGER PRIMARY KEY AUTOINCREMENT,
destination_hash TEXT UNIQUE,
@@ -494,15 +709,15 @@ class DatabaseSchema:
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_notification_viewed_state_destination_hash ON notification_viewed_state(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_notification_viewed_state_dest_hash_unique ON notification_viewed_state(destination_hash)",
)
if current_version < 15:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_telemetry (
id INTEGER PRIMARY KEY AUTOINCREMENT,
destination_hash TEXT,
@@ -515,26 +730,23 @@ class DatabaseSchema:
UNIQUE(destination_hash, timestamp)
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_telemetry_destination_hash ON lxmf_telemetry(destination_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_telemetry_timestamp ON lxmf_telemetry(timestamp)",
)
self.provider.execute(
self._safe_execute(
"CREATE UNIQUE INDEX IF NOT EXISTS idx_lxmf_telemetry_dest_ts_unique ON lxmf_telemetry(destination_hash, timestamp)",
)
if current_version < 16:
try:
self.provider.execute(
"ALTER TABLE lxmf_forwarding_rules ADD COLUMN name TEXT",
)
except Exception:
pass
self._safe_execute(
"ALTER TABLE lxmf_forwarding_rules ADD COLUMN name TEXT",
)
if current_version < 17:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS ringtones (
id INTEGER PRIMARY KEY AUTOINCREMENT,
filename TEXT,
@@ -547,7 +759,7 @@ class DatabaseSchema:
""")
if current_version < 18:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS contacts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT,
@@ -556,20 +768,20 @@ class DatabaseSchema:
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_contacts_name ON contacts(name)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_contacts_remote_identity_hash ON contacts(remote_identity_hash)",
)
if current_version < 19:
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_call_history_remote_name ON call_history(remote_identity_name)",
)
if current_version < 20:
self.provider.execute("""
self._safe_execute("""
CREATE TABLE IF NOT EXISTS notifications (
id INTEGER PRIMARY KEY AUTOINCREMENT,
type TEXT,
@@ -581,15 +793,214 @@ class DatabaseSchema:
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_notifications_remote_hash ON notifications(remote_hash)",
)
self.provider.execute(
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_notifications_timestamp ON notifications(timestamp)",
)
if current_version < 21:
self._safe_execute("""
CREATE TABLE IF NOT EXISTS keyboard_shortcuts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
identity_hash TEXT,
action TEXT,
keys TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
UNIQUE(identity_hash, action)
)
""")
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_keyboard_shortcuts_identity_hash ON keyboard_shortcuts(identity_hash)",
)
if current_version < 22:
# Optimize fetching conversations and favorites
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_timestamp ON lxmf_messages(timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_favourite_destinations_aspect ON favourite_destinations(aspect)",
)
# Add index for faster searching in announces
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_updated_at ON announces(updated_at)",
)
if current_version < 23:
# Further optimize conversation fetching
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_conv_optim ON lxmf_messages(source_hash, destination_hash, timestamp DESC)",
)
# Add index for unread message filtering
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_state_incoming ON lxmf_messages(state, is_incoming)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_aspect ON announces(aspect)",
)
if current_version < 24:
self._safe_execute("""
CREATE TABLE IF NOT EXISTS call_recordings (
id INTEGER PRIMARY KEY AUTOINCREMENT,
remote_identity_hash TEXT,
remote_identity_name TEXT,
filename_rx TEXT,
filename_tx TEXT,
duration_seconds INTEGER,
timestamp REAL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_call_recordings_remote_hash ON call_recordings(remote_identity_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_call_recordings_timestamp ON call_recordings(timestamp)",
)
if current_version < 25:
# Add docs_downloaded to config if not exists
self._safe_execute(
"INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)",
("docs_downloaded", "0"),
)
if current_version < 26:
# Add initial_docs_download_attempted to config if not exists
self._safe_execute(
"INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)",
("initial_docs_download_attempted", "0"),
)
if current_version < 28:
# Add preferred_ringtone_id to contacts
self._safe_execute(
"ALTER TABLE contacts ADD COLUMN preferred_ringtone_id INTEGER DEFAULT NULL",
)
if current_version < 29:
# Performance optimization indexes
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_peer_hash ON lxmf_messages(peer_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_timestamp ON lxmf_messages(timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_messages_peer_ts ON lxmf_messages(peer_hash, timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_announces_updated_at ON announces(updated_at)",
)
if current_version < 30:
# Add custom_image to contacts
self._safe_execute(
"ALTER TABLE contacts ADD COLUMN custom_image TEXT DEFAULT NULL",
)
if current_version < 31:
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_last_sent_icon_hashes (
destination_hash TEXT PRIMARY KEY,
icon_hash TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
if current_version < 32:
# Add tutorial_seen and changelog_seen_version to config
self._safe_execute(
"INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)",
("tutorial_seen", "false"),
)
self._safe_execute(
"INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)",
("changelog_seen_version", "0.0.0"),
)
if current_version < 33:
self._safe_execute("""
CREATE TABLE IF NOT EXISTS debug_logs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp REAL,
level TEXT,
module TEXT,
message TEXT,
is_anomaly INTEGER DEFAULT 0,
anomaly_type TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_timestamp ON debug_logs(timestamp)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_level ON debug_logs(level)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_debug_logs_anomaly ON debug_logs(is_anomaly)",
)
if current_version < 34:
# Add updated_at to crawl_tasks
self._safe_execute(
"ALTER TABLE crawl_tasks ADD COLUMN updated_at DATETIME DEFAULT CURRENT_TIMESTAMP",
)
if current_version < 35:
# Add lxmf_address and lxst_address to contacts
self._safe_execute(
"ALTER TABLE contacts ADD COLUMN lxmf_address TEXT DEFAULT NULL",
)
self._safe_execute(
"ALTER TABLE contacts ADD COLUMN lxst_address TEXT DEFAULT NULL",
)
if current_version < 36:
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_folders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
self._safe_execute("""
CREATE TABLE IF NOT EXISTS lxmf_conversation_folders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
peer_hash TEXT UNIQUE,
folder_id INTEGER,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (folder_id) REFERENCES lxmf_folders(id) ON DELETE CASCADE
)
""")
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_conversation_folders_peer_hash ON lxmf_conversation_folders(peer_hash)",
)
self._safe_execute(
"CREATE INDEX IF NOT EXISTS idx_lxmf_conversation_folders_folder_id ON lxmf_conversation_folders(folder_id)",
)
if current_version < 37:
# Add is_telemetry_trusted to contacts
self._safe_execute(
"ALTER TABLE contacts ADD COLUMN is_telemetry_trusted INTEGER DEFAULT 0",
)
# Ensure telemetry_enabled exists in config and is false by default
self._safe_execute(
"INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)",
("telemetry_enabled", "false"),
)
# Update version in config
self.provider.execute(
self._safe_execute(
"""
INSERT INTO config (key, value, created_at, updated_at)
VALUES (?, ?, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP)

View File

@@ -65,3 +65,42 @@ class TelemetryDAO:
"DELETE FROM lxmf_telemetry WHERE destination_hash = ?",
(destination_hash,),
)
def is_tracking(self, destination_hash):
row = self.provider.fetchone(
"SELECT is_tracking FROM telemetry_tracking WHERE destination_hash = ?",
(destination_hash,),
)
return bool(row["is_tracking"]) if row else False
def toggle_tracking(self, destination_hash, is_tracking=None):
if is_tracking is None:
is_tracking = not self.is_tracking(destination_hash)
now = datetime.now(UTC).isoformat()
self.provider.execute(
"""
INSERT INTO telemetry_tracking (destination_hash, is_tracking, updated_at)
VALUES (?, ?, ?)
ON CONFLICT(destination_hash) DO UPDATE SET
is_tracking = EXCLUDED.is_tracking,
updated_at = EXCLUDED.updated_at
""",
(destination_hash, int(is_tracking), now),
)
return is_tracking
def get_tracked_peers(self):
return self.provider.fetchall(
"SELECT * FROM telemetry_tracking WHERE is_tracking = 1",
)
def update_last_request_at(self, destination_hash, timestamp=None):
if timestamp is None:
import time
timestamp = time.time()
self.provider.execute(
"UPDATE telemetry_tracking SET last_request_at = ? WHERE destination_hash = ?",
(timestamp, destination_hash),
)

View File

@@ -57,3 +57,65 @@ class TelephoneDAO:
def clear_call_history(self):
self.provider.execute("DELETE FROM call_history")
def add_call_recording(
self,
remote_identity_hash,
remote_identity_name,
filename_rx,
filename_tx,
duration_seconds,
timestamp,
):
from datetime import UTC, datetime
now = datetime.now(UTC)
self.provider.execute(
"""
INSERT INTO call_recordings (
remote_identity_hash,
remote_identity_name,
filename_rx,
filename_tx,
duration_seconds,
timestamp,
created_at
) VALUES (?, ?, ?, ?, ?, ?, ?)
""",
(
remote_identity_hash,
remote_identity_name,
filename_rx,
filename_tx,
duration_seconds,
timestamp,
now,
),
)
def get_call_recordings(self, search=None, limit=10, offset=0):
if search:
return self.provider.fetchall(
"""
SELECT * FROM call_recordings
WHERE remote_identity_name LIKE ? OR remote_identity_hash LIKE ?
ORDER BY timestamp DESC LIMIT ? OFFSET ?
""",
(f"%{search}%", f"%{search}%", limit, offset),
)
return self.provider.fetchall(
"SELECT * FROM call_recordings ORDER BY timestamp DESC LIMIT ? OFFSET ?",
(limit, offset),
)
def get_call_recording(self, recording_id):
return self.provider.fetchone(
"SELECT * FROM call_recordings WHERE id = ?",
(recording_id,),
)
def delete_call_recording(self, recording_id):
self.provider.execute(
"DELETE FROM call_recordings WHERE id = ?",
(recording_id,),
)

View File

@@ -75,3 +75,9 @@ class VoicemailDAO:
"SELECT COUNT(*) as count FROM voicemails WHERE is_read = 0",
)
return row["count"] if row else 0
def get_latest_voicemail_id(self):
row = self.provider.fetchone(
"SELECT id FROM voicemails ORDER BY timestamp DESC LIMIT 1",
)
return row["id"] if row else None

View File

@@ -0,0 +1,664 @@
import html
import io
import logging
import os
import re
import shutil
import threading
import zipfile
import requests
from meshchatx.src.backend.markdown_renderer import MarkdownRenderer
class DocsManager:
def __init__(self, config, public_dir, project_root=None, storage_dir=None):
self.config = config
self.public_dir = public_dir
self.project_root = project_root
self.storage_dir = storage_dir
# Determine docs directories
if self.storage_dir:
self.docs_base_dir = os.path.join(self.storage_dir, "reticulum-docs")
self.meshchatx_docs_dir = os.path.join(self.storage_dir, "meshchatx-docs")
else:
self.docs_base_dir = os.path.join(self.public_dir, "reticulum-docs")
self.meshchatx_docs_dir = os.path.join(self.public_dir, "meshchatx-docs")
# The actual docs are served from this directory
# We will use a 'current' subdirectory for the active version
self.docs_dir = os.path.join(self.docs_base_dir, "current")
self.versions_dir = os.path.join(self.docs_base_dir, "versions")
self.download_status = "idle"
self.download_progress = 0
self.last_error = None
# Ensure docs directories exist
try:
for d in [
self.docs_base_dir,
self.versions_dir,
self.docs_dir,
self.meshchatx_docs_dir,
]:
if not os.path.exists(d):
os.makedirs(d)
# If 'current' doesn't exist but we have versions, pick the latest one
if not os.path.exists(self.docs_dir) or not os.listdir(self.docs_dir):
self._update_current_link()
except OSError as e:
logging.exception(f"Failed to create documentation directories: {e}")
self.last_error = str(e)
# Initial population of MeshChatX docs
if os.path.exists(self.meshchatx_docs_dir) and os.access(
self.meshchatx_docs_dir,
os.W_OK,
):
self.populate_meshchatx_docs()
def _update_current_link(self, version=None):
"""Updates the 'current' directory to point to the specified version or the latest one."""
if not os.path.exists(self.versions_dir):
return
versions = self.get_available_versions()
if not versions:
return
target_version = version
if not target_version:
# Pick latest version (alphabetically)
target_version = versions[-1]
version_path = os.path.join(self.versions_dir, target_version)
if not os.path.exists(version_path):
return
# On some systems symlinks might fail or be restricted, so we use a directory copy or move
# but for now let's try to just use the path directly if possible.
# However, meshchat.py uses self.docs_dir for the static route.
# To make it simple and robust across platforms, we'll clear 'current' and copy the version
if os.path.exists(self.docs_dir):
if os.path.islink(self.docs_dir):
os.unlink(self.docs_dir)
else:
shutil.rmtree(self.docs_dir)
try:
# Try symlink first as it's efficient
# We use a relative path for the symlink target to make the storage directory portable
# version_path is relative to CWD, so we need it relative to the parent of self.docs_dir
rel_target = os.path.relpath(version_path, os.path.dirname(self.docs_dir))
os.symlink(rel_target, self.docs_dir)
except (OSError, AttributeError):
# Fallback to copy
shutil.copytree(version_path, self.docs_dir)
def get_available_versions(self):
if not os.path.exists(self.versions_dir):
return []
versions = [
d
for d in os.listdir(self.versions_dir)
if os.path.isdir(os.path.join(self.versions_dir, d))
]
return sorted(versions)
def get_current_version(self):
if not os.path.exists(self.docs_dir):
return None
if os.path.islink(self.docs_dir):
return os.path.basename(os.readlink(self.docs_dir))
# If it's a copy, we might need a metadata file to know which version it is
version_file = os.path.join(self.docs_dir, ".version")
if os.path.exists(version_file):
try:
with open(version_file) as f:
return f.read().strip()
except OSError:
pass
return "unknown"
def switch_version(self, version):
if version in self.get_available_versions():
self._update_current_link(version)
return True
return False
def delete_version(self, version):
"""Deletes a specific version of documentation."""
if version not in self.get_available_versions():
return False
version_path = os.path.join(self.versions_dir, version)
if not os.path.exists(version_path):
return False
try:
# If the deleted version is the current one, unlink 'current' first
current_version = self.get_current_version()
if current_version == version:
if os.path.exists(self.docs_dir):
if os.path.islink(self.docs_dir):
os.unlink(self.docs_dir)
else:
shutil.rmtree(self.docs_dir)
shutil.rmtree(version_path)
# If we just deleted the current version, try to pick another one as current
if current_version == version:
self._update_current_link()
return True
except Exception as e:
logging.exception(f"Failed to delete docs version {version}: {e}")
return False
def clear_reticulum_docs(self):
"""Clears all Reticulum documentation and versions."""
try:
if os.path.exists(self.docs_base_dir):
# We don't want to delete the base dir itself, just its contents
# except possibly some metadata if we added any.
# Actually, deleting everything inside reticulum-docs is fine.
for item in os.listdir(self.docs_base_dir):
item_path = os.path.join(self.docs_base_dir, item)
if os.path.islink(item_path):
os.unlink(item_path)
elif os.path.isdir(item_path):
shutil.rmtree(item_path)
else:
os.remove(item_path)
# Re-create required subdirectories
for d in [self.versions_dir, self.docs_dir]:
if not os.path.exists(d):
os.makedirs(d)
self.config.docs_downloaded.set(False)
return True
except Exception as e:
logging.exception(f"Failed to clear Reticulum docs: {e}")
return False
def populate_meshchatx_docs(self):
"""Populates meshchatx-docs from the project's docs folder."""
# Try to find docs folder in several places
search_paths = []
if self.project_root:
search_paths.append(os.path.join(self.project_root, "docs"))
# Also try in the public directory
search_paths.append(os.path.join(self.public_dir, "meshchatx-docs"))
# Also try relative to this file
# This file is in meshchatx/src/backend/docs_manager.py
# Project root is 3 levels up
this_dir = os.path.dirname(os.path.abspath(__file__))
search_paths.append(
os.path.abspath(os.path.join(this_dir, "..", "..", "..", "docs")),
)
src_docs = None
for path in search_paths:
if os.path.exists(path) and os.path.isdir(path):
src_docs = path
break
if not src_docs:
logging.warning("MeshChatX docs source directory not found.")
return
try:
for file in os.listdir(src_docs):
if file.endswith(".md") or file.endswith(".txt"):
src_path = os.path.join(src_docs, file)
dest_path = os.path.join(self.meshchatx_docs_dir, file)
# Only copy if source and destination are different
if os.path.abspath(src_path) != os.path.abspath(
dest_path,
) and os.access(self.meshchatx_docs_dir, os.W_OK):
shutil.copy2(src_path, dest_path)
# Also pre-render to HTML for easy sharing/viewing
try:
with open(src_path, encoding="utf-8") as f:
content = f.read()
html_content = MarkdownRenderer.render(content)
# Basic HTML wrapper for standalone viewing
full_html = f"""<!DOCTYPE html>
<html class="dark">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>{file}</title>
<script src="../assets/js/tailwindcss/tailwind-v3.4.3-forms-v0.5.7.js"></script>
<style>
body {{ background-color: #111827; color: #f3f4f6; }}
</style>
</head>
<body class="p-4 md:p-8 max-w-4xl mx-auto">
<div class="max-w-none break-words">
{html_content}
</div>
</body>
</html>"""
html_file = os.path.splitext(file)[0] + ".html"
with open(
os.path.join(self.meshchatx_docs_dir, html_file),
"w",
encoding="utf-8",
) as f:
f.write(full_html)
except Exception as e:
logging.exception(f"Failed to render {file} to HTML: {e}")
except Exception as e:
logging.exception(f"Failed to populate MeshChatX docs: {e}")
def get_status(self):
return {
"status": self.download_status,
"progress": self.download_progress,
"last_error": self.last_error,
"has_docs": self.has_docs(),
"has_meshchatx_docs": self.has_meshchatx_docs(),
"versions": self.get_available_versions(),
"current_version": self.get_current_version(),
}
def has_meshchatx_docs(self):
return (
any(
f.endswith((".md", ".txt")) for f in os.listdir(self.meshchatx_docs_dir)
)
if os.path.exists(self.meshchatx_docs_dir)
else False
)
def get_meshchatx_docs_list(self):
docs = []
if not os.path.exists(self.meshchatx_docs_dir):
return docs
docs.extend(
{
"name": file,
"path": file,
"type": "markdown" if file.endswith(".md") else "text",
}
for file in os.listdir(self.meshchatx_docs_dir)
if file.endswith((".md", ".txt"))
)
return sorted(docs, key=lambda x: x["name"])
def get_doc_content(self, path):
full_path = os.path.join(self.meshchatx_docs_dir, path)
if not os.path.exists(full_path):
return None
with open(full_path, encoding="utf-8", errors="ignore") as f:
content = f.read()
if path.endswith(".md"):
return {
"content": content,
"html": MarkdownRenderer.render(content),
"type": "markdown",
}
return {
"content": content,
"html": f"<pre class='whitespace-pre-wrap font-mono'>{html.escape(content)}</pre>",
"type": "text",
}
def export_docs(self):
"""Creates a zip of all docs and returns the bytes."""
buffer = io.BytesIO()
with zipfile.ZipFile(buffer, "w", zipfile.ZIP_DEFLATED) as zip_file:
# Add reticulum docs
for root, _, files in os.walk(self.docs_dir):
for file in files:
file_path = os.path.join(root, file)
rel_path = os.path.join(
"reticulum-docs",
os.path.relpath(file_path, self.docs_dir),
)
zip_file.write(file_path, rel_path)
# Add meshchatx docs
for root, _, files in os.walk(self.meshchatx_docs_dir):
for file in files:
file_path = os.path.join(root, file)
rel_path = os.path.join(
"meshchatx-docs",
os.path.relpath(file_path, self.meshchatx_docs_dir),
)
zip_file.write(file_path, rel_path)
buffer.seek(0)
return buffer.getvalue()
def search(self, query, lang="en"):
if not query:
return []
results = []
query = query.lower()
# 1. Search MeshChatX Docs first
if os.path.exists(self.meshchatx_docs_dir):
for file in os.listdir(self.meshchatx_docs_dir):
if file.endswith((".md", ".txt")):
file_path = os.path.join(self.meshchatx_docs_dir, file)
try:
with open(
file_path,
encoding="utf-8",
errors="ignore",
) as f:
content = f.read()
if query in content.lower():
# Simple snippet
idx = content.lower().find(query)
start = max(0, idx - 80)
end = min(len(content), idx + len(query) + 120)
snippet = content[start:end]
if start > 0:
snippet = "..." + snippet
if end < len(content):
snippet = snippet + "..."
results.append(
{
"title": file,
"path": f"/meshchatx-docs/{file}",
"snippet": snippet,
"source": "MeshChatX",
},
)
except Exception as e:
logging.exception(f"Error searching MeshChatX doc {file}: {e}")
# 2. Search Reticulum Docs
if self.has_docs():
# Known language suffixes in Reticulum docs
known_langs = ["de", "es", "jp", "nl", "pl", "pt-br", "tr", "uk", "zh-cn"]
# Determine files to search
target_files = []
try:
for root, _, files in os.walk(self.docs_dir):
for file in files:
if file.endswith(".html"):
# Basic filtering for language if possible
if lang != "en":
if f"_{lang}.html" in file:
target_files.append(os.path.join(root, file))
else:
# For English, we want files that DON'T have a language suffix
# This is a bit heuristic
has_lang_suffix = False
for lang_code in known_langs:
if f"_{lang_code}.html" in file:
has_lang_suffix = True
break
if not has_lang_suffix:
target_files.append(os.path.join(root, file))
# If we found nothing for a specific language, fall back to English ONLY
if not target_files and lang != "en":
for root, _, files in os.walk(self.docs_dir):
for file in files:
if file.endswith(".html"):
has_lang_suffix = False
for lang_code in known_langs:
if f"_{lang_code}.html" in file:
has_lang_suffix = True
break
if not has_lang_suffix:
target_files.append(os.path.join(root, file))
for file_path in target_files:
try:
with open(file_path, encoding="utf-8", errors="ignore") as f:
content = f.read()
# Very basic HTML tag removal for searching
text_content = re.sub(r"<[^>]+>", " ", content)
text_content = " ".join(text_content.split())
if query in text_content.lower():
# Find title
title_match = re.search(
r"<title>(.*?)</title>",
content,
re.IGNORECASE | re.DOTALL,
)
title = (
title_match.group(1).strip()
if title_match
else os.path.basename(file_path)
)
# Remove " — Reticulum Network Stack ..." suffix often found in Sphinx docs
title = re.sub(r"\s+[\u2014-].*$", "", title)
# Find snippet
idx = text_content.lower().find(query)
start = max(0, idx - 80)
end = min(len(text_content), idx + len(query) + 120)
snippet = text_content[start:end]
if start > 0:
snippet = "..." + snippet
if end < len(text_content):
snippet = snippet + "..."
rel_path = os.path.relpath(file_path, self.docs_dir)
results.append(
{
"title": title,
"path": f"/reticulum-docs/{rel_path}",
"snippet": snippet,
"source": "Reticulum",
},
)
if len(results) >= 25: # Limit results
break
except Exception as e:
logging.exception(f"Error searching file {file_path}: {e}")
except Exception as e:
logging.exception(f"Search failed: {e}")
return results
def has_docs(self):
# Check if index.html exists in the docs folder or if we have any versions
return (
os.path.exists(os.path.join(self.docs_dir, "index.html"))
or len(self.get_available_versions()) > 0
)
def update_docs(self, version="latest"):
if (
self.download_status == "downloading"
or self.download_status == "extracting"
):
return False
thread = threading.Thread(target=self._download_task, args=(version,))
thread.daemon = True
thread.start()
return True
def _download_task(self, version="latest"):
self.download_status = "downloading"
self.download_progress = 0
self.last_error = None
# Get URLs from config
urls_str = self.config.docs_download_urls.get()
urls = [u.strip() for u in urls_str.replace("\n", ",").split(",") if u.strip()]
if not urls:
urls = ["https://git.quad4.io/Reticulum/reticulum_website/archive/main.zip"]
last_exception = None
for url in urls:
try:
logging.info(f"Attempting to download docs from {url}")
zip_path = os.path.join(self.docs_base_dir, "website.zip")
# Download ZIP
response = requests.get(url, stream=True, timeout=60)
response.raise_for_status()
total_size = int(response.headers.get("content-length", 0))
downloaded_size = 0
with open(zip_path, "wb") as f:
for chunk in response.iter_content(chunk_size=8192):
if chunk:
f.write(chunk)
downloaded_size += len(chunk)
if total_size > 0:
self.download_progress = int(
(downloaded_size / total_size) * 90,
)
# Extract
self.download_status = "extracting"
# For automatic downloads from git, we'll use a timestamp as version if none provided
if version == "latest":
import time
version = f"git-{int(time.time())}"
self._extract_docs(zip_path, version)
# Cleanup
if os.path.exists(zip_path):
os.remove(zip_path)
self.config.docs_downloaded.set(True)
self.download_progress = 100
self.download_status = "completed"
# Switch to the new version
self.switch_version(version)
return # Success, exit task
except Exception as e:
logging.warning(f"Failed to download docs from {url}: {e}")
last_exception = e
if os.path.exists(os.path.join(self.docs_base_dir, "website.zip")):
os.remove(os.path.join(self.docs_base_dir, "website.zip"))
continue # Try next URL
# If we got here, all URLs failed
self.last_error = str(last_exception)
self.download_status = "error"
logging.error(f"All docs download sources failed. Last error: {last_exception}")
def upload_zip(self, zip_bytes, version):
self.download_status = "extracting"
self.download_progress = 0
self.last_error = None
try:
zip_path = os.path.join(self.docs_base_dir, "uploaded.zip")
with open(zip_path, "wb") as f:
f.write(zip_bytes)
self._extract_docs(zip_path, version)
if os.path.exists(zip_path):
os.remove(zip_path)
self.download_status = "completed"
self.download_progress = 100
self.switch_version(version)
return True
except Exception as e:
self.last_error = str(e)
self.download_status = "error"
logging.exception(f"Failed to upload docs: {e}")
return False
def _extract_docs(self, zip_path, version):
# Target dir for this version
version_dir = os.path.join(self.versions_dir, version)
if os.path.exists(version_dir):
shutil.rmtree(version_dir)
os.makedirs(version_dir)
# Temp dir for extraction
temp_extract = os.path.join(self.docs_base_dir, "temp_extract")
if os.path.exists(temp_extract):
shutil.rmtree(temp_extract)
with zipfile.ZipFile(zip_path, "r") as zip_ref:
# Gitea/GitHub zips have a root folder
namelist = zip_ref.namelist()
if not namelist:
raise Exception("Zip file is empty")
root_folder = namelist[0].split("/")[0]
# Check if it's the reticulum_website repo (has docs/ folder)
docs_prefix = f"{root_folder}/docs/"
has_docs_subfolder = any(m.startswith(docs_prefix) for m in namelist)
if has_docs_subfolder:
members_to_extract = [m for m in namelist if m.startswith(docs_prefix)]
for member in members_to_extract:
zip_ref.extract(member, temp_extract)
src_path = os.path.join(temp_extract, root_folder, "docs")
# Move files from extracted docs to version_dir
for item in os.listdir(src_path):
s = os.path.join(src_path, item)
d = os.path.join(version_dir, item)
if os.path.isdir(s):
shutil.copytree(s, d)
else:
shutil.copy2(s, d)
else:
# Just extract everything directly to version_dir, but remove root folder if exists
zip_ref.extractall(temp_extract)
src_path = os.path.join(temp_extract, root_folder)
if os.path.exists(src_path) and os.path.isdir(src_path):
for item in os.listdir(src_path):
s = os.path.join(src_path, item)
d = os.path.join(version_dir, item)
if os.path.isdir(s):
shutil.copytree(s, d)
else:
shutil.copy2(s, d)
else:
# Fallback if no root folder
for item in os.listdir(temp_extract):
s = os.path.join(temp_extract, item)
d = os.path.join(version_dir, item)
if os.path.isdir(s):
shutil.copytree(s, d)
else:
shutil.copy2(s, d)
# Create a metadata file with the version name
with open(os.path.join(version_dir, ".version"), "w") as f:
f.write(version)
# Cleanup temp
if os.path.exists(temp_extract):
shutil.rmtree(temp_extract)

View File

@@ -1,10 +1,10 @@
import base64
import os
import LXMF
import RNS
from .database import Database
from .meshchat_utils import create_lxmf_router
class ForwardingManager:
@@ -34,7 +34,7 @@ class ForwardingManager:
)
os.makedirs(router_storage_path, exist_ok=True)
router = LXMF.LXMRouter(
router = create_lxmf_router(
identity=alias_identity,
storagepath=router_storage_path,
)
@@ -79,7 +79,7 @@ class ForwardingManager:
)
os.makedirs(router_storage_path, exist_ok=True)
router = LXMF.LXMRouter(
router = create_lxmf_router(
identity=alias_identity,
storagepath=router_storage_path,
)

View File

@@ -0,0 +1,540 @@
import asyncio
import os
import threading
import RNS
from meshchatx.src.backend.announce_handler import AnnounceHandler
from meshchatx.src.backend.announce_manager import AnnounceManager
from meshchatx.src.backend.archiver_manager import ArchiverManager
from meshchatx.src.backend.auto_propagation_manager import AutoPropagationManager
from meshchatx.src.backend.bot_handler import BotHandler
from meshchatx.src.backend.community_interfaces import CommunityInterfacesManager
from meshchatx.src.backend.config_manager import ConfigManager
from meshchatx.src.backend.database import Database
from meshchatx.src.backend.docs_manager import DocsManager
from meshchatx.src.backend.forwarding_manager import ForwardingManager
from meshchatx.src.backend.integrity_manager import IntegrityManager
from meshchatx.src.backend.map_manager import MapManager
from meshchatx.src.backend.meshchat_utils import create_lxmf_router
from meshchatx.src.backend.message_handler import MessageHandler
from meshchatx.src.backend.nomadnet_utils import NomadNetworkManager
from meshchatx.src.backend.ringtone_manager import RingtoneManager
from meshchatx.src.backend.rncp_handler import RNCPHandler
from meshchatx.src.backend.rnpath_handler import RNPathHandler
from meshchatx.src.backend.rnpath_trace_handler import RNPathTraceHandler
from meshchatx.src.backend.rnprobe_handler import RNProbeHandler
from meshchatx.src.backend.rnstatus_handler import RNStatusHandler
from meshchatx.src.backend.telephone_manager import TelephoneManager
from meshchatx.src.backend.translator_handler import TranslatorHandler
from meshchatx.src.backend.voicemail_manager import VoicemailManager
class IdentityContext:
def __init__(self, identity: RNS.Identity, app):
self.identity = identity
self.app = app
self.identity_hash = identity.hash.hex()
# Storage paths
self.storage_path = os.path.join(
app.storage_dir,
"identities",
self.identity_hash,
)
os.makedirs(self.storage_path, exist_ok=True)
self.database_path = os.path.join(self.storage_path, "database.db")
self.lxmf_router_path = os.path.join(self.storage_path, "lxmf_router")
# Identity backup
identity_backup_file = os.path.join(self.storage_path, "identity")
if not os.path.exists(identity_backup_file):
with open(identity_backup_file, "wb") as f:
f.write(identity.get_private_key())
# Session ID for this specific context instance
if not hasattr(app, "_identity_session_id_counter"):
app._identity_session_id_counter = 0
app._identity_session_id_counter += 1
self.session_id = app._identity_session_id_counter
# Initialized state
self.database = None
self.config = None
self.message_handler = None
self.announce_manager = None
self.archiver_manager = None
self.map_manager = None
self.docs_manager = None
self.nomadnet_manager = None
self.message_router = None
self.telephone_manager = None
self.voicemail_manager = None
self.ringtone_manager = None
self.auto_propagation_manager = None
self.rncp_handler = None
self.rnstatus_handler = None
self.rnpath_handler = None
self.rnpath_trace_handler = None
self.rnprobe_handler = None
self.translator_handler = None
self.bot_handler = None
self.forwarding_manager = None
self.community_interfaces_manager = None
self.local_lxmf_destination = None
self.announce_handlers = []
self.integrity_manager = IntegrityManager(
self.storage_path,
self.database_path,
self.identity_hash,
)
self.running = False
def setup(self):
print(f"Setting up Identity Context for {self.identity_hash}...")
# 0. Clear any previous integrity issues on the app
self.app.integrity_issues = []
# 1. Cleanup RNS state for this identity if any lingers
self.app.cleanup_rns_state_for_identity(self.identity.hash)
# 2. Initialize Database
if getattr(self.app, "emergency", False):
print("EMERGENCY MODE ENABLED: Using in-memory database.")
self.database = Database(":memory:")
else:
self.database = Database(self.database_path)
# Check Integrity (skip in emergency mode)
if not getattr(self.app, "emergency", False):
is_ok, issues = self.integrity_manager.check_integrity()
if not is_ok:
print(
f"INTEGRITY WARNING for {self.identity_hash}: {', '.join(issues)}",
)
if not hasattr(self.app, "integrity_issues"):
self.app.integrity_issues = []
self.app.integrity_issues.extend(issues)
try:
self.database.initialize()
if not getattr(self.app, "emergency", False):
self.database.migrate_from_legacy(
self.app.reticulum_config_dir,
self.identity_hash,
)
self.database._tune_sqlite_pragmas()
except Exception as exc:
if not self.app.auto_recover and not getattr(self.app, "emergency", False):
raise
print(
f"Database initialization failed for {self.identity_hash}, attempting recovery: {exc}",
)
if not getattr(self.app, "emergency", False):
self.app._run_startup_auto_recovery()
self.database.initialize()
self.database._tune_sqlite_pragmas()
# 3. Initialize Config and Managers
self.config = ConfigManager(self.database)
# Apply overrides from CLI/ENV if provided
if (
hasattr(self.app, "gitea_base_url_override")
and self.app.gitea_base_url_override
):
self.config.gitea_base_url.set(self.app.gitea_base_url_override)
if (
hasattr(self.app, "docs_download_urls_override")
and self.app.docs_download_urls_override
):
self.config.docs_download_urls.set(self.app.docs_download_urls_override)
self.message_handler = MessageHandler(self.database)
self.announce_manager = AnnounceManager(self.database)
self.archiver_manager = ArchiverManager(self.database)
self.map_manager = MapManager(self.config, self.app.storage_dir)
self.docs_manager = DocsManager(
self.config,
self.app.get_public_path(),
project_root=os.path.dirname(
os.path.dirname(
os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
),
),
storage_dir=self.storage_path,
)
self.nomadnet_manager = NomadNetworkManager(
self.config,
self.archiver_manager,
self.database,
)
# Vacuum and mark stuck messages
self.database.provider.vacuum()
self.database.messages.mark_stuck_messages_as_failed()
# 4. Initialize LXMF Router
propagation_stamp_cost = self.config.lxmf_propagation_node_stamp_cost.get()
self.message_router = create_lxmf_router(
identity=self.identity,
storagepath=self.lxmf_router_path,
propagation_cost=propagation_stamp_cost,
)
self.message_router.PROCESSING_INTERVAL = 1
self.message_router.delivery_per_transfer_limit = (
self.config.lxmf_delivery_transfer_limit_in_bytes.get() / 1000
)
# Register LXMF delivery identity
inbound_stamp_cost = self.config.lxmf_inbound_stamp_cost.get()
self.local_lxmf_destination = self.message_router.register_delivery_identity(
identity=self.identity,
display_name=self.config.display_name.get(),
stamp_cost=inbound_stamp_cost,
)
# Forwarding Manager
self.forwarding_manager = ForwardingManager(
self.database,
self.lxmf_router_path,
lambda msg: self.app.on_lxmf_delivery(msg, context=self),
config=self.config,
)
self.forwarding_manager.load_aliases()
# Register delivery callback
self.message_router.register_delivery_callback(
lambda msg: self.app.on_lxmf_delivery(msg, context=self),
)
# Restore preferred propagation node on startup
try:
preferred_node = (
self.config.lxmf_preferred_propagation_node_destination_hash.get()
)
if preferred_node:
self.app.set_active_propagation_node(preferred_node, context=self)
except Exception:
pass
# 5. Initialize Handlers and Managers
self.rncp_handler = RNCPHandler(
reticulum_instance=getattr(self.app, "reticulum", None),
identity=self.identity,
storage_dir=self.app.storage_dir,
)
self.rnstatus_handler = RNStatusHandler(
reticulum_instance=getattr(self.app, "reticulum", None),
)
self.rnpath_handler = RNPathHandler(
reticulum_instance=getattr(self.app, "reticulum", None),
)
self.rnpath_trace_handler = RNPathTraceHandler(
reticulum_instance=getattr(self.app, "reticulum", None),
identity=self.identity,
)
self.rnprobe_handler = RNProbeHandler(
reticulum_instance=getattr(self.app, "reticulum", None),
identity=self.identity,
)
libretranslate_url = self.config.libretranslate_url.get()
translator_enabled = self.config.translator_enabled.get()
self.translator_handler = TranslatorHandler(
libretranslate_url=libretranslate_url,
enabled=translator_enabled,
)
self.bot_handler = BotHandler(
identity_path=self.storage_path,
config_manager=self.config,
)
try:
self.bot_handler.restore_enabled_bots()
except Exception as exc:
print(f"Failed to restore bots: {exc}")
# Initialize managers
self.telephone_manager = TelephoneManager(
self.identity,
config_manager=self.config,
storage_dir=self.storage_path,
db=self.database,
)
self.telephone_manager.get_name_for_identity_hash = (
self.app.get_name_for_identity_hash
)
self.telephone_manager.on_initiation_status_callback = (
lambda status, target: self.app.on_telephone_initiation_status(
status,
target,
context=self,
)
)
self.telephone_manager.register_ringing_callback(
lambda call: self.app.on_incoming_telephone_call(call, context=self),
)
self.telephone_manager.register_established_callback(
lambda call: self.app.on_telephone_call_established(call, context=self),
)
self.telephone_manager.register_ended_callback(
lambda call: self.app.on_telephone_call_ended(call, context=self),
)
# Only initialize telephone hardware/profile if not in emergency mode
if not getattr(self.app, "emergency", False):
self.telephone_manager.init_telephone()
self.voicemail_manager = VoicemailManager(
db=self.database,
config=self.config,
telephone_manager=self.telephone_manager,
storage_dir=self.storage_path,
)
self.voicemail_manager.get_name_for_identity_hash = (
self.app.get_name_for_identity_hash
)
self.voicemail_manager.on_new_voicemail_callback = (
lambda vm: self.app.on_new_voicemail_received(vm, context=self)
)
self.ringtone_manager = RingtoneManager(
config=self.config,
storage_dir=self.storage_path,
)
self.community_interfaces_manager = CommunityInterfacesManager()
self.auto_propagation_manager = AutoPropagationManager(
app=self.app,
context=self,
)
# 6. Register Announce Handlers
self.register_announce_handlers()
# 7. Start background threads
self.running = True
self.start_background_threads()
# 8. Handle initial documentation download
if (
not getattr(self.app, "emergency", False)
and not self.config.initial_docs_download_attempted.get()
):
if not self.docs_manager.has_docs():
print(
f"Triggering initial documentation download for {self.identity_hash}...",
)
self.docs_manager.update_docs()
self.config.initial_docs_download_attempted.set(True)
# Baseline integrity manifest after successful setup
if not getattr(self.app, "emergency", False):
self.integrity_manager.save_manifest()
print(f"Identity Context for {self.identity_hash} is now running.")
def start_background_threads(self):
# start background thread for auto announce loop
thread = threading.Thread(
target=asyncio.run,
args=(self.app.announce_loop(self.session_id, context=self),),
)
thread.daemon = True
thread.start()
# start background thread for auto syncing propagation nodes
thread = threading.Thread(
target=asyncio.run,
args=(
self.app.announce_sync_propagation_nodes(self.session_id, context=self),
),
)
thread.daemon = True
thread.start()
# start background thread for crawler loop
thread = threading.Thread(
target=asyncio.run,
args=(self.app.crawler_loop(self.session_id, context=self),),
)
thread.daemon = True
thread.start()
# start background thread for auto backup loop
thread = threading.Thread(
target=asyncio.run,
args=(self.app.auto_backup_loop(self.session_id, context=self),),
)
thread.daemon = True
thread.start()
# start background thread for telemetry tracking loop
thread = threading.Thread(
target=asyncio.run,
args=(self.app.telemetry_tracking_loop(self.session_id, context=self),),
)
thread.daemon = True
thread.start()
# start background thread for auto propagation node selection
thread = threading.Thread(
target=asyncio.run,
args=(self.auto_propagation_manager._run(),),
)
thread.daemon = True
thread.start()
def register_announce_handlers(self):
handlers = [
AnnounceHandler(
"lxst.telephony",
lambda aspect, dh, ai, ad, aph: self.app.on_telephone_announce_received(
aspect,
dh,
ai,
ad,
aph,
context=self,
),
),
AnnounceHandler(
"lxmf.delivery",
lambda aspect, dh, ai, ad, aph: self.app.on_lxmf_announce_received(
aspect,
dh,
ai,
ad,
aph,
context=self,
),
),
AnnounceHandler(
"lxmf.propagation",
lambda aspect,
dh,
ai,
ad,
aph: self.app.on_lxmf_propagation_announce_received(
aspect,
dh,
ai,
ad,
aph,
context=self,
),
),
AnnounceHandler(
"nomadnetwork.node",
lambda aspect,
dh,
ai,
ad,
aph: self.app.on_nomadnet_node_announce_received(
aspect,
dh,
ai,
ad,
aph,
context=self,
),
),
]
for handler in handlers:
RNS.Transport.register_announce_handler(handler)
self.announce_handlers.append(handler)
def teardown(self):
print(f"Tearing down Identity Context for {self.identity_hash}...")
self.running = False
if self.auto_propagation_manager:
self.auto_propagation_manager.stop()
# 1. Deregister announce handlers
for handler in self.announce_handlers:
try:
RNS.Transport.deregister_announce_handler(handler)
except Exception:
pass
self.announce_handlers = []
# 2. Cleanup RNS destinations and links
try:
if self.message_router:
if hasattr(self.message_router, "delivery_destinations"):
for dest_hash in list(
self.message_router.delivery_destinations.keys(),
):
dest = self.message_router.delivery_destinations[dest_hash]
RNS.Transport.deregister_destination(dest)
if (
hasattr(self.message_router, "propagation_destination")
and self.message_router.propagation_destination
):
RNS.Transport.deregister_destination(
self.message_router.propagation_destination,
)
if self.telephone_manager and self.telephone_manager.telephone:
if (
hasattr(self.telephone_manager.telephone, "destination")
and self.telephone_manager.telephone.destination
):
RNS.Transport.deregister_destination(
self.telephone_manager.telephone.destination,
)
self.app.cleanup_rns_state_for_identity(self.identity.hash)
except Exception as e:
print(f"Error during RNS cleanup for {self.identity_hash}: {e}")
# 3. Stop LXMF Router jobs
if self.message_router:
try:
self.message_router.jobs = lambda: None
if hasattr(self.message_router, "exit_handler"):
self.message_router.exit_handler()
# Give LXMF/RNS a moment to finish any final disk writes
import time
time.sleep(1.0)
except Exception as e:
print(
f"Error while tearing down LXMRouter for {self.identity_hash}: {e}",
)
# 4. Stop telephone and voicemail
if self.telephone_manager:
try:
self.telephone_manager.teardown()
except Exception as e:
print(
f"Error while tearing down telephone for {self.identity_hash}: {e}",
)
if self.bot_handler:
try:
self.bot_handler.stop_all()
except Exception as e:
print(f"Error while stopping bots for {self.identity_hash}: {e}")
if self.database:
try:
# 1. Checkpoint WAL and close database cleanly to ensure file is stable for hashing
self.database._checkpoint_and_close()
except Exception as e:
print(
f"Error closing database during teardown for {self.identity_hash}: {e}",
)
# 2. Save integrity manifest AFTER closing to capture final stable state
self.integrity_manager.save_manifest()
print(f"Identity Context for {self.identity_hash} torn down.")

View File

@@ -0,0 +1,228 @@
import base64
import json
import os
import shutil
import RNS
from meshchatx.src.backend.database.config import ConfigDAO
from meshchatx.src.backend.database.provider import DatabaseProvider
from meshchatx.src.backend.database.schema import DatabaseSchema
class IdentityManager:
def __init__(self, storage_dir: str, identity_file_path: str | None = None):
self.storage_dir = storage_dir
self.identity_file_path = identity_file_path
def get_identity_bytes(self, identity: RNS.Identity) -> bytes:
return identity.get_private_key()
def backup_identity(self, identity: RNS.Identity) -> dict:
identity_bytes = self.get_identity_bytes(identity)
target_path = self.identity_file_path or os.path.join(
self.storage_dir,
"identity",
)
os.makedirs(os.path.dirname(target_path), exist_ok=True)
with open(target_path, "wb") as f:
f.write(identity_bytes)
return {
"path": target_path,
"size": os.path.getsize(target_path),
}
def backup_identity_base32(self, identity: RNS.Identity) -> str:
return base64.b32encode(self.get_identity_bytes(identity)).decode("utf-8")
def list_identities(self, current_identity_hash: str | None = None):
identities = []
identities_base_dir = os.path.join(self.storage_dir, "identities")
if not os.path.exists(identities_base_dir):
return identities
for identity_hash in os.listdir(identities_base_dir):
identity_path = os.path.join(identities_base_dir, identity_hash)
if not os.path.isdir(identity_path):
continue
metadata_path = os.path.join(identity_path, "metadata.json")
metadata = None
if os.path.exists(metadata_path):
try:
with open(metadata_path) as f:
metadata = json.load(f)
except Exception:
pass
if metadata:
identities.append(
{
"hash": identity_hash,
"display_name": metadata.get("display_name", "Anonymous Peer"),
"icon_name": metadata.get("icon_name"),
"icon_foreground_colour": metadata.get(
"icon_foreground_colour",
),
"icon_background_colour": metadata.get(
"icon_background_colour",
),
"lxmf_address": metadata.get("lxmf_address"),
"lxst_address": metadata.get("lxst_address"),
"is_current": (
current_identity_hash is not None
and identity_hash == current_identity_hash
),
},
)
continue
# Fallback to DB if metadata.json doesn't exist
db_path = os.path.join(identity_path, "database.db")
if not os.path.exists(db_path):
continue
display_name = "Anonymous Peer"
icon_name = None
icon_foreground_colour = None
icon_background_colour = None
lxmf_address = None
lxst_address = None
try:
temp_provider = DatabaseProvider(db_path)
temp_config_dao = ConfigDAO(temp_provider)
display_name = temp_config_dao.get("display_name", "Anonymous Peer")
icon_name = temp_config_dao.get("lxmf_user_icon_name")
icon_foreground_colour = temp_config_dao.get(
"lxmf_user_icon_foreground_colour",
)
icon_background_colour = temp_config_dao.get(
"lxmf_user_icon_background_colour",
)
lxmf_address = temp_config_dao.get("lxmf_address_hash")
lxst_address = temp_config_dao.get("lxst_address_hash")
temp_provider.close()
# Save metadata for next time
metadata = {
"display_name": display_name,
"icon_name": icon_name,
"icon_foreground_colour": icon_foreground_colour,
"icon_background_colour": icon_background_colour,
"lxmf_address": lxmf_address,
"lxst_address": lxst_address,
}
with open(metadata_path, "w") as f:
json.dump(metadata, f)
except Exception as e:
print(f"Error reading config for {identity_hash}: {e}")
identities.append(
{
"hash": identity_hash,
"display_name": display_name,
"icon_name": icon_name,
"icon_foreground_colour": icon_foreground_colour,
"icon_background_colour": icon_background_colour,
"lxmf_address": lxmf_address,
"lxst_address": lxst_address,
"is_current": (
current_identity_hash is not None
and identity_hash == current_identity_hash
),
},
)
return identities
def create_identity(self, display_name=None):
new_identity = RNS.Identity(create_keys=True)
return self._save_new_identity(new_identity, display_name or "Anonymous Peer")
def _save_new_identity(self, identity, display_name):
identity_hash = identity.hash.hex()
identity_dir = os.path.join(self.storage_dir, "identities", identity_hash)
os.makedirs(identity_dir, exist_ok=True)
identity_file = os.path.join(identity_dir, "identity")
with open(identity_file, "wb") as f:
f.write(identity.get_private_key())
db_path = os.path.join(identity_dir, "database.db")
new_provider = DatabaseProvider(db_path)
new_schema = DatabaseSchema(new_provider)
new_schema.initialize()
if display_name:
new_config_dao = ConfigDAO(new_provider)
new_config_dao.set("display_name", display_name)
new_provider.close()
# Save metadata
metadata = {
"display_name": display_name,
"icon_name": None,
"icon_foreground_colour": None,
"icon_background_colour": None,
}
metadata_path = os.path.join(identity_dir, "metadata.json")
with open(metadata_path, "w") as f:
json.dump(metadata, f)
return {
"hash": identity_hash,
"display_name": display_name,
}
def update_metadata_cache(self, identity_hash: str, metadata: dict):
identity_dir = os.path.join(self.storage_dir, "identities", identity_hash)
if not os.path.exists(identity_dir):
return
metadata_path = os.path.join(identity_dir, "metadata.json")
# Merge with existing metadata if it exists
existing_metadata = {}
if os.path.exists(metadata_path):
try:
with open(metadata_path) as f:
existing_metadata = json.load(f)
except Exception:
pass
existing_metadata.update(metadata)
with open(metadata_path, "w") as f:
json.dump(existing_metadata, f)
def delete_identity(self, identity_hash: str, current_identity_hash: str | None):
if current_identity_hash and identity_hash == current_identity_hash:
raise ValueError("Cannot delete the current active identity")
identity_dir = os.path.join(self.storage_dir, "identities", identity_hash)
if os.path.exists(identity_dir):
shutil.rmtree(identity_dir)
return True
return False
def restore_identity_from_bytes(self, identity_bytes: bytes) -> dict:
try:
# We use RNS.Identity.from_bytes to validate and get the hash
identity = RNS.Identity.from_bytes(identity_bytes)
if not identity:
raise ValueError("Could not load identity from bytes")
return self._save_new_identity(identity, "Restored Identity")
except Exception as exc:
raise ValueError(f"Failed to restore identity: {exc}")
def restore_identity_from_base32(self, base32_value: str) -> dict:
try:
identity_bytes = base64.b32decode(base32_value, casefold=True)
return self.restore_identity_from_bytes(identity_bytes)
except Exception as exc:
msg = f"Invalid base32 identity: {exc}"
raise ValueError(msg) from exc

View File

@@ -0,0 +1,179 @@
import fnmatch
import hashlib
import json
import os
from datetime import UTC, datetime
from pathlib import Path
class IntegrityManager:
"""Manages the integrity of the database and identity files at rest."""
# Files and directories that are frequently modified by RNS/LXMF or SQLite
# and should be ignored during integrity checks.
IGNORED_PATTERNS = [
"*-wal",
"*-shm",
"*-journal",
"*.tmp",
"*.lock",
"*.log",
"*~",
".DS_Store",
"Thumbs.db",
"integrity-manifest.json",
]
def __init__(self, storage_dir, database_path, identity_hash=None):
self.storage_dir = Path(storage_dir)
self.database_path = Path(database_path)
self.identity_hash = identity_hash
self.manifest_path = self.storage_dir / "integrity-manifest.json"
self.issues = []
def _should_ignore(self, rel_path):
"""Determine if a file path should be ignored based on name or directory."""
path = Path(rel_path)
path_parts = path.parts
# Check for volatile LXMF/RNS directories
# We only ignore these if they are inside the lxmf_router directory
# to avoid accidentally ignoring important files with similar names.
if "lxmf_router" in path_parts:
if any(
part in ["announces", "storage", "identities"] for part in path_parts
):
return True
# Check for other generally ignored directories
if any(
part in ["tmp", "recordings", "greetings", "docs", "bots", "ringtones"]
for part in path_parts
):
return True
filename = path_parts[-1]
# Check against IGNORED_PATTERNS
if any(fnmatch.fnmatch(filename, pattern) for pattern in self.IGNORED_PATTERNS):
return True
return False
def _hash_file(self, file_path):
if not os.path.exists(file_path):
return None
sha256_hash = hashlib.sha256()
with open(file_path, "rb") as f:
for byte_block in iter(lambda: f.read(4096), b""):
sha256_hash.update(byte_block)
return sha256_hash.hexdigest()
def check_integrity(self):
"""Verify the current state against the last saved manifest."""
if not self.manifest_path.exists():
return True, ["Initial run - no manifest yet"]
try:
with open(self.manifest_path) as f:
manifest = json.load(f)
issues = []
manifest_files = manifest.get("files", {})
# Check Database
if self.database_path.exists():
db_rel = str(self.database_path.relative_to(self.storage_dir))
actual_db_hash = self._hash_file(self.database_path)
if actual_db_hash and actual_db_hash != manifest_files.get(db_rel):
issues.append(f"Database modified: {db_rel}")
# Check other critical files in storage_dir
for root, _, files_in_dir in os.walk(self.storage_dir):
for file in files_in_dir:
full_path = Path(root) / file
rel_path = str(full_path.relative_to(self.storage_dir))
if self._should_ignore(rel_path):
continue
# Database already checked separately, skip here to avoid double reporting
if full_path == self.database_path:
continue
actual_hash = self._hash_file(full_path)
if rel_path in manifest_files:
if actual_hash != manifest_files[rel_path]:
issues.append(f"File modified: {rel_path}")
else:
# New files are also a concern for integrity
# but we only report them if they are not in ignored dirs/patterns
issues.append(f"New file detected: {rel_path}")
# Check for missing files that were in manifest
for rel_path in manifest_files:
if self._should_ignore(rel_path):
continue
full_path = self.storage_dir / rel_path
if not full_path.exists():
issues.append(f"File missing: {rel_path}")
if issues:
m_date = manifest.get("date", "Unknown")
m_time = manifest.get("time", "Unknown")
m_id = manifest.get("identity", "Unknown")
issues.insert(
0,
f"Last integrity snapshot: {m_date} {m_time} (Identity: {m_id})",
)
# Check if identity matches
if (
self.identity_hash
and m_id != "Unknown"
and self.identity_hash != m_id
):
issues.append(f"Identity mismatch! Manifest belongs to: {m_id}")
self.issues = issues
return len(issues) == 0, issues
except Exception as e:
import traceback
traceback.print_exc()
return False, [f"Integrity check failed: {e!s}"]
def save_manifest(self):
"""Snapshot the current state of critical files."""
try:
files = {}
# Hash all critical files in storage_dir recursively
for root, _, files_in_dir in os.walk(self.storage_dir):
for file in files_in_dir:
full_path = Path(root) / file
rel_path = str(full_path.relative_to(self.storage_dir))
if self._should_ignore(rel_path):
continue
files[rel_path] = self._hash_file(full_path)
now = datetime.now(UTC)
manifest = {
"version": 1,
"timestamp": now.timestamp(),
"date": now.strftime("%Y-%m-%d"),
"time": now.strftime("%H:%M:%S"),
"identity": self.identity_hash,
"files": files,
}
with open(self.manifest_path, "w") as f:
json.dump(manifest, f, indent=2)
return True
except Exception as e:
print(f"Failed to save integrity manifest: {e}")
return False

View File

@@ -30,6 +30,12 @@ class InterfaceConfigParser:
for interface_name in config_interfaces:
# ensure interface has a name
interface_config = config_interfaces[interface_name]
if not isinstance(interface_config, dict):
print(
f"Skipping invalid interface configuration for {interface_name}: expected dict, got {type(interface_config)}",
)
continue
interface_config["name"] = interface_name
interfaces.append(interface_config)

View File

@@ -3,9 +3,12 @@ import time
import RNS
from RNS.Interfaces.Interface import Interface
from src.backend.interfaces.WebsocketClientInterface import WebsocketClientInterface
from websockets.sync.server import Server, ServerConnection, serve
from meshchatx.src.backend.interfaces.WebsocketClientInterface import (
WebsocketClientInterface,
)
class WebsocketServerInterface(Interface):
# TODO: required?

View File

@@ -0,0 +1,337 @@
import base64
import json
import LXMF
from meshchatx.src.backend.telemetry_utils import Telemeter
def convert_lxmf_message_to_dict(
lxmf_message: LXMF.LXMessage,
include_attachments: bool = True,
reticulum=None,
):
# handle fields
fields = {}
message_fields = lxmf_message.get_fields()
for field_type in message_fields:
value = message_fields[field_type]
# handle file attachments field
if field_type == LXMF.FIELD_FILE_ATTACHMENTS:
# process file attachments
file_attachments = []
for file_attachment in value:
file_name = file_attachment[0]
file_data = file_attachment[1]
file_bytes = None
if include_attachments:
file_bytes = base64.b64encode(file_data).decode(
"utf-8",
)
file_attachments.append(
{
"file_name": file_name,
"file_size": len(file_data),
"file_bytes": file_bytes,
},
)
# add to fields
fields["file_attachments"] = file_attachments
# handle image field
if field_type == LXMF.FIELD_IMAGE:
image_type = value[0]
image_data = value[1]
image_bytes = None
if include_attachments:
image_bytes = base64.b64encode(image_data).decode("utf-8")
fields["image"] = {
"image_type": image_type,
"image_size": len(image_data),
"image_bytes": image_bytes,
}
# handle audio field
if field_type == LXMF.FIELD_AUDIO:
audio_mode = value[0]
audio_data = value[1]
audio_bytes = None
if include_attachments:
audio_bytes = base64.b64encode(audio_data).decode("utf-8")
fields["audio"] = {
"audio_mode": audio_mode,
"audio_size": len(audio_data),
"audio_bytes": audio_bytes,
}
# handle telemetry field
if field_type == LXMF.FIELD_TELEMETRY:
fields["telemetry"] = Telemeter.from_packed(value)
# handle commands field
if field_type == LXMF.FIELD_COMMANDS or field_type == 0x01:
# value is usually a list of dicts, or a single dict
if isinstance(value, dict):
# convert dict keys back to ints if they look like hex or int strings
new_cmd = {}
for k, v in value.items():
try:
ki = None
if isinstance(k, int):
ki = k
elif isinstance(k, str):
if k.startswith("0x"):
ki = int(k, 16)
else:
ki = int(k)
if ki is not None:
new_cmd[f"0x{ki:02x}"] = v
else:
new_cmd[str(k)] = v
except (ValueError, TypeError):
new_cmd[str(k)] = v
fields["commands"] = [new_cmd]
elif isinstance(value, list):
processed_commands = []
for cmd in value:
if isinstance(cmd, dict):
new_cmd = {}
for k, v in cmd.items():
try:
ki = None
if isinstance(k, int):
ki = k
elif isinstance(k, str):
if k.startswith("0x"):
ki = int(k, 16)
else:
ki = int(k)
if ki is not None:
new_cmd[f"0x{ki:02x}"] = v
else:
new_cmd[str(k)] = v
except (ValueError, TypeError):
new_cmd[str(k)] = v
processed_commands.append(new_cmd)
else:
processed_commands.append(cmd)
fields["commands"] = processed_commands
else:
fields["commands"] = value
# convert 0.0-1.0 progress to 0.00-100 percentage
progress_percentage = round(lxmf_message.progress * 100, 2)
# get rssi
rssi = lxmf_message.rssi
if rssi is None and reticulum:
rssi = reticulum.get_packet_rssi(lxmf_message.hash)
# get snr
snr = lxmf_message.snr
if snr is None and reticulum:
snr = reticulum.get_packet_snr(lxmf_message.hash)
# get quality
quality = lxmf_message.q
if quality is None and reticulum:
quality = reticulum.get_packet_q(lxmf_message.hash)
return {
"hash": lxmf_message.hash.hex(),
"source_hash": lxmf_message.source_hash.hex(),
"destination_hash": lxmf_message.destination_hash.hex(),
"is_incoming": lxmf_message.incoming,
"state": convert_lxmf_state_to_string(lxmf_message),
"progress": progress_percentage,
"method": convert_lxmf_method_to_string(lxmf_message),
"delivery_attempts": lxmf_message.delivery_attempts,
"next_delivery_attempt_at": getattr(
lxmf_message,
"next_delivery_attempt",
None,
), # attribute may not exist yet
"title": lxmf_message.title.decode("utf-8", errors="replace")
if lxmf_message.title
else "",
"content": lxmf_message.content.decode("utf-8", errors="replace")
if lxmf_message.content
else "",
"fields": fields,
"timestamp": lxmf_message.timestamp,
"rssi": rssi,
"snr": snr,
"quality": quality,
}
def convert_lxmf_state_to_string(lxmf_message: LXMF.LXMessage):
# convert state to string
lxmf_message_state = "unknown"
if lxmf_message.state == LXMF.LXMessage.GENERATING:
lxmf_message_state = "generating"
elif lxmf_message.state == LXMF.LXMessage.OUTBOUND:
lxmf_message_state = "outbound"
elif lxmf_message.state == LXMF.LXMessage.SENDING:
lxmf_message_state = "sending"
elif lxmf_message.state == LXMF.LXMessage.SENT:
lxmf_message_state = "sent"
elif lxmf_message.state == LXMF.LXMessage.DELIVERED:
lxmf_message_state = "delivered"
elif lxmf_message.state == LXMF.LXMessage.REJECTED:
lxmf_message_state = "rejected"
elif lxmf_message.state == LXMF.LXMessage.CANCELLED:
lxmf_message_state = "cancelled"
elif lxmf_message.state == LXMF.LXMessage.FAILED:
lxmf_message_state = "failed"
return lxmf_message_state
def convert_lxmf_method_to_string(lxmf_message: LXMF.LXMessage):
# convert method to string
lxmf_message_method = "unknown"
if lxmf_message.method == LXMF.LXMessage.OPPORTUNISTIC:
lxmf_message_method = "opportunistic"
elif lxmf_message.method == LXMF.LXMessage.DIRECT:
lxmf_message_method = "direct"
elif lxmf_message.method == LXMF.LXMessage.PROPAGATED:
lxmf_message_method = "propagated"
elif lxmf_message.method == LXMF.LXMessage.PAPER:
lxmf_message_method = "paper"
return lxmf_message_method
def convert_db_lxmf_message_to_dict(
db_lxmf_message,
include_attachments: bool = False,
):
try:
fields_str = db_lxmf_message.get("fields", "{}")
fields = json.loads(fields_str) if fields_str else {}
except (json.JSONDecodeError, TypeError):
fields = {}
if not isinstance(fields, dict):
fields = {}
# normalize commands if present
if "commands" in fields:
cmds = fields["commands"]
if isinstance(cmds, list):
new_cmds = []
for cmd in cmds:
if isinstance(cmd, dict):
new_cmd = {}
for k, v in cmd.items():
# normalize key to 0xXX format if it's a number string
try:
ki = None
if isinstance(k, int):
ki = k
elif isinstance(k, str):
if k.startswith("0x"):
ki = int(k, 16)
else:
ki = int(k)
if ki is not None:
new_cmd[f"0x{ki:02x}"] = v
else:
new_cmd[str(k)] = v
except (ValueError, TypeError):
new_cmd[str(k)] = v
new_cmds.append(new_cmd)
else:
new_cmds.append(cmd)
fields["commands"] = new_cmds
# strip attachments if requested
if not include_attachments:
if "image" in fields:
# keep type but strip bytes
image_size = fields["image"].get("image_size") or 0
b64_bytes = fields["image"].get("image_bytes")
if not image_size and b64_bytes:
# Optimized size calculation without full decoding
image_size = (len(b64_bytes) * 3) // 4
if b64_bytes.endswith("=="):
image_size -= 2
elif b64_bytes.endswith("="):
image_size -= 1
fields["image"] = {
"image_type": fields["image"].get("image_type"),
"image_size": image_size,
"image_bytes": None,
}
if "audio" in fields:
# keep mode but strip bytes
audio_size = fields["audio"].get("audio_size") or 0
b64_bytes = fields["audio"].get("audio_bytes")
if not audio_size and b64_bytes:
audio_size = (len(b64_bytes) * 3) // 4
if b64_bytes.endswith("=="):
audio_size -= 2
elif b64_bytes.endswith("="):
audio_size -= 1
fields["audio"] = {
"audio_mode": fields["audio"].get("audio_mode"),
"audio_size": audio_size,
"audio_bytes": None,
}
if "file_attachments" in fields:
# keep file names but strip bytes
for i in range(len(fields["file_attachments"])):
file_size = fields["file_attachments"][i].get("file_size") or 0
b64_bytes = fields["file_attachments"][i].get("file_bytes")
if not file_size and b64_bytes:
file_size = (len(b64_bytes) * 3) // 4
if b64_bytes.endswith("=="):
file_size -= 2
elif b64_bytes.endswith("="):
file_size -= 1
fields["file_attachments"][i] = {
"file_name": fields["file_attachments"][i].get("file_name"),
"file_size": file_size,
"file_bytes": None,
}
# ensure created_at and updated_at have Z suffix for UTC if they don't have a timezone
created_at = str(db_lxmf_message["created_at"])
if created_at and "+" not in created_at and "Z" not in created_at:
created_at += "Z"
updated_at = str(db_lxmf_message["updated_at"])
if updated_at and "+" not in updated_at and "Z" not in updated_at:
updated_at += "Z"
return {
"id": db_lxmf_message["id"],
"hash": db_lxmf_message["hash"],
"source_hash": db_lxmf_message["source_hash"],
"destination_hash": db_lxmf_message["destination_hash"],
"is_incoming": bool(db_lxmf_message["is_incoming"]),
"state": db_lxmf_message["state"],
"progress": db_lxmf_message["progress"],
"method": db_lxmf_message["method"],
"delivery_attempts": db_lxmf_message["delivery_attempts"],
"next_delivery_attempt_at": db_lxmf_message["next_delivery_attempt_at"],
"title": db_lxmf_message["title"],
"content": db_lxmf_message["content"],
"fields": fields,
"timestamp": db_lxmf_message["timestamp"],
"rssi": db_lxmf_message["rssi"],
"snr": db_lxmf_message["snr"],
"quality": db_lxmf_message["quality"],
"is_spam": bool(db_lxmf_message["is_spam"]),
"created_at": created_at,
"updated_at": updated_at,
}

View File

@@ -1,3 +1,5 @@
import base64
import concurrent.futures
import math
import os
import sqlite3
@@ -7,6 +9,11 @@ import time
import requests
import RNS
# 1x1 transparent PNG to return when a tile is not found in offline mode
TRANSPARENT_TILE = base64.b64decode(
"iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII=",
)
class MapManager:
def __init__(self, config_manager, storage_dir):
@@ -178,14 +185,17 @@ class MapManager:
# bbox: [min_lon, min_lat, max_lon, max_lat]
min_lon, min_lat, max_lon, max_lat = bbox
# calculate total tiles
total_tiles = 0
# collect all tiles to download
tiles_to_download = []
zoom_levels = range(min_zoom, max_zoom + 1)
for z in zoom_levels:
x1, y1 = self._lonlat_to_tile(min_lon, max_lat, z)
x2, y2 = self._lonlat_to_tile(max_lon, min_lat, z)
total_tiles += (x2 - x1 + 1) * (y2 - y1 + 1)
tiles_to_download.extend(
(z, x, y) for x in range(x1, x2 + 1) for y in range(y1, y2 + 1)
)
total_tiles = len(tiles_to_download)
self._export_progress[export_id]["total"] = total_tiles
self._export_progress[export_id]["status"] = "downloading"
@@ -214,56 +224,97 @@ class MapManager:
("bounds", f"{min_lon},{min_lat},{max_lon},{max_lat}"),
]
cursor.executemany("INSERT INTO metadata VALUES (?, ?)", metadata)
conn.commit()
tile_server_url = self.config.map_tile_server_url.get()
current_count = 0
for z in zoom_levels:
x1, y1 = self._lonlat_to_tile(min_lon, max_lat, z)
x2, y2 = self._lonlat_to_tile(max_lon, min_lat, z)
for x in range(x1, x2 + 1):
for y in range(y1, y2 + 1):
# check if we should stop
if export_id in self._export_cancelled:
conn.close()
if os.path.exists(dest_path):
os.remove(dest_path)
if export_id in self._export_progress:
del self._export_progress[export_id]
self._export_cancelled.remove(export_id)
return
# download tiles in parallel
# using 10 workers for a good balance between speed and being polite
max_workers = 10
# download tile
tile_url = f"https://tile.openstreetmap.org/{z}/{x}/{y}.png"
try:
# wait a bit to be nice to OSM
time.sleep(0.1)
def download_tile(tile_coords):
if export_id in self._export_cancelled:
return None
response = requests.get(
tile_url,
headers={"User-Agent": "MeshChatX/1.0 MapExporter"},
timeout=10,
)
if response.status_code == 200:
# MBTiles uses TMS (y flipped)
tms_y = (1 << z) - 1 - y
cursor.execute(
"INSERT INTO tiles VALUES (?, ?, ?, ?)",
(z, x, tms_y, response.content),
)
except Exception as e:
RNS.log(
f"Export failed to download tile {z}/{x}/{y}: {e}",
RNS.LOG_ERROR,
)
z, x, y = tile_coords
tile_url = (
tile_server_url.replace("{z}", str(z))
.replace("{x}", str(x))
.replace("{y}", str(y))
)
current_count += 1
try:
# small per-thread delay to avoid overwhelming servers
time.sleep(0.02)
response = requests.get(
tile_url,
headers={"User-Agent": "MeshChatX/1.0 MapExporter"},
timeout=15,
)
if response.status_code == 200:
# MBTiles uses TMS (y flipped)
tms_y = (1 << z) - 1 - y
return (z, x, tms_y, response.content)
except Exception as e:
RNS.log(
f"Export failed to download tile {z}/{x}/{y}: {e}",
RNS.LOG_ERROR,
)
return None
with concurrent.futures.ThreadPoolExecutor(
max_workers=max_workers,
) as executor:
future_to_tile = {
executor.submit(download_tile, tile): tile
for tile in tiles_to_download
}
batch_size = 50
batch_data = []
for future in concurrent.futures.as_completed(future_to_tile):
if export_id in self._export_cancelled:
executor.shutdown(wait=False, cancel_futures=True)
break
result = future.result()
if result:
batch_data.append(result)
current_count += 1
# Update progress every few tiles or when batch is ready
if current_count % 5 == 0 or current_count == total_tiles:
self._export_progress[export_id]["current"] = current_count
self._export_progress[export_id]["progress"] = int(
(current_count / total_tiles) * 100,
)
# commit after each zoom level
conn.commit()
# Write batches to database
if len(batch_data) >= batch_size or (
current_count == total_tiles and batch_data
):
try:
cursor.executemany(
"INSERT INTO tiles VALUES (?, ?, ?, ?)",
batch_data,
)
conn.commit()
batch_data = []
except Exception as e:
RNS.log(f"Failed to insert map tiles: {e}", RNS.LOG_ERROR)
if export_id in self._export_cancelled:
conn.close()
if os.path.exists(dest_path):
os.remove(dest_path)
if export_id in self._export_progress:
del self._export_progress[export_id]
self._export_cancelled.remove(export_id)
return
conn.close()
self._export_progress[export_id]["status"] = "completed"

View File

@@ -0,0 +1,190 @@
import html
import re
class MarkdownRenderer:
"""A simple Markdown to HTML renderer."""
@staticmethod
def render(text):
if not text:
return ""
# Escape HTML entities first to prevent XSS
# Use a more limited escape if we want to allow some things,
# but for docs, full escape is safest.
text = html.escape(text)
# Fenced code blocks - process these FIRST and replace with placeholders
# to avoid other regexes mangling the code content
code_blocks = []
def code_block_placeholder(match):
lang = match.group(1) or ""
code = match.group(2)
placeholder = f"[[CB{len(code_blocks)}]]"
code_blocks.append(
f'<pre class="bg-gray-800 dark:bg-zinc-900 text-zinc-100 dark:text-zinc-100 p-4 rounded-lg my-4 overflow-x-auto border border-gray-700 dark:border-zinc-800 font-mono text-sm"><code class="language-{lang} text-inherit">{code}</code></pre>',
)
return placeholder
text = re.sub(
r"```(\w+)?\n(.*?)\n```",
code_block_placeholder,
text,
flags=re.DOTALL,
)
# Horizontal Rules
text = re.sub(
r"^---+$",
r'<hr class="my-8 border-t border-gray-200 dark:border-zinc-800">',
text,
flags=re.MULTILINE,
)
# Headers
text = re.sub(
r"^# (.*)$",
r'<h1 class="text-3xl font-bold mt-8 mb-4 text-gray-900 dark:text-zinc-100">\1</h1>',
text,
flags=re.MULTILINE,
)
text = re.sub(
r"^## (.*)$",
r'<h2 class="text-2xl font-bold mt-6 mb-3 text-gray-900 dark:text-zinc-100">\1</h2>',
text,
flags=re.MULTILINE,
)
text = re.sub(
r"^### (.*)$",
r'<h3 class="text-xl font-bold mt-4 mb-2 text-gray-900 dark:text-zinc-100">\1</h3>',
text,
flags=re.MULTILINE,
)
text = re.sub(
r"^#### (.*)$",
r'<h4 class="text-lg font-bold mt-3 mb-2 text-gray-900 dark:text-zinc-100">\1</h4>',
text,
flags=re.MULTILINE,
)
# Bold and Italic
text = re.sub(r"\*\*\*(.*?)\*\*\*", r"<strong><em>\1</em></strong>", text)
text = re.sub(r"\*\*(.*?)\*\*", r"<strong>\1</strong>", text)
text = re.sub(r"\*(.*?)\*", r"<em>\1</em>", text)
text = re.sub(r"___(.*?)___", r"<strong><em>\1</em></strong>", text)
text = re.sub(r"__(.*?)__", r"<strong>\1</strong>", text)
text = re.sub(r"_(.*?)_", r"<em>\1</em>", text)
# Strikethrough
text = re.sub(r"~~(.*?)~~", r"<del>\1</del>", text)
# Inline code
text = re.sub(
r"`([^`]+)`",
r'<code class="bg-gray-100 dark:bg-zinc-800 px-1.5 py-0.5 rounded text-pink-600 dark:text-pink-400 font-mono text-[0.9em]">\1</code>',
text,
)
# Task lists
text = re.sub(
r"^[-*] \[ \] (.*)$",
r'<li class="flex items-start gap-2 list-none"><input type="checkbox" disabled class="mt-1"> <span>\1</span></li>',
text,
flags=re.MULTILINE,
)
text = re.sub(
r"^[-*] \[x\] (.*)$",
r'<li class="flex items-start gap-2 list-none"><input type="checkbox" checked disabled class="mt-1"> <span class="line-through opacity-50">\1</span></li>',
text,
flags=re.MULTILINE,
)
# Links
text = re.sub(
r"\[([^\]]+)\]\(([^)]+)\)",
r'<a href="\2" class="text-blue-600 dark:text-blue-400 hover:underline" target="_blank">\1</a>',
text,
)
# Images
text = re.sub(
r"!\[([^\]]*)\]\(([^)]+)\)",
r'<div class="my-6"><img src="\2" alt="\1" class="max-w-full h-auto rounded-xl shadow-lg border border-gray-100 dark:border-zinc-800"></div>',
text,
)
# Blockquotes
text = re.sub(
r"^> (.*)$",
r'<blockquote class="border-l-4 border-blue-500/50 pl-4 py-2 my-6 italic bg-gray-50 dark:bg-zinc-900/50 text-gray-700 dark:text-zinc-300 rounded-r-lg">\1</blockquote>',
text,
flags=re.MULTILINE,
)
# Lists - Simple single level for now to keep it predictable
def unordered_list_repl(match):
items = match.group(0).strip().split("\n")
html_items = ""
for i in items:
# Check if it's already a task list item
if 'type="checkbox"' in i:
html_items += i
else:
content = i[2:].strip()
html_items += f'<li class="ml-4 mb-1 list-disc text-gray-700 dark:text-zinc-300">{content}</li>'
return f'<ul class="my-4 space-y-1">{html_items}</ul>'
text = re.sub(
r"((?:^[*-] .*\n?)+)",
unordered_list_repl,
text,
flags=re.MULTILINE,
)
def ordered_list_repl(match):
items = match.group(0).strip().split("\n")
html_items = ""
for i in items:
content = re.sub(r"^\d+\. ", "", i).strip()
html_items += f'<li class="ml-4 mb-1 list-decimal text-gray-700 dark:text-zinc-300">{content}</li>'
return f'<ol class="my-4 space-y-1">{html_items}</ol>'
text = re.sub(
r"((?:^\d+\. .*\n?)+)",
ordered_list_repl,
text,
flags=re.MULTILINE,
)
# Paragraphs - double newline to p tag
parts = text.split("\n\n")
processed_parts = []
for part in parts:
part = part.strip()
if not part:
continue
# If it's a placeholder for code block, don't wrap in <p>
if part.startswith("[[CB") and part.endswith("]]"):
processed_parts.append(part)
continue
# If it already starts with a block tag, don't wrap in <p>
if re.match(r"^<(h\d|ul|ol|li|blockquote|hr|div)", part):
processed_parts.append(part)
else:
# Replace single newlines with <br> for line breaks within paragraphs
part = part.replace("\n", "<br>")
processed_parts.append(
f'<p class="my-4 leading-relaxed text-gray-800 dark:text-zinc-200">{part}</p>',
)
text = "\n".join(processed_parts)
# Restore code blocks
for i, code_html in enumerate(code_blocks):
text = text.replace(f"[[CB{i}]]", code_html)
return text

View File

@@ -0,0 +1,219 @@
import base64
import json
import signal
import threading
import LXMF
import RNS.vendor.umsgpack as msgpack
from LXMF import LXMRouter
def create_lxmf_router(identity, storagepath, propagation_cost=None):
"""Creates an LXMF.LXMRouter instance safely, avoiding signal handler crashes
when called from non-main threads.
"""
if propagation_cost is None:
propagation_cost = 0
if threading.current_thread() != threading.main_thread():
# signal.signal can only be called from the main thread in Python
# We monkeypatch it temporarily to avoid the ValueError
original_signal = signal.signal
try:
signal.signal = lambda s, h: None
return LXMF.LXMRouter(
identity=identity,
storagepath=storagepath,
propagation_cost=propagation_cost,
)
finally:
signal.signal = original_signal
else:
return LXMF.LXMRouter(
identity=identity,
storagepath=storagepath,
propagation_cost=propagation_cost,
)
def parse_bool_query_param(value: str | None) -> bool:
if value is None:
return False
value = value.lower()
return value in {"1", "true", "yes", "on"}
def message_fields_have_attachments(fields_json: str | None):
if not fields_json:
return False
try:
fields = json.loads(fields_json)
except Exception:
return False
if "image" in fields or "audio" in fields:
return True
if "file_attachments" in fields and isinstance(
fields["file_attachments"],
list,
):
return len(fields["file_attachments"]) > 0
return False
def has_attachments(lxmf_fields: dict) -> bool:
try:
if LXMF.FIELD_FILE_ATTACHMENTS in lxmf_fields:
return len(lxmf_fields[LXMF.FIELD_FILE_ATTACHMENTS]) > 0
if LXMF.FIELD_IMAGE in lxmf_fields:
return True
if LXMF.FIELD_AUDIO in lxmf_fields:
return True
return False
except Exception:
return False
def convert_propagation_node_state_to_string(state):
state_map = {
LXMRouter.PR_IDLE: "idle",
LXMRouter.PR_PATH_REQUESTED: "path_requested",
LXMRouter.PR_LINK_ESTABLISHING: "link_establishing",
LXMRouter.PR_LINK_ESTABLISHED: "link_established",
LXMRouter.PR_REQUEST_SENT: "request_sent",
LXMRouter.PR_RECEIVING: "receiving",
LXMRouter.PR_RESPONSE_RECEIVED: "response_received",
LXMRouter.PR_COMPLETE: "complete",
LXMRouter.PR_NO_PATH: "no_path",
LXMRouter.PR_LINK_FAILED: "link_failed",
LXMRouter.PR_TRANSFER_FAILED: "transfer_failed",
LXMRouter.PR_NO_IDENTITY_RCVD: "no_identity_received",
LXMRouter.PR_NO_ACCESS: "no_access",
LXMRouter.PR_FAILED: "failed",
}
if state in state_map:
return state_map[state]
return "unknown"
def convert_db_favourite_to_dict(favourite):
created_at = str(favourite["created_at"])
if created_at and "+" not in created_at and "Z" not in created_at:
created_at += "Z"
updated_at = str(favourite["updated_at"])
if updated_at and "+" not in updated_at and "Z" not in updated_at:
updated_at += "Z"
return {
"id": favourite["id"],
"destination_hash": favourite["destination_hash"],
"display_name": favourite["display_name"],
"aspect": favourite["aspect"],
"created_at": created_at,
"updated_at": updated_at,
}
def parse_lxmf_display_name(
app_data_base64: str | bytes | None,
default_value: str | None = "Anonymous Peer",
):
if app_data_base64 is None:
return default_value
try:
if isinstance(app_data_base64, bytes):
app_data_bytes = app_data_base64
else:
app_data_bytes = base64.b64decode(app_data_base64)
# Try using the library first
try:
display_name = LXMF.display_name_from_app_data(app_data_bytes)
if display_name is not None:
return display_name
except (AttributeError, Exception):
# Handle cases where library might fail or has the 'str' object has no attribute 'decode' bug
pass
# Fallback manual parsing if library failed or returned None
if len(app_data_bytes) > 0:
# Version 0.5.0+ announce format (msgpack list)
if (
app_data_bytes[0] >= 0x90 and app_data_bytes[0] <= 0x9F
) or app_data_bytes[0] == 0xDC:
try:
peer_data = msgpack.unpackb(app_data_bytes)
if isinstance(peer_data, list) and len(peer_data) >= 1:
dn = peer_data[0]
if dn is not None:
if isinstance(dn, bytes):
return dn.decode("utf-8")
return str(dn)
except Exception:
pass
except Exception as e:
print(f"Failed to parse LXMF display name: {e}")
return default_value
def parse_lxmf_stamp_cost(app_data_base64: str | bytes | None):
if app_data_base64 is None:
return None
try:
if isinstance(app_data_base64, bytes):
app_data_bytes = app_data_base64
else:
app_data_bytes = base64.b64decode(app_data_base64)
return LXMF.stamp_cost_from_app_data(app_data_bytes)
except Exception as e:
print(f"Failed to parse LXMF stamp cost: {e}")
return None
def parse_nomadnetwork_node_display_name(
app_data_base64: str | bytes | None,
default_value: str | None = "Anonymous Node",
):
if app_data_base64 is None:
return default_value
try:
if isinstance(app_data_base64, bytes):
app_data_bytes = app_data_base64
else:
app_data_bytes = base64.b64decode(app_data_base64)
return app_data_bytes.decode("utf-8")
except Exception as e:
print(f"Failed to parse NomadNetwork display name: {e}")
return default_value
def parse_lxmf_propagation_node_app_data(app_data_base64: str | bytes | None):
if app_data_base64 is None:
return None
try:
if isinstance(app_data_base64, bytes):
app_data_bytes = app_data_base64
else:
app_data_bytes = base64.b64decode(app_data_base64)
data = msgpack.unpackb(app_data_bytes)
if not isinstance(data, list) or len(data) < 4:
return None
return {
"enabled": bool(data[2]) if data[2] is not None else False,
"timebase": int(data[1]) if data[1] is not None else 0,
"per_transfer_limit": int(data[3]) if data[3] is not None else 0,
}
except Exception as e:
print(f"Failed to parse LXMF propagation node app data: {e}")
return None

View File

@@ -16,10 +16,9 @@ class MessageHandler:
):
query = """
SELECT * FROM lxmf_messages
WHERE ((source_hash = ? AND destination_hash = ?)
OR (destination_hash = ? AND source_hash = ?))
WHERE peer_hash = ?
"""
params = [local_hash, destination_hash, local_hash, destination_hash]
params = [destination_hash]
if after_id:
query += " AND id > ?"
@@ -34,56 +33,110 @@ class MessageHandler:
return self.db.provider.fetchall(query, params)
def delete_conversation(self, local_hash, destination_hash):
query = """
DELETE FROM lxmf_messages
WHERE ((source_hash = ? AND destination_hash = ?)
OR (destination_hash = ? AND source_hash = ?))
"""
query = "DELETE FROM lxmf_messages WHERE peer_hash = ?"
self.db.provider.execute(query, [destination_hash])
# Also clean up folder mapping
self.db.provider.execute(
query,
[local_hash, destination_hash, local_hash, destination_hash],
"DELETE FROM lxmf_conversation_folders WHERE peer_hash = ?",
[destination_hash],
)
def search_messages(self, local_hash, search_term):
like_term = f"%{search_term}%"
query = """
SELECT source_hash, destination_hash, MAX(timestamp) as max_ts
SELECT peer_hash, MAX(timestamp) as max_ts
FROM lxmf_messages
WHERE (source_hash = ? OR destination_hash = ?)
AND (title LIKE ? OR content LIKE ? OR source_hash LIKE ? OR destination_hash LIKE ?)
GROUP BY source_hash, destination_hash
WHERE title LIKE ? OR content LIKE ? OR peer_hash LIKE ?
GROUP BY peer_hash
"""
params = [local_hash, local_hash, like_term, like_term, like_term, like_term]
params = [like_term, like_term, like_term]
return self.db.provider.fetchall(query, params)
def get_conversations(self, local_hash, filter_unread=False):
# Implementation moved from get_conversations DAO but with local_hash filter
def get_conversations(
self,
local_hash,
search=None,
filter_unread=False,
filter_failed=False,
filter_has_attachments=False,
folder_id=None,
limit=None,
offset=0,
):
# Optimized using peer_hash column and JOINs to avoid N+1 queries
query = """
SELECT m1.* FROM lxmf_messages m1
JOIN (
SELECT
CASE WHEN source_hash = ? THEN destination_hash ELSE source_hash END as peer_hash,
MAX(timestamp) as max_ts
SELECT
m1.*,
a.app_data as peer_app_data,
c.display_name as custom_display_name,
con.custom_image as contact_image,
i.icon_name, i.foreground_colour, i.background_colour,
r.last_read_at,
f.id as folder_id,
fn.name as folder_name,
(SELECT COUNT(*) FROM lxmf_messages m_failed
WHERE m_failed.peer_hash = m1.peer_hash AND m_failed.state = 'failed') as failed_count
FROM lxmf_messages m1
INNER JOIN (
SELECT peer_hash, MAX(timestamp) as max_ts
FROM lxmf_messages
WHERE source_hash = ? OR destination_hash = ?
WHERE peer_hash IS NOT NULL
GROUP BY peer_hash
) m2 ON (CASE WHEN m1.source_hash = ? THEN m1.destination_hash ELSE m1.source_hash END = m2.peer_hash
AND m1.timestamp = m2.max_ts)
WHERE (m1.source_hash = ? OR m1.destination_hash = ?)
) m2 ON m1.peer_hash = m2.peer_hash AND m1.timestamp = m2.max_ts
LEFT JOIN announces a ON a.destination_hash = m1.peer_hash
LEFT JOIN custom_destination_display_names c ON c.destination_hash = m1.peer_hash
LEFT JOIN contacts con ON (
con.remote_identity_hash = m1.peer_hash OR
con.lxmf_address = m1.peer_hash OR
con.lxst_address = m1.peer_hash
)
LEFT JOIN lxmf_user_icons i ON i.destination_hash = m1.peer_hash
LEFT JOIN lxmf_conversation_read_state r ON r.destination_hash = m1.peer_hash
LEFT JOIN lxmf_conversation_folders f ON f.peer_hash = m1.peer_hash
LEFT JOIN lxmf_folders fn ON fn.id = f.folder_id
"""
params = [
local_hash,
local_hash,
local_hash,
local_hash,
local_hash,
local_hash,
]
params = []
where_clauses = []
if folder_id is not None:
if folder_id == 0 or folder_id == "0":
# Special case: no folder (Uncategorized)
where_clauses.append("f.folder_id IS NULL")
else:
where_clauses.append("f.folder_id = ?")
params.append(folder_id)
if filter_unread:
query += " AND EXISTS (SELECT 1 FROM lxmf_messages m3 WHERE (m3.source_hash = m2.peer_hash AND m3.destination_hash = ?) AND m3.state = 'received' AND m3.is_incoming = 1)"
params.append(local_hash)
where_clauses.append(
"(r.last_read_at IS NULL OR m1.timestamp > strftime('%s', r.last_read_at))",
)
query += " ORDER BY m1.timestamp DESC"
if filter_failed:
where_clauses.append("m1.state = 'failed'")
if filter_has_attachments:
where_clauses.append(
"(m1.fields IS NOT NULL AND m1.fields != '{}' AND m1.fields != '')",
)
if search:
like_term = f"%{search}%"
# Search in latest message info OR search across ALL messages for this peer
where_clauses.append("""
(m1.title LIKE ? OR m1.content LIKE ? OR m1.peer_hash LIKE ? OR c.display_name LIKE ?
OR m1.peer_hash IN (SELECT peer_hash FROM lxmf_messages WHERE title LIKE ? OR content LIKE ?))
""")
params.extend(
[like_term, like_term, like_term, like_term, like_term, like_term],
)
if where_clauses:
query += " WHERE " + " AND ".join(where_clauses)
query += " GROUP BY m1.peer_hash ORDER BY m1.timestamp DESC"
if limit is not None:
query += " LIMIT ? OFFSET ?"
params.extend([limit, offset])
return self.db.provider.fetchall(query, params)

View File

@@ -0,0 +1,268 @@
import asyncio
import io
import os
import time
from collections.abc import Callable
import RNS
# global cache for nomadnet links to avoid re-establishing them for every request
nomadnet_cached_links = {}
class NomadnetDownloader:
def __init__(
self,
destination_hash: bytes,
path: str,
data: str | None,
on_download_success: Callable[[RNS.RequestReceipt], None],
on_download_failure: Callable[[str], None],
on_progress_update: Callable[[float], None],
timeout: int | None = None,
):
self.app_name = "nomadnetwork"
self.aspects = "node"
self.destination_hash = destination_hash
self.path = path
self.data = data
self.timeout = timeout
self._download_success_callback = on_download_success
self._download_failure_callback = on_download_failure
self.on_progress_update = on_progress_update
self.request_receipt = None
self.is_cancelled = False
self.link = None
# cancel the download
def cancel(self):
self.is_cancelled = True
# cancel the request if it exists
if self.request_receipt is not None:
try:
self.request_receipt.cancel()
except Exception as e:
print(f"Failed to cancel request: {e}")
# clean up the link if we created it
if self.link is not None:
try:
self.link.teardown()
except Exception as e:
print(f"Failed to teardown link: {e}")
# notify that download was cancelled
self._download_failure_callback("cancelled")
# setup link to destination and request download
async def download(
self,
path_lookup_timeout: int = 15,
link_establishment_timeout: int = 15,
):
# check if cancelled before starting
if self.is_cancelled:
return
# use existing established link if it's active
if self.destination_hash in nomadnet_cached_links:
link = nomadnet_cached_links[self.destination_hash]
if link.status is RNS.Link.ACTIVE:
print("[NomadnetDownloader] using existing link for request")
self.link_established(link)
return
# determine when to timeout
timeout_after_seconds = time.time() + path_lookup_timeout
# check if we have a path to the destination
if not RNS.Transport.has_path(self.destination_hash):
# we don't have a path, so we need to request it
RNS.Transport.request_path(self.destination_hash)
# wait until we have a path, or give up after the configured timeout
while (
not RNS.Transport.has_path(self.destination_hash)
and time.time() < timeout_after_seconds
):
# check if cancelled during path lookup
if self.is_cancelled:
return
await asyncio.sleep(0.1)
# if we still don't have a path, we can't establish a link, so bail out
if not RNS.Transport.has_path(self.destination_hash):
self._download_failure_callback("Could not find path to destination.")
return
# check if cancelled before establishing link
if self.is_cancelled:
return
# create destination to nomadnet node
identity = RNS.Identity.recall(self.destination_hash)
destination = RNS.Destination(
identity,
RNS.Destination.OUT,
RNS.Destination.SINGLE,
self.app_name,
self.aspects,
)
# create link to destination
print("[NomadnetDownloader] establishing new link for request")
link = RNS.Link(destination, established_callback=self.link_established)
self.link = link
# determine when to timeout
timeout_after_seconds = time.time() + link_establishment_timeout
# wait until we have established a link, or give up after the configured timeout
while (
link.status is not RNS.Link.ACTIVE and time.time() < timeout_after_seconds
):
# check if cancelled during link establishment
if self.is_cancelled:
return
await asyncio.sleep(0.1)
# if we still haven't established a link, bail out
if link.status is not RNS.Link.ACTIVE:
self._download_failure_callback("Could not establish link to destination.")
# link to destination was established, we should now request the download
def link_established(self, link):
# check if cancelled before requesting
if self.is_cancelled:
return
# cache link for using in future requests
nomadnet_cached_links[self.destination_hash] = link
# request download over link
self.request_receipt = link.request(
self.path,
data=self.data,
response_callback=self.on_response,
failed_callback=self.on_failed,
progress_callback=self.on_progress,
timeout=self.timeout,
)
# handle successful download
def on_response(self, request_receipt: RNS.RequestReceipt):
self._download_success_callback(request_receipt)
# handle failure
def on_failed(self, request_receipt=None):
self._download_failure_callback("request_failed")
# handle download progress
def on_progress(self, request_receipt):
self.on_progress_update(request_receipt.progress)
class NomadnetPageDownloader(NomadnetDownloader):
def __init__(
self,
destination_hash: bytes,
page_path: str,
data: str | None,
on_page_download_success: Callable[[str], None],
on_page_download_failure: Callable[[str], None],
on_progress_update: Callable[[float], None],
timeout: int | None = None,
):
self.on_page_download_success = on_page_download_success
self.on_page_download_failure = on_page_download_failure
super().__init__(
destination_hash,
page_path,
data,
self.on_download_success,
self.on_download_failure,
on_progress_update,
timeout,
)
# page download was successful, decode the response and send to provided callback
def on_download_success(self, request_receipt: RNS.RequestReceipt):
micron_markup_response = request_receipt.response.decode("utf-8")
self.on_page_download_success(micron_markup_response)
# page download failed, send error to provided callback
def on_download_failure(self, failure_reason):
self.on_page_download_failure(failure_reason)
class NomadnetFileDownloader(NomadnetDownloader):
def __init__(
self,
destination_hash: bytes,
page_path: str,
on_file_download_success: Callable[[str, bytes], None],
on_file_download_failure: Callable[[str], None],
on_progress_update: Callable[[float], None],
timeout: int | None = None,
):
self.on_file_download_success = on_file_download_success
self.on_file_download_failure = on_file_download_failure
super().__init__(
destination_hash,
page_path,
None,
self.on_download_success,
self.on_download_failure,
on_progress_update,
timeout,
)
# file download was successful, decode the response and send to provided callback
def on_download_success(self, request_receipt: RNS.RequestReceipt):
# get response
response = request_receipt.response
# handle buffered reader response
if isinstance(response, io.BufferedReader):
# get file name from metadata
file_name = "downloaded_file"
metadata = request_receipt.metadata
if metadata is not None and "name" in metadata:
file_path = metadata["name"].decode("utf-8")
file_name = os.path.basename(file_path)
# get file data
file_data: bytes = response.read()
self.on_file_download_success(file_name, file_data)
return
# check for list response with bytes in position 0, and metadata dict in position 1
# e.g: [file_bytes, {name: "filename.ext"}]
if isinstance(response, list) and isinstance(response[1], dict):
file_data: bytes = response[0]
metadata: dict = response[1]
# get file name from metadata
file_name = "downloaded_file"
if metadata is not None and "name" in metadata:
file_path = metadata["name"].decode("utf-8")
file_name = os.path.basename(file_path)
self.on_file_download_success(file_name, file_data)
return
# try using original response format
# unsure if this is actually used anymore now that a buffered reader is provided
# have left here just in case...
try:
file_name: str = response[0]
file_data: bytes = response[1]
self.on_file_download_success(file_name, file_data)
except Exception:
self.on_download_failure("unsupported_response")
# page download failed, send error to provided callback
def on_download_failure(self, failure_reason):
self.on_file_download_failure(failure_reason)

View File

@@ -0,0 +1,61 @@
def convert_nomadnet_string_data_to_map(path_data: str | None):
data = {}
if path_data is not None:
for field in path_data.split("|"):
if "=" in field:
parts = field.split("=", 1)
if len(parts) == 2:
variable_name, variable_value = parts
data[f"var_{variable_name}"] = variable_value
else:
print(f"unhandled field: {field}")
return data
def convert_nomadnet_field_data_to_map(field_data):
data = {}
if field_data is not None or "{}":
try:
json_data = field_data
if isinstance(json_data, dict):
data = {f"field_{key}": value for key, value in json_data.items()}
else:
return None
except Exception as e:
print(f"skipping invalid field data: {e}")
return data
class NomadNetworkManager:
def __init__(self, config, archiver_manager, database):
self.config = config
self.archiver_manager = archiver_manager
self.database = database
def archive_page(
self,
destination_hash: str,
page_path: str,
content: str,
is_manual: bool = False,
):
if not is_manual and not self.config.page_archiver_enabled.get():
return
self.archiver_manager.archive_page(
destination_hash,
page_path,
content,
max_versions=self.config.page_archiver_max_versions.get(),
max_storage_gb=self.config.archives_max_storage_gb.get(),
)
def get_archived_page_versions(self, destination_hash: str, page_path: str):
return self.database.misc.get_archived_page_versions(
destination_hash,
page_path,
)
def flush_all_archived_pages(self):
self.database.misc.delete_archived_pages()

View File

@@ -0,0 +1,235 @@
import collections
import logging
import re
import threading
import time
from datetime import UTC, datetime
class PersistentLogHandler(logging.Handler):
def __init__(self, database=None, capacity=5000, flush_interval=5):
super().__init__()
self.database = database
self.logs_buffer = collections.deque(maxlen=capacity)
self.flush_interval = flush_interval
self.last_flush_time = time.time()
self.lock = threading.RLock()
self.flush_lock = threading.Lock()
# Anomaly detection state
self.recent_messages = collections.deque(maxlen=100)
self.flooding_threshold = 20 # messages per second
self.repeat_threshold = 5 # identical messages in a row
self.message_counts = collections.defaultdict(int)
self.last_reset_time = time.time()
# UA and IP tracking
self.known_ips = set()
self.known_uas = set()
def set_database(self, database):
with self.lock:
self.database = database
def emit(self, record):
try:
msg = self.format(record)
timestamp = datetime.now(UTC).timestamp()
is_anomaly, anomaly_type = self._detect_anomaly(record, msg, timestamp)
log_entry = {
"timestamp": timestamp,
"level": record.levelname,
"module": record.module,
"message": msg,
"is_anomaly": 1 if is_anomaly else 0,
"anomaly_type": anomaly_type,
}
with self.lock:
self.logs_buffer.append(log_entry)
# Periodically flush to database if available
if self.database and (
time.time() - self.last_flush_time > self.flush_interval
):
self._flush_to_db()
except Exception:
self.handleError(record)
def _detect_access_anomaly(self, message):
"""Detect anomalies in aiohttp access logs."""
# Regex to extract IP and User-Agent from aiohttp access log
# Format: IP [date] "GET ..." status size "referer" "User-Agent"
match = re.search(
r"^([\d\.\:]+) .* \"[^\"]+\" \d+ \d+ \"[^\"]*\" \"([^\"]+)\"",
message,
)
if match:
ip = match.group(1)
ua = match.group(2)
with self.lock:
is_anomaly = False
anomaly_type = None
# Detect if this is a different UA or IP from what we've seen recently
if len(self.known_ips) > 0 and ip not in self.known_ips:
is_anomaly = True
anomaly_type = "multi_ip"
if len(self.known_uas) > 0 and ua not in self.known_uas:
is_anomaly = True
if anomaly_type:
anomaly_type = "multi_ip_ua"
else:
anomaly_type = "multi_ua"
self.known_ips.add(ip)
self.known_uas.add(ua)
# Cap the tracking sets to prevent memory growth
if len(self.known_ips) > 100:
self.known_ips.clear()
if len(self.known_uas) > 100:
self.known_uas.clear()
return is_anomaly, anomaly_type
return False, None
def _detect_anomaly(self, record, message, timestamp):
# 1. Access anomaly detection (UA/IP) - checked for all levels of aiohttp.access
if record.name == "aiohttp.access":
is_acc_anomaly, acc_type = self._detect_access_anomaly(message)
if is_acc_anomaly:
return True, acc_type
# Only detect other anomalies for WARNING level and above
if record.levelno < logging.WARNING:
return False, None
now = time.time()
# 1. Detect Log Flooding
if now - self.last_reset_time > 1.0:
self.message_counts.clear()
self.last_reset_time = now
self.message_counts["total"] += 1
if self.message_counts["total"] > self.flooding_threshold:
return True, "flooding"
# 2. Detect Repeats
if len(self.recent_messages) > 0:
repeat_count = 0
for prev_msg in reversed(self.recent_messages):
if prev_msg == message:
repeat_count += 1
else:
break
if repeat_count >= self.repeat_threshold:
return True, "repeat"
self.recent_messages.append(message)
return False, None
def _flush_to_db(self):
if not self.database:
return
# Ensure only one thread flushes at a time
if not self.flush_lock.acquire(blocking=False):
return
try:
items_to_flush = []
with self.lock:
while self.logs_buffer:
items_to_flush.append(self.logs_buffer.popleft())
if not items_to_flush:
return
# Batch insert for speed
for entry in items_to_flush:
try:
self.database.debug_logs.insert_log(
level=entry["level"],
module=entry["module"],
message=entry["message"],
is_anomaly=entry["is_anomaly"],
anomaly_type=entry["anomaly_type"],
)
except Exception as e:
print(f"Error inserting log: {e}")
# Periodic cleanup of old logs (only every 100 flushes or similar?
# for now let's just keep it here but it should be fast)
try:
self.database.debug_logs.cleanup_old_logs()
except Exception as e:
print(f"Error cleaning up logs: {e}")
self.last_flush_time = time.time()
except Exception as e:
print(f"Failed to flush logs to database: {e}")
finally:
self.flush_lock.release()
def get_logs(
self,
limit=100,
offset=0,
search=None,
level=None,
module=None,
is_anomaly=None,
):
if self.database:
# Flush current buffer first to ensure we have latest logs
self._flush_to_db()
with self.lock:
if self.database:
return self.database.debug_logs.get_logs(
limit=limit,
offset=offset,
search=search,
level=level,
module=module,
is_anomaly=is_anomaly,
)
# Fallback to in-memory buffer if DB not yet available
logs = list(self.logs_buffer)
if search:
logs = [
log
for log in logs
if search.lower() in log["message"].lower()
or search.lower() in log["module"].lower()
]
if level:
logs = [log for log in logs if log["level"] == level]
if is_anomaly is not None:
logs = [
log for log in logs if log["is_anomaly"] == (1 if is_anomaly else 0)
]
# Sort descending
logs.sort(key=lambda x: x["timestamp"], reverse=True)
return logs[offset : offset + limit]
def get_total_count(self, search=None, level=None, module=None, is_anomaly=None):
with self.lock:
if self.database:
return self.database.debug_logs.get_total_count(
search=search,
level=level,
module=module,
is_anomaly=is_anomaly,
)
return len(self.logs_buffer)

View File

@@ -0,0 +1,3 @@
from .crash_recovery import CrashRecovery
__all__ = ["CrashRecovery"]

View File

@@ -0,0 +1,290 @@
import os
import platform
import shutil
import sqlite3
import sys
import traceback
import psutil
import RNS
class CrashRecovery:
"""A diagnostic utility that intercepts application crashes and provides
meaningful error reports and system state analysis.
"""
def __init__(
self,
storage_dir=None,
database_path=None,
public_dir=None,
reticulum_config_dir=None,
):
self.storage_dir = storage_dir
self.database_path = database_path
self.public_dir = public_dir
self.reticulum_config_dir = reticulum_config_dir
self.enabled = True
# Check environment variable to allow disabling the recovery system
env_val = os.environ.get("MESHCHAT_NO_CRASH_RECOVERY", "").lower()
if env_val in ("true", "1", "yes", "on"):
self.enabled = False
def install(self):
"""Installs the crash recovery exception hook into the system."""
if not self.enabled:
return
sys.excepthook = self.handle_exception
def disable(self):
"""Disables the crash recovery system manually."""
self.enabled = False
def update_paths(
self,
storage_dir=None,
database_path=None,
public_dir=None,
reticulum_config_dir=None,
):
"""Updates the internal paths used for system diagnosis."""
if storage_dir:
self.storage_dir = storage_dir
if database_path:
self.database_path = database_path
if public_dir:
self.public_dir = public_dir
if reticulum_config_dir:
self.reticulum_config_dir = reticulum_config_dir
def handle_exception(self, exc_type, exc_value, exc_traceback):
"""Intercepts unhandled exceptions to provide a detailed diagnosis report."""
# Let keyboard interrupts pass through normally
if issubclass(exc_type, KeyboardInterrupt):
sys.__excepthook__(exc_type, exc_value, exc_traceback)
return
# Use stderr for everything to ensure correct ordering in logs and console
out = sys.stderr
# Print visual separator
out.write("\n" + "=" * 70 + "\n")
out.write("!!! APPLICATION CRASH DETECTED !!!\n")
out.write("=" * 70 + "\n")
out.write("\nError Summary:\n")
out.write(f" Type: {exc_type.__name__}\n")
out.write(f" Message: {exc_value}\n")
out.write("\nSystem Environment Diagnosis:\n")
try:
self.run_diagnosis(file=out)
except Exception as e:
out.write(f" [ERROR] Failed to complete diagnosis: {e}\n")
out.write("\nTechnical Traceback:\n")
traceback.print_exception(exc_type, exc_value, exc_traceback, file=out)
out.write("\n" + "=" * 70 + "\n")
out.write("Recovery Suggestions:\n")
out.write(" 1. Review the 'System Environment Diagnosis' section above.\n")
out.write(
" 2. Verify that all dependencies are installed (poetry install or pip install -r requirements.txt).\n",
)
out.write(
" 3. If database corruption is suspected, try starting with --auto-recover.\n",
)
out.write(
" 4. If the issue persists, report it to Ivan over another LXMF client: 7cc8d66b4f6a0e0e49d34af7f6077b5a\n",
)
out.write("=" * 70 + "\n\n")
out.flush()
# Exit with error code
sys.exit(1)
def run_diagnosis(self, file=sys.stderr):
"""Performs a series of OS-agnostic checks on the application's environment."""
# Basic System Info
file.write(
f"- OS: {platform.system()} {platform.release()} ({platform.machine()})\n",
)
file.write(f"- Python: {sys.version.split()[0]}\n")
# Resource Monitoring
try:
mem = psutil.virtual_memory()
file.write(
f"- Memory: {mem.percent}% used ({mem.available / (1024**2):.1f} MB available)\n",
)
if mem.percent > 95:
file.write(" [CRITICAL] System memory is dangerously low!\n")
except Exception:
pass
# Filesystem Status
if self.storage_dir:
file.write(f"- Storage Path: {self.storage_dir}\n")
if not os.path.exists(self.storage_dir):
file.write(
" [ERROR] Storage path does not exist. Check MESHCHAT_STORAGE_DIR.\n",
)
else:
if not os.access(self.storage_dir, os.W_OK):
file.write(
" [ERROR] Storage path is NOT writable. Check filesystem permissions.\n",
)
try:
usage = shutil.disk_usage(self.storage_dir)
free_mb = usage.free / (1024**2)
file.write(f" - Disk Space: {free_mb:.1f} MB free\n")
if free_mb < 50:
file.write(
" [CRITICAL] Disk space is critically low (< 50MB)!\n",
)
except Exception:
pass
# Database Integrity
if self.database_path:
file.write(f"- Database: {self.database_path}\n")
if os.path.exists(self.database_path):
if os.path.getsize(self.database_path) == 0:
file.write(
" [WARNING] Database file exists but is empty (0 bytes).\n",
)
else:
try:
# Open in read-only mode for safety during crash handling
conn = sqlite3.connect(
f"file:{self.database_path}?mode=ro",
uri=True,
)
cursor = conn.cursor()
cursor.execute("PRAGMA integrity_check")
res = cursor.fetchone()[0]
if res != "ok":
file.write(
f" [ERROR] Database corruption detected: {res}\n",
)
else:
file.write(" - Integrity: OK\n")
conn.close()
except sqlite3.DatabaseError as e:
file.write(
f" [ERROR] Database is unreadable or not a SQLite file: {e}\n",
)
except Exception as e:
file.write(f" [ERROR] Database check failed: {e}\n")
else:
file.write(" - Database: File not yet created\n")
# Frontend Assets
if self.public_dir:
file.write(f"- Frontend Assets: {self.public_dir}\n")
if not os.path.exists(self.public_dir):
file.write(
" [ERROR] Frontend directory is missing. Web interface will fail to load.\n",
)
else:
index_path = os.path.join(self.public_dir, "index.html")
if not os.path.exists(index_path):
file.write(
" [ERROR] index.html not found in frontend directory!\n",
)
else:
file.write(" - Frontend Status: Assets verified\n")
# Reticulum Status
self.run_reticulum_diagnosis(file=file)
def run_reticulum_diagnosis(self, file=sys.stderr):
"""Diagnoses the Reticulum Network Stack environment."""
file.write("- Reticulum Network Stack:\n")
# Check config directory
config_dir = self.reticulum_config_dir or RNS.Reticulum.configpath
file.write(f" - Config Directory: {config_dir}\n")
if not os.path.exists(config_dir):
file.write(" [ERROR] Reticulum config directory does not exist.\n")
return
config_file = os.path.join(config_dir, "config")
if not os.path.exists(config_file):
file.write(" [ERROR] Reticulum config file is missing.\n")
else:
try:
# Basic config validation
with open(config_file) as f:
content = f.read()
if "[reticulum]" not in content:
file.write(
" [ERROR] Reticulum config file is invalid (missing [reticulum] section).\n",
)
else:
file.write(" - Config File: OK\n")
except Exception as e:
file.write(f" [ERROR] Could not read Reticulum config: {e}\n")
# Extract recent RNS log entries if possible
# Check common log file locations
log_paths = [
os.path.join(config_dir, "logfile"),
os.path.join(config_dir, "rnsd.log"),
"/var/log/rnsd.log",
]
found_logs = False
for logfile in log_paths:
if os.path.exists(logfile):
file.write(f" - Recent Log Entries ({logfile}):\n")
try:
with open(logfile) as f:
lines = f.readlines()
if not lines:
file.write(" (Log file is empty)\n")
else:
for line in lines[-15:]:
if "ERROR" in line or "CRITICAL" in line:
file.write(f" > [ALERT] {line.strip()}\n")
else:
file.write(f" > {line.strip()}\n")
found_logs = True
break # Stop at first found log file
except Exception as e:
file.write(f" [ERROR] Could not read logfile: {e}\n")
if not found_logs:
file.write(" - Logs: No RNS log files found in standard locations.\n")
# Check for interfaces and transport status
try:
# Try to get more info from RNS if it's already running
if hasattr(RNS.Transport, "interfaces") and RNS.Transport.interfaces:
file.write(f" - Active Interfaces: {len(RNS.Transport.interfaces)}\n")
for iface in RNS.Transport.interfaces:
status = "Active" if iface.online else "Offline"
file.write(f" > {iface} [{status}]\n")
else:
file.write(
" - Active Interfaces: None registered (Reticulum may not be initialized yet)\n",
)
except Exception:
pass
# Check for common port conflicts
common_ports = [4242, 8000, 8080] # Reticulum default is often 4242
for port in common_ports:
try:
for conn in psutil.net_connections():
if conn.laddr.port == port and conn.status == "LISTEN":
file.write(
f" [ALERT] Port {port} is already in use by PID {conn.pid}. Potential conflict.\n",
)
except Exception:
pass

View File

@@ -38,7 +38,7 @@ class RingtoneManager:
filename = f"ringtone_{secrets.token_hex(8)}.opus"
opus_path = os.path.join(self.storage_dir, filename)
subprocess.run(
subprocess.run( # noqa: S603
[
self.ffmpeg_path,
"-i",

View File

@@ -0,0 +1,139 @@
import RNS
class RNPathHandler:
def __init__(self, reticulum_instance: RNS.Reticulum):
self.reticulum = reticulum_instance
def get_path_table(
self,
max_hops: int = None,
search: str = None,
interface: str = None,
hops: int = None,
page: int = 1,
limit: int = 0,
):
table = self.reticulum.get_path_table(max_hops=max_hops)
formatted_table = []
for entry in table:
# Get additional data directly from Transport.path_table if available
# to provide more stats as requested.
dst_hash = entry["hash"]
announce_hash = None
state = RNS.Transport.STATE_UNKNOWN
if dst_hash in RNS.Transport.path_table:
pt_entry = RNS.Transport.path_table[dst_hash]
if len(pt_entry) > 6:
announce_hash = pt_entry[6].hex() if pt_entry[6] else None
if dst_hash in RNS.Transport.path_states:
state = RNS.Transport.path_states[dst_hash]
# Filtering
if search:
search = search.lower()
hash_str = entry["hash"].hex().lower()
via_str = entry["via"].hex().lower()
if search not in hash_str and search not in via_str:
continue
if interface and entry["interface"] != interface:
continue
if hops is not None and entry["hops"] != hops:
continue
formatted_table.append(
{
"hash": entry["hash"].hex(),
"hops": entry["hops"],
"via": entry["via"].hex(),
"interface": entry["interface"],
"expires": entry["expires"],
"timestamp": entry.get("timestamp"),
"announce_hash": announce_hash,
"state": state,
},
)
# Sort: Responsive first, then by hops, then by interface
formatted_table.sort(
key=lambda e: (
0 if e["state"] == RNS.Transport.STATE_RESPONSIVE else 1,
e["hops"],
e["interface"],
),
)
total = len(formatted_table)
responsive_count = len(
[
e
for e in formatted_table
if e["state"] == RNS.Transport.STATE_RESPONSIVE
],
)
unresponsive_count = len(
[
e
for e in formatted_table
if e["state"] == RNS.Transport.STATE_UNRESPONSIVE
],
)
# Pagination
if limit > 0:
start = (page - 1) * limit
end = start + limit
formatted_table = formatted_table[start:end]
return {
"table": formatted_table,
"total": total,
"responsive": responsive_count,
"unresponsive": unresponsive_count,
"page": page,
"limit": limit,
}
def get_rate_table(self):
table = self.reticulum.get_rate_table()
formatted_table = [
{
"hash": entry["hash"].hex(),
"last": entry["last"],
"timestamps": entry["timestamps"],
"rate_violations": entry["rate_violations"],
"blocked_until": entry["blocked_until"],
}
for entry in table
]
return sorted(formatted_table, key=lambda e: e["last"])
def drop_path(self, destination_hash: str) -> bool:
try:
dest_bytes = bytes.fromhex(destination_hash)
return self.reticulum.drop_path(dest_bytes)
except Exception:
return False
def drop_all_via(self, transport_instance_hash: str) -> bool:
try:
ti_bytes = bytes.fromhex(transport_instance_hash)
return self.reticulum.drop_all_via(ti_bytes)
except Exception:
return False
def drop_announce_queues(self):
self.reticulum.drop_announce_queues()
return True
def request_path(self, destination_hash: str):
try:
dest_bytes = bytes.fromhex(destination_hash)
RNS.Transport.request_path(dest_bytes)
return True
except Exception:
return False

View File

@@ -0,0 +1,100 @@
import asyncio
import time
import traceback
import RNS
class RNPathTraceHandler:
def __init__(self, reticulum_instance, identity):
self.reticulum = reticulum_instance
self.identity = identity
async def trace_path(self, destination_hash_str):
try:
try:
destination_hash = bytes.fromhex(destination_hash_str)
except Exception:
return {"error": "Invalid destination hash"}
# Request path if we don't have it
if not RNS.Transport.has_path(destination_hash):
RNS.Transport.request_path(destination_hash)
timeout = 10
start_time = time.time()
while (
not RNS.Transport.has_path(destination_hash)
and time.time() - start_time < timeout
):
await asyncio.sleep(0.2)
if not RNS.Transport.has_path(destination_hash):
return {"error": "Path not found after timeout"}
hops = RNS.Transport.hops_to(destination_hash)
next_hop_bytes = None
next_hop_interface = None
if self.reticulum:
try:
next_hop_bytes = self.reticulum.get_next_hop(destination_hash)
next_hop_interface = self.reticulum.get_next_hop_if_name(
destination_hash,
)
except Exception as e:
print(f"Error calling reticulum methods: {e}")
path = []
# Me
local_hash = "unknown"
if self.identity and hasattr(self.identity, "hash"):
local_hash = self.identity.hash.hex()
elif (
self.reticulum
and hasattr(self.reticulum, "identity")
and self.reticulum.identity
):
local_hash = self.reticulum.identity.hash.hex()
path.append({"type": "local", "hash": local_hash, "name": "Local Node"})
if hops == 1:
# Direct
path.append(
{
"type": "destination",
"hash": destination_hash_str,
"hops": 1,
"interface": next_hop_interface,
},
)
elif hops > 1:
# Next hop
path.append(
{
"type": "hop",
"hash": next_hop_bytes.hex() if next_hop_bytes else None,
"name": "Next Hop",
"interface": next_hop_interface,
"hop_number": 1,
},
)
# Intermediate unknown hops
if hops > 2:
path.append({"type": "unknown", "count": hops - 2})
# Destination
path.append(
{"type": "destination", "hash": destination_hash_str, "hops": hops},
)
return {
"destination": destination_hash_str,
"hops": hops,
"path": path,
"interface": next_hop_interface,
"next_hop": next_hop_bytes.hex() if next_hop_bytes else None,
}
except Exception as e:
return {"error": f"Trace failed: {e}\n{traceback.format_exc()}"}

View File

@@ -1,6 +1,8 @@
import time
from typing import Any
import RNS
def size_str(num, suffix="B"):
units = ["", "K", "M", "G", "T", "P", "E", "Z"]
@@ -53,6 +55,19 @@ class RNStatusHandler:
"link_count": link_count,
}
blackhole_enabled = False
blackhole_sources = []
blackhole_count = 0
try:
blackhole_enabled = RNS.Reticulum.publish_blackhole_enabled()
blackhole_sources = [s.hex() for s in RNS.Reticulum.blackhole_sources()]
# Get count of blackholed identities
if self.reticulum and hasattr(self.reticulum, "get_blackholed_identities"):
blackhole_count = len(self.reticulum.get_blackholed_identities())
except Exception:
pass
interfaces = stats.get("interfaces", [])
if sorting and isinstance(sorting, str):
@@ -211,4 +226,7 @@ class RNStatusHandler:
"interfaces": formatted_interfaces,
"link_count": link_count,
"timestamp": time.time(),
"blackhole_enabled": blackhole_enabled,
"blackhole_sources": blackhole_sources,
"blackhole_count": blackhole_count,
}

View File

@@ -70,7 +70,7 @@ class Telemeter:
struct.pack("!I", int(round(speed, 2) * 1e2)),
struct.pack("!i", int(round(bearing, 2) * 1e2)),
struct.pack("!H", int(round(accuracy, 2) * 1e2)),
int(last_update or time.time()),
int(last_update) if last_update is not None else int(time.time()),
]
except Exception:
return None
@@ -84,15 +84,33 @@ class Telemeter:
res["time"] = {"utc": p[Sensor.SID_TIME]}
if Sensor.SID_LOCATION in p:
res["location"] = Telemeter.unpack_location(p[Sensor.SID_LOCATION])
if Sensor.SID_PHYSICAL_LINK in p:
pl = p[Sensor.SID_PHYSICAL_LINK]
if isinstance(pl, (list, tuple)) and len(pl) >= 3:
res["physical_link"] = {"rssi": pl[0], "snr": pl[1], "q": pl[2]}
if Sensor.SID_BATTERY in p:
b = p[Sensor.SID_BATTERY]
if isinstance(b, (list, tuple)) and len(b) >= 2:
res["battery"] = {"charge_percent": b[0], "charging": b[1]}
# Add other sensors as needed
return res
except Exception:
return None
@staticmethod
def pack(time_utc=None, location=None):
def pack(time_utc=None, location=None, battery=None, physical_link=None):
p = {}
p[Sensor.SID_TIME] = int(time_utc or time.time())
if location:
p[Sensor.SID_LOCATION] = Telemeter.pack_location(**location)
if battery:
# battery should be [charge_percent, charging]
p[Sensor.SID_BATTERY] = [battery["charge_percent"], battery["charging"]]
if physical_link:
# physical_link should be [rssi, snr, q]
p[Sensor.SID_PHYSICAL_LINK] = [
physical_link["rssi"],
physical_link["snr"],
physical_link["q"],
]
return umsgpack.packb(p)

View File

@@ -1,10 +1,35 @@
import asyncio
import base64
import os
import time
import RNS
from LXST import Telephone
class Tee:
def __init__(self, sink):
self.sinks = [sink]
def add_sink(self, sink):
if sink not in self.sinks:
self.sinks.append(sink)
def remove_sink(self, sink):
if sink in self.sinks:
self.sinks.remove(sink)
def handle_frame(self, frame, source):
for sink in self.sinks:
try:
sink.handle_frame(frame, source)
except Exception as e:
RNS.log(f"Tee: Error in sink handle_frame: {e}", RNS.LOG_ERROR)
def can_receive(self, from_source=None):
return any(sink.can_receive(from_source) for sink in self.sinks)
class TelephoneManager:
# LXST Status Constants for reference:
# 0: STATUS_BUSY
@@ -15,9 +40,24 @@ class TelephoneManager:
# 5: STATUS_CONNECTING
# 6: STATUS_ESTABLISHED
def __init__(self, identity: RNS.Identity, config_manager=None):
def __init__(
self,
identity: RNS.Identity,
config_manager=None,
storage_dir=None,
db=None,
):
self.identity = identity
self.config_manager = config_manager
self.storage_dir = storage_dir
self.db = db
self.get_name_for_identity_hash = None
self.recordings_dir = (
os.path.join(storage_dir, "recordings") if storage_dir else None
)
if self.recordings_dir:
os.makedirs(self.recordings_dir, exist_ok=True)
self.telephone = None
self.on_ringing_callback = None
self.on_established_callback = None
@@ -26,6 +66,23 @@ class TelephoneManager:
self.call_start_time = None
self.call_status_at_end = None
self.call_is_incoming = False
self.call_was_established = False
# Manual mute overrides in case LXST internal muting is buggy
self.transmit_muted = False
self.receive_muted = False
self.initiation_status = None
self.initiation_target_hash = None
self.on_initiation_status_callback = None
@property
def is_recording(self):
# Check if voicemail manager or this manager is recording
# This is a bit of a hack since we don't have a direct link to voicemail_manager here
# but we can check if our own recording is active if we had it.
# For now, we'll just return False and let meshchat.py handle the combined status.
return False
def init_telephone(self):
if self.telephone is not None:
@@ -34,6 +91,14 @@ class TelephoneManager:
self.telephone = Telephone(self.identity)
# Disable busy tone played on caller side when remote side rejects, or doesn't answer
self.telephone.set_busy_tone_time(0)
# Increase connection timeout for slower networks
self.telephone.set_connect_timeout(30)
# Set initial profile from config
if self.config_manager:
profile_id = self.config_manager.telephone_audio_profile_id.get()
self.telephone.switch_profile(profile_id)
self.telephone.set_ringing_callback(self.on_telephone_ringing)
self.telephone.set_established_callback(self.on_telephone_call_established)
self.telephone.set_ended_callback(self.on_telephone_call_ended)
@@ -43,6 +108,16 @@ class TelephoneManager:
self.telephone.teardown()
self.telephone = None
def hangup(self):
if self.telephone:
try:
self.telephone.hangup()
except Exception as e:
RNS.log(f"TelephoneManager: Error during hangup: {e}", RNS.LOG_ERROR)
# Always clear initiation status on hangup to prevent "Dialing..." hang
self._update_initiation_status(None, None)
def register_ringing_callback(self, callback):
self.on_ringing_callback = callback
@@ -53,14 +128,29 @@ class TelephoneManager:
self.on_ended_callback = callback
def on_telephone_ringing(self, caller_identity: RNS.Identity):
if self.initiation_status:
# This is an outgoing call where the remote side is now ringing.
# We update the initiation status to "Ringing..." for the UI.
self._update_initiation_status("Ringing...")
return
self.call_start_time = time.time()
self.call_is_incoming = True
self.call_was_established = False
if self.on_ringing_callback:
self.on_ringing_callback(caller_identity)
def on_telephone_call_established(self, caller_identity: RNS.Identity):
# Update start time to when it was actually established for duration calculation
self.call_start_time = time.time()
self.call_was_established = True
# Track per-call stats from the active link (uses RNS Link counters)
link = getattr(self.telephone, "active_call", None)
self.call_stats = {
"link": link,
}
if self.on_established_callback:
self.on_established_callback(caller_identity)
@@ -69,37 +159,316 @@ class TelephoneManager:
if self.telephone:
self.call_status_at_end = self.telephone.call_status
# Ensure initiation status is cleared when call ends
self._update_initiation_status(None, None)
if self.on_ended_callback:
self.on_ended_callback(caller_identity)
def announce(self, attached_interface=None):
def start_recording(self):
# Disabled for now as LXST does not have a Tee to use
pass
def stop_recording(self):
# Disabled for now
pass
def announce(self, attached_interface=None, display_name=None):
if self.telephone:
self.telephone.announce(attached_interface=attached_interface)
if display_name:
import RNS.vendor.umsgpack as msgpack
# Pack display name in LXMF-compatible app data format
app_data = msgpack.packb([display_name, None, None])
self.telephone.destination.announce(
app_data=app_data,
attached_interface=attached_interface,
)
self.telephone.last_announce = time.time()
else:
self.telephone.announce(attached_interface=attached_interface)
def _update_initiation_status(self, status, target_hash=None):
self.initiation_status = status
if target_hash is not None or status is None:
self.initiation_target_hash = target_hash
if self.on_initiation_status_callback:
try:
self.on_initiation_status_callback(
self.initiation_status,
self.initiation_target_hash,
)
except Exception as e:
RNS.log(
f"TelephoneManager: Error in initiation status callback: {e}",
RNS.LOG_ERROR,
)
async def initiate(self, destination_hash: bytes, timeout_seconds: int = 15):
if self.telephone is None:
msg = "Telephone is not initialized"
raise RuntimeError(msg)
# Find destination identity
destination_identity = RNS.Identity.recall(destination_hash)
if destination_identity is None:
# If not found by identity hash, try as destination hash
destination_identity = RNS.Identity.recall(
destination_hash,
) # Identity.recall takes identity hash
if destination_identity is None:
msg = "Destination identity not found"
if self.telephone.busy or self.initiation_status:
msg = "Telephone is already in use"
raise RuntimeError(msg)
# In LXST, we just call the identity. Telephone class handles path requests.
# But we might want to ensure a path exists first for better UX,
# similar to how the old MeshChat did it.
destination_hash_hex = destination_hash.hex()
self._update_initiation_status("Resolving identity...", destination_hash_hex)
# For now, let's just use the telephone.call method which is threaded.
# We need to run it in a thread since it might block.
self.call_start_time = time.time()
self.call_is_incoming = False
await asyncio.to_thread(self.telephone.call, destination_identity)
return self.telephone.active_call
try:
def resolve_identity(target_hash_hex):
"""Resolve identity from multiple hints: direct recall, destination_hash announce, identity_hash announce, or public key."""
target_hash = bytes.fromhex(target_hash_hex)
# 1) Direct recall (identity hash)
ident = RNS.Identity.recall(target_hash)
if ident:
return ident
if not self.db:
return None
# 2) By destination_hash (could be lxst.telephony or lxmf.delivery hash)
announce = self.db.announces.get_announce_by_hash(target_hash_hex)
if not announce:
# 3) By identity_hash field (if user entered identity hash but we missed recall, or other announce types)
announces = self.db.announces.get_filtered_announces(
identity_hash=target_hash_hex,
)
if announces:
announce = announces[0]
if not announce:
return None
# Try identity_hash from announce
identity_hex = announce.get("identity_hash")
if identity_hex:
ident = RNS.Identity.recall(bytes.fromhex(identity_hex))
if ident:
return ident
# Try reconstructing from public key
if announce.get("identity_public_key"):
try:
return RNS.Identity.from_bytes(
base64.b64decode(announce["identity_public_key"]),
)
except Exception:
pass
return None
# Find destination identity
destination_identity = resolve_identity(destination_hash_hex)
if destination_identity is None:
self._update_initiation_status("Discovering path/identity...")
RNS.Transport.request_path(destination_hash)
# Wait for identity to appear
start_wait = time.time()
while time.time() - start_wait < timeout_seconds:
if not self.initiation_status: # Externally cancelled (hangup)
return None
await asyncio.sleep(0.5)
destination_identity = resolve_identity(destination_hash_hex)
if destination_identity:
break
if destination_identity is None:
self._update_initiation_status(None, None)
msg = "Destination identity not found"
raise RuntimeError(msg)
if not RNS.Transport.has_path(destination_hash):
self._update_initiation_status("Requesting path...")
RNS.Transport.request_path(destination_hash)
# Wait up to 10s for path discovery
path_wait_start = time.time()
while time.time() - path_wait_start < min(timeout_seconds, 10):
if not self.initiation_status: # Externally cancelled
return None
if RNS.Transport.has_path(destination_hash):
break
await asyncio.sleep(0.5)
self._update_initiation_status("Establishing link...", destination_hash_hex)
self.call_start_time = time.time()
self.call_is_incoming = False
# Use a thread for the blocking LXST call, but monitor status for early exit
# if established elsewhere or timed out/hung up
call_task = asyncio.create_task(
asyncio.to_thread(self.telephone.call, destination_identity),
)
start_wait = time.time()
# LXST telephone.call usually returns on establishment or timeout.
# We wait for it, but if status becomes established or ended, we can stop waiting.
while not call_task.done():
if not self.initiation_status: # Externally cancelled
break
# Update UI status based on current call state
if self.telephone.call_status == 2:
self._update_initiation_status("Calling...", destination_hash_hex)
elif self.telephone.call_status == 4:
self._update_initiation_status("Ringing...", destination_hash_hex)
elif self.telephone.call_status == 5:
self._update_initiation_status(
"Establishing link...",
destination_hash_hex,
)
if self.telephone.call_status in [
6,
0,
1,
]: # Established, Busy, Rejected
break
if self.telephone.call_status == 3 and (
time.time() - start_wait > 1.0
): # Available (ended/timeout)
break
await asyncio.sleep(0.5)
# If the task finished but we're still ringing or connecting,
# wait a bit more for establishment or definitive failure
if self.initiation_status and self.telephone.call_status in [
2,
4,
5,
]: # Calling, Ringing, Connecting
wait_until = time.time() + timeout_seconds
while time.time() < wait_until:
if not self.initiation_status: # Externally cancelled
break
if self.telephone.call_status == 2:
self._update_initiation_status(
"Calling...",
destination_hash_hex,
)
elif self.telephone.call_status == 4:
self._update_initiation_status(
"Ringing...",
destination_hash_hex,
)
elif self.telephone.call_status == 5:
self._update_initiation_status(
"Establishing link...",
destination_hash_hex,
)
if self.telephone.call_status in [
6,
0,
1,
3,
]: # Established, Busy, Rejected, Ended
break
await asyncio.sleep(0.5)
return self.telephone.active_call
except Exception as e:
self._update_initiation_status(f"Failed: {e!s}")
await asyncio.sleep(3)
raise
finally:
# Wait for either establishment, failure, or a timeout
# to ensure the UI has something to show (either active_call or initiation_status)
for _ in range(20): # Max 10 seconds of defensive waiting
if self.telephone and (
self.telephone.active_call
or self.telephone.call_status in [0, 1, 3, 6]
):
break
await asyncio.sleep(0.5)
# If call was successful, keep status for a moment to prevent UI flicker
# while the frontend picks up the new active_call state
if self.telephone and (
(self.telephone.active_call and self.telephone.call_status == 6)
or self.telephone.call_status in [2, 4, 5]
):
await asyncio.sleep(2.0)
self._update_initiation_status(None, None)
def mute_transmit(self):
if self.telephone:
# Manual override as LXST internal muting can be buggy
if hasattr(self.telephone, "audio_input") and self.telephone.audio_input:
try:
self.telephone.audio_input.stop()
except Exception as e:
RNS.log(f"Failed to stop audio input for mute: {e}", RNS.LOG_ERROR)
# Still call the internal method just in case it does something useful
try:
self.telephone.mute_transmit()
except Exception: # noqa: S110
pass
self.transmit_muted = True
def unmute_transmit(self):
if self.telephone:
# Manual override as LXST internal muting can be buggy
if hasattr(self.telephone, "audio_input") and self.telephone.audio_input:
try:
self.telephone.audio_input.start()
except Exception as e:
RNS.log(
f"Failed to start audio input for unmute: {e}",
RNS.LOG_ERROR,
)
# Still call the internal method just in case
try:
self.telephone.unmute_transmit()
except Exception: # noqa: S110
pass
self.transmit_muted = False
def mute_receive(self):
if self.telephone:
# Manual override as LXST internal muting can be buggy
if hasattr(self.telephone, "audio_output") and self.telephone.audio_output:
try:
self.telephone.audio_output.stop()
except Exception as e:
RNS.log(f"Failed to stop audio output for mute: {e}", RNS.LOG_ERROR)
# Still call the internal method just in case
try:
self.telephone.mute_receive()
except Exception: # noqa: S110
pass
self.receive_muted = True
def unmute_receive(self):
if self.telephone:
# Manual override as LXST internal muting can be buggy
if hasattr(self.telephone, "audio_output") and self.telephone.audio_output:
try:
self.telephone.audio_output.start()
except Exception as e:
RNS.log(
f"Failed to start audio output for unmute: {e}",
RNS.LOG_ERROR,
)
# Still call the internal method just in case
try:
self.telephone.unmute_receive()
except Exception: # noqa: S110
pass
self.receive_muted = False

View File

@@ -64,7 +64,8 @@ LANGUAGE_CODE_TO_NAME = {
class TranslatorHandler:
def __init__(self, libretranslate_url: str | None = None):
def __init__(self, libretranslate_url: str | None = None, enabled: bool = False):
self.enabled = enabled
self.libretranslate_url = libretranslate_url or os.getenv(
"LIBRETRANSLATE_URL",
"http://localhost:5000",
@@ -76,6 +77,9 @@ class TranslatorHandler:
def get_supported_languages(self, libretranslate_url: str | None = None):
languages = []
if not self.enabled:
return languages
url = libretranslate_url or self.libretranslate_url
if self.has_requests:
@@ -131,6 +135,10 @@ class TranslatorHandler:
use_argos: bool = False,
libretranslate_url: str | None = None,
) -> dict[str, Any]:
if not self.enabled:
msg = "Translator is disabled"
raise RuntimeError(msg)
if not text:
msg = "Text cannot be empty"
raise ValueError(msg)

View File

@@ -38,7 +38,7 @@ class VoicemailManager:
self.on_new_voicemail_callback = None
# stabilization delay for voicemail greeting
self.STABILIZATION_DELAY = 2.5
self.STABILIZATION_DELAY = 1.0
# Paths to executables
self.espeak_path = self._find_espeak()
@@ -141,8 +141,34 @@ class VoicemailManager:
wav_path = os.path.join(self.greetings_dir, "greeting.wav")
try:
# espeak-ng to WAV
subprocess.run([self.espeak_path, "-w", wav_path, text], check=True)
# espeak-ng to WAV with improved parameters
speed = str(self.config.voicemail_tts_speed.get())
pitch = str(self.config.voicemail_tts_pitch.get())
voice = self.config.voicemail_tts_voice.get()
gap = str(self.config.voicemail_tts_word_gap.get())
cmd = [
self.espeak_path,
"-s",
speed,
"-p",
pitch,
"-g",
gap,
"-k",
"10",
"-v",
voice,
"-w",
wav_path,
text,
]
RNS.log(
f"Voicemail: Generating greeting with command: {' '.join(cmd)}",
RNS.LOG_DEBUG,
)
subprocess.run(cmd, check=True) # noqa: S603
# Convert WAV to Opus
return self.convert_to_greeting(wav_path)
@@ -160,7 +186,7 @@ class VoicemailManager:
if os.path.exists(opus_path):
os.remove(opus_path)
subprocess.run(
subprocess.run( # noqa: S603
[
self.ffmpeg_path,
"-i",
@@ -169,6 +195,10 @@ class VoicemailManager:
"libopus",
"-b:a",
"16k",
"-ar",
"48000",
"-ac",
"1",
"-vbr",
"on",
opus_path,
@@ -214,11 +244,16 @@ class VoicemailManager:
RNS.LOG_DEBUG,
)
active_call_remote_identity = (
telephone.active_call.get_remote_identity()
if (telephone and telephone.active_call)
else None
)
if (
telephone
and telephone.active_call
and telephone.active_call.get_remote_identity().hash
== caller_identity.hash
and active_call_remote_identity
and active_call_remote_identity.hash == caller_identity.hash
and telephone.call_status == 4 # Ringing
):
RNS.log(
@@ -232,10 +267,17 @@ class VoicemailManager:
RNS.LOG_DEBUG,
)
if telephone.active_call:
RNS.log(
f"Voicemail: Active call remote: {RNS.prettyhexrep(telephone.active_call.get_remote_identity().hash)}",
RNS.LOG_DEBUG,
)
remote_identity = telephone.active_call.get_remote_identity()
if remote_identity:
RNS.log(
f"Voicemail: Active call remote: {RNS.prettyhexrep(remote_identity.hash)}",
RNS.LOG_DEBUG,
)
else:
RNS.log(
"Voicemail: Active call remote identity not found",
RNS.LOG_DEBUG,
)
threading.Thread(target=voicemail_job, daemon=True).start()
@@ -244,13 +286,24 @@ class VoicemailManager:
if not telephone:
return
# Answer the call
if not telephone.answer(caller_identity):
# Answer the call if it's still ringing
if telephone.call_status == 4: # STATUS_RINGING
if not telephone.answer(caller_identity):
RNS.log("Voicemail: Failed to answer call", RNS.LOG_ERROR)
return
elif telephone.call_status != 6: # STATUS_ESTABLISHED
RNS.log(
f"Voicemail: Cannot start session, call status is {telephone.call_status}",
RNS.LOG_DEBUG,
)
return
# Stop microphone if it's active to prevent local noise being sent or recorded
if telephone.audio_input:
telephone.audio_input.stop()
try:
telephone.audio_input.stop()
except Exception:
pass
# Play greeting
greeting_path = os.path.join(self.greetings_dir, "greeting.opus")
@@ -271,6 +324,13 @@ class VoicemailManager:
)
def session_job():
prev_receive_muted = self.telephone_manager.receive_muted
try:
# Prevent remote audio from playing locally while recording voicemail
self.telephone_manager.mute_receive()
except Exception:
pass
try:
# Wait for link to stabilize
RNS.log(
@@ -290,7 +350,8 @@ class VoicemailManager:
if os.path.exists(greeting_path):
try:
greeting_source = OpusFileSource(
greeting_path, target_frame_ms=60
greeting_path,
target_frame_ms=60,
)
# Attach to transmit mixer
greeting_pipeline = Pipeline(
@@ -355,6 +416,12 @@ class VoicemailManager:
RNS.log(f"Error during voicemail session: {e}", RNS.LOG_ERROR)
if self.is_recording:
self.stop_recording()
finally:
try:
if not prev_receive_muted:
self.telephone_manager.unmute_receive()
except Exception:
pass
threading.Thread(target=session_job, daemon=True).start()
@@ -369,17 +436,12 @@ class VoicemailManager:
try:
self.recording_sink = OpusFileSink(filepath)
# Ensure samplerate is set to avoid TypeError in LXST Opus codec
# which expects sink to have a valid samplerate attribute
self.recording_sink.samplerate = 48000
# Connect the caller's audio source to our sink
# active_call.audio_source is a LinkSource that feeds into receive_mixer
# We want to record what we receive.
self.recording_pipeline = Pipeline(
source=telephone.active_call.audio_source,
codec=Null(),
sink=self.recording_sink,
telephone.active_call.audio_source,
Null(),
self.recording_sink,
)
self.recording_pipeline.start()
@@ -401,26 +463,45 @@ class VoicemailManager:
try:
duration = int(time.time() - self.recording_start_time)
self.recording_pipeline.stop()
if self.recording_pipeline:
self.recording_pipeline.stop()
if self.recording_sink:
self.recording_sink.stop()
self.recording_sink = None
self.recording_pipeline = None
# Save to database if long enough
if duration >= 1:
filepath = os.path.join(self.recordings_dir, self.recording_filename)
self._fix_recording(filepath)
# If recording is missing or empty (no frames), synthesize a small silence file
if (not os.path.exists(filepath)) or os.path.getsize(filepath) == 0:
self._write_silence_file(filepath, max(duration, 1))
remote_name = self.get_name_for_identity_hash(
self.recording_remote_identity.hash.hex(),
)
self.db.voicemails.add_voicemail(
remote_identity_hash=self.recording_remote_identity.hash.hex(),
remote_identity_name=remote_name,
filename=self.recording_filename,
duration_seconds=duration,
timestamp=self.recording_start_time,
)
RNS.log(
f"Saved voicemail from {RNS.prettyhexrep(self.recording_remote_identity.hash)} ({duration}s)",
RNS.LOG_DEBUG,
)
if os.path.exists(filepath) and os.path.getsize(filepath) > 0:
self.db.voicemails.add_voicemail(
remote_identity_hash=self.recording_remote_identity.hash.hex(),
remote_identity_name=remote_name,
filename=self.recording_filename,
duration_seconds=duration,
timestamp=self.recording_start_time,
)
RNS.log(
f"Saved voicemail from {RNS.prettyhexrep(self.recording_remote_identity.hash)} ({duration}s)",
RNS.LOG_DEBUG,
)
else:
RNS.log(
f"Voicemail: Recording missing for {self.recording_filename}, skipping DB insert",
RNS.LOG_ERROR,
)
if self.on_new_voicemail_callback:
self.on_new_voicemail_callback(
@@ -444,6 +525,86 @@ class VoicemailManager:
RNS.log(f"Error stopping recording: {e}", RNS.LOG_ERROR)
self.is_recording = False
def _fix_recording(self, filepath):
"""Ensures the recording is a valid OGG/Opus file using ffmpeg."""
if not self.has_ffmpeg or not os.path.exists(filepath):
return
temp_path = filepath + ".fix"
try:
# We assume it might be raw opus packets or a slightly broken ogg
# ffmpeg can often fix this by just re-wrapping it.
# We try to detect if it's already a valid format first.
cmd = [
self.ffmpeg_path,
"-y",
"-i",
filepath,
"-c:a",
"libopus",
"-b:a",
"16k",
"-ar",
"48000",
"-ac",
"1",
temp_path,
]
result = subprocess.run(cmd, capture_output=True, text=True, check=False) # noqa: S603
if result.returncode == 0 and os.path.exists(temp_path):
os.remove(filepath)
os.rename(temp_path, filepath)
RNS.log(
f"Voicemail: Fixed recording format for {filepath}",
RNS.LOG_DEBUG,
)
else:
RNS.log(
f"Voicemail: ffmpeg failed to fix {filepath}: {result.stderr}",
RNS.LOG_WARNING,
)
except Exception as e:
RNS.log(f"Voicemail: Error fixing recording {filepath}: {e}", RNS.LOG_ERROR)
finally:
if os.path.exists(temp_path):
os.remove(temp_path)
def _write_silence_file(self, filepath, seconds=1):
"""Creates a minimal OGG/Opus file with silence if recording is missing."""
if not self.has_ffmpeg:
return False
try:
cmd = [
self.ffmpeg_path,
"-y",
"-f",
"lavfi",
"-i",
"anullsrc=r=48000:cl=mono",
"-t",
str(max(1, seconds)),
"-c:a",
"libopus",
"-b:a",
"16k",
filepath,
]
result = subprocess.run(cmd, capture_output=True, text=True, check=False) # noqa: S603
if result.returncode == 0 and os.path.exists(filepath):
return True
RNS.log(
f"Voicemail: Failed to create silence file for {filepath}: {result.stderr}",
RNS.LOG_ERROR,
)
except Exception as e:
RNS.log(
f"Voicemail: Error creating silence file for {filepath}: {e}",
RNS.LOG_ERROR,
)
return False
def start_greeting_recording(self):
telephone = self.telephone_manager.telephone
if not telephone:
@@ -463,21 +624,23 @@ class VoicemailManager:
try:
self.greeting_recording_sink = OpusFileSink(
os.path.join(self.greetings_dir, "greeting.opus")
os.path.join(self.greetings_dir, "greeting.opus"),
)
self.greeting_recording_sink.samplerate = 48000
self.greeting_recording_pipeline = Pipeline(
source=telephone.audio_input,
codec=Null(),
sink=self.greeting_recording_sink,
telephone.audio_input,
Null(),
self.greeting_recording_sink,
)
self.greeting_recording_pipeline.start()
self.is_greeting_recording = True
RNS.log("Voicemail: Started recording greeting from mic", RNS.LOG_DEBUG)
except Exception as e:
RNS.log(
f"Voicemail: Failed to start greeting recording: {e}", RNS.LOG_ERROR
f"Voicemail: Failed to start greeting recording: {e}",
RNS.LOG_ERROR,
)
def stop_greeting_recording(self):
@@ -485,7 +648,12 @@ class VoicemailManager:
return
try:
self.greeting_recording_pipeline.stop()
if self.greeting_recording_pipeline:
self.greeting_recording_pipeline.stop()
if self.greeting_recording_sink:
self.greeting_recording_sink.stop()
self.greeting_recording_sink = None
self.greeting_recording_pipeline = None
self.is_greeting_recording = False

View File

@@ -0,0 +1,236 @@
import asyncio
import json
import threading
import numpy as np
import RNS
from LXST.Codecs import Null, Raw
from LXST.Mixer import Mixer
from LXST.Pipeline import Pipeline
from LXST.Sinks import LocalSink
from LXST.Sources import LocalSource
from .telephone_manager import Tee
def _log_debug(msg: str):
RNS.log(msg, RNS.LOG_DEBUG)
class WebAudioSource(LocalSource):
"""Injects PCM frames (int16 little-endian) received over websocket into the transmit mixer."""
def __init__(self, target_frame_ms: int, sink: Mixer):
self.target_frame_ms = target_frame_ms or 60
self.sink = sink
self.codec = Raw(channels=1, bitdepth=16)
self.channels = 1
self.samplerate = 48000
self.bitdepth = 16
def start(self):
# Nothing to start; frames are pushed from the websocket thread.
pass
def stop(self):
# Nothing to stop; kept for interface compatibility.
pass
def can_receive(self, from_source=None):
return True
def handle_frame(self, frame, source=None):
# Not used; frames are pushed via push_pcm.
pass
def push_pcm(self, pcm_bytes: bytes):
try:
samples = (
np.frombuffer(pcm_bytes, dtype=np.int16).astype(np.float32) / 32768.0
)
if samples.size == 0:
return
samples = samples.reshape(-1, 1)
frame = self.codec.encode(samples)
if self.sink and self.sink.can_receive(from_source=self):
self.sink.handle_frame(frame, self)
except Exception as exc: # noqa: BLE001
RNS.log(f"WebAudioSource: failed to push pcm: {exc}", RNS.LOG_ERROR)
class WebAudioSink(LocalSink):
"""Pushes received PCM frames to websocket clients."""
def __init__(self, loop: asyncio.AbstractEventLoop, send_bytes):
self.loop = loop
self.send_bytes = send_bytes
def can_receive(self, from_source=None):
return True
def handle_frame(self, frame, source):
try:
# frame is expected to be numpy float PCM from receive mixer
if hasattr(frame, "astype"):
samples = np.clip(frame, -1.0, 1.0).astype(np.float32)
pcm = (samples * 32767.0).astype(np.int16).tobytes()
else:
pcm = frame
self.loop.call_soon_threadsafe(asyncio.create_task, self.send_bytes(pcm))
except Exception as exc: # noqa: BLE001
RNS.log(f"WebAudioSink: failed to handle frame: {exc}", RNS.LOG_ERROR)
class WebAudioBridge:
"""Coordinates websocket audio transport with an active LXST telephone call."""
def __init__(self, telephone_manager, config_manager):
self.telephone_manager = telephone_manager
self.config_manager = config_manager
self.clients = set()
self.tx_source: WebAudioSource | None = None
self.rx_sink: WebAudioSink | None = None
self.rx_tee: Tee | None = None
self.loop = asyncio.get_event_loop()
self.lock = threading.Lock()
def _tele(self):
return getattr(self.telephone_manager, "telephone", None)
def config_enabled(self):
return (
self.config_manager
and hasattr(self.config_manager, "telephone_web_audio_enabled")
and self.config_manager.telephone_web_audio_enabled.get()
)
def allow_fallback(self):
return (
self.config_manager
and hasattr(self.config_manager, "telephone_web_audio_allow_fallback")
and self.config_manager.telephone_web_audio_allow_fallback.get()
)
def attach_client(self, client):
with self.lock:
self.clients.add(client)
tele = self._tele()
if not tele or not tele.active_call:
return False
self._ensure_remote_tx(tele)
self._ensure_rx_tee(tele)
return True
def detach_client(self, client):
with self.lock:
if client in self.clients:
self.clients.remove(client)
if not self.clients and self.allow_fallback():
self._restore_host_audio()
async def send_status(self, client):
tele = self._tele()
frame_ms = getattr(tele, "target_frame_time_ms", None) or 60
await client.send_str(
json.dumps(
{
"type": "web_audio.ready",
"frame_ms": frame_ms,
},
),
)
def push_client_frame(self, pcm_bytes: bytes):
with self.lock:
if not self.tx_source:
return
self.tx_source.push_pcm(pcm_bytes)
async def _send_bytes_to_all(self, pcm_bytes: bytes):
stale = []
for ws in list(self.clients):
try:
await ws.send_bytes(pcm_bytes)
except Exception:
stale.append(ws)
for ws in stale:
self.detach_client(ws)
def _ensure_remote_tx(self, tele):
# Rebuild transmit path with websocket-backed source
if self.tx_source:
return
try:
if hasattr(tele, "audio_input") and tele.audio_input:
tele.audio_input.stop()
self.tx_source = WebAudioSource(
target_frame_ms=getattr(tele, "target_frame_time_ms", 60),
sink=tele.transmit_mixer,
)
tele.audio_input = self.tx_source
if tele.transmit_mixer and not tele.transmit_mixer.should_run:
tele.transmit_mixer.start()
except Exception as exc: # noqa: BLE001
RNS.log(
f"WebAudioBridge: failed to swap transmit path: {exc}",
RNS.LOG_ERROR,
)
def _ensure_rx_tee(self, tele):
if self.rx_sink:
return
try:
send_fn = lambda pcm: self._send_bytes_to_all(pcm) # noqa: E731
self.rx_sink = WebAudioSink(self.loop, send_fn)
# Build tee with existing audio_output as first sink to preserve speaker
base_sink = tele.audio_output
self.rx_tee = Tee(base_sink) if base_sink else Tee(self.rx_sink)
if base_sink:
self.rx_tee.add_sink(self.rx_sink)
tele.audio_output = self.rx_tee
if tele.receive_pipeline:
tele.receive_pipeline.stop()
tele.receive_pipeline = Pipeline(
source=tele.receive_mixer,
codec=Null(),
sink=self.rx_tee,
)
tele.receive_pipeline.start()
except Exception as exc: # noqa: BLE001
RNS.log(f"WebAudioBridge: failed to tee receive path: {exc}", RNS.LOG_ERROR)
def _restore_host_audio(self):
tele = self._tele()
if not tele:
return
try:
if hasattr(tele, "_Telephony__reconfigure_transmit_pipeline"):
tele._Telephony__reconfigure_transmit_pipeline()
except Exception:
pass
try:
if tele.receive_pipeline:
tele.receive_pipeline.stop()
if tele.audio_output and self.rx_tee:
# If tee had original sink as first element, revert
primary = self.rx_tee.sinks[0] if self.rx_tee.sinks else None
if primary is not None:
tele.audio_output = primary
if tele.receive_mixer:
tele.receive_pipeline = Pipeline(
source=tele.receive_mixer,
codec=Null(),
sink=tele.audio_output,
)
tele.receive_pipeline.start()
except Exception:
pass
self.tx_source = None
self.rx_sink = None
self.rx_tee = None
def on_call_ended(self):
with self.lock:
self.tx_source = None
self.rx_sink = None
self.rx_tee = None

View File

@@ -3,11 +3,15 @@
:class="{ dark: config?.theme === 'dark' }"
class="h-screen w-full flex flex-col bg-slate-50 dark:bg-zinc-950 transition-colors"
>
<!-- emergency banner -->
<div
v-if="appInfo?.is_demo"
class="relative z-[100] bg-blue-600/90 backdrop-blur-sm text-white text-[10px] font-bold uppercase tracking-[0.2em] py-1 text-center select-none border-b border-white/10 shadow-sm"
v-if="appInfo?.emergency"
class="relative z-[100] bg-red-600 text-white px-4 py-2 text-center text-sm font-bold shadow-md animate-pulse"
>
Demo Mode &bull; Read Only
<div class="flex items-center justify-center gap-2">
<MaterialDesignIcon icon-name="alert-decagram" class="size-5" />
<span>{{ $t("app.emergency_mode_active") }}</span>
</div>
</div>
<RouterView v-if="$route.name === 'auth'" />
@@ -18,11 +22,10 @@
</div>
<template v-else>
<!-- header -->
<div
class="relative z-[60] flex bg-white/80 dark:bg-zinc-900/70 backdrop-blur border-gray-200 dark:border-zinc-800 border-b min-h-16 shadow-sm transition-colors"
class="sticky top-0 z-[100] flex bg-white/80 dark:bg-zinc-900/80 backdrop-blur-lg border-gray-200 dark:border-zinc-800 border-b min-h-16 shadow-sm transition-colors overflow-x-hidden"
>
<div class="flex w-full px-4">
<div class="flex w-full px-2 sm:px-4 overflow-x-auto no-scrollbar">
<button
type="button"
class="sm:hidden my-auto mr-4 text-gray-500 hover:text-gray-600 dark:text-gray-400 dark:hover:text-gray-300"
@@ -31,13 +34,14 @@
<MaterialDesignIcon :icon-name="isSidebarOpen ? 'close' : 'menu'" class="size-6" />
</button>
<div
class="hidden sm:flex my-auto w-12 h-12 mr-2 rounded-xl overflow-hidden bg-white/70 dark:bg-white/10 border border-gray-200 dark:border-zinc-700 shadow-inner"
class="hidden sm:flex cursor-pointer my-auto w-12 h-12 mr-2 rounded-xl overflow-hidden bg-white/70 dark:bg-white/10 border border-gray-200 dark:border-zinc-700 shadow-inner"
@click="onAppNameClick"
>
<img class="w-12 h-12 object-contain p-1.5" :src="logoUrl" />
</div>
<div class="my-auto">
<div
class="font-semibold cursor-pointer text-gray-900 dark:text-zinc-100 tracking-tight text-lg"
class="font-semibold cursor-pointer text-gray-900 dark:text-zinc-100 hover:text-blue-600 dark:hover:text-blue-400 transition-colors tracking-tight text-lg"
@click="onAppNameClick"
>
{{ $t("app.name") }}
@@ -46,7 +50,7 @@
{{ $t("app.custom_fork_by") }}
<a
target="_blank"
href="https://github.com/Sudo-Ivan"
:href="`${giteaBaseUrl}/Sudo-Ivan`"
class="text-blue-500 dark:text-blue-300 hover:underline"
>Sudo-Ivan</a
>
@@ -78,11 +82,17 @@
<span
class="flex text-gray-800 dark:text-zinc-100 bg-white dark:bg-zinc-800/80 border border-gray-200 dark:border-zinc-700 hover:border-blue-400 dark:hover:border-blue-400/60 px-3 py-1.5 rounded-full shadow-sm transition"
>
<span :class="{ 'animate-spin': isSyncingPropagationNode }">
<MaterialDesignIcon icon-name="refresh" class="size-6" />
</span>
<MaterialDesignIcon
icon-name="refresh"
class="size-6"
:class="{ 'animate-spin': isSyncingPropagationNode }"
/>
<span class="hidden sm:inline-block my-auto mx-1 text-sm font-medium">{{
$t("app.sync_messages")
isSyncingPropagationNode
? $t("app.syncing_node", {
state: propagationNodeStatus?.state ?? "...",
})
: $t("app.sync_messages")
}}</span>
</span>
</button>
@@ -119,11 +129,11 @@
class="fixed inset-y-0 left-0 z-[70] transform transition-all duration-300 ease-in-out sm:relative sm:z-0 sm:flex sm:translate-x-0"
:class="[
isSidebarOpen ? 'translate-x-0' : '-translate-x-full',
isSidebarCollapsed ? 'w-20' : 'w-72',
isSidebarCollapsed ? 'w-16' : 'w-80',
]"
>
<div
class="flex h-full w-full flex-col overflow-y-auto border-r border-gray-200/70 bg-white dark:border-zinc-800 dark:bg-zinc-900 backdrop-blur"
class="flex h-full w-full flex-col overflow-y-auto border-r border-gray-200/70 bg-white dark:border-zinc-800 dark:bg-zinc-900 backdrop-blur pt-16 sm:pt-0"
>
<!-- toggle button for desktop -->
<div class="hidden sm:flex justify-end p-2 border-b border-gray-100 dark:border-zinc-800">
@@ -233,7 +243,7 @@
>
<template #icon>
<MaterialDesignIcon
icon-name="diagram-projector"
icon-name="hub"
class="w-6 h-6 text-gray-700 dark:text-gray-200"
/>
</template>
@@ -311,6 +321,7 @@
:icon-name="config?.lxmf_user_icon_name"
:icon-foreground-colour="config?.lxmf_user_icon_foreground_colour"
:icon-background-colour="config?.lxmf_user_icon_background_colour"
icon-class="size-7"
/>
</RouterLink>
</div>
@@ -342,8 +353,9 @@
<div class="p-2 dark:border-zinc-900 overflow-hidden text-xs">
<div>{{ $t("app.identity_hash") }}</div>
<div
class="text-[10px] text-gray-700 dark:text-zinc-400 truncate font-mono"
class="text-[10px] text-gray-700 dark:text-zinc-400 truncate font-mono cursor-pointer"
:title="config.identity_hash"
@click="copyValue(config.identity_hash, $t('app.identity_hash'))"
>
{{ config.identity_hash }}
</div>
@@ -351,11 +363,22 @@
<div class="p-2 dark:border-zinc-900 overflow-hidden text-xs">
<div>{{ $t("app.lxmf_address") }}</div>
<div
class="text-[10px] text-gray-700 dark:text-zinc-400 truncate font-mono"
class="text-[10px] text-gray-700 dark:text-zinc-400 truncate font-mono cursor-pointer"
:title="config.lxmf_address_hash"
@click="copyValue(config.lxmf_address_hash, $t('app.lxmf_address'))"
>
{{ config.lxmf_address_hash }}
</div>
<div class="flex items-center justify-end pt-1">
<button
type="button"
class="p-1 rounded-lg text-gray-500 hover:text-blue-500 dark:hover:text-blue-400 transition-colors"
:title="$t('app.show_qr')"
@click.stop="openLxmfQr"
>
<MaterialDesignIcon icon-name="qrcode" class="size-4" />
</button>
</div>
</div>
</div>
</div>
@@ -426,13 +449,74 @@
</template>
</template>
<CallOverlay
v-if="(activeCall || isCallEnded || wasDeclined) && $route.name !== 'call'"
v-if="
(activeCall || isCallEnded || wasDeclined || initiationStatus) &&
!$route.meta.isPopout &&
(!['call', 'call-popout'].includes($route.name) || activeCallTab !== 'phone') &&
(!config?.desktop_open_calls_in_separate_window || !ElectronUtils.isElectron())
"
:active-call="activeCall || lastCall"
:is-ended="isCallEnded"
:was-declined="wasDeclined"
:voicemail-status="voicemailStatus"
:initiation-status="initiationStatus"
:initiation-target-hash="initiationTargetHash"
:initiation-target-name="initiationTargetName"
@hangup="onOverlayHangup"
@toggle-mic="onToggleMic"
@toggle-speaker="onToggleSpeaker"
/>
<Toast />
<ConfirmDialog />
<CommandPalette />
<IntegrityWarningModal />
<ChangelogModal ref="changelogModal" :app-version="appInfo?.version" />
<TutorialModal ref="tutorialModal" />
<!-- LXMF QR modal -->
<div
v-if="showLxmfQr"
class="fixed inset-0 z-[190] flex items-center justify-center p-4 bg-black/60 backdrop-blur-sm"
@click.self="showLxmfQr = false"
>
<div class="w-full max-w-sm bg-white dark:bg-zinc-900 rounded-2xl shadow-2xl overflow-hidden">
<div class="px-4 py-3 border-b border-gray-100 dark:border-zinc-800 flex items-center justify-between">
<h3 class="text-sm font-semibold text-gray-900 dark:text-white">LXMF Address QR</h3>
<button
type="button"
class="text-gray-400 hover:text-gray-600 dark:hover:text-zinc-300 transition-colors"
@click="showLxmfQr = false"
>
<MaterialDesignIcon icon-name="close" class="size-5" />
</button>
</div>
<div class="p-4 space-y-3">
<div class="flex justify-center">
<img
v-if="lxmfQrDataUrl"
:src="lxmfQrDataUrl"
alt="LXMF QR"
class="w-48 h-48 bg-white rounded-xl border border-gray-200 dark:border-zinc-800"
/>
</div>
<div
v-if="config?.lxmf_address_hash"
class="text-xs font-mono text-gray-700 dark:text-zinc-200 text-center break-words"
>
{{ config.lxmf_address_hash }}
</div>
<div class="flex justify-center">
<button
type="button"
class="px-3 py-1.5 text-xs font-semibold text-blue-600 dark:text-blue-400 hover:underline"
@click="copyValue(config?.lxmf_address_hash, $t('app.lxmf_address'))"
>
{{ $t("common.copy") }}
</button>
</div>
</div>
</div>
</div>
<!-- identity switching overlay -->
<transition name="fade-blur">
@@ -450,9 +534,9 @@
</div>
</div>
<div class="mt-6 text-xl font-bold text-gray-900 dark:text-white tracking-tight">
Switching Identity...
{{ $t("app.switching_identity") }}
</div>
<div class="mt-2 text-sm text-gray-500 dark:text-gray-400">Loading your identity</div>
<div class="mt-2 text-sm text-gray-500 dark:text-gray-400">{{ $t("app.loading_identity") }}</div>
</div>
</div>
</transition>
@@ -460,6 +544,7 @@
</template>
<script>
import { useTheme } from "vuetify";
import SidebarLink from "./SidebarLink.vue";
import DialogUtils from "../js/DialogUtils";
import WebSocketConnection from "../js/WebSocketConnection";
@@ -469,11 +554,20 @@ import GlobalEmitter from "../js/GlobalEmitter";
import NotificationUtils from "../js/NotificationUtils";
import LxmfUserIcon from "./LxmfUserIcon.vue";
import Toast from "./Toast.vue";
import ConfirmDialog from "./ConfirmDialog.vue";
import ToastUtils from "../js/ToastUtils";
import MaterialDesignIcon from "./MaterialDesignIcon.vue";
import QRCode from "qrcode";
import NotificationBell from "./NotificationBell.vue";
import LanguageSelector from "./LanguageSelector.vue";
import CallOverlay from "./call/CallOverlay.vue";
import CommandPalette from "./CommandPalette.vue";
import IntegrityWarningModal from "./IntegrityWarningModal.vue";
import ChangelogModal from "./ChangelogModal.vue";
import TutorialModal from "./TutorialModal.vue";
import KeyboardShortcuts from "../js/KeyboardShortcuts";
import ElectronUtils from "../js/ElectronUtils";
import ToneGenerator from "../js/ToneGenerator";
import logoUrl from "../assets/images/logo.png";
export default {
@@ -482,14 +576,26 @@ export default {
LxmfUserIcon,
SidebarLink,
Toast,
ConfirmDialog,
MaterialDesignIcon,
NotificationBell,
LanguageSelector,
CallOverlay,
CommandPalette,
IntegrityWarningModal,
ChangelogModal,
TutorialModal,
},
setup() {
const vuetifyTheme = useTheme();
return {
vuetifyTheme,
};
},
data() {
return {
logoUrl,
ElectronUtils,
reloadInterval: null,
appInfoInterval: null,
@@ -504,17 +610,33 @@ export default {
displayName: "Anonymous Peer",
config: null,
appInfo: null,
hasCheckedForModals: false,
showLxmfQr: false,
lxmfQrDataUrl: null,
activeCall: null,
propagationNodeStatus: null,
isCallEnded: false,
wasDeclined: false,
lastCall: null,
voicemailStatus: null,
isMicMuting: false,
isSpeakerMuting: false,
endedTimeout: null,
ringtonePlayer: null,
toneGenerator: new ToneGenerator(),
isFetchingRingtone: false,
initiationStatus: null,
initiationTargetHash: null,
initiationTargetName: null,
isCallWindowOpen: false,
};
},
computed: {
giteaBaseUrl() {
return this.config?.gitea_base_url || "https://git.quad4.io";
},
currentPopoutType() {
if (this.$route?.meta?.popoutType) {
return this.$route.meta.popoutType;
@@ -535,9 +657,11 @@ export default {
"request_sent",
"receiving",
"response_received",
"complete",
].includes(this.propagationNodeStatus?.state);
},
activeCallTab() {
return GlobalState.activeCallTab;
},
},
watch: {
$route() {
@@ -551,12 +675,8 @@ export default {
if (newConfig && newConfig.custom_ringtone_enabled !== undefined) {
this.updateRingtonePlayer();
}
if (newConfig && newConfig.theme) {
if (newConfig.theme === "dark") {
document.documentElement.classList.add("dark");
} else {
document.documentElement.classList.remove("dark");
}
if (newConfig && "theme" in newConfig) {
this.applyThemePreference(newConfig.theme ?? "light");
}
},
deep: true,
@@ -567,9 +687,11 @@ export default {
clearInterval(this.appInfoInterval);
if (this.endedTimeout) clearTimeout(this.endedTimeout);
this.stopRingtone();
this.toneGenerator.stop();
// stop listening for websocket messages
WebSocketConnection.off("message", this.onWebsocketMessage);
GlobalEmitter.off("config-updated", this.onConfigUpdatedExternally);
},
mounted() {
// listen for websocket messages
@@ -586,12 +708,43 @@ export default {
}, 10000);
});
GlobalEmitter.on("sync-propagation-node", () => {
this.syncPropagationNode();
});
GlobalEmitter.on("config-updated", this.onConfigUpdatedExternally);
GlobalEmitter.on("keyboard-shortcut", (action) => {
this.handleKeyboardShortcut(action);
});
GlobalEmitter.on("block-status-changed", () => {
this.getBlockedDestinations();
});
GlobalEmitter.on("show-changelog", () => {
this.$refs.changelogModal.show();
});
GlobalEmitter.on("show-tutorial", () => {
this.$refs.tutorialModal.show();
});
this.getAppInfo();
this.getConfig();
this.getBlockedDestinations();
this.getKeyboardShortcuts();
this.updateRingtonePlayer();
this.updateTelephoneStatus();
this.updatePropagationNodeStatus();
// listen for protocol links in electron
if (ElectronUtils.isElectron()) {
window.electron.onProtocolLink((url) => {
this.handleProtocolLink(url);
});
}
// update info every few seconds
this.reloadInterval = setInterval(() => {
this.updateTelephoneStatus();
@@ -602,6 +755,21 @@ export default {
}, 15000);
},
methods: {
onConfigUpdatedExternally(newConfig) {
if (!newConfig) return;
this.config = newConfig;
GlobalState.config = newConfig;
this.displayName = newConfig.display_name;
},
applyThemePreference(theme) {
const mode = theme === "dark" ? "dark" : "light";
if (typeof document !== "undefined") {
document.documentElement.classList.toggle("dark", mode === "dark");
}
if (this.vuetifyTheme?.global?.name) {
this.vuetifyTheme.global.name.value = mode;
}
},
getHashPopoutValue() {
const hash = window.location.hash || "";
const match = hash.match(/popout=([^&]+)/);
@@ -612,14 +780,12 @@ export default {
switch (json.type) {
case "config": {
this.config = json.config;
GlobalState.config = json.config;
this.displayName = json.config.display_name;
if (this.config?.theme) {
if (this.config.theme === "dark") {
document.documentElement.classList.add("dark");
} else {
document.documentElement.classList.remove("dark");
}
}
break;
}
case "keyboard_shortcuts": {
KeyboardShortcuts.setShortcuts(json.shortcuts);
break;
}
case "announced": {
@@ -631,6 +797,10 @@ export default {
if (this.config?.do_not_disturb_enabled) {
break;
}
// If we are the caller (outgoing initiation), skip playing the incoming ringtone
if (this.initiationStatus) {
break;
}
NotificationUtils.showIncomingCallNotification();
this.updateTelephoneStatus();
this.playRingtone();
@@ -642,6 +812,21 @@ export default {
);
break;
}
case "telephone_initiation_status": {
this.initiationStatus = json.status;
this.initiationTargetHash = json.target_hash;
this.initiationTargetName = json.target_name;
if (this.initiationStatus === "Ringing...") {
if (this.config?.telephone_tone_generator_enabled) {
this.toneGenerator.setVolume(this.config.telephone_tone_generator_volume);
this.toneGenerator.playRingback();
}
} else if (this.initiationStatus === null) {
this.toneGenerator.stop();
}
break;
}
case "new_voicemail": {
NotificationUtils.showNewVoicemailNotification(
json.remote_identity_name || json.remote_identity_hash
@@ -649,12 +834,50 @@ export default {
this.updateTelephoneStatus();
break;
}
case "telephone_call_established":
case "telephone_call_established": {
this.stopRingtone();
this.ringtonePlayer = null;
this.toneGenerator.stop();
this.updateTelephoneStatus();
break;
}
case "telephone_call_ended": {
this.stopRingtone();
this.ringtonePlayer = null;
if (this.config?.telephone_tone_generator_enabled) {
this.toneGenerator.setVolume(this.config.telephone_tone_generator_volume);
this.toneGenerator.playBusyTone();
}
this.updateTelephoneStatus();
break;
}
case "lxmf.delivery": {
if (this.config?.do_not_disturb_enabled) {
break;
}
// show notification for new messages if window is not focussed
// only for incoming messages
if (!document.hasFocus() && json.lxmf_message?.is_incoming === true) {
NotificationUtils.showNewMessageNotification(
json.remote_identity_name,
json.lxmf_message?.content
);
}
break;
}
case "lxm.ingest_uri.result": {
if (json.status === "success") {
ToastUtils.success(json.message);
} else if (json.status === "error") {
ToastUtils.error(json.message);
} else if (json.status === "warning") {
ToastUtils.warning(json.message);
} else {
ToastUtils.info(json.message);
}
break;
}
case "identity_switched": {
ToastUtils.success(`Switched to identity: ${json.display_name}`);
@@ -679,6 +902,35 @@ export default {
try {
const response = await window.axios.get(`/api/v1/app/info`);
this.appInfo = response.data.app_info;
// check URL params for modal triggers
const urlParams = new URLSearchParams(window.location.search);
if (urlParams.has("show-guide")) {
this.$refs.tutorialModal.show();
// remove param from URL
urlParams.delete("show-guide");
const newUrl = window.location.pathname + (urlParams.toString() ? `?${urlParams.toString()}` : "");
window.history.replaceState({}, "", newUrl);
} else if (urlParams.has("changelog")) {
this.$refs.changelogModal.show();
// remove param from URL
urlParams.delete("changelog");
const newUrl = window.location.pathname + (urlParams.toString() ? `?${urlParams.toString()}` : "");
window.history.replaceState({}, "", newUrl);
} else if (!this.hasCheckedForModals) {
// check if we should show tutorial or changelog (only on first load)
this.hasCheckedForModals = true;
if (this.appInfo && !this.appInfo.tutorial_seen) {
this.$refs.tutorialModal.show();
} else if (
this.appInfo &&
this.appInfo.changelog_seen_version !== "999.999.999" &&
this.appInfo.changelog_seen_version !== this.appInfo.version
) {
// show changelog if version changed and not silenced forever
this.$refs.changelogModal.show();
}
}
} catch (e) {
// do nothing if failed to load app info
console.log(e);
@@ -688,39 +940,59 @@ export default {
try {
const response = await window.axios.get(`/api/v1/config`);
this.config = response.data.config;
if (this.config?.theme) {
if (this.config.theme === "dark") {
document.documentElement.classList.add("dark");
} else {
document.documentElement.classList.remove("dark");
}
}
GlobalState.config = response.data.config;
this.displayName = response.data.config.display_name;
} catch (e) {
// do nothing if failed to load config
console.log(e);
}
},
async getBlockedDestinations() {
try {
const response = await window.axios.get("/api/v1/blocked-destinations");
GlobalState.blockedDestinations = response.data.blocked_destinations || [];
} catch (e) {
console.log("Failed to load blocked destinations:", e);
}
},
async getKeyboardShortcuts() {
WebSocketConnection.send(
JSON.stringify({
type: "keyboard_shortcuts.get",
})
);
},
async sendAnnounce() {
try {
await window.axios.get(`/api/v1/announce`);
} catch (e) {
ToastUtils.error("failed to announce");
ToastUtils.error(this.$t("app.failed_announce"));
console.log(e);
}
// fetch config so it updates last announced timestamp
await this.getConfig();
},
async updateConfig(config) {
// update local state immediately if in demo mode, as websocket is not available
if (this.appInfo?.is_demo) {
this.config = {
...this.config,
...config,
};
return;
async copyValue(value, label) {
if (!value) return;
try {
await navigator.clipboard.writeText(value);
ToastUtils.success(`${label} copied`);
} catch {
ToastUtils.success(value);
}
},
async openLxmfQr() {
if (!this.config?.lxmf_address_hash) return;
try {
const uri = `lxmf://${this.config.lxmf_address_hash}`;
this.lxmfQrDataUrl = await QRCode.toDataURL(uri, { margin: 1, scale: 6 });
this.showLxmfQr = true;
} catch {
ToastUtils.error(this.$t("common.error"));
}
},
async updateConfig(config, label = null) {
try {
WebSocketConnection.send(
JSON.stringify({
@@ -728,6 +1000,13 @@ export default {
config: config,
})
);
if (label) {
ToastUtils.success(
this.$t("app.setting_auto_saved", {
label: this.$t(`app.${label.toLowerCase().replace(/ /g, "_")}`),
})
);
}
} catch (e) {
console.error(e);
}
@@ -738,23 +1017,32 @@ export default {
});
},
async onAnnounceIntervalSecondsChange() {
await this.updateConfig({
auto_announce_interval_seconds: this.config.auto_announce_interval_seconds,
});
await this.updateConfig(
{
auto_announce_interval_seconds: this.config.auto_announce_interval_seconds,
},
"announce_interval"
);
},
async toggleTheme() {
if (!this.config) {
return;
}
const newTheme = this.config.theme === "dark" ? "light" : "dark";
await this.updateConfig({
theme: newTheme,
});
await this.updateConfig(
{
theme: newTheme,
},
"theme"
);
},
async onLanguageChange(langCode) {
await this.updateConfig({
language: langCode,
});
await this.updateConfig(
{
language: langCode,
},
"language"
);
this.$i18n.locale = langCode;
},
async composeNewMessage() {
@@ -767,7 +1055,7 @@ export default {
async syncPropagationNode() {
// ask to stop syncing if already syncing
if (this.isSyncingPropagationNode) {
if (await DialogUtils.confirm("Are you sure you want to stop syncing?")) {
if (await DialogUtils.confirm(this.$t("app.stop_sync_confirm"))) {
await this.stopSyncingPropagationNode();
}
return;
@@ -777,7 +1065,7 @@ export default {
try {
await axios.get("/api/v1/lxmf/propagation-node/sync");
} catch (e) {
const errorMessage = e.response?.data?.message ?? "Something went wrong. Try again later.";
const errorMessage = e.response?.data?.message ?? this.$t("app.sync_error_generic");
ToastUtils.error(errorMessage);
return;
}
@@ -799,9 +1087,9 @@ export default {
const status = this.propagationNodeStatus?.state;
const messagesReceived = this.propagationNodeStatus?.messages_received ?? 0;
if (status === "complete" || status === "idle") {
ToastUtils.success(`Sync complete. ${messagesReceived} messages received.`);
ToastUtils.success(this.$t("app.sync_complete", { count: messagesReceived }));
} else {
ToastUtils.error(`Sync error: ${status}`);
ToastUtils.error(this.$t("app.sync_error", { status: status }));
}
}, 500);
},
@@ -841,6 +1129,9 @@ export default {
if (status.has_custom_ringtone && status.id) {
this.ringtonePlayer = new Audio(`/api/v1/telephone/ringtones/${status.id}/audio`);
this.ringtonePlayer.loop = true;
if (status.volume !== undefined) {
this.ringtonePlayer.volume = status.volume;
}
}
} catch (e) {
console.error("Failed to update ringtone player:", e);
@@ -849,15 +1140,21 @@ export default {
},
playRingtone() {
if (this.ringtonePlayer) {
this.ringtonePlayer.play().catch((e) => {
console.log("Failed to play custom ringtone:", e);
});
if (this.ringtonePlayer.paused) {
this.ringtonePlayer.play().catch((e) => {
console.log("Failed to play custom ringtone:", e);
});
}
}
},
stopRingtone() {
if (this.ringtonePlayer) {
this.ringtonePlayer.pause();
this.ringtonePlayer.currentTime = 0;
try {
this.ringtonePlayer.pause();
this.ringtonePlayer.currentTime = 0;
} catch {
// ignore errors during pause
}
}
},
async updateTelephoneStatus() {
@@ -865,22 +1162,31 @@ export default {
// fetch status
const response = await axios.get("/api/v1/telephone/status");
const oldCall = this.activeCall;
const newCall = response.data.active_call;
// update ui
this.activeCall = response.data.active_call;
// Stop ringtone if not ringing anymore
if (this.activeCall?.status !== 4) {
this.stopRingtone();
this.activeCall = newCall;
if (this.activeCall) {
this.toneGenerator.stop();
}
this.voicemailStatus = response.data.voicemail;
this.initiationStatus = response.data.initiation_status;
this.initiationTargetHash = response.data.initiation_target_hash;
this.initiationTargetName = response.data.initiation_target_name;
// If call just ended, show ended state for a few seconds
if (oldCall != null && this.activeCall == null) {
// Update call ended state if needed
const justEnded = oldCall != null && this.activeCall == null;
if (justEnded) {
this.lastCall = oldCall;
if (this.config?.telephone_tone_generator_enabled) {
this.toneGenerator.setVolume(this.config.telephone_tone_generator_volume);
this.toneGenerator.playBusyTone();
}
if (this.wasDeclined) {
// Already set by hangupCall
} else {
// Trigger history refresh
GlobalEmitter.emit("telephone-history-updated");
if (!this.wasDeclined) {
this.isCallEnded = true;
}
@@ -890,6 +1196,90 @@ export default {
this.wasDeclined = false;
this.lastCall = null;
}, 5000);
}
// Handle outgoing ringback tone
if (this.initiationStatus === "Ringing...") {
if (this.config?.telephone_tone_generator_enabled) {
this.toneGenerator.setVolume(this.config.telephone_tone_generator_volume);
this.toneGenerator.playRingback();
}
} else if (!this.initiationStatus && !this.activeCall && !this.isCallEnded) {
// Only stop if we're not ringing, in a call, or just finished a call (busy tone playing)
this.toneGenerator.stop();
}
// Handle power management for calls
if (ElectronUtils.isElectron()) {
if (this.activeCall) {
window.electron.setPowerSaveBlocker(true);
} else if (!this.initiationStatus) {
window.electron.setPowerSaveBlocker(false);
}
}
// Handle opening call in separate window if enabled
if (
(this.activeCall || this.initiationStatus) &&
this.config?.desktop_open_calls_in_separate_window &&
ElectronUtils.isElectron()
) {
if (!this.isCallWindowOpen && !this.$route.meta.isPopout) {
this.isCallWindowOpen = true;
window.open("/call.html", "MeshChatXCallWindow", "width=600,height=800");
}
} else {
this.isCallWindowOpen = false;
}
// Handle ringtone (only for incoming ringing)
if (this.activeCall?.status === 4 && this.activeCall?.is_incoming) {
// Call is ringing
if (!this.ringtonePlayer && this.config?.custom_ringtone_enabled && !this.isFetchingRingtone) {
this.isFetchingRingtone = true;
try {
const caller_hash = this.activeCall.remote_identity_hash;
const ringResponse = await window.axios.get(
`/api/v1/telephone/ringtones/status?caller_hash=${caller_hash}`
);
const status = ringResponse.data;
if (status.has_custom_ringtone && status.id) {
// Double check if we still need to play it (call might have ended during await)
if (this.activeCall?.status === 4) {
// Stop any existing player just in case
this.stopRingtone();
this.ringtonePlayer = new Audio(`/api/v1/telephone/ringtones/${status.id}/audio`);
this.ringtonePlayer.loop = true;
if (status.volume !== undefined) {
this.ringtonePlayer.volume = status.volume;
}
this.playRingtone();
}
}
} finally {
this.isFetchingRingtone = false;
}
} else if (this.ringtonePlayer && this.activeCall?.status === 4) {
this.playRingtone();
}
} else {
// Not ringing
if (this.ringtonePlayer) {
this.stopRingtone();
this.ringtonePlayer = null;
}
}
// Preserve local mute state if we're currently toggling
if (newCall && oldCall) {
newCall.is_mic_muted = oldCall.is_mic_muted;
newCall.is_speaker_muted = oldCall.is_speaker_muted;
}
// If call just ended, show ended state for a few seconds
if (justEnded) {
// Handled above
} else if (this.activeCall != null) {
// if a new call starts, clear ended state
this.isCallEnded = false;
@@ -911,19 +1301,107 @@ export default {
this.wasDeclined = true;
}
},
onToggleMic(isMuted) {
this.isMicMuting = true;
if (this.activeCall) {
this.activeCall.is_mic_muted = isMuted;
}
setTimeout(() => {
this.isMicMuting = false;
}, 2000);
},
onToggleSpeaker(isMuted) {
this.isSpeakerMuting = true;
if (this.activeCall) {
this.activeCall.is_speaker_muted = isMuted;
}
setTimeout(() => {
this.isSpeakerMuting = false;
}, 2000);
},
onAppNameClick() {
// user may be on mobile, and is unable to scroll back to sidebar, so let them tap app name to do it
this.$refs["middle"].scrollTo({
this.$refs["middle"]?.scrollTo({
top: 0,
left: 0,
behavior: "smooth",
});
this.$router.push("/messages");
},
handleProtocolLink(url) {
try {
// lxmf://<hash> or rns://<hash>
const hash = url.replace("lxmf://", "").replace("rns://", "").split("/")[0].replace("/", "");
if (hash && hash.length === 32) {
this.$router.push({
name: "messages",
params: { destinationHash: hash },
});
}
} catch (e) {
console.error("Failed to handle protocol link:", e);
}
},
handleKeyboardShortcut(action) {
switch (action) {
case "nav_messages":
this.$router.push({ name: "messages" });
break;
case "nav_nomad":
this.$router.push({ name: "nomadnetwork" });
break;
case "nav_map":
this.$router.push({ name: "map" });
break;
case "nav_paper":
this.$router.push({ name: "paper-message" });
break;
case "nav_archives":
this.$router.push({ name: "archives" });
break;
case "nav_calls":
this.$router.push({ name: "call" });
break;
case "nav_settings":
this.$router.push({ name: "settings" });
break;
case "compose_message":
this.composeNewMessage();
break;
case "sync_messages":
this.syncPropagationNode();
break;
case "command_palette":
// Command palette handles its own shortcut but we emit it just in case
break;
case "toggle_sidebar":
this.isSidebarCollapsed = !this.isSidebarCollapsed;
break;
}
},
},
};
</script>
<style scoped>
<style>
.banished-overlay {
@apply absolute inset-0 z-[100] flex items-center justify-center overflow-hidden pointer-events-none rounded-[inherit];
background: rgba(220, 38, 38, 0.12);
backdrop-filter: blur(3px) saturate(180%);
}
.banished-text {
@apply font-black tracking-[0.3em] uppercase pointer-events-none opacity-40;
font-size: clamp(1.5rem, 8vw, 6rem);
color: #dc2626;
transform: rotate(-12deg);
text-shadow: 0 0 15px rgba(220, 38, 38, 0.4);
border: 0.2em solid #dc2626;
padding: 0.15em 0.4em;
border-radius: 0.15em;
background: rgba(255, 255, 255, 0.05);
}
.fade-blur-enter-active,
.fade-blur-leave-active {
transition: all 0.5s ease;

View File

@@ -0,0 +1,304 @@
<template>
<v-dialog
v-if="!isPage"
v-model="visible"
:fullscreen="isMobile"
max-width="800"
transition="dialog-bottom-transition"
class="changelog-dialog"
@update:model-value="onVisibleUpdate"
>
<v-card class="flex flex-col h-full bg-white dark:bg-zinc-900 border-0 overflow-hidden">
<!-- Header -->
<v-toolbar flat color="transparent" class="px-4 border-b dark:border-zinc-800">
<div class="flex items-center">
<div class="p-1 mr-3">
<img src="../public/favicons/favicon-512x512.png" class="w-8 h-8 object-contain" alt="Logo" />
</div>
<v-toolbar-title class="text-xl font-bold tracking-tight text-gray-900 dark:text-white">
{{ $t("app.changelog_title", "What's New") }}
</v-toolbar-title>
<span
v-if="version"
class="ml-3 font-black text-[10px] px-2 h-5 tracking-tighter uppercase rounded-sm bg-blue-600 text-white inline-flex items-center"
>
v{{ version }}
</span>
</div>
<v-spacer></v-spacer>
<button
type="button"
class="v-btn text-gray-400 hover:text-gray-900 dark:hover:text-white hover:bg-black/5 dark:hover:bg-white/10 p-2 transition-colors"
@click="close"
>
<v-icon>mdi-close</v-icon>
</button>
</v-toolbar>
<!-- Content -->
<v-card-text class="flex-1 overflow-y-auto px-6 py-8">
<div v-if="loading" class="flex flex-col items-center justify-center h-full space-y-4">
<v-progress-circular indeterminate color="blue" size="64"></v-progress-circular>
<div class="text-gray-500 dark:text-zinc-400 font-medium">Loading changelog...</div>
</div>
<div v-else-if="error" class="flex flex-col items-center justify-center h-full text-center space-y-4">
<v-icon icon="mdi-alert-circle-outline" size="64" color="red"></v-icon>
<div class="text-red-500 font-bold text-lg">{{ error }}</div>
<button type="button" class="primary-chip !px-6" @click="fetchChangelog">Retry</button>
</div>
<div
v-else
class="changelog-content max-w-none prose dark:prose-invert text-gray-900 dark:text-zinc-100"
>
<!-- eslint-disable-next-line vue/no-v-html -->
<div v-html="changelogHtml"></div>
</div>
</v-card-text>
<!-- Footer -->
<v-divider class="dark:border-zinc-800"></v-divider>
<v-card-actions class="px-6 py-4 bg-gray-50 dark:bg-zinc-950/50 flex-wrap gap-y-2">
<div class="flex flex-col">
<v-checkbox
v-model="dontShowAgain"
:label="$t('app.do_not_show_again', 'Do not show again for this version')"
density="compact"
hide-details
color="blue"
class="my-0 text-gray-700 dark:text-zinc-300 font-medium"
></v-checkbox>
<v-checkbox
v-model="dontShowEver"
:label="$t('app.do_not_show_ever', 'Do not show ever again')"
density="compact"
hide-details
color="red"
class="my-0 text-gray-700 dark:text-zinc-300 font-medium"
></v-checkbox>
</div>
<v-spacer></v-spacer>
<button type="button" class="primary-chip !px-8 !h-10 !rounded-xl" @click="close">
{{ $t("common.close", "Close") }}
</button>
</v-card-actions>
</v-card>
</v-dialog>
<div v-else class="flex flex-col h-full bg-white dark:bg-zinc-950 overflow-hidden">
<div class="flex-1 overflow-y-auto px-6 md:px-12 py-10">
<div class="max-w-4xl mx-auto">
<div class="flex items-center gap-4 mb-8">
<div class="p-2">
<img src="../public/favicons/favicon-512x512.png" class="w-16 h-16 object-contain" alt="Logo" />
</div>
<div>
<h1 class="text-4xl font-black text-gray-900 dark:text-white tracking-tighter uppercase mb-1">
{{ $t("app.changelog_title", "What's New") }}
</h1>
<div class="flex items-center gap-2">
<span
class="font-black text-[10px] px-2 h-5 rounded-sm bg-blue-600 text-white inline-flex items-center"
>
v{{ version }}
</span>
<span class="text-sm text-gray-500 font-medium">Full release history</span>
</div>
</div>
</div>
<div v-if="loading" class="flex flex-col items-center justify-center py-20 space-y-4">
<v-progress-circular indeterminate color="blue" size="64"></v-progress-circular>
</div>
<div v-else-if="error" class="flex flex-col items-center justify-center py-20 text-center space-y-4">
<v-icon icon="mdi-alert-circle-outline" size="64" color="red"></v-icon>
<div class="text-red-500 font-bold text-lg">{{ error }}</div>
<button type="button" class="primary-chip !px-6" @click="fetchChangelog">Retry</button>
</div>
<div v-else class="changelog-content max-w-none prose dark:prose-invert pb-20">
<!-- eslint-disable-next-line vue/no-v-html -->
<div v-html="changelogHtml"></div>
</div>
</div>
</div>
</div>
</template>
<script>
export default {
name: "ChangelogModal",
props: {
appVersion: {
type: String,
default: "",
},
},
data() {
return {
visible: false,
loading: true,
error: null,
changelogHtml: "",
version: "",
dontShowAgain: false,
dontShowEver: false,
};
},
computed: {
currentVersion() {
return this.version || this.appVersion;
},
isPage() {
return this.$route?.meta?.isPage === true;
},
isMobile() {
return window.innerWidth < 640;
},
},
mounted() {
if (this.isPage) {
this.fetchChangelog();
}
},
methods: {
async show() {
this.visible = true;
await this.fetchChangelog();
},
async fetchChangelog() {
this.loading = true;
this.error = null;
try {
const response = await window.axios.get("/api/v1/app/changelog");
this.version = response.data.version;
// Process HTML to make version headers look better
// Find [x.x.x] and wrap in a styled span
let html = response.data.html;
html = html.replace(/\[(\d+\.\d+\.\d+)\]/g, '<span class="version-tag">$1</span>');
this.changelogHtml = html;
} catch (e) {
this.error = "Failed to load changelog.";
console.error(e);
} finally {
this.loading = false;
}
},
async close() {
// mark as seen for current version automatically on close if not already marked
if (!this.dontShowEver && !this.dontShowAgain) {
try {
await window.axios.post("/api/v1/app/changelog/seen", {
version: this.currentVersion || "0.0.0",
});
} catch (e) {
console.error("Failed to auto-mark changelog as seen:", e);
}
} else {
await this.markAsSeen();
}
this.visible = false;
},
async markAsSeen() {
if (this.dontShowEver) {
try {
await window.axios.post("/api/v1/app/changelog/seen", {
version: "999.999.999",
});
} catch (e) {
console.error("Failed to mark changelog as seen forever:", e);
}
} else if (this.dontShowAgain) {
try {
await window.axios.post("/api/v1/app/changelog/seen", {
version: this.currentVersion,
});
} catch (e) {
console.error("Failed to mark changelog as seen for this version:", e);
}
}
},
async onVisibleUpdate(val) {
if (!val) {
// handle case where dialog is closed by clicking outside or ESC
await this.markAsSeen();
}
},
},
};
</script>
<style>
.changelog-dialog .v-overlay__content {
border-radius: 0.5rem !important;
overflow: hidden;
}
.changelog-content {
@apply leading-relaxed !important;
}
.changelog-content h1 {
@apply text-3xl font-black mt-2 mb-6 text-gray-900 dark:text-white tracking-tight uppercase border-b-2 border-gray-100 dark:border-zinc-800 pb-2 !important;
}
.changelog-content h2 {
@apply flex items-center gap-3 text-xl font-bold mt-8 mb-4 text-gray-900 dark:text-white !important;
}
/* Style for [v4.0.0] style headers in markdown */
.changelog-content h2::before {
content: "VERSION";
@apply text-[10px] font-black bg-blue-500 text-white px-1.5 py-0.5 rounded-sm tracking-tighter !important;
}
.changelog-content h3 {
@apply text-lg font-bold mt-6 mb-3 text-blue-600 dark:text-blue-400 flex items-center gap-2 !important;
}
.changelog-content h3::before {
content: "•";
@apply text-blue-500 font-black !important;
}
.changelog-content p {
@apply my-4 text-gray-700 dark:text-zinc-300 leading-relaxed !important;
}
.changelog-content ul {
@apply my-6 space-y-3 list-disc pl-6 !important;
}
.changelog-content li {
@apply text-gray-600 dark:text-zinc-400 transition-colors hover:text-gray-900 dark:hover:text-white !important;
}
.changelog-content strong {
@apply font-bold text-gray-900 dark:text-zinc-100 !important;
}
.changelog-content code {
@apply bg-blue-50 dark:bg-blue-900/20 px-1.5 py-0.5 rounded-sm text-blue-700 dark:text-blue-300 font-mono text-[0.85em] border border-blue-100 dark:border-blue-800/30 !important;
}
.changelog-content hr {
@apply my-10 border-gray-100 dark:border-zinc-800 !important;
}
/* Highlight tags like [4.0.0] if they are inside the text */
.changelog-content h2 {
counter-increment: version-counter;
}
.changelog-content h2 {
@apply py-2 px-4 bg-gray-50 dark:bg-zinc-800/50 rounded-md border border-gray-100 dark:border-zinc-800 !important;
}
.changelog-content .version-tag {
@apply bg-blue-600 text-white px-2 py-0.5 rounded-sm font-black text-sm tracking-tighter !important;
}
</style>

View File

@@ -22,7 +22,7 @@
leave-from-class="transform opacity-100 scale-100"
leave-to-class="transform opacity-0 scale-95"
>
<div v-if="isShowingMenu" class="absolute left-0 z-10 ml-4">
<div v-if="isShowingMenu" class="absolute left-0 z-[100] mt-2">
<v-color-picker
v-model="colourPickerValue"
:modes="['hex']"

View File

@@ -0,0 +1,514 @@
<template>
<transition name="slide-down">
<div
v-if="isOpen"
class="fixed inset-x-0 top-0 z-[200] flex items-start justify-center p-4 pointer-events-none"
>
<div
v-click-outside="close"
class="w-full max-w-2xl bg-white/95 dark:bg-zinc-900/95 backdrop-blur-md rounded-2xl shadow-2xl border border-gray-200 dark:border-zinc-800 overflow-hidden flex flex-col max-h-[70vh] pointer-events-auto mt-2 sm:mt-8"
>
<!-- search input -->
<div class="relative flex items-center p-4 border-b border-gray-100 dark:border-zinc-800">
<MaterialDesignIcon icon-name="magnify" class="size-6 text-gray-400 mr-3" />
<input
ref="input"
v-model="query"
type="text"
class="w-full bg-transparent border-none focus:ring-0 text-gray-900 dark:text-white placeholder-gray-400 text-lg"
:placeholder="$t('command_palette.search_placeholder')"
@keydown.down.prevent="moveHighlight(1)"
@keydown.up.prevent="moveHighlight(-1)"
@keydown.enter="executeAction"
@keydown.esc="close"
/>
<div class="flex items-center gap-1 ml-2">
<kbd
class="px-2 py-1 text-xs font-semibold text-gray-500 bg-gray-100 dark:bg-zinc-800 border border-gray-200 dark:border-zinc-700 rounded-lg shadow-sm"
>ESC</kbd
>
</div>
</div>
<!-- results -->
<div class="flex-1 overflow-y-auto p-2 min-h-0">
<div v-if="filteredResults.length === 0" class="p-8 text-center text-gray-500 dark:text-gray-400">
{{ $t("command_palette.no_results", { query: query }) }}
</div>
<div v-else class="space-y-1">
<div v-for="(group, groupName) in groupedResults" :key="groupName">
<div class="px-3 py-2 text-[10px] font-bold text-gray-400 uppercase tracking-widest">
{{ $t(`command_palette.${groupName}`) }}
</div>
<button
v-for="result in group"
:key="result.id"
type="button"
class="w-full flex items-center gap-3 p-3 rounded-xl transition-all text-left group"
:class="[
highlightedId === result.id
? 'bg-blue-50 dark:bg-blue-900/20 text-blue-600 dark:text-blue-400'
: 'hover:bg-gray-50 dark:hover:bg-zinc-800/50 text-gray-700 dark:text-zinc-300',
]"
@click="executeResult(result)"
@mousemove="highlightedId = result.id"
>
<div
class="size-10 rounded-xl flex items-center justify-center shrink-0 border transition-colors"
:class="[
highlightedId === result.id
? 'bg-blue-100 dark:bg-blue-900/40 border-blue-200 dark:border-blue-800'
: 'bg-gray-100 dark:bg-zinc-800 border-gray-200 dark:border-zinc-700',
]"
>
<LxmfUserIcon
v-if="result.type === 'contact' || result.type === 'peer'"
:custom-image="result.type === 'contact' ? result.contact.custom_image : ''"
:icon-name="result.icon"
:icon-foreground-colour="result.iconForeground"
:icon-background-colour="result.iconBackground"
icon-class="size-5"
/>
<MaterialDesignIcon v-else :icon-name="result.icon" class="size-5" />
</div>
<div class="min-w-0 flex-1">
<div class="font-bold truncate">{{ result.title }}</div>
<div class="text-xs opacity-60 truncate">{{ result.description }}</div>
</div>
<MaterialDesignIcon
v-if="highlightedId === result.id"
icon-name="arrow-right"
class="size-4 animate-in slide-in-from-left-2"
/>
</button>
</div>
</div>
</div>
<!-- footer -->
<div
class="p-3 bg-gray-50/50 dark:bg-zinc-900/50 border-t border-gray-100 dark:border-zinc-800 flex justify-center gap-6 text-[10px] font-bold text-gray-400 uppercase tracking-widest"
>
<div class="flex items-center gap-1.5">
<kbd
class="px-1.5 py-0.5 bg-white dark:bg-zinc-800 border border-gray-200 dark:border-zinc-700 rounded shadow-sm"
></kbd
>
<span>{{ $t("command_palette.footer_navigate") }}</span>
</div>
<div class="flex items-center gap-1.5">
<kbd
class="px-1.5 py-0.5 bg-white dark:bg-zinc-800 border border-gray-200 dark:border-zinc-700 rounded shadow-sm"
>Enter</kbd
>
<span>{{ $t("command_palette.footer_select") }}</span>
</div>
</div>
</div>
</div>
</transition>
</template>
<script>
import MaterialDesignIcon from "./MaterialDesignIcon.vue";
import LxmfUserIcon from "./LxmfUserIcon.vue";
import GlobalEmitter from "../js/GlobalEmitter";
import ToastUtils from "../js/ToastUtils";
export default {
name: "CommandPalette",
components: { MaterialDesignIcon, LxmfUserIcon },
data() {
return {
isOpen: false,
query: "",
highlightedId: null,
peers: [],
contacts: [],
actions: [
{
id: "nav-messages",
title: "nav_messages",
description: "nav_messages_desc",
icon: "message-text",
type: "navigation",
route: { name: "messages" },
},
{
id: "nav-nomad",
title: "nav_nomad",
description: "nav_nomad_desc",
icon: "earth",
type: "navigation",
route: { name: "nomadnetwork" },
},
{
id: "nav-map",
title: "nav_map",
description: "nav_map_desc",
icon: "map",
type: "navigation",
route: { name: "map" },
},
{
id: "nav-paper",
title: "nav_paper",
description: "nav_paper_desc",
icon: "qrcode",
type: "navigation",
route: { name: "paper-message" },
},
{
id: "nav-call",
title: "nav_call",
description: "nav_call_desc",
icon: "phone",
type: "navigation",
route: { name: "call" },
},
{
id: "nav-settings",
title: "nav_settings",
description: "nav_settings_desc",
icon: "cog",
type: "navigation",
route: { name: "settings" },
},
{
id: "nav-ping",
title: "nav_ping",
description: "nav_ping_desc",
icon: "radar",
type: "navigation",
route: { name: "ping" },
},
{
id: "nav-rnprobe",
title: "nav_rnprobe",
description: "nav_rnprobe_desc",
icon: "radar",
type: "navigation",
route: { name: "rnprobe" },
},
{
id: "nav-rncp",
title: "nav_rncp",
description: "nav_rncp_desc",
icon: "swap-horizontal",
type: "navigation",
route: { name: "rncp" },
},
{
id: "nav-rnstatus",
title: "nav_rnstatus",
description: "nav_rnstatus_desc",
icon: "chart-line",
type: "navigation",
route: { name: "rnstatus" },
},
{
id: "nav-rnpath",
title: "nav_rnpath",
description: "nav_rnpath_desc",
icon: "route",
type: "navigation",
route: { name: "rnpath" },
},
{
id: "nav-rnpath-trace",
title: "nav_rnpath_trace",
description: "nav_rnpath_trace_desc",
icon: "map-marker-path",
type: "navigation",
route: { name: "rnpath-trace" },
},
{
id: "nav-translator",
title: "nav_translator",
description: "nav_translator_desc",
icon: "translate",
type: "navigation",
route: { name: "translator" },
},
{
id: "nav-forwarder",
title: "nav_forwarder",
description: "nav_forwarder_desc",
icon: "email-send-outline",
type: "navigation",
route: { name: "forwarder" },
},
{
id: "nav-documentation",
title: "nav_documentation",
description: "nav_documentation_desc",
icon: "book-open-variant",
type: "navigation",
route: { name: "documentation" },
},
{
id: "nav-micron-editor",
title: "nav_micron_editor",
description: "nav_micron_editor_desc",
icon: "code-tags",
type: "navigation",
route: { name: "micron-editor" },
},
{
id: "nav-rnode-flasher",
title: "nav_rnode_flasher",
description: "nav_rnode_flasher_desc",
icon: "flash",
type: "navigation",
route: { name: "rnode-flasher" },
},
{
id: "nav-debug-logs",
title: "nav_debug_logs",
description: "nav_debug_logs_desc",
icon: "console",
type: "navigation",
route: { name: "debug-logs" },
},
{
id: "action-sync",
title: "action_sync",
description: "action_sync_desc",
icon: "refresh",
type: "action",
action: "sync",
},
{
id: "action-compose",
title: "action_compose",
description: "action_compose_desc",
icon: "email-plus",
type: "action",
action: "compose",
},
{
id: "action-orbit",
title: "action_orbit",
description: "action_orbit_desc",
icon: "orbit",
type: "action",
action: "toggle-orbit",
},
{
id: "action-bouncing-balls",
title: "action_bouncing_balls",
description: "action_bouncing_balls_desc",
icon: "bounce",
type: "action",
action: "toggle-bouncing-balls",
},
{
id: "action-getting-started",
title: "action_getting_started",
description: "action_getting_started_desc",
icon: "help-circle",
type: "action",
action: "show-tutorial",
},
{
id: "action-changelog",
title: "action_changelog",
description: "action_changelog_desc",
icon: "history",
type: "action",
action: "show-changelog",
},
],
};
},
computed: {
allResults() {
const results = this.actions.map((action) => ({
...action,
title: this.$t(`command_palette.${action.title}`),
description: this.$t(`command_palette.${action.description}`),
}));
// add peers
if (Array.isArray(this.peers)) {
for (const peer of this.peers) {
results.push({
id: `peer-${peer.destination_hash}`,
title: peer.custom_display_name ?? peer.display_name,
description: peer.destination_hash,
icon: peer.lxmf_user_icon?.icon_name ?? "account",
iconForeground: peer.lxmf_user_icon?.foreground_colour,
iconBackground: peer.lxmf_user_icon?.background_colour,
type: "peer",
peer: peer,
});
}
}
// add contacts
if (Array.isArray(this.contacts)) {
for (const contact of this.contacts) {
results.push({
id: `contact-${contact.id}`,
title: contact.name,
description: this.$t("app.call") + ` ${contact.remote_identity_hash}`,
icon: "phone",
type: "contact",
contact: contact,
});
}
}
return results;
},
filteredResults() {
if (!this.query) return this.allResults.filter((r) => r.type === "navigation" || r.type === "action");
const q = this.query.toLowerCase();
return this.allResults.filter(
(r) => r.title.toLowerCase().includes(q) || r.description.toLowerCase().includes(q)
);
},
groupedResults() {
const groups = {};
for (const result of this.filteredResults) {
const groupName =
result.type === "peer"
? "group_recent"
: result.type === "contact"
? "group_contacts"
: "group_actions";
if (!groups[groupName]) groups[groupName] = [];
groups[groupName].push(result);
}
return groups;
},
},
watch: {
filteredResults: {
handler(newResults) {
if (
newResults.length > 0 &&
(!this.highlightedId || !newResults.find((r) => r.id === this.highlightedId))
) {
this.highlightedId = newResults[0].id;
}
},
immediate: true,
},
},
mounted() {
window.addEventListener("keydown", this.handleGlobalKeydown);
},
beforeUnmount() {
window.removeEventListener("keydown", this.handleGlobalKeydown);
},
methods: {
handleGlobalKeydown(e) {
if ((e.metaKey || e.ctrlKey) && e.key === "k") {
e.preventDefault();
this.toggle();
}
},
async toggle() {
if (this.isOpen) {
this.close();
} else {
await this.open();
}
},
async open() {
this.query = "";
this.isOpen = true;
this.loadPeersAndContacts();
this.$nextTick(() => {
this.$refs.input?.focus();
});
},
close() {
this.isOpen = false;
},
async loadPeersAndContacts() {
try {
// fetch announces for "lxmf.delivery" aspect to get peers
const peerResponse = await window.axios.get(`/api/v1/announces`, {
params: { aspect: "lxmf.delivery", limit: 20 },
});
this.peers = peerResponse.data.announces;
// fetch telephone contacts
const contactResponse = await window.axios.get("/api/v1/telephone/contacts");
this.contacts = Array.isArray(contactResponse.data) ? contactResponse.data : [];
} catch (e) {
console.error("Failed to load command palette data:", e);
}
},
moveHighlight(step) {
const index = this.filteredResults.findIndex((r) => r.id === this.highlightedId);
let nextIndex = index + step;
if (nextIndex < 0) nextIndex = this.filteredResults.length - 1;
if (nextIndex >= this.filteredResults.length) nextIndex = 0;
this.highlightedId = this.filteredResults[nextIndex].id;
},
executeAction() {
const result = this.filteredResults.find((r) => r.id === this.highlightedId);
if (result) this.executeResult(result);
},
executeResult(result) {
this.close();
if (result.type === "navigation") {
this.$router.push(result.route);
} else if (result.type === "peer") {
this.$router.push({ name: "messages", params: { destinationHash: result.peer.destination_hash } });
} else if (result.type === "contact") {
this.dialContact(result.contact.remote_identity_hash);
} else if (result.type === "action") {
if (result.action === "sync") {
GlobalEmitter.emit("sync-propagation-node");
} else if (result.action === "compose") {
this.$router.push({ name: "messages" });
this.$nextTick(() => {
const input = document.getElementById("compose-input");
input?.focus();
});
} else if (result.action === "toggle-orbit") {
GlobalEmitter.emit("toggle-orbit");
} else if (result.action === "toggle-bouncing-balls") {
GlobalEmitter.emit("toggle-bouncing-balls");
} else if (result.action === "show-tutorial") {
GlobalEmitter.emit("show-tutorial");
} else if (result.action === "show-changelog") {
GlobalEmitter.emit("show-changelog");
}
}
},
async dialContact(hash) {
try {
await window.axios.get(`/api/v1/telephone/call/${hash}`);
if (this.$route.name !== "call") {
this.$router.push({ name: "call" });
}
} catch (e) {
ToastUtils.error(e.response?.data?.message || "Failed to initiate call");
}
},
},
};
</script>
<style scoped>
.slide-down-enter-active,
.slide-down-leave-active {
transition: all 0.3s cubic-bezier(0.16, 1, 0.3, 1);
}
.slide-down-enter-from,
.slide-down-leave-to {
opacity: 0;
transform: translateY(-20px) scale(0.98);
}
kbd {
font-family: inherit;
}
</style>

View File

@@ -0,0 +1,107 @@
<template>
<Transition name="confirm-dialog">
<div v-if="pendingConfirm" class="fixed inset-0 z-[9999] flex items-center justify-center p-4">
<div class="fixed inset-0 bg-black/50 backdrop-blur-sm shadow-2xl" @click="cancel"></div>
<div
class="relative w-full sm:w-auto sm:min-w-[400px] sm:max-w-md bg-white dark:bg-zinc-900 sm:rounded-3xl rounded-3xl shadow-2xl border border-gray-200 dark:border-zinc-800 overflow-hidden transform transition-all"
@click.stop
>
<div class="p-8">
<div class="flex items-start mb-6">
<div
class="flex-shrink-0 flex items-center justify-center w-12 h-12 rounded-2xl bg-red-100 dark:bg-red-900/30 text-red-600 dark:text-red-400 mr-4"
>
<MaterialDesignIcon icon-name="alert-circle" class="w-6 h-6" />
</div>
<div class="flex-1 min-w-0">
<h3 class="text-xl font-black text-gray-900 dark:text-white mb-2">Confirm Action</h3>
<p class="text-gray-600 dark:text-zinc-300 whitespace-pre-wrap leading-relaxed">
{{ pendingConfirm.message }}
</p>
</div>
</div>
<div class="flex flex-col sm:flex-row gap-3 sm:justify-end mt-8">
<button
type="button"
class="px-6 py-3 text-sm font-bold text-gray-700 dark:text-zinc-300 bg-gray-100 dark:bg-zinc-800 rounded-xl hover:bg-gray-200 dark:hover:bg-zinc-700 transition-all active:scale-95"
@click="cancel"
>
Cancel
</button>
<button
type="button"
class="px-6 py-3 text-sm font-bold text-white bg-red-600 hover:bg-red-700 rounded-xl shadow-lg shadow-red-600/20 transition-all active:scale-95"
@click="confirm"
>
Confirm
</button>
</div>
</div>
</div>
</div>
</Transition>
</template>
<script>
import GlobalEmitter from "../js/GlobalEmitter";
import MaterialDesignIcon from "./MaterialDesignIcon.vue";
export default {
name: "ConfirmDialog",
components: {
MaterialDesignIcon,
},
data() {
return {
pendingConfirm: null,
resolvePromise: null,
};
},
mounted() {
GlobalEmitter.on("confirm", this.show);
},
beforeUnmount() {
GlobalEmitter.off("confirm", this.show);
},
methods: {
show({ message, resolve }) {
this.pendingConfirm = { message };
this.resolvePromise = resolve;
},
confirm() {
if (this.resolvePromise) {
this.resolvePromise(true);
this.resolvePromise = null;
}
this.pendingConfirm = null;
},
cancel() {
if (this.resolvePromise) {
this.resolvePromise(false);
this.resolvePromise = null;
}
this.pendingConfirm = null;
},
},
};
</script>
<style scoped>
.confirm-dialog-enter-active,
.confirm-dialog-leave-active {
transition: all 0.2s ease;
}
.confirm-dialog-enter-from,
.confirm-dialog-leave-to {
opacity: 0;
}
.confirm-dialog-enter-from .relative,
.confirm-dialog-leave-to .relative {
transform: scale(0.95);
opacity: 0;
}
</style>

View File

@@ -0,0 +1,122 @@
<template>
<v-dialog v-model="visible" persistent max-width="500">
<v-card color="warning" class="pa-4">
<v-card-title class="headline text-white">
<v-icon start icon="mdi-alert-decagram" class="mr-2"></v-icon>
{{ $t("about.security_integrity") }}
</v-card-title>
<v-card-text class="text-white mt-2">
<p v-if="integrity.backend && !integrity.backend.ok">
<strong>{{ $t("about.tampering_detected") }}</strong
><br />
{{ $t("about.integrity_backend_error") }}
</p>
<p v-if="integrity.data && !integrity.data.ok" class="mt-2">
<strong>{{ $t("about.tampering_detected") }}</strong
><br />
{{ $t("about.integrity_data_error") }}
</p>
<v-expansion-panels v-if="issues.length > 0" variant="inset" class="mt-4">
<v-expansion-panel :title="$t('about.technical_issues')" bg-color="warning-darken-1">
<v-expansion-panel-text>
<ul class="text-caption">
<li v-for="(issue, index) in issues" :key="index">{{ issue }}</li>
</ul>
</v-expansion-panel-text>
</v-expansion-panel>
</v-expansion-panels>
<p class="mt-4 text-caption">
{{ $t("about.integrity_warning_footer") }}
</p>
</v-card-text>
<v-card-actions>
<v-checkbox
v-model="dontShowAgain"
:label="$t('app.do_not_show_again')"
density="compact"
hide-details
class="text-white"
></v-checkbox>
<v-spacer></v-spacer>
<v-btn variant="text" color="white" @click="close"> {{ $t("common.continue") }} </v-btn>
<v-btn
v-if="integrity.data && !integrity.data.ok"
variant="flat"
color="white"
class="text-warning font-bold"
@click="acknowledgeAndReset"
>
{{ $t("common.acknowledge_reset") }}
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
import ToastUtils from "../js/ToastUtils";
export default {
name: "IntegrityWarningModal",
data() {
return {
visible: false,
dontShowAgain: false,
integrity: {
backend: { ok: true, issues: [] },
data: { ok: true, issues: [] },
},
};
},
computed: {
issues() {
return [...this.integrity.backend.issues, ...this.integrity.data.issues];
},
},
async mounted() {
if (window.electron && window.electron.getIntegrityStatus) {
this.integrity = await window.electron.getIntegrityStatus();
const isOk = this.integrity.backend.ok && this.integrity.data.ok;
if (!isOk) {
// Check if user has already dismissed this
const dismissed = localStorage.getItem("integrity_warning_dismissed");
const appVersion = await window.electron.appVersion();
if (dismissed !== appVersion) {
this.visible = true;
}
}
}
},
methods: {
async close() {
if (this.dontShowAgain && window.electron) {
const appVersion = await window.electron.appVersion();
localStorage.setItem("integrity_warning_dismissed", appVersion);
}
this.visible = false;
},
async acknowledgeAndReset() {
try {
await window.axios.post("/api/v1/app/integrity/acknowledge");
ToastUtils.success(this.$t("about.integrity_acknowledged_reset"));
this.visible = false;
} catch (e) {
ToastUtils.error(this.$t("about.failed_acknowledge_integrity"));
console.error(e);
}
},
},
};
</script>
<style scoped>
.text-white {
color: white !important;
}
</style>

View File

@@ -4,34 +4,37 @@
type="button"
class="relative rounded-full p-1.5 sm:p-2 text-gray-600 dark:text-gray-300 hover:bg-gray-100 dark:hover:bg-zinc-800 transition-colors"
:title="$t('app.language')"
@click="toggleDropdown"
@click.stop="toggleDropdown"
>
<MaterialDesignIcon icon-name="translate" class="w-5 h-5 sm:w-6 sm:h-6" />
</button>
<div
v-if="isDropdownOpen"
v-click-outside="closeDropdown"
class="absolute right-0 mt-2 w-48 bg-white dark:bg-zinc-900 border border-gray-200 dark:border-zinc-800 rounded-2xl shadow-xl z-[9999] overflow-hidden"
>
<div class="p-2">
<button
v-for="lang in languages"
:key="lang.code"
type="button"
class="w-full px-4 py-2 text-left rounded-lg hover:bg-gray-100 dark:hover:bg-zinc-800 transition-colors flex items-center justify-between"
:class="{
'bg-blue-50 dark:bg-blue-900/20 text-blue-600 dark:text-blue-400':
currentLanguage === lang.code,
'text-gray-900 dark:text-zinc-100': currentLanguage !== lang.code,
}"
@click="selectLanguage(lang.code)"
>
<span class="font-medium">{{ lang.name }}</span>
<MaterialDesignIcon v-if="currentLanguage === lang.code" icon-name="check" class="w-5 h-5" />
</button>
<Teleport to="body">
<div
v-if="isDropdownOpen"
v-click-outside="closeDropdown"
class="fixed w-48 bg-white dark:bg-zinc-900 border border-gray-200 dark:border-zinc-800 rounded-2xl shadow-xl z-[9999] overflow-hidden"
:style="dropdownStyle"
>
<div class="p-2">
<button
v-for="lang in languages"
:key="lang.code"
type="button"
class="w-full px-4 py-2 text-left rounded-lg hover:bg-gray-100 dark:hover:bg-zinc-800 transition-colors flex items-center justify-between"
:class="{
'bg-blue-50 dark:bg-blue-900/20 text-blue-600 dark:text-blue-400':
currentLanguage === lang.code,
'text-gray-900 dark:text-zinc-100': currentLanguage !== lang.code,
}"
@click="selectLanguage(lang.code)"
>
<span class="font-medium">{{ lang.name }}</span>
<MaterialDesignIcon v-if="currentLanguage === lang.code" icon-name="check" class="w-5 h-5" />
</button>
</div>
</div>
</div>
</Teleport>
</div>
</template>
@@ -62,10 +65,12 @@ export default {
data() {
return {
isDropdownOpen: false,
dropdownPosition: { top: 0, left: 0 },
languages: [
{ code: "en", name: "English" },
{ code: "de", name: "Deutsch" },
{ code: "ru", name: "Русский" },
{ code: "it", name: "Italiano" },
],
};
},
@@ -73,10 +78,27 @@ export default {
currentLanguage() {
return this.$i18n.locale;
},
dropdownStyle() {
return {
top: `${this.dropdownPosition.top}px`,
left: `${this.dropdownPosition.left}px`,
};
},
},
methods: {
toggleDropdown() {
toggleDropdown(event) {
this.isDropdownOpen = !this.isDropdownOpen;
if (this.isDropdownOpen) {
this.updateDropdownPosition(event);
}
},
updateDropdownPosition(event) {
const button = event.currentTarget;
const rect = button.getBoundingClientRect();
this.dropdownPosition = {
top: rect.bottom + 8,
left: Math.max(8, rect.right - 192), // 192px is w-48
};
},
closeDropdown() {
this.isDropdownOpen = false;

View File

@@ -1,13 +1,27 @@
<template>
<div
v-if="iconName"
class="p-2 rounded-full"
:style="{ color: iconForegroundColour, 'background-color': iconBackgroundColour }"
v-if="customImage"
class="rounded-full overflow-hidden shrink-0 flex items-center justify-center"
:class="iconClass || 'size-6'"
:style="iconStyle"
>
<MaterialDesignIcon :icon-name="iconName" :class="iconClass" />
<img :src="customImage" class="w-full h-full object-cover" />
</div>
<div v-else class="bg-gray-200 dark:bg-zinc-700 text-gray-500 dark:text-gray-400 p-2 rounded-full">
<MaterialDesignIcon icon-name="account-outline" :class="iconClass" />
<div
v-else-if="iconName"
class="p-[10%] rounded-full shrink-0 flex items-center justify-center"
:style="[iconStyle, { 'background-color': finalBackgroundColor }]"
:class="iconClass || 'size-6'"
>
<MaterialDesignIcon :icon-name="iconName" class="size-full" :style="{ color: finalForegroundColor }" />
</div>
<div
v-else
class="bg-gray-100 dark:bg-zinc-800 text-gray-400 dark:text-zinc-500 p-[15%] rounded-full shrink-0 flex items-center justify-center border border-gray-200 dark:border-zinc-700"
:class="iconClass || 'size-6'"
:style="iconStyle"
>
<MaterialDesignIcon icon-name="account" class="w-full h-full" />
</div>
</template>
@@ -19,21 +33,41 @@ export default {
MaterialDesignIcon,
},
props: {
customImage: {
type: String,
default: "",
},
iconName: {
type: String,
default: "",
},
iconForegroundColour: {
type: String,
default: "",
default: "#6b7280",
},
iconBackgroundColour: {
type: String,
default: "",
default: "#e5e7eb",
},
iconClass: {
type: String,
default: "size-6",
default: "",
},
iconStyle: {
type: Object,
default: () => ({}),
},
},
computed: {
finalForegroundColor() {
return this.iconForegroundColour && this.iconForegroundColour !== ""
? this.iconForegroundColour
: "#6b7280";
},
finalBackgroundColor() {
return this.iconBackgroundColour && this.iconBackgroundColour !== ""
? this.iconBackgroundColour
: "#e5e7eb";
},
},
};

Some files were not shown because too many files have changed in this diff Show More