Import & Export Procedures
Data Management Procedures
Managing the records within Atslegas ensures that user access, key assignments, and audit logs remain consistent and recoverable. This section covers the procedures for backing up the system state and performing bulk data operations.
System Backups
The application utilizes a SQLite database for all persistent storage. Because SQLite stores the entire database in a single file, the backup process is straightforward.
Creating a Manual Backup
To create a backup of the current system state:
- Locate the database file: Refer to your
config.jsonfile under thedb.db_filekey (default isdatabase.db). - Copy the file: Stop the application service to ensure no active write operations are occurring, then copy the
.dbfile to a secure location. - Journal Files: If the system is running in WAL (Write-Ahead Logging) mode, you may see files named
database.db-walanddatabase.db-shm. Ensure the application is fully shut down before copying to ensure these logs are committed to the main file.
Restoring from Backup
- Stop the Flask application.
- Replace the existing
database.dbin the root directory with your backup file. - Restart the application. The system will automatically detect the schema and resume operations.
Bulk Data Import
While the Admin Dashboard provides interfaces for individual entry creation, bulk loading of users or keys is typically handled via the REST API or direct database manipulation.
Programmatic Import via API
You can automate the import of users, groups, or keys by sending POST requests to the API endpoints defined in api_server.py.
Example: Bulk User Import Script (Python)
import requests
import json
API_BASE = "http://localhost:5000/api"
SESSION_COOKIE = "your_admin_session_cookie"
users_to_import = [
{"username": "jdoe", "password": "securepassword123"},
{"username": "asmith", "password": "securepassword456"}
]
for user in users_to_import:
response = requests.post(
f"{API_BASE}/user/add",
json=user,
cookies={"session": SESSION_COOKIE}
)
if response.status_code == 201:
print(f"Successfully imported {user['username']}")
Database Initialization
For a fresh installation, the system automatically initializes the schema using schema.sql. You can pre-populate this SQL file with INSERT statements if you need the system to launch with a specific dataset.
Client-Side Data Sync
The Admin Dashboard utilizes browser localStorage to cache data for performance and offline viewing.
- Storage Keys: Data is stored under versioned keys such as
atslegas_users_v1,atslegas_keys_v1, andatslegas_persons_v1. - Forcing a Refresh: If the dashboard displays stale data, the application automatically attempts to synchronize with the API on page load. To force a hard reset of client-side data, clear the browser's Local Storage or use the "Reload" functionality within the admin interface which triggers the
loadData()routine inadmin-common.js.
Exporting Logs for Audit
To export the system audit logs (Key Logs):
- Navigate to the Admin Dashboard.
- The logs are fetched from the
/api/keyLog/listendpoint. - To save these for external analysis (e.g., Excel), you can capture the JSON response from the network tab or use the API directly:
Note: Timestamps are accepted in Unix format.
curl -X GET "http://localhost:5000/api/keyLog/list?start_time=1704067200&end_time=1735689600"