Compare commits
26 Commits
f4d0399771
...
creation-s
Author | SHA1 | Date | |
---|---|---|---|
07eae87855
|
|||
080537fe6d
|
|||
ecd84bf242
|
|||
981a5d34c5
|
|||
681cc7cb83
|
|||
fbc3be564f
|
|||
4871187726
|
|||
67b482ff13
|
|||
34bdea2269
|
|||
20e16f6ae2
|
|||
f42e38228a
|
|||
3712667a04
|
|||
345190dfeb | |||
5712d898a5 | |||
3bd0a02b62 | |||
167a1fbbc2
|
|||
f11e2502dd
|
|||
43bb2c40de
|
|||
54870b0d0f
|
|||
a50d951af7
|
|||
2e057eee01
|
|||
bc33bd48e8
|
|||
62decb3314
|
|||
339377b838
|
|||
71ea6423bc
|
|||
cad2390649
|
@ -1,3 +1,2 @@
|
||||
SESAM_FSV_VERSION=1.40.13
|
||||
SESAM_INI_PATH=/etc/opt/santesocial/fsv/${SESAM_FSV_VERSION}/conf/sesam.ini
|
||||
DATABASE_URL=sqlite://p4pillon.sqlite?mode=rwc
|
||||
|
@ -1,3 +1,2 @@
|
||||
SESAM_FSV_VERSION=1.40.13
|
||||
SESAM_INI_PATH=${ALLUSERSPROFILE}\\santesocial\\fsv\\${SESAM_FSV_VERSION}\\conf\\sesam.ini
|
||||
DATABASE_URL=sqlite://p4pillon.sqlite?mode=rwc
|
||||
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -23,6 +23,3 @@ target/
|
||||
|
||||
# Ignore .env files
|
||||
.env
|
||||
|
||||
# Development Database
|
||||
*.sqlite
|
||||
|
2601
Cargo.lock
generated
2601
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
17
Cargo.toml
17
Cargo.toml
@ -1,20 +1,9 @@
|
||||
[workspace]
|
||||
resolver = "2"
|
||||
members = [
|
||||
"crates/app",
|
||||
"crates/sesam-vitale",
|
||||
"crates/backend",
|
||||
"crates/desktop",
|
||||
"crates/sesam-vitale",
|
||||
"crates/services-sesam-vitale-sys",
|
||||
"crates/utils",
|
||||
"migration",
|
||||
"entity",
|
||||
".",
|
||||
]
|
||||
|
||||
[workspace.dependencies]
|
||||
anyhow = "1.0"
|
||||
axum = "0.7.5"
|
||||
dotenv = "0.15"
|
||||
sea-orm-cli = "1.0.1"
|
||||
sea-orm = "1.0.1"
|
||||
thiserror = "1.0"
|
||||
tokio = "1.39.1"
|
||||
|
106
README.md
106
README.md
@ -2,12 +2,14 @@
|
||||
|
||||
Logiciel de Pharmacie libre et open-source.
|
||||
|
||||
## Crates
|
||||
## Modules applicatifs
|
||||
|
||||
- `app`: Interface du logiciel, servie par un serveur web propulsé par Axum. Utilisable en mode endpoint ou encapsulé dans le client `desktop`
|
||||
- `desktop`: Client desktop propulsé par Tauri, encapsulant le serveur web `app`
|
||||
- `sesam-vitale`: Bibliothèque de gestion des services SESAM-Vitale (Lecture des cartes CPS et Vitale, téléservices ...)
|
||||
- `utils`: Bibliothèque de fonctions utilitaires
|
||||
- `crates`: Dossier racine des modules Rust
|
||||
- `crates/backend`: Serveur backend propulsé par Axum, exposant une API REST
|
||||
- `crates/desktop`: Client desktop propulsé par Tauri, exposant le `frontend`
|
||||
- `crates/sesam-vitale`: Bibliothèque de gestion des services SESAM-Vitale (Lecture des cartes CPS et Vitale, téléservices ...)
|
||||
- `crates/utils`: Bibliothèque de fonctions utilitaires
|
||||
- `frontend`: Interface web du logiciel, propulsée par Nuxt.js
|
||||
|
||||
## Installation
|
||||
|
||||
@ -26,98 +28,50 @@ Des exemples de fichiers de configuration sont disponibles à la racine du proje
|
||||
|
||||
### Pré-requis
|
||||
|
||||
#### Frontend (Nuxt + Typescript)
|
||||
|
||||
Le frontend est propulsé par Nuxt.js, un framework TypeScript pour Vue.js. Pour le développement, il est nécessaire d'installer les dépendances suivantes :
|
||||
- [Bun](https://bun.sh/docs/installation), un gestionnaire de paquets, équivalent à `npm` en plus performant
|
||||
|
||||
#### Tauri CLI
|
||||
|
||||
TODO: Tauri CLI, réellement nécessaire ?
|
||||
|
||||
La CLI Tauri est nécessaire au lancement du client `desktop`. Elle peut être installée via Cargo :
|
||||
|
||||
```bash
|
||||
cargo install tauri-cli --version "^2.0.0-beta"
|
||||
cargo install tauri-cli --version "^2.0.0-rc"
|
||||
```
|
||||
|
||||
#### Tailwindcss CLI
|
||||
|
||||
Le CLI Tailwindcss est nécessaire pour la génération du fichier `crates/app/assets/css/style.css`.
|
||||
|
||||
La documentation d'installation est disponible sur le site officiel de Tailwindcss : https://tailwindcss.com/blog/standalone-cli
|
||||
|
||||
La version actuellement utilisée est la [`v3.4.7`](https://github.com/tailwindlabs/tailwindcss/releases/tag/v3.4.7)
|
||||
|
||||
#### SeaORM CLI
|
||||
|
||||
SeaORM est notre ORM. Le CLI SeaORM est nécessaire pour la génération des modèles de la base de données et des migrations associées. Elle peut être installée via Cargo :
|
||||
|
||||
```bash
|
||||
cargo install sea-orm-cli
|
||||
```
|
||||
|
||||
L'applicatif va chercher les informations de connexion à la base de données dans la variable `DATABASE_URL` importée depuis les [fichiers de configuration](#fichiers-de-configuration).
|
||||
|
||||
```.env
|
||||
DATABASE_URL=sqlite://p4pillon.sqlite?mode=rwc
|
||||
```
|
||||
|
||||
Toutefois, l'usage de la CLI de SeaORM nécessite de renseigner les informations de connexion à la base de données dans un fichier `.env` situé à la racine du projet.
|
||||
|
||||
> Astuce : utilisé un lien symbolique pour éviter de dupliquer le fichier `.env`.
|
||||
|
||||
#### SESAM-Vitale
|
||||
|
||||
La crate `sesam-vitale` nécessite la présence des librairies dynamiques fournies par le package FSV et la CryptolibCPS. Les instructions d'installation sont disponibles dans le [README](crates/sesam-vitale/README.md) de la crate `sesam-vitale`.
|
||||
|
||||
#### Backend Hot-reload
|
||||
|
||||
Voir le [README](crates/backend/README.md) de la crate `backend` pour les prérequis de développement du serveur backend.
|
||||
|
||||
### Lancement
|
||||
|
||||
Le logiciel dans sa globalité peut être lancé via la commande suivante :
|
||||
Pour lancer l'application en mode développement, il est nécessaire d'exécuter plusieurs composants simultanément :
|
||||
|
||||
```bash
|
||||
# Lancement du serveur backend
|
||||
systemfd --no-pid -s http::3030 -- cargo watch -x 'run --bin backend'
|
||||
```
|
||||
|
||||
```bash
|
||||
# Lancement de l'interface utilisateur (frontend ou desktop)
|
||||
# - frontend (serveur web, accessible via navigateur)
|
||||
bun run --cwd frontend/ dev
|
||||
# - desktop (client desktop, basé sur Tauri)
|
||||
cargo tauri dev
|
||||
```
|
||||
|
||||
/!\ Attention, le lancement du client `desktop` ne génère pas le fichier `crates/app/assets/css/style.css` automatiquement pour le moment. En cas de modification des interfaces web, il est donc nécessaire de procéder à sa génération comme indiqué dans le [README](crates/app/README.md) de la crate `app`.
|
||||
|
||||
Si vous souhaitez lancer les composants séparément, les indications de lancement sont disponibles dans les README des différents crates.
|
||||
|
||||
- [app](crates/app/README.md)
|
||||
- [sesam-vitale](crates/sesam-vitale/README.md)
|
||||
|
||||
## Rechargement automatique
|
||||
|
||||
Pour permettre de développer plus rapidement, il existe une librairie qui recompile automatiquement nos modifications en cours : [`cargo-watch`](https://github.com/watchexec/cargo-watch) permet de relancer une commande `cargo` lorsqu'un fichier est modifié (example: `cargo run` --> `cargo watch -x run`).
|
||||
|
||||
Voici la commande pour l'installer dans un _package_ :
|
||||
```bash
|
||||
cargo add cargo-watch --dev --package app
|
||||
```
|
||||
|
||||
Le fichier [`.ignore`](./ignore) permet d'ignorer certains fichiers pour éviter de relancer la recompilation inutilement.
|
||||
|
||||
⚠️ La librairie n'est pas compatible avec _Windows 7_ et les versions antérieurs de _Windows_.
|
||||
|
||||
## Build
|
||||
|
||||
Packager le client desktop
|
||||
Pour packager le client `desktop`, il est nécessaire de faire appel à la CLI Tauri, qui se charge de gérer le build du `frontend` et son intégration au bundle :
|
||||
|
||||
```bash
|
||||
cargo tauri build
|
||||
```
|
||||
|
||||
## Gestion de la base de données
|
||||
|
||||
### Création d'une migration
|
||||
|
||||
```bash
|
||||
sea-orm-cli migrate generate <nom_de_la_migration>
|
||||
```
|
||||
|
||||
Cette commande génère un fichier de migration à adapter dans le dossier `migration/src`.
|
||||
|
||||
### Appliquer les migrations
|
||||
|
||||
```bash
|
||||
sea-orm-cli migrate up
|
||||
```
|
||||
|
||||
### Génération des entitées
|
||||
|
||||
```bash
|
||||
sea-orm-cli generate entity -o entity/src/entities
|
||||
```
|
@ -6,28 +6,16 @@ edition = "2021"
|
||||
[dependencies]
|
||||
askama = "0.12.1"
|
||||
askama_axum = "0.4.0"
|
||||
axum.workspace = true
|
||||
axum = "0.7.5"
|
||||
axum-htmx = { version = "0.6", features = ["auto-vary"] }
|
||||
futures = "0.3.30"
|
||||
listenfd = "1.0.1"
|
||||
notify = "6.1.1"
|
||||
sea-orm = { workspace = true, features = [
|
||||
# Same `ASYNC_RUNTIME` and `DATABASE_DRIVER` as in the migration crate
|
||||
"sqlx-sqlite",
|
||||
"runtime-tokio-rustls",
|
||||
"macros",
|
||||
] }
|
||||
serde = { version = "1.0.204", features = ["derive"] }
|
||||
thiserror.workspace = true
|
||||
tokio = { workspace = true, features = ["macros", "rt-multi-thread"] }
|
||||
thiserror = "1.0.63"
|
||||
tokio = { version = "1.39.1", features = ["macros", "rt-multi-thread"] }
|
||||
tower-http = { version = "0.5.2", features = ["fs"] }
|
||||
tower-livereload = "0.9.3"
|
||||
|
||||
entity = { path = "../../entity" }
|
||||
migration = { path = "../../migration" }
|
||||
utils = { path = "../utils" }
|
||||
|
||||
[dev-dependencies]
|
||||
cargo-watch = "8.5.1"
|
||||
systemfd = "0.4.0"
|
||||
sea-orm-cli.workspace = true
|
||||
|
@ -2,15 +2,6 @@
|
||||
|
||||
- Récupérer le binaire TailwindCSS : https://tailwindcss.com/blog/standalone-cli
|
||||
|
||||
## Configuration
|
||||
|
||||
> Astuce : lorsqu'on exécute directement la crate `App` à des fins de développement, le système de configuration n'utilisera pas l'éventuel fichier `.env` situé à la racine du workspace Rust. Pour éviter de dupliquer le fichier `.env`, il est possible de créer un lien symbolique vers le fichier `.env` de la crate `App` :
|
||||
|
||||
```bash
|
||||
cd crates/app
|
||||
ln -s ../../.env .env
|
||||
```
|
||||
|
||||
## Exécution
|
||||
|
||||
- Lancer tailwindcss en mode watch dans un terminal :
|
||||
|
@ -580,10 +580,6 @@ video {
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.mb-1 {
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
.mb-2 {
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
@ -608,10 +604,6 @@ video {
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.mb-5 {
|
||||
margin-bottom: 1.25rem;
|
||||
}
|
||||
|
||||
.block {
|
||||
display: block;
|
||||
}
|
||||
@ -624,10 +616,6 @@ video {
|
||||
display: inline-flex;
|
||||
}
|
||||
|
||||
.table {
|
||||
display: table;
|
||||
}
|
||||
|
||||
.grid {
|
||||
display: grid;
|
||||
}
|
||||
@ -708,10 +696,6 @@ video {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.max-w-3xl {
|
||||
max-width: 48rem;
|
||||
}
|
||||
|
||||
.max-w-7xl {
|
||||
max-width: 80rem;
|
||||
}
|
||||
@ -772,12 +756,6 @@ video {
|
||||
margin-left: calc(0.75rem * calc(1 - var(--tw-space-x-reverse)));
|
||||
}
|
||||
|
||||
.space-y-4 > :not([hidden]) ~ :not([hidden]) {
|
||||
--tw-space-y-reverse: 0;
|
||||
margin-top: calc(1rem * calc(1 - var(--tw-space-y-reverse)));
|
||||
margin-bottom: calc(1rem * var(--tw-space-y-reverse));
|
||||
}
|
||||
|
||||
.divide-y > :not([hidden]) ~ :not([hidden]) {
|
||||
--tw-divide-y-reverse: 0;
|
||||
border-top-width: calc(1px * calc(1 - var(--tw-divide-y-reverse)));
|
||||
@ -823,10 +801,6 @@ video {
|
||||
border-width: 2px;
|
||||
}
|
||||
|
||||
.border-b {
|
||||
border-bottom-width: 1px;
|
||||
}
|
||||
|
||||
.border-dashed {
|
||||
border-style: dashed;
|
||||
}
|
||||
@ -876,11 +850,6 @@ video {
|
||||
background-color: rgb(255 255 255 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.bg-blue-600 {
|
||||
--tw-bg-opacity: 1;
|
||||
background-color: rgb(37 99 235 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.p-2 {
|
||||
padding: 0.5rem;
|
||||
}
|
||||
@ -889,10 +858,6 @@ video {
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.p-6 {
|
||||
padding: 1.5rem;
|
||||
}
|
||||
|
||||
.px-3 {
|
||||
padding-left: 0.75rem;
|
||||
padding-right: 0.75rem;
|
||||
@ -903,59 +868,26 @@ video {
|
||||
padding-right: 1rem;
|
||||
}
|
||||
|
||||
.px-5 {
|
||||
padding-left: 1.25rem;
|
||||
padding-right: 1.25rem;
|
||||
}
|
||||
|
||||
.px-6 {
|
||||
padding-left: 1.5rem;
|
||||
padding-right: 1.5rem;
|
||||
}
|
||||
|
||||
.py-10 {
|
||||
padding-top: 2.5rem;
|
||||
padding-bottom: 2.5rem;
|
||||
}
|
||||
|
||||
.py-12 {
|
||||
padding-top: 3rem;
|
||||
padding-bottom: 3rem;
|
||||
}
|
||||
|
||||
.py-2 {
|
||||
padding-top: 0.5rem;
|
||||
padding-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.py-2\.5 {
|
||||
padding-top: 0.625rem;
|
||||
padding-bottom: 0.625rem;
|
||||
}
|
||||
|
||||
.py-3 {
|
||||
padding-top: 0.75rem;
|
||||
padding-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.py-4 {
|
||||
padding-top: 1rem;
|
||||
padding-bottom: 1rem;
|
||||
}
|
||||
|
||||
.py-8 {
|
||||
padding-top: 2rem;
|
||||
padding-bottom: 2rem;
|
||||
}
|
||||
|
||||
.text-left {
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.text-center {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.text-2xl {
|
||||
font-size: 1.5rem;
|
||||
line-height: 2rem;
|
||||
@ -976,24 +908,10 @@ video {
|
||||
line-height: 1.25rem;
|
||||
}
|
||||
|
||||
.text-xl {
|
||||
font-size: 1.25rem;
|
||||
line-height: 1.75rem;
|
||||
}
|
||||
|
||||
.text-xs {
|
||||
font-size: 0.75rem;
|
||||
line-height: 1rem;
|
||||
}
|
||||
|
||||
.font-bold {
|
||||
font-weight: 700;
|
||||
}
|
||||
|
||||
.font-light {
|
||||
font-weight: 300;
|
||||
}
|
||||
|
||||
.font-medium {
|
||||
font-weight: 500;
|
||||
}
|
||||
@ -1002,10 +920,6 @@ video {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.uppercase {
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.leading-tight {
|
||||
line-height: 1.25;
|
||||
}
|
||||
@ -1050,11 +964,6 @@ video {
|
||||
background-color: rgb(243 244 246 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.hover\:bg-blue-700:hover {
|
||||
--tw-bg-opacity: 1;
|
||||
background-color: rgb(29 78 216 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.focus\:outline-none:focus {
|
||||
outline: 2px solid transparent;
|
||||
outline-offset: 2px;
|
||||
@ -1082,24 +991,11 @@ video {
|
||||
--tw-ring-color: rgb(209 213 219 / var(--tw-ring-opacity));
|
||||
}
|
||||
|
||||
.focus\:ring-blue-300:focus {
|
||||
--tw-ring-opacity: 1;
|
||||
--tw-ring-color: rgb(147 197 253 / var(--tw-ring-opacity));
|
||||
}
|
||||
|
||||
@media (min-width: 640px) {
|
||||
.sm\:max-w-md {
|
||||
max-width: 28rem;
|
||||
}
|
||||
|
||||
.sm\:grid-cols-2 {
|
||||
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||
}
|
||||
|
||||
.sm\:p-8 {
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
.sm\:px-6 {
|
||||
padding-left: 1.5rem;
|
||||
padding-right: 1.5rem;
|
||||
@ -1155,12 +1051,6 @@ video {
|
||||
margin-left: calc(2rem * calc(1 - var(--tw-space-x-reverse)));
|
||||
}
|
||||
|
||||
.md\:space-y-5 > :not([hidden]) ~ :not([hidden]) {
|
||||
--tw-space-y-reverse: 0;
|
||||
margin-top: calc(1.25rem * calc(1 - var(--tw-space-y-reverse)));
|
||||
margin-bottom: calc(1.25rem * var(--tw-space-y-reverse));
|
||||
}
|
||||
|
||||
.md\:border-0 {
|
||||
border-width: 0px;
|
||||
}
|
||||
@ -1182,11 +1072,6 @@ video {
|
||||
padding: 1.5rem;
|
||||
}
|
||||
|
||||
.md\:text-2xl {
|
||||
font-size: 1.5rem;
|
||||
line-height: 2rem;
|
||||
}
|
||||
|
||||
.md\:text-blue-700 {
|
||||
--tw-text-opacity: 1;
|
||||
color: rgb(29 78 216 / var(--tw-text-opacity));
|
||||
@ -1203,10 +1088,6 @@ video {
|
||||
}
|
||||
|
||||
@media (min-width: 1024px) {
|
||||
.lg\:mt-5 {
|
||||
margin-top: 1.25rem;
|
||||
}
|
||||
|
||||
.lg\:grid-cols-4 {
|
||||
grid-template-columns: repeat(4, minmax(0, 1fr));
|
||||
}
|
||||
@ -1221,20 +1102,12 @@ video {
|
||||
--tw-space-x-reverse: 1;
|
||||
}
|
||||
|
||||
.rtl\:text-right:where([dir="rtl"], [dir="rtl"] *) {
|
||||
text-align: right;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: dark) {
|
||||
.dark\:divide-gray-600 > :not([hidden]) ~ :not([hidden]) {
|
||||
--tw-divide-opacity: 1;
|
||||
border-color: rgb(75 85 99 / var(--tw-divide-opacity));
|
||||
}
|
||||
|
||||
.dark\:border {
|
||||
border-width: 1px;
|
||||
}
|
||||
|
||||
.dark\:border-gray-600 {
|
||||
--tw-border-opacity: 1;
|
||||
border-color: rgb(75 85 99 / var(--tw-border-opacity));
|
||||
@ -1260,11 +1133,6 @@ video {
|
||||
background-color: rgb(17 24 39 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.dark\:bg-blue-600 {
|
||||
--tw-bg-opacity: 1;
|
||||
background-color: rgb(37 99 235 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.dark\:text-gray-200 {
|
||||
--tw-text-opacity: 1;
|
||||
color: rgb(229 231 235 / var(--tw-text-opacity));
|
||||
@ -1300,11 +1168,6 @@ video {
|
||||
background-color: rgb(55 65 81 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.dark\:hover\:bg-blue-700:hover {
|
||||
--tw-bg-opacity: 1;
|
||||
background-color: rgb(29 78 216 / var(--tw-bg-opacity));
|
||||
}
|
||||
|
||||
.dark\:hover\:text-white:hover {
|
||||
--tw-text-opacity: 1;
|
||||
color: rgb(255 255 255 / var(--tw-text-opacity));
|
||||
@ -1314,11 +1177,6 @@ video {
|
||||
--tw-ring-opacity: 1;
|
||||
--tw-ring-color: rgb(75 85 99 / var(--tw-ring-opacity));
|
||||
}
|
||||
|
||||
.dark\:focus\:ring-blue-800:focus {
|
||||
--tw-ring-opacity: 1;
|
||||
--tw-ring-color: rgb(30 64 175 / var(--tw-ring-opacity));
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 768px) {
|
||||
|
@ -1,11 +0,0 @@
|
||||
use migration::{Migrator, MigratorTrait};
|
||||
use sea_orm::{Database, DatabaseConnection, DbErr};
|
||||
use std::env;
|
||||
|
||||
pub async fn get_connection() -> Result<DatabaseConnection, DbErr> {
|
||||
let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
|
||||
|
||||
let db_connection = Database::connect(database_url).await?;
|
||||
Migrator::up(&db_connection, None).await?;
|
||||
Ok(db_connection)
|
||||
}
|
@ -2,14 +2,8 @@ use std::path::PathBuf;
|
||||
|
||||
use axum::http::{StatusCode, Uri};
|
||||
use axum_htmx::AutoVaryLayer;
|
||||
use sea_orm::DatabaseConnection;
|
||||
use thiserror::Error;
|
||||
use tower_http::services::ServeDir;
|
||||
|
||||
use ::utils::config::{load_config, ConfigError};
|
||||
|
||||
pub mod db;
|
||||
|
||||
mod menu;
|
||||
mod pages;
|
||||
|
||||
@ -17,31 +11,11 @@ async fn fallback(uri: Uri) -> (StatusCode, String) {
|
||||
(StatusCode::NOT_FOUND, format!("No route for {uri}"))
|
||||
}
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum InitError {
|
||||
#[error(transparent)]
|
||||
ConfigError(#[from] ConfigError),
|
||||
}
|
||||
|
||||
pub fn init() -> Result<(), InitError> {
|
||||
load_config(None)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct AppState {
|
||||
db_connection: DatabaseConnection,
|
||||
}
|
||||
|
||||
pub async fn get_router(assets_path: PathBuf) -> axum::Router<()> {
|
||||
let db_connection = db::get_connection().await.unwrap();
|
||||
let state: AppState = AppState { db_connection };
|
||||
|
||||
axum::Router::new()
|
||||
.nest_service("/assets", ServeDir::new(assets_path))
|
||||
.merge(pages::get_routes())
|
||||
.fallback(fallback)
|
||||
.with_state(state)
|
||||
// The AutoVaryLayer is used to avoid cache issues with htmx (cf: https://github.com/robertwayne/axum-htmx?tab=readme-ov-file#auto-caching-management)
|
||||
.layer(AutoVaryLayer)
|
||||
}
|
||||
|
@ -10,7 +10,7 @@ use tokio::net::TcpListener;
|
||||
use tower_livereload::predicate::Predicate;
|
||||
use tower_livereload::LiveReloadLayer;
|
||||
|
||||
use ::app::{get_router, init, InitError};
|
||||
use ::app::get_router;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum AppError {
|
||||
@ -20,10 +20,6 @@ pub enum AppError {
|
||||
NotifyWatcher(#[from] notify::Error),
|
||||
#[error("Missing environment variable {var}")]
|
||||
MissingEnvVar { var: &'static str },
|
||||
#[error("Error with the database connection")]
|
||||
DatabaseConnection(#[from] sea_orm::DbErr),
|
||||
#[error("Error while initialising the app")]
|
||||
Initialisation(#[from] InitError),
|
||||
}
|
||||
|
||||
/// Nous filtrons les requêtes de `htmx` pour ne pas inclure le script _JS_ qui gère le rechargement
|
||||
@ -65,8 +61,6 @@ fn get_livereload_layer(
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<(), AppError> {
|
||||
init()?;
|
||||
|
||||
let manifest_dir = env::var("CARGO_MANIFEST_DIR").map_err(|_| AppError::MissingEnvVar {
|
||||
var: "CARGO_MANIFEST_DIR",
|
||||
})?;
|
||||
|
@ -19,10 +19,5 @@ pub fn get_menu_items() -> Vec<MenuItem> {
|
||||
label: "CPS".to_string(),
|
||||
href: "/cps".to_string(),
|
||||
},
|
||||
MenuItem {
|
||||
id: "debug".to_string(),
|
||||
label: "DEBUG".to_string(),
|
||||
href: "/debug".to_string(),
|
||||
},
|
||||
]
|
||||
}
|
||||
|
@ -1,72 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% import "navbar/navbar.html" as navbar -%}
|
||||
|
||||
{% block title %}Pharma Libre - Debug{% endblock %}
|
||||
|
||||
{% block body %}
|
||||
{% call navbar::navbar(current="debug") %}
|
||||
<div class="py-10">
|
||||
<header id="page-header">
|
||||
<div class="mx-auto max-w-7xl px-4 sm:px-6 lg:px-8">
|
||||
<h1 id="page-title" class="text-3xl font-bold leading-tight tracking-tight text-gray-900">
|
||||
DEBUG
|
||||
</h1>
|
||||
</div>
|
||||
</header>
|
||||
<main id="page-main">
|
||||
<div class="mx-auto max-w-7xl py-12 sm:px-6 lg:px-8">
|
||||
<div class="mx-auto max-w-3xl">
|
||||
<div
|
||||
class="w-full p-6 bg-white rounded-lg shadow dark:border md:mt-0 sm:max-w-md dark:bg-gray-800 dark:border-gray-700 sm:p-8">
|
||||
<h1
|
||||
class="mb-1 text-xl font-bold leading-tight tracking-tight text-gray-900 md:text-2xl dark:text-white">
|
||||
Base de données
|
||||
</h1>
|
||||
<p class="font-light text-gray-500 dark:text-gray-400 mb-5">
|
||||
Données extraites de la base de donnée à des fins de debug
|
||||
</p>
|
||||
<table class="w-full text-sm text-left rtl:text-right text-gray-500 dark:text-gray-400">
|
||||
<thead class="text-xs text-gray-700 uppercase bg-gray-50 dark:bg-gray-700 dark:text-gray-400">
|
||||
<tr>
|
||||
<th scope="col" class="px-6 py-3">
|
||||
ID
|
||||
</th>
|
||||
<th scope="col" class="px-6 py-3">
|
||||
Value
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="bg-white border-b dark:bg-gray-800 dark:border-gray-700">
|
||||
<th scope="row"
|
||||
class="px-6 py-4 font-medium text-gray-900 whitespace-nowrap dark:text-white">
|
||||
db_ping_status
|
||||
</th>
|
||||
<td class="px-6 py-4">
|
||||
{{ db_ping_status }}
|
||||
</td>
|
||||
</tr>
|
||||
<tr class="bg-white border-b dark:bg-gray-800 dark:border-gray-700">
|
||||
<th scope="row"
|
||||
class="px-6 py-4 font-medium text-gray-900 whitespace-nowrap dark:text-white">
|
||||
debug_entries_count
|
||||
</th>
|
||||
<td class="px-6 py-4">
|
||||
{{ debug_entries_count }}
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<div class="mt-4 space-y-4 lg:mt-5 md:space-y-5">
|
||||
<button type="button"
|
||||
class="w-full text-white bg-blue-600 hover:bg-blue-700 focus:ring-4 focus:outline-none focus:ring-blue-300 font-medium rounded-lg text-sm px-5 py-2.5 text-center dark:bg-blue-600 dark:hover:bg-blue-700 dark:focus:ring-blue-800"
|
||||
hx-trigger="click" hx-post="/debug/add_random" hx-swap="none">
|
||||
Add random debug entry
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</main>
|
||||
</div>
|
||||
{% endblock %}
|
@ -1,48 +0,0 @@
|
||||
use askama_axum::Template;
|
||||
use axum::{extract::State, routing};
|
||||
|
||||
use ::entity::{debug, debug::Entity as DebugEntity};
|
||||
use axum_htmx::HxRequest;
|
||||
use sea_orm::*;
|
||||
|
||||
use crate::AppState;
|
||||
|
||||
async fn get_debug_entries(db: &DatabaseConnection) -> Result<Vec<debug::Model>, DbErr> {
|
||||
DebugEntity::find().all(db).await
|
||||
}
|
||||
|
||||
async fn add_random_debug_entry(State(AppState { db_connection }): State<AppState>) {
|
||||
let random_entry = debug::ActiveModel {
|
||||
title: Set("Random title".to_string()),
|
||||
text: Set("Random text".to_string()),
|
||||
..Default::default()
|
||||
};
|
||||
random_entry.insert(&db_connection).await.unwrap();
|
||||
}
|
||||
|
||||
#[derive(Template)]
|
||||
#[template(path = "debug.html")]
|
||||
struct GetDebugTemplate {
|
||||
hx_request: bool,
|
||||
db_ping_status: bool,
|
||||
debug_entries_count: usize,
|
||||
}
|
||||
|
||||
async fn debug(
|
||||
HxRequest(hx_request): HxRequest,
|
||||
State(AppState { db_connection }): State<AppState>,
|
||||
) -> GetDebugTemplate {
|
||||
let db_ping_status = db_connection.ping().await.is_ok();
|
||||
let debug_entries = get_debug_entries(&db_connection).await.unwrap();
|
||||
GetDebugTemplate {
|
||||
hx_request,
|
||||
db_ping_status,
|
||||
debug_entries_count: debug_entries.len(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_routes() -> axum::Router<crate::AppState> {
|
||||
axum::Router::new()
|
||||
.route("/", routing::get(debug))
|
||||
.route("/add_random", routing::post(add_random_debug_entry))
|
||||
}
|
@ -1,14 +1,10 @@
|
||||
use axum::{routing, Router};
|
||||
|
||||
use crate::AppState;
|
||||
|
||||
mod cps;
|
||||
mod debug;
|
||||
mod home;
|
||||
|
||||
pub fn get_routes() -> Router<AppState> {
|
||||
pub fn get_routes() -> Router {
|
||||
Router::new()
|
||||
.route("/", routing::get(home::home))
|
||||
.route("/cps", routing::get(cps::cps))
|
||||
.nest("/debug", debug::get_routes())
|
||||
}
|
||||
|
14
crates/backend/Cargo.toml
Normal file
14
crates/backend/Cargo.toml
Normal file
@ -0,0 +1,14 @@
|
||||
[package]
|
||||
name = "backend"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
anyhow = "1.0.89"
|
||||
axum = "0.7.6"
|
||||
listenfd = "1.0.1"
|
||||
tokio = { version = "1.40.0", features = ["macros", "rt-multi-thread"] }
|
||||
|
||||
[dev-dependencies]
|
||||
cargo-watch = "8.5.2"
|
||||
systemfd = "0.4.3"
|
19
crates/backend/README.md
Normal file
19
crates/backend/README.md
Normal file
@ -0,0 +1,19 @@
|
||||
# Backend
|
||||
|
||||
Ceci est un serveur backend, basé sur axum, et permettant d'offrir une gestion centralisée des accès aux données.
|
||||
|
||||
## Prérequis
|
||||
|
||||
En développement, le mécanisme de hot-reload nécessite de disposer de `cargo-watch` et `systemfd`. Pour les installer, exécutez la commande suivante :
|
||||
|
||||
```bash
|
||||
cargo install cargo-watch systemfd
|
||||
```
|
||||
|
||||
## Développement
|
||||
|
||||
Pour lancer le serveur en mode développement, exécutez la commande suivante :
|
||||
|
||||
```bash
|
||||
systemfd --no-pid -s http::3030 -- cargo watch -x 'run --bin backend'
|
||||
```
|
37
crates/backend/src/lib.rs
Normal file
37
crates/backend/src/lib.rs
Normal file
@ -0,0 +1,37 @@
|
||||
use anyhow::Error as AnyError;
|
||||
use axum::http::{StatusCode, Uri};
|
||||
use axum::response::{IntoResponse, Response};
|
||||
use axum::{routing::get, Router};
|
||||
|
||||
pub fn get_router() -> Router {
|
||||
Router::new()
|
||||
.route("/", get(|| async { "Hello, world!" }))
|
||||
.fallback(fallback)
|
||||
}
|
||||
|
||||
async fn fallback(uri: Uri) -> (StatusCode, String) {
|
||||
(StatusCode::NOT_FOUND, format!("No route for {uri}"))
|
||||
}
|
||||
|
||||
struct AppError(AnyError);
|
||||
|
||||
// To automatically convert `AppError` into a response
|
||||
impl IntoResponse for AppError {
|
||||
fn into_response(self) -> Response {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
format!("Internal Server Error: {}", self.0),
|
||||
)
|
||||
.into_response()
|
||||
}
|
||||
}
|
||||
|
||||
// To automatically convert `AnyError` into `AppError`
|
||||
impl<E> From<E> for AppError
|
||||
where
|
||||
E: Into<AnyError>,
|
||||
{
|
||||
fn from(err: E) -> Self {
|
||||
Self(err.into())
|
||||
}
|
||||
}
|
24
crates/backend/src/main.rs
Normal file
24
crates/backend/src/main.rs
Normal file
@ -0,0 +1,24 @@
|
||||
use listenfd::ListenFd;
|
||||
use tokio::net::TcpListener;
|
||||
|
||||
use backend::get_router;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() {
|
||||
let app = get_router();
|
||||
|
||||
let mut listenfd = ListenFd::from_env();
|
||||
|
||||
let listener = match listenfd.take_tcp_listener(0).unwrap() {
|
||||
// if we are given a tcp listener on listen fd 0, we use that one
|
||||
Some(listener) => {
|
||||
listener.set_nonblocking(true).unwrap();
|
||||
TcpListener::from_std(listener).unwrap()
|
||||
}
|
||||
// otherwise fall back to local listening
|
||||
None => TcpListener::bind("0.0.0.0:8080").await.unwrap(),
|
||||
};
|
||||
|
||||
println!("Listening on {}", listener.local_addr().unwrap());
|
||||
axum::serve(listener, app).await.unwrap();
|
||||
}
|
@ -10,16 +10,11 @@ name = "desktop_lib"
|
||||
crate-type = ["lib", "cdylib", "staticlib"]
|
||||
|
||||
[build-dependencies]
|
||||
tauri-build = { version = "2.0.0-beta", features = [] }
|
||||
tauri-build = { version = "2.0.0-rc", features = [] }
|
||||
|
||||
[dependencies]
|
||||
axum.workspace = true
|
||||
bytes = "1.6.1"
|
||||
http = "1.1.0"
|
||||
tauri = { version = "2.0.0-beta", features = [] }
|
||||
thiserror.workspace = true
|
||||
tower = "0.4.13"
|
||||
tokio.workspace = true
|
||||
|
||||
app = { path = "../app" }
|
||||
tauri = { version = "2.0.0-rc", features = [] }
|
||||
tauri-plugin-shell = "2.0.0-rc"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
|
||||
|
10
crates/desktop/capabilities/default.json
Normal file
10
crates/desktop/capabilities/default.json
Normal file
@ -0,0 +1,10 @@
|
||||
{
|
||||
"$schema": "../gen/schemas/desktop-schema.json",
|
||||
"identifier": "default",
|
||||
"description": "Capability for the main window",
|
||||
"windows": ["main"],
|
||||
"permissions": [
|
||||
"core:default",
|
||||
"shell:allow-open"
|
||||
]
|
||||
}
|
@ -1,92 +1,14 @@
|
||||
use axum::body::{to_bytes, Body};
|
||||
use axum::Router;
|
||||
use bytes::Bytes;
|
||||
use http::{request, response, Request, Response};
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
use tauri::path::BaseDirectory;
|
||||
use tauri::Manager;
|
||||
use thiserror::Error;
|
||||
use tokio::sync::{Mutex, MutexGuard};
|
||||
use tower::{Service, ServiceExt};
|
||||
|
||||
use ::app::init;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum DesktopError {
|
||||
#[error("Axum error:\n{0}")]
|
||||
Axum(#[from] axum::Error),
|
||||
#[error("Infallible error")]
|
||||
Infallible(#[from] std::convert::Infallible),
|
||||
}
|
||||
|
||||
/// Process requests sent to Tauri (with the `axum://` protocol) and handle them with Axum
|
||||
/// When an error occurs, this function is expected to panic, which should result in a 500 error
|
||||
/// being sent to the client, so we let the client handle the error recovering
|
||||
async fn process_tauri_request(
|
||||
tauri_request: Request<Vec<u8>>,
|
||||
mut router: MutexGuard<'_, Router>,
|
||||
) -> Result<Response<Vec<u8>>, DesktopError> {
|
||||
let (parts, body): (request::Parts, Vec<u8>) = tauri_request.into_parts();
|
||||
let axum_request: Request<Body> = Request::from_parts(parts, body.into());
|
||||
|
||||
let axum_response: Response<Body> = router
|
||||
.as_service()
|
||||
.ready()
|
||||
.await
|
||||
.map_err(DesktopError::Infallible)?
|
||||
.call(axum_request)
|
||||
.await
|
||||
.map_err(DesktopError::Infallible)?;
|
||||
|
||||
let (parts, body): (response::Parts, Body) = axum_response.into_parts();
|
||||
let body: Bytes = to_bytes(body, usize::MAX).await?;
|
||||
|
||||
let tauri_response: Response<Vec<u8>> = Response::from_parts(parts, body.into());
|
||||
Ok(tauri_response)
|
||||
// Learn more about Tauri commands at https://tauri.app/v1/guides/features/command
|
||||
#[tauri::command]
|
||||
fn greet(name: &str) -> String {
|
||||
format!("Hello, {}! You've been greeted from Rust!", name)
|
||||
}
|
||||
|
||||
#[cfg_attr(mobile, tauri::mobile_entry_point)]
|
||||
pub fn run() {
|
||||
init().expect("Failed to initialize the application");
|
||||
|
||||
tauri::Builder::default()
|
||||
.setup(|app| {
|
||||
let assets_path: PathBuf = app
|
||||
.path()
|
||||
.resolve("assets", BaseDirectory::Resource)
|
||||
.expect("Assets path should be resolvable");
|
||||
|
||||
// Adds Axum router to application state
|
||||
// This makes it so we can retrieve it from any app instance (see bellow)
|
||||
let router = Arc::new(Mutex::new(app::get_router(assets_path)));
|
||||
|
||||
app.manage(router);
|
||||
|
||||
Ok(())
|
||||
})
|
||||
.register_asynchronous_uri_scheme_protocol("axum", move |app, request, responder| {
|
||||
// Retrieve the router from the application state and clone it for the async block
|
||||
let router = Arc::clone(&app.state::<Arc<Mutex<axum::Router>>>());
|
||||
|
||||
// Spawn a new async task to process the request
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let router = router.lock().await;
|
||||
match process_tauri_request(request, router).await {
|
||||
Ok(response) => responder.respond(response),
|
||||
Err(err) => {
|
||||
let body = format!("Failed to process an axum:// request:\n{}", err);
|
||||
responder.respond(
|
||||
http::Response::builder()
|
||||
.status(http::StatusCode::BAD_REQUEST)
|
||||
.header(http::header::CONTENT_TYPE, "text/plain")
|
||||
.body::<Vec<u8>>(body.into())
|
||||
.expect("BAD_REQUEST response should be valid"),
|
||||
)
|
||||
}
|
||||
}
|
||||
});
|
||||
})
|
||||
.plugin(tauri_plugin_shell::init())
|
||||
.invoke_handler(tauri::generate_handler![greet])
|
||||
.run(tauri::generate_context!())
|
||||
.expect("error while running tauri application");
|
||||
}
|
||||
|
@ -1,20 +1,24 @@
|
||||
{
|
||||
"productName": "Logiciel Pharma",
|
||||
"$schema": "https://schema.tauri.app/config/2.0.0-rc",
|
||||
"productName": "Chrys4lide LGO",
|
||||
"version": "0.0.1",
|
||||
"identifier": "org.p4pillon.pharma.desktop",
|
||||
"identifier": "org.p4pillon.chrys4lide.lgo",
|
||||
"build": {
|
||||
"beforeDevCommand": {
|
||||
"cwd": "../app",
|
||||
"script": "cargo run"
|
||||
"cwd": "../../frontend",
|
||||
"script": "bun run dev"
|
||||
},
|
||||
"devUrl": "http://localhost:3000",
|
||||
"frontendDist": "axum://place.holder/"
|
||||
"devUrl": "http://localhost:1420",
|
||||
"beforeBuildCommand": {
|
||||
"cwd": "../../frontend",
|
||||
"script": "bun run generate"
|
||||
},
|
||||
"frontendDist": "../../frontend/dist"
|
||||
},
|
||||
"app": {
|
||||
"withGlobalTauri": true,
|
||||
"windows": [
|
||||
{
|
||||
"title": "Logiciel Pharma",
|
||||
"title": "Chrys4lide | LG0",
|
||||
"width": 800,
|
||||
"height": 600
|
||||
}
|
||||
@ -25,9 +29,6 @@
|
||||
},
|
||||
"bundle": {
|
||||
"active": true,
|
||||
"resources": {
|
||||
"../app/assets/": "./assets/"
|
||||
},
|
||||
"targets": "all",
|
||||
"icon": [
|
||||
"icons/32x32.png",
|
||||
@ -37,5 +38,4 @@
|
||||
"icons/icon.ico"
|
||||
]
|
||||
}
|
||||
|
||||
}
|
||||
}
|
12
crates/services-sesam-vitale-sys/Cargo.toml
Normal file
12
crates/services-sesam-vitale-sys/Cargo.toml
Normal file
@ -0,0 +1,12 @@
|
||||
[package]
|
||||
name = "services-sesam-vitale-sys"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
#links= "ssvlux64"
|
||||
|
||||
[dependencies]
|
||||
bitvec = "1.0.1"
|
||||
deku = "0.17.0"
|
||||
libc = "0.2.155"
|
||||
num_enum = { version = "0.7.3", features = ["complex-expressions"] }
|
||||
thiserror = "1.0.63"
|
104
crates/services-sesam-vitale-sys/src/api.rs
Normal file
104
crates/services-sesam-vitale-sys/src/api.rs
Normal file
@ -0,0 +1,104 @@
|
||||
use deku::{deku_derive, DekuContainerRead, DekuError, DekuReader};
|
||||
use std::{ffi::CString, fmt, path::Path, ptr};
|
||||
use thiserror::Error;
|
||||
|
||||
use crate::{
|
||||
bindings::{SSV_InitLIB2, SSV_LireConfig, SSV_TermLIB},
|
||||
types::{common::read_from_buffer, configuration::Configuration},
|
||||
};
|
||||
use num_enum::FromPrimitive;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub struct SesamVitaleError {
|
||||
code: u16,
|
||||
}
|
||||
|
||||
#[derive(Debug, Eq, PartialEq, FromPrimitive)]
|
||||
#[repr(u16)]
|
||||
enum SSVIntError {
|
||||
CPSNotInserted = 61441,
|
||||
|
||||
#[num_enum(catch_all)]
|
||||
NotImplemented(u16),
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_sesam_vitale_error() {
|
||||
let int_error = SSVIntError::from(61441);
|
||||
assert_eq!(int_error, SSVIntError::CPSNotInserted);
|
||||
|
||||
let int_error = SSVIntError::from(123);
|
||||
assert_eq!(int_error, SSVIntError::NotImplemented(123));
|
||||
println!("{:?}", int_error);
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
enum SSVError {
|
||||
#[error("Erreur standard de la librairie SSV")]
|
||||
SSVStandard,
|
||||
// #[error("Erreur de parsing")]
|
||||
// Parsing(#[from] ParsingError),
|
||||
#[error("Erreur inattendue de la librairie SSV (TMP)")]
|
||||
SSVUnknownTmp,
|
||||
}
|
||||
|
||||
impl fmt::Display for SesamVitaleError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "Got error code {} from SSV_LireConfig", self.code)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn init_library(sesam_ini_path: &Path) -> Result<(), SesamVitaleError> {
|
||||
// TODO: better error handling
|
||||
let path_str = sesam_ini_path.to_str().unwrap();
|
||||
let path_ptr = CString::new(path_str).expect("failed to create cstring");
|
||||
|
||||
let exit_code: u16 = unsafe { SSV_InitLIB2(path_ptr.as_ptr()) };
|
||||
if exit_code != 0 {
|
||||
let error = SesamVitaleError { code: exit_code };
|
||||
return Err(error);
|
||||
};
|
||||
|
||||
Ok(())
|
||||
}
|
||||
pub fn close_library() -> Result<(), SesamVitaleError> {
|
||||
let exit_code: u16 = unsafe { SSV_TermLIB() };
|
||||
if exit_code != 0 {
|
||||
let error = SesamVitaleError { code: exit_code };
|
||||
return Err(error);
|
||||
};
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn read_config() -> Result<Configuration, SesamVitaleError> {
|
||||
let mut buffer_ptr: *mut libc::c_void = ptr::null_mut();
|
||||
let mut size: libc::size_t = 0;
|
||||
|
||||
let buffer_ptr_ptr: *mut *mut libc::c_void = &mut buffer_ptr;
|
||||
let size_ptr: *mut libc::size_t = &mut size;
|
||||
|
||||
// Need to add proper error handling -> return a result with error code pointing to an error
|
||||
// enum
|
||||
let exit_code: u16 = unsafe { SSV_LireConfig(buffer_ptr_ptr, size_ptr) };
|
||||
|
||||
if exit_code != 0 {
|
||||
let error = SesamVitaleError { code: exit_code };
|
||||
return Err(error);
|
||||
};
|
||||
|
||||
let buffer: &[u8] = unsafe { std::slice::from_raw_parts(buffer_ptr as *const u8, size) };
|
||||
|
||||
// TODO: Improve error handling
|
||||
let configuration: Configuration = read_from_buffer(buffer).unwrap();
|
||||
|
||||
// TODO: Call library function for memory delocating
|
||||
unsafe { libc::free(buffer_ptr) };
|
||||
|
||||
Ok(configuration)
|
||||
}
|
288
crates/services-sesam-vitale-sys/src/bindings.rs
Normal file
288
crates/services-sesam-vitale-sys/src/bindings.rs
Normal file
@ -0,0 +1,288 @@
|
||||
#![allow(non_upper_case_globals)]
|
||||
#![allow(non_camel_case_types)]
|
||||
#![allow(non_snake_case)]
|
||||
#![allow(dead_code)]
|
||||
|
||||
// Generated using bindgen
|
||||
|
||||
extern "C" {
|
||||
// Fonctions de gestion des données
|
||||
|
||||
pub fn SSV_LireCartePS(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_LireDroitsVitale(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
DateConsultation: *const ::std::os::raw::c_char,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_FormaterFactures(
|
||||
cFactureACreer: ::std::os::raw::c_char,
|
||||
cModeSecur: ::std::os::raw::c_char,
|
||||
cTypeFlux: ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
TailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_ChiffrerFacture(
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
TailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_SignerFactureVitale(
|
||||
pcNomRessourceVitale: *const ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_CalculerHashFactureAssure(
|
||||
pcNumSerie: *const ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_AjouterSignatureAssureDansFacture(
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_SignerFactureCPS(
|
||||
pcNomRessourcePS: *const ::std::os::raw::c_char,
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcCodePorteurPS: *const ::std::os::raw::c_char,
|
||||
cNologSituation: ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_FormaterLot(
|
||||
NBZDataIn: ::std::os::raw::c_short,
|
||||
TZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneIn: *mut usize,
|
||||
pNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
TZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_SignerLotCPS(
|
||||
pcNomRessourcePS: *const ::std::os::raw::c_char,
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcCodePorteurPS: *const ::std::os::raw::c_char,
|
||||
cNologSituation: ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_FormaterFichier(
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
TailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_TraduireARL(
|
||||
NbZDonneesEntree: ::std::os::raw::c_short,
|
||||
TZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneIn: *mut usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_LireNumSerieCarteVitale(
|
||||
pcNomRessource: *mut ::std::os::raw::c_char,
|
||||
numeroSerie: *mut ::std::os::raw::c_uchar,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_CalculerHashFacturePS(
|
||||
pcNumSerieCPS: *const ::std::os::raw::c_char,
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
usTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pusTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_AjouterSignaturePSFacture(
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_DechargerFacturesPdT(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
pcNumFact: *const ::std::os::raw::c_char,
|
||||
sNbZDataIn: ::std::os::raw::c_short,
|
||||
pvTZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
psTTailleDataIn: *mut usize,
|
||||
pNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
TZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_TraduireFSE(
|
||||
pZDataIn: *mut ::std::os::raw::c_void,
|
||||
TailleDataIn: usize,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pTailleZone: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
// Fonctions TLA
|
||||
// TLA (Terminal Lecteur Applicatif) -> lecteur autre que PC-SC, on ne prend pas en compte cela
|
||||
|
||||
pub fn SSV_IdentifierTLA(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
NumVersionCDC: *const ::std::os::raw::c_char,
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
tailleDataOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_ChargerDonneesTLA(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
sNbZDataIn: ::std::os::raw::c_short,
|
||||
pvTZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
psTTailleDataIn: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_ChargerFacturesPdT(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcNumFacturation: *const ::std::os::raw::c_char,
|
||||
sNbZDataIn: ::std::os::raw::c_short,
|
||||
pvTZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
psTTailleDataIn: *mut usize,
|
||||
pNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
TZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_DechargerFSETLA(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
pcNumFact: *const ::std::os::raw::c_char,
|
||||
pNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
TZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_DechargerFSETLANC(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
pcNumFact: *const ::std::os::raw::c_char,
|
||||
pNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
TZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
TTailleZoneOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_DechargerBeneficiaires(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
cNumFacturation: *const ::std::os::raw::c_char,
|
||||
sNbZDataOut: *mut ::std::os::raw::c_short,
|
||||
pTZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
sTTailleDataOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_EffacerTLA(
|
||||
NomRessourcePS: *const ::std::os::raw::c_char,
|
||||
NomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
CodePorteurPS: *const ::std::os::raw::c_char,
|
||||
cNumFacturation: *const ::std::os::raw::c_char,
|
||||
cTypeDonnee: *const ::std::os::raw::c_char,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
pub fn SSV_SecuriserFacture(
|
||||
pcNomRessourcePS: *const ::std::os::raw::c_char,
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcCodePorteurPS: *const ::std::os::raw::c_char,
|
||||
cNologSituation: ::std::os::raw::c_char,
|
||||
pcNumFact: *const ::std::os::raw::c_char,
|
||||
pvDataIn: *mut ::std::os::raw::c_void,
|
||||
szTailleDataIn: usize,
|
||||
pvDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
pszTailleDataOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
// Fonctions de gestion de configuration (GALSS)
|
||||
|
||||
pub fn SSV_LireConfig(
|
||||
pZDataOut: *mut *mut ::std::os::raw::c_void,
|
||||
psTailleDataOut: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_LireDateLecteur(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcDateHeure: *mut ::std::os::raw::c_char,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_MajDateLecteur(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
pcDateHeure: *const ::std::os::raw::c_char,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
pub fn SSV_ChargerAppli(
|
||||
pcNomRessourceLecteur: *const ::std::os::raw::c_char,
|
||||
sNbZDataIn: ::std::os::raw::c_short,
|
||||
pvTZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
psTTailleDataIn: *mut usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
// Fonctions techniques
|
||||
|
||||
// La fonction Initialiser Librairie a pour objet de charger et d’initialiser dans la mémoire du système :
|
||||
// - dans le cas où le GALSS est installé sur le poste :
|
||||
// - la bibliothèque du Gestionnaire d’Accès au Lecteur Santé Social (GALSS),
|
||||
// - qui charge la bibliothèque du Protocole Santé Social (PSS),
|
||||
// - la configuration du poste de travail à l’aide du fichier galssinf,
|
||||
// - les variables globales communes aux différents Services SESAM-Vitale,
|
||||
// - les fichiers de tables et scripts des répertoires par défaut.
|
||||
// Cette fonction accède au référentiel électronique en utilisant le chemin complet indiqué dans le fichier sesam.ini.
|
||||
pub fn SSV_InitLIB2(pcFichierSesam: *const ::std::os::raw::c_char) -> ::std::os::raw::c_ushort;
|
||||
|
||||
// La fonction Terminer a pour objet de décharger de la mémoire du système les éléments
|
||||
// chargés par la fonction Initialiser Librairie, qui ne sont plus utiles.
|
||||
pub fn SSV_TermLIB() -> ::std::os::raw::c_ushort;
|
||||
|
||||
/// Fonctions de Tracage
|
||||
//La fonction Allouer Zone Mémoire a un rôle purement technique : elle permet d’allouer, autrement dit de réserver une zone ou partie de la mémoire du poste de travail pour y écrire les données à passer en entrée d’un Service SESAM-Vitale.
|
||||
// Cette fonction doit être utilisée pour allouer toutes les zones de mémoire requises en entrée des Services SESAM-Vitale de manière à permettre un diagnostic fiable par le « mode trace » en cas de dysfonctionnement. En effet, son mode d’exécution est susceptible de fournir des informations utiles au « mode trace » lorsqu’il est activé.
|
||||
pub fn SSV_AllouerZoneMem(
|
||||
pZDataIn: *mut *mut ::std::os::raw::c_void,
|
||||
taille: usize,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
// La fonction Libérer Zone Mémoire a un rôle purement technique : elle permet de libérer une zone de mémoire du poste de travail précédemment allouée après exploitation des données qu’elle contient.
|
||||
// Cette fonction doit être utilisée pour libérer toutes les zones de mémoire :
|
||||
// - celles qui ont été allouées par le progiciel de santé pour fournir les données nécessaires à l’entrée des Services SESAM-Vitale, avant leur appel, celles qui ont été allouées par les Services SESAM-Vitale pour fournir en sortie les données utiles au progiciel de santé qui a fait appel à ces services,
|
||||
// - de façon à permettre un diagnostic fiable par le mode trace en cas de dysfonctionnement
|
||||
//En effet, son exécution est susceptible de fournir des informations utiles au « mode trace » lorsqu’il est activé.
|
||||
pub fn SSV_LibererZoneMem(pZone: *mut ::std::os::raw::c_void);
|
||||
|
||||
// La fonction Initialiser Trace a pour objet de permettre l’activation du « mode trace ».
|
||||
// Ce mode de fonctionnement est prévu pour permettre à l’assistance technique du GIE
|
||||
// SESAM-Vitale d’analyser les problèmes de mise en œuvre des Services SESAM-Vitale,
|
||||
// notamment lorsque une fonction retourne un code d’erreur de valeur hexadécimale supérieure à FF00.
|
||||
pub fn SSV_InitTrace(
|
||||
pathConf: *mut ::std::os::raw::c_char,
|
||||
ModeOuvertureFicherLog: *mut ::std::os::raw::c_char,
|
||||
ModuleLog: ::std::os::raw::c_ushort,
|
||||
NiveauLog: ::std::os::raw::c_uchar,
|
||||
) -> ::std::os::raw::c_ushort;
|
||||
|
||||
}
|
6
crates/services-sesam-vitale-sys/src/lib.rs
Normal file
6
crates/services-sesam-vitale-sys/src/lib.rs
Normal file
@ -0,0 +1,6 @@
|
||||
pub mod api;
|
||||
mod bindings;
|
||||
pub mod types;
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {}
|
144
crates/services-sesam-vitale-sys/src/types/common.rs
Normal file
144
crates/services-sesam-vitale-sys/src/types/common.rs
Normal file
@ -0,0 +1,144 @@
|
||||
use crate::types::configuration::{
|
||||
ConfigurationHeader, PCSCReader, ReaderConfiguration, SESAMVitaleComponent,
|
||||
};
|
||||
|
||||
use std::{error::Error, str::FromStr};
|
||||
|
||||
use bitvec::index::BitIdx;
|
||||
use deku::{
|
||||
bitvec::{BitStore, Msb0},
|
||||
ctx::ByteSize,
|
||||
deku_derive,
|
||||
reader::{Reader, ReaderRet},
|
||||
DekuContainerRead, DekuError, DekuReader,
|
||||
};
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub(crate) struct NumericString(#[deku(map = "convert_from_data_field")] String);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub(crate) struct AlphaNumericString(#[deku(map = "convert_from_data_field")] String);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub(crate) struct BinaryData(#[deku(map = "extract_from_data_field")] Vec<u8>);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, Clone, Copy, PartialEq)]
|
||||
#[deku(endian = "big")]
|
||||
pub(crate) struct GroupId(u16);
|
||||
|
||||
trait MapToDekuParseError<T> {
|
||||
fn map_to_deku_parse_error(self) -> Result<T, DekuError>;
|
||||
}
|
||||
|
||||
impl<T, E: Error> MapToDekuParseError<T> for Result<T, E> {
|
||||
fn map_to_deku_parse_error(self) -> Result<T, DekuError> {
|
||||
self.map_err(|e| DekuError::Parse(e.to_string().into()))
|
||||
}
|
||||
}
|
||||
|
||||
fn read_size<R: std::io::Read>(reader: &mut Reader<R>) -> Result<ByteSize, DekuError> {
|
||||
let first_byte: u8 = u8::from_reader_with_ctx(reader, ())?;
|
||||
|
||||
let is_length_expanded = first_byte.get_bit::<Msb0>(BitIdx::new(0).map_to_deku_parse_error()?);
|
||||
|
||||
match is_length_expanded {
|
||||
true => {
|
||||
let size_of_data_size: ByteSize = ByteSize((first_byte & 0b0111_1111) as usize);
|
||||
|
||||
if size_of_data_size.0 > 4 {
|
||||
return Err(DekuError::Parse("Size of the length encoding is > 4, this is not normal. Probable parsing error".to_string().into()));
|
||||
};
|
||||
|
||||
// maximum size of the buffer is 4, we use the offset to read values less than 4 bytes
|
||||
let buffer: &mut [u8; 4] = &mut [0; 4];
|
||||
let write_offset = 4 - size_of_data_size.0;
|
||||
|
||||
match reader.read_bytes(size_of_data_size.0, &mut buffer[write_offset..])? {
|
||||
ReaderRet::Bits(_bit_vec) => Err(DekuError::Parse("Got bits when trying to read bytes -> reader is unaligned, this is not normal.".to_string().into())),
|
||||
ReaderRet::Bytes => Ok(ByteSize(u32::from_be_bytes(*buffer) as usize)),
|
||||
}
|
||||
}
|
||||
false => Ok(ByteSize(first_byte as usize)),
|
||||
}
|
||||
}
|
||||
|
||||
// Using this as the map function asks deku to parse a datafield
|
||||
// We then use the datafield and convert it to the corresponding value
|
||||
pub(super) fn convert_from_data_field<T>(data_field: DataField) -> Result<T, DekuError>
|
||||
where
|
||||
T: FromStr,
|
||||
T::Err: Error,
|
||||
{
|
||||
let text = String::from_utf8(data_field.data).map_to_deku_parse_error()?;
|
||||
T::from_str(&text).map_to_deku_parse_error()
|
||||
}
|
||||
|
||||
pub(crate) fn extract_from_data_field(data_field: DataField) -> Result<Vec<u8>, DekuError> {
|
||||
Ok(data_field.data)
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub(crate) struct DataField {
|
||||
#[deku(reader = "read_size(deku::reader)")]
|
||||
pub(crate) data_size: ByteSize,
|
||||
|
||||
#[deku(bytes_read = "data_size.0")]
|
||||
pub(crate) data: Vec<u8>,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub(crate) struct BlockHeader {
|
||||
pub(crate) group_id: GroupId,
|
||||
|
||||
#[deku(reader = "read_size(deku::reader)")]
|
||||
pub(crate) data_size: ByteSize,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub(crate) struct DataBlock {
|
||||
pub(crate) header: BlockHeader,
|
||||
|
||||
#[deku(ctx = "header.group_id")]
|
||||
pub(crate) inner: DataGroup,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
#[deku(ctx = "group_id: GroupId", id = "group_id.0")]
|
||||
pub enum DataGroup {
|
||||
#[deku(id = 60)]
|
||||
ConfigurationHeader(ConfigurationHeader),
|
||||
#[deku(id = 61)]
|
||||
ReaderConfiguration(ReaderConfiguration),
|
||||
#[deku(id = 64)]
|
||||
SESAMVitaleComponent(SESAMVitaleComponent),
|
||||
#[deku(id = 67)]
|
||||
PCSCReader(PCSCReader),
|
||||
}
|
||||
pub(crate) fn read_from_buffer<T>(buffer: &[u8]) -> Result<T, T::Error>
|
||||
where
|
||||
T: TryFrom<Vec<DataBlock>>,
|
||||
{
|
||||
let mut data_blocks: Vec<DataBlock> = Vec::new();
|
||||
let mut offset = 0;
|
||||
|
||||
let mut remaining_buffer = buffer;
|
||||
|
||||
while !remaining_buffer.is_empty() {
|
||||
// TODO: properly handle errors
|
||||
let (rest, data_block) = DataBlock::from_bytes((remaining_buffer, offset)).unwrap();
|
||||
|
||||
data_blocks.push(data_block);
|
||||
|
||||
(remaining_buffer, offset) = rest;
|
||||
}
|
||||
|
||||
T::try_from(data_blocks)
|
||||
}
|
137
crates/services-sesam-vitale-sys/src/types/configuration.rs
Normal file
137
crates/services-sesam-vitale-sys/src/types/configuration.rs
Normal file
@ -0,0 +1,137 @@
|
||||
use crate::types::common::DataBlock;
|
||||
use std::{error::Error, fmt, vec::Vec};
|
||||
|
||||
use crate::types::common::convert_from_data_field;
|
||||
use deku::{deku_derive, DekuReader};
|
||||
|
||||
use super::common::{AlphaNumericString, DataGroup};
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct SSVVersionNumber(#[deku(map = "convert_from_data_field")] u16);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct GALSSVersionNumber(#[deku(map = "convert_from_data_field")] u16);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct PSSVersionNumber(#[deku(map = "convert_from_data_field")] u16);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct ConfigurationHeader {
|
||||
pub ssv_version: SSVVersionNumber,
|
||||
pub galss_version: GALSSVersionNumber,
|
||||
pub pss_version: PSSVersionNumber,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct PCSCReaderName(AlphaNumericString);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct CardType(#[deku(map = "convert_from_data_field")] u8);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct PCSCReader {
|
||||
pub name: PCSCReaderName,
|
||||
pub card_type: CardType,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct SESAMVitaleComponentID(#[deku(map = "convert_from_data_field")] u16);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct SESAMVitaleComponentDescription(AlphaNumericString);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct SESAMVitaleComponentVersion(AlphaNumericString);
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct SESAMVitaleComponent {
|
||||
pub id: SESAMVitaleComponentID,
|
||||
pub description: SESAMVitaleComponentDescription,
|
||||
pub version: SESAMVitaleComponentVersion,
|
||||
}
|
||||
|
||||
#[deku_derive(DekuRead)]
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub struct ReaderConfiguration {}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub enum ConfigurationError {
|
||||
MultipleConfigurationHeaders,
|
||||
MissingConfigurationHeader,
|
||||
}
|
||||
|
||||
impl fmt::Display for ConfigurationError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
ConfigurationError::MultipleConfigurationHeaders => {
|
||||
write!(f, "Multiple ConfigurationHeader blocks found")
|
||||
}
|
||||
ConfigurationError::MissingConfigurationHeader => {
|
||||
write!(f, "Missing ConfigurationHeader block")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Error for ConfigurationError {}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct Configuration {
|
||||
pub configuration_header: ConfigurationHeader,
|
||||
pub reader_configurations: Vec<ReaderConfiguration>,
|
||||
pub sesam_vitale_components: Vec<SESAMVitaleComponent>,
|
||||
pub pcsc_readers: Vec<PCSCReader>,
|
||||
}
|
||||
|
||||
impl TryFrom<Vec<DataBlock>> for Configuration {
|
||||
type Error = ConfigurationError;
|
||||
|
||||
fn try_from(data_blocks: Vec<DataBlock>) -> Result<Self, Self::Error> {
|
||||
let mut configuration_header: Option<ConfigurationHeader> = None;
|
||||
let mut reader_configurations: Vec<ReaderConfiguration> = Vec::new();
|
||||
let mut sesam_vitale_components: Vec<SESAMVitaleComponent> = Vec::new();
|
||||
let mut pcsc_readers: Vec<PCSCReader> = Vec::new();
|
||||
|
||||
for block in data_blocks {
|
||||
match block.inner {
|
||||
DataGroup::ConfigurationHeader(header) => {
|
||||
if configuration_header.is_some() {
|
||||
return Err(ConfigurationError::MultipleConfigurationHeaders);
|
||||
}
|
||||
configuration_header = Some(header);
|
||||
}
|
||||
DataGroup::ReaderConfiguration(configuration) => {
|
||||
reader_configurations.push(configuration)
|
||||
}
|
||||
DataGroup::SESAMVitaleComponent(component) => {
|
||||
sesam_vitale_components.push(component);
|
||||
}
|
||||
DataGroup::PCSCReader(reader) => {
|
||||
pcsc_readers.push(reader);
|
||||
}
|
||||
}
|
||||
}
|
||||
let configuration_header = match configuration_header {
|
||||
Some(header) => header,
|
||||
None => return Err(ConfigurationError::MissingConfigurationHeader),
|
||||
};
|
||||
|
||||
Ok(Self {
|
||||
configuration_header,
|
||||
reader_configurations,
|
||||
sesam_vitale_components,
|
||||
pcsc_readers,
|
||||
})
|
||||
}
|
||||
}
|
3
crates/services-sesam-vitale-sys/src/types/mod.rs
Normal file
3
crates/services-sesam-vitale-sys/src/types/mod.rs
Normal file
@ -0,0 +1,3 @@
|
||||
pub mod common;
|
||||
pub mod configuration;
|
||||
// pub mod droits_vitale;
|
@ -4,11 +4,10 @@ version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
anyhow.workspace = true
|
||||
anyhow = "1.0"
|
||||
libc = "0.2"
|
||||
thiserror.workspace = true
|
||||
|
||||
thiserror = "1.0"
|
||||
utils = { path = "../utils" }
|
||||
|
||||
[build-dependencies]
|
||||
dotenv.workspace = true
|
||||
dotenv = "0.15"
|
||||
|
@ -9,7 +9,7 @@ use thiserror::Error;
|
||||
use crate::cps::lire_carte;
|
||||
use crate::libssv::{SSV_InitLIB2, SSV_LireConfig};
|
||||
|
||||
use ::utils::config::{load_config, ConfigError};
|
||||
use ::utils::config::load_config;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum SSVDemoError {
|
||||
@ -18,7 +18,7 @@ pub enum SSVDemoError {
|
||||
#[error(transparent)]
|
||||
SSVLibErrorCode(#[from] crate::libssv::LibSSVError),
|
||||
#[error(transparent)]
|
||||
Configuration(#[from] ConfigError),
|
||||
Anyhow(#[from] anyhow::Error),
|
||||
}
|
||||
|
||||
fn ssv_init_lib_2() -> Result<(), SSVDemoError> {
|
||||
@ -71,7 +71,7 @@ pub fn demo() -> Result<(), SSVDemoError> {
|
||||
|
||||
println!("------- Demo for the SSV library --------");
|
||||
|
||||
load_config(None)?;
|
||||
load_config()?;
|
||||
|
||||
ssv_init_lib_2()?;
|
||||
|
||||
|
@ -4,7 +4,6 @@ version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
anyhow.workspace = true
|
||||
anyhow = "1.0"
|
||||
directories = "5.0"
|
||||
dotenv.workspace = true
|
||||
thiserror.workspace = true
|
||||
dotenv = "0.15"
|
||||
|
@ -1,23 +1,11 @@
|
||||
use std::{env, path::PathBuf, sync::atomic::AtomicBool};
|
||||
use std::{env, path::PathBuf};
|
||||
|
||||
use anyhow::{bail, Context, Result};
|
||||
use directories::ProjectDirs;
|
||||
use dotenv::from_path;
|
||||
use thiserror::Error;
|
||||
|
||||
const CONFIG_FILE_NAME: &str = ".env";
|
||||
|
||||
static CONFIG_INITIALIZED: AtomicBool = AtomicBool::new(false);
|
||||
|
||||
#[derive(Debug, Error)]
|
||||
pub enum ConfigError {
|
||||
#[error("No config file {0} found in the following directories: {1:#?}")]
|
||||
ConfigFileNotFound(String, Vec<PathBuf>),
|
||||
#[error("Failed to load config file: {0}")]
|
||||
LoadConfigError(#[from] dotenv::Error),
|
||||
#[error("Environment variable error: {0}")]
|
||||
EnvVarError(#[from] std::env::VarError),
|
||||
}
|
||||
|
||||
pub fn get_config_dirs() -> Vec<PathBuf> {
|
||||
let mut config_dirs = vec![
|
||||
PathBuf::from(""), // Current directory
|
||||
@ -31,7 +19,7 @@ pub fn get_config_dirs() -> Vec<PathBuf> {
|
||||
config_dirs
|
||||
}
|
||||
|
||||
pub fn get_config_files() -> Result<Vec<PathBuf>, ConfigError> {
|
||||
pub fn get_config_files() -> Result<Vec<PathBuf>> {
|
||||
let config_dirs = get_config_dirs();
|
||||
let mut config_files = Vec::new();
|
||||
for config_dir in config_dirs.iter() {
|
||||
@ -41,20 +29,14 @@ pub fn get_config_files() -> Result<Vec<PathBuf>, ConfigError> {
|
||||
}
|
||||
}
|
||||
if config_files.is_empty() {
|
||||
return Err(ConfigError::ConfigFileNotFound(
|
||||
CONFIG_FILE_NAME.to_string(),
|
||||
config_dirs,
|
||||
));
|
||||
bail!(
|
||||
"No config file {CONFIG_FILE_NAME} found in the following directories: {config_dirs:#?}"
|
||||
);
|
||||
}
|
||||
Ok(config_files)
|
||||
}
|
||||
|
||||
pub fn load_config(force: Option<bool>) -> Result<(), ConfigError> {
|
||||
let force = force.unwrap_or(false);
|
||||
if CONFIG_INITIALIZED.load(std::sync::atomic::Ordering::Relaxed) && force {
|
||||
println!("DEBUG: Config already initialized, skipping");
|
||||
return Ok(());
|
||||
}
|
||||
pub fn load_config() -> Result<()> {
|
||||
let config_files = get_config_files()?;
|
||||
// Load the first config file found
|
||||
// TODO: add a verbose log to list all config files found
|
||||
@ -62,7 +44,5 @@ pub fn load_config(force: Option<bool>) -> Result<(), ConfigError> {
|
||||
"DEBUG: Config files found (1st loaded): {:#?}",
|
||||
config_files
|
||||
);
|
||||
from_path(config_files[0].as_path()).map_err(ConfigError::LoadConfigError)?;
|
||||
CONFIG_INITIALIZED.store(true, std::sync::atomic::Ordering::Relaxed);
|
||||
Ok(())
|
||||
from_path(config_files[0].as_path()).context("Failed to load config file")
|
||||
}
|
||||
|
@ -1,14 +0,0 @@
|
||||
[package]
|
||||
name = "entity"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
name = "entity"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[dependencies]
|
||||
sea-orm.workspace = true
|
||||
|
||||
[dev-dependencies]
|
||||
sea-orm-cli.workspace = true
|
@ -1,17 +0,0 @@
|
||||
//! `SeaORM` Entity, @generated by sea-orm-codegen 1.0.1
|
||||
|
||||
use sea_orm::entity::prelude::*;
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, DeriveEntityModel, Eq)]
|
||||
#[sea_orm(table_name = "debug")]
|
||||
pub struct Model {
|
||||
#[sea_orm(primary_key)]
|
||||
pub id: i32,
|
||||
pub title: String,
|
||||
pub text: String,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
|
||||
pub enum Relation {}
|
||||
|
||||
impl ActiveModelBehavior for ActiveModel {}
|
@ -1,5 +0,0 @@
|
||||
//! `SeaORM` Entity, @generated by sea-orm-codegen 1.0.1
|
||||
|
||||
pub mod prelude;
|
||||
|
||||
pub mod debug;
|
@ -1,3 +0,0 @@
|
||||
//! `SeaORM` Entity, @generated by sea-orm-codegen 1.0.1
|
||||
|
||||
pub use super::debug::Entity as Debug;
|
@ -1,2 +0,0 @@
|
||||
mod entities;
|
||||
pub use entities::*;
|
24
frontend/.gitignore
vendored
Normal file
24
frontend/.gitignore
vendored
Normal file
@ -0,0 +1,24 @@
|
||||
# Nuxt dev/build outputs
|
||||
.output
|
||||
.data
|
||||
.nuxt
|
||||
.nitro
|
||||
.cache
|
||||
dist
|
||||
|
||||
# Node dependencies
|
||||
node_modules
|
||||
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
|
||||
# Misc
|
||||
.DS_Store
|
||||
.fleet
|
||||
.idea
|
||||
|
||||
# Local env files
|
||||
.env
|
||||
.env.*
|
||||
!.env.example
|
77
frontend/README.md
Normal file
77
frontend/README.md
Normal file
@ -0,0 +1,77 @@
|
||||
# Nuxt 3 Minimal Starter
|
||||
|
||||
TODO : Faire un vrai README pour `frontend` (Nuxt 3)
|
||||
|
||||
Look at the [Nuxt 3 documentation](https://nuxt.com/docs/getting-started/introduction) to learn more.
|
||||
|
||||
## Setup
|
||||
|
||||
Make sure to install the dependencies:
|
||||
|
||||
```bash
|
||||
# npm
|
||||
npm install
|
||||
|
||||
# pnpm
|
||||
pnpm install
|
||||
|
||||
# yarn
|
||||
yarn install
|
||||
|
||||
# bun
|
||||
bun install
|
||||
```
|
||||
|
||||
## Development Server
|
||||
|
||||
Start the development server on `http://localhost:3000`:
|
||||
|
||||
```bash
|
||||
# npm
|
||||
npm run dev
|
||||
|
||||
# pnpm
|
||||
pnpm run dev
|
||||
|
||||
# yarn
|
||||
yarn dev
|
||||
|
||||
# bun
|
||||
bun run dev
|
||||
```
|
||||
|
||||
## Production
|
||||
|
||||
Build the application for production:
|
||||
|
||||
```bash
|
||||
# npm
|
||||
npm run build
|
||||
|
||||
# pnpm
|
||||
pnpm run build
|
||||
|
||||
# yarn
|
||||
yarn build
|
||||
|
||||
# bun
|
||||
bun run build
|
||||
```
|
||||
|
||||
Locally preview production build:
|
||||
|
||||
```bash
|
||||
# npm
|
||||
npm run preview
|
||||
|
||||
# pnpm
|
||||
pnpm run preview
|
||||
|
||||
# yarn
|
||||
yarn preview
|
||||
|
||||
# bun
|
||||
bun run preview
|
||||
```
|
||||
|
||||
Check out the [deployment documentation](https://nuxt.com/docs/getting-started/deployment) for more information.
|
7
frontend/app.vue
Normal file
7
frontend/app.vue
Normal file
@ -0,0 +1,7 @@
|
||||
<template>
|
||||
<div>
|
||||
<NuxtRouteAnnouncer />
|
||||
<NavBar />
|
||||
<NuxtPage />
|
||||
</div>
|
||||
</template>
|
43
frontend/app/spa-loading-template.html
Normal file
43
frontend/app/spa-loading-template.html
Normal file
@ -0,0 +1,43 @@
|
||||
<!--
|
||||
This component is used to show a loading spinner when the SPA is loading.
|
||||
Source: https://github.com/barelyhuman/snips/blob/dev/pages/css-loader.md
|
||||
-->
|
||||
<div class="loader"></div>
|
||||
<style>
|
||||
.loader {
|
||||
display: block;
|
||||
position: fixed;
|
||||
z-index: 1031;
|
||||
top: 50%;
|
||||
left: 50%;
|
||||
transform: translate(-50%, -50%);
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
box-sizing: border-box;
|
||||
border: solid 2px transparent;
|
||||
border-top-color: #000;
|
||||
border-left-color: #000;
|
||||
border-bottom-color: #efefef;
|
||||
border-right-color: #efefef;
|
||||
border-radius: 50%;
|
||||
-webkit-animation: loader 400ms linear infinite;
|
||||
animation: loader 400ms linear infinite;
|
||||
}
|
||||
|
||||
\@-webkit-keyframes loader {
|
||||
0% {
|
||||
-webkit-transform: translate(-50%, -50%) rotate(0deg);
|
||||
}
|
||||
100% {
|
||||
-webkit-transform: translate(-50%, -50%) rotate(360deg);
|
||||
}
|
||||
}
|
||||
\@keyframes loader {
|
||||
0% {
|
||||
transform: translate(-50%, -50%) rotate(0deg);
|
||||
}
|
||||
100% {
|
||||
transform: translate(-50%, -50%) rotate(360deg);
|
||||
}
|
||||
}
|
||||
</style>
|
BIN
frontend/bun.lockb
Executable file
BIN
frontend/bun.lockb
Executable file
Binary file not shown.
22
frontend/components/Avatar.vue
Normal file
22
frontend/components/Avatar.vue
Normal file
@ -0,0 +1,22 @@
|
||||
<template>
|
||||
<div class="avatar">
|
||||
<div class="rounded-full">
|
||||
<img :src="getAvatarUrl(user)" />
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import type { User } from '~/types/user';
|
||||
|
||||
const props = defineProps<{
|
||||
user: User,
|
||||
}>();
|
||||
|
||||
const getAvatarUrl = (user: User) => {
|
||||
if (user.avatar) {
|
||||
return user.avatar;
|
||||
}
|
||||
return 'https://avatar.iran.liara.run/username?username=' + user.name;
|
||||
};
|
||||
</script>
|
71
frontend/components/LoginModal.vue
Normal file
71
frontend/components/LoginModal.vue
Normal file
@ -0,0 +1,71 @@
|
||||
<template>
|
||||
<dialog
|
||||
id="login_modal"
|
||||
ref="login_modal"
|
||||
@cancel.prevent=""
|
||||
@keyup="handleKeyPress"
|
||||
class="modal"
|
||||
>
|
||||
<div class="modal-box">
|
||||
<h3 class="text-3xl text-center mb-6">Connexion</h3>
|
||||
<div class="flex flex-wrap gap-5 justify-center">
|
||||
<template v-for="(user, index) in users" :key="user.id">
|
||||
<LoginModalAvatar
|
||||
:user="user"
|
||||
:rank="index+1"
|
||||
@selectUser="login"
|
||||
/>
|
||||
</template>
|
||||
</div>
|
||||
</div>
|
||||
<form method="dialog" class="modal-backdrop">
|
||||
<button class="cursor-default">close</button>
|
||||
</form>
|
||||
</dialog>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import type { User } from '~/types/user';
|
||||
|
||||
const users: User[] = [
|
||||
{ id: 1, name: 'John Doe', avatar: 'https://img.daisyui.com/images/stock/photo-1534528741775-53994a69daeb.webp' },
|
||||
{ id: 2, name: 'Jane Doe', avatar: 'https://avatar.iran.liara.run/public' },
|
||||
{ id: 3, name: 'Michel Moulin', avatar: '' },
|
||||
{ id: 4, name: 'Jean Paris', avatar: '' },
|
||||
{ id: 5, name: 'Marie Dupont', avatar: '' },
|
||||
{ id: 6, name: 'Émilie Fournier', avatar: '' },
|
||||
{ id: 7, name: 'Pierre Lefevre', avatar: '' },
|
||||
{ id: 8, name: 'Sophie Lemoine', avatar: '' },
|
||||
{ id: 9, name: 'Lucie Simon', avatar: '' },
|
||||
{ id: 10, name: 'Kevin Boucher', avatar: '' },
|
||||
];
|
||||
|
||||
const loginModal = useTemplateRef('login_modal');
|
||||
|
||||
const current_user = useCurrentUser();
|
||||
|
||||
const login = (user: User) => {
|
||||
console.log('login', user);
|
||||
current_user.value = user;
|
||||
loginModal.value?.close();
|
||||
};
|
||||
|
||||
const handleKeyPress = (event: KeyboardEvent) => {
|
||||
// Extract the rank from the event.code : Digit7 -> 7
|
||||
const rank = event.code.match(/\d/);
|
||||
if (!rank) {
|
||||
console.debug('Not handled key event', { event });
|
||||
return;
|
||||
}
|
||||
const user = getUserByRank(parseInt(rank[0]));
|
||||
if (user) {
|
||||
login(user);
|
||||
} else {
|
||||
console.debug('Not handled key event', { event });
|
||||
}
|
||||
};
|
||||
|
||||
const getUserByRank = (rank: number): User => {
|
||||
return users[rank - 1];
|
||||
};
|
||||
</script>
|
17
frontend/components/LoginModalAvatar.vue
Normal file
17
frontend/components/LoginModalAvatar.vue
Normal file
@ -0,0 +1,17 @@
|
||||
<template>
|
||||
<button class="relative" @click="$emit('selectUser', user)">
|
||||
<Avatar class="w-24" :user="user" />
|
||||
<div class="absolute w-fit mx-auto bottom-0 inset-x-0">
|
||||
<kbd class="kbd kbd-sm">{{ rank }}</kbd>
|
||||
</div>
|
||||
</button>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import type { User } from '~/types/user';
|
||||
|
||||
const props = defineProps<{
|
||||
user: User,
|
||||
rank: Number,
|
||||
}>();
|
||||
</script>
|
35
frontend/components/NavBar.vue
Normal file
35
frontend/components/NavBar.vue
Normal file
@ -0,0 +1,35 @@
|
||||
<template>
|
||||
<div class="navbar">
|
||||
<div class="navbar-start">
|
||||
<a class="btn btn-ghost text-xl" href="/">Chrys4lide</a>
|
||||
</div>
|
||||
<nav class="navbar-center">
|
||||
<NuxtLink to="/" class="btn btn-ghost">Accueil</NuxtLink>
|
||||
<NuxtLink to="/CPS" class="btn btn-ghost">Carte CPS</NuxtLink>
|
||||
</nav>
|
||||
<div class="navbar-end">
|
||||
<template v-if="!current_user">
|
||||
<button class="btn btn-ghost" type="button" onclick="login_modal.showModal()">
|
||||
Connexion
|
||||
</button>
|
||||
</template>
|
||||
<template v-else>
|
||||
<details class="dropdown dropdown-end">
|
||||
<summary class="block"><Avatar :user="current_user" class="w-12" role="button" /></summary>
|
||||
<ul class="menu dropdown-content bg-base-100 rounded-box z-[1] w-52 p-2 shadow">
|
||||
<li><a @click="logout">Déconnexion</a></li>
|
||||
</ul>
|
||||
</details>
|
||||
</template>
|
||||
</div>
|
||||
</div>
|
||||
<LoginModal />
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
const current_user = useCurrentUser();
|
||||
|
||||
const logout = () => {
|
||||
current_user.value = null;
|
||||
};
|
||||
</script>
|
3
frontend/composables/currentUser.ts
Normal file
3
frontend/composables/currentUser.ts
Normal file
@ -0,0 +1,3 @@
|
||||
import type { User } from '@/types/user';
|
||||
|
||||
export const useCurrentUser = () => useState<User | null>('currentUser', () => null);
|
39
frontend/nuxt.config.ts
Normal file
39
frontend/nuxt.config.ts
Normal file
@ -0,0 +1,39 @@
|
||||
// https://nuxt.com/docs/api/configuration/nuxt-config
|
||||
export default defineNuxtConfig({
|
||||
compatibilityDate: '2024-04-03',
|
||||
// Enables the development server to be discoverable by other devices for mobile development
|
||||
devServer: { host: '0.0.0.0', port: 1420 },
|
||||
devtools: { enabled: true },
|
||||
modules: [
|
||||
'@nuxtjs/tailwindcss',
|
||||
'@nuxtjs/color-mode',
|
||||
],
|
||||
// Disable SSR for Tauri
|
||||
ssr: false,
|
||||
vite: {
|
||||
// Better support for Tauri CLI output
|
||||
clearScreen: false,
|
||||
// Enable environment variables
|
||||
// Additional environment variables can be found at
|
||||
// https://v2.tauri.app/reference/environment-variables/
|
||||
envPrefix: ['VITE_', 'TAURI_'],
|
||||
server: {
|
||||
// Tauri requires a consistent port
|
||||
strictPort: true,
|
||||
hmr: {
|
||||
// Use websocket for mobile hot reloading
|
||||
protocol: 'ws',
|
||||
// Make sure it's available on the network
|
||||
host: '0.0.0.0',
|
||||
// Use a specific port for hmr
|
||||
port: 5183,
|
||||
},
|
||||
},
|
||||
},
|
||||
colorMode: {
|
||||
// Add `data-theme` attribute to the `html` tag, allowing DaisyUI to handle dark mode automatically
|
||||
dataValue: 'theme',
|
||||
// Remove the default `-mode` suffix from the class name, letting have `dark` and `light` as class names, for DaisyUI compatibility
|
||||
classSuffix: '',
|
||||
},
|
||||
})
|
22
frontend/package.json
Normal file
22
frontend/package.json
Normal file
@ -0,0 +1,22 @@
|
||||
{
|
||||
"name": "nuxt-app",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"build": "nuxi build",
|
||||
"dev": "nuxi dev",
|
||||
"generate": "nuxi generate",
|
||||
"preview": "nuxi preview",
|
||||
"postinstall": "nuxi prepare"
|
||||
},
|
||||
"dependencies": {
|
||||
"@nuxtjs/color-mode": "^3.5.1",
|
||||
"daisyui": "^4.12.10",
|
||||
"nuxt": "^3.13.0",
|
||||
"vue": "latest",
|
||||
"vue-router": "latest"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@nuxtjs/tailwindcss": "^6.12.1"
|
||||
}
|
||||
}
|
8
frontend/pages/CPS.vue
Normal file
8
frontend/pages/CPS.vue
Normal file
@ -0,0 +1,8 @@
|
||||
<template>
|
||||
<div>
|
||||
<h1 class="text-xl">Carte CPS</h1>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup>
|
||||
</script>
|
17
frontend/pages/index.vue
Normal file
17
frontend/pages/index.vue
Normal file
@ -0,0 +1,17 @@
|
||||
<template>
|
||||
<div>
|
||||
<h1 class="text-xl">Welcome to your {{ appName }}!</h1>
|
||||
<p v-if="current_user">Logged in as {{ current_user.name }}</p>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
const current_user = useCurrentUser();
|
||||
const appName = 'Nuxt App';
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
h1 {
|
||||
color: #42b983;
|
||||
}
|
||||
</style>
|
BIN
frontend/public/favicon.ico
Normal file
BIN
frontend/public/favicon.ico
Normal file
Binary file not shown.
After Width: | Height: | Size: 4.2 KiB |
1
frontend/public/robots.txt
Normal file
1
frontend/public/robots.txt
Normal file
@ -0,0 +1 @@
|
||||
|
3
frontend/server/tsconfig.json
Normal file
3
frontend/server/tsconfig.json
Normal file
@ -0,0 +1,3 @@
|
||||
{
|
||||
"extends": "../.nuxt/tsconfig.server.json"
|
||||
}
|
7
frontend/tailwind.config.ts
Normal file
7
frontend/tailwind.config.ts
Normal file
@ -0,0 +1,7 @@
|
||||
import type { Config } from 'tailwindcss'
|
||||
|
||||
export default <Partial<Config>>{
|
||||
plugins: [
|
||||
require('daisyui'),
|
||||
],
|
||||
}
|
4
frontend/tsconfig.json
Normal file
4
frontend/tsconfig.json
Normal file
@ -0,0 +1,4 @@
|
||||
{
|
||||
// https://nuxt.com/docs/guide/concepts/typescript
|
||||
"extends": "./.nuxt/tsconfig.json"
|
||||
}
|
6
frontend/types/user.ts
Normal file
6
frontend/types/user.ts
Normal file
@ -0,0 +1,6 @@
|
||||
export declare interface User {
|
||||
id: number;
|
||||
name: string;
|
||||
email?: string;
|
||||
avatar?: string;
|
||||
}
|
@ -1,24 +0,0 @@
|
||||
[package]
|
||||
name = "migration"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
publish = false
|
||||
|
||||
[lib]
|
||||
name = "migration"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[dependencies]
|
||||
async-std = { version = "1", features = ["attributes", "tokio1"] }
|
||||
|
||||
[dev-dependencies]
|
||||
sea-orm-cli.workspace = true
|
||||
|
||||
[dependencies.sea-orm-migration]
|
||||
version = "1.0.0"
|
||||
features = [
|
||||
# `ASYNC_RUNTIME` and `DATABASE_DRIVER` are required to run migration using the cli
|
||||
# They must be the same as the features in the `sea-orm` dependency in the `app` crate
|
||||
"sqlx-sqlite", # `DATABASE_DRIVER` feature
|
||||
"runtime-tokio-rustls", # `ASYNC_RUNTIME` feature
|
||||
]
|
@ -1,41 +0,0 @@
|
||||
# Running Migrator CLI
|
||||
|
||||
- Generate a new migration file
|
||||
```sh
|
||||
cargo run -- generate MIGRATION_NAME
|
||||
```
|
||||
- Apply all pending migrations
|
||||
```sh
|
||||
cargo run
|
||||
```
|
||||
```sh
|
||||
cargo run -- up
|
||||
```
|
||||
- Apply first 10 pending migrations
|
||||
```sh
|
||||
cargo run -- up -n 10
|
||||
```
|
||||
- Rollback last applied migrations
|
||||
```sh
|
||||
cargo run -- down
|
||||
```
|
||||
- Rollback last 10 applied migrations
|
||||
```sh
|
||||
cargo run -- down -n 10
|
||||
```
|
||||
- Drop all tables from the database, then reapply all migrations
|
||||
```sh
|
||||
cargo run -- fresh
|
||||
```
|
||||
- Rollback all applied migrations, then reapply all migrations
|
||||
```sh
|
||||
cargo run -- refresh
|
||||
```
|
||||
- Rollback all applied migrations
|
||||
```sh
|
||||
cargo run -- reset
|
||||
```
|
||||
- Check the status of all migrations
|
||||
```sh
|
||||
cargo run -- status
|
||||
```
|
@ -1,12 +0,0 @@
|
||||
pub use sea_orm_migration::prelude::*;
|
||||
|
||||
mod m20220101_000001_create_debug_table;
|
||||
|
||||
pub struct Migrator;
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl MigratorTrait for Migrator {
|
||||
fn migrations() -> Vec<Box<dyn MigrationTrait>> {
|
||||
vec![Box::new(m20220101_000001_create_debug_table::Migration)]
|
||||
}
|
||||
}
|
@ -1,35 +0,0 @@
|
||||
use sea_orm_migration::{prelude::*, schema::*};
|
||||
|
||||
#[derive(DeriveMigrationName)]
|
||||
pub struct Migration;
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl MigrationTrait for Migration {
|
||||
async fn up(&self, manager: &SchemaManager) -> Result<(), DbErr> {
|
||||
manager
|
||||
.create_table(
|
||||
Table::create()
|
||||
.table(Debug::Table)
|
||||
.if_not_exists()
|
||||
.col(pk_auto(Debug::Id))
|
||||
.col(string(Debug::Title))
|
||||
.col(string(Debug::Text))
|
||||
.to_owned(),
|
||||
)
|
||||
.await
|
||||
}
|
||||
|
||||
async fn down(&self, manager: &SchemaManager) -> Result<(), DbErr> {
|
||||
manager
|
||||
.drop_table(Table::drop().table(Debug::Table).to_owned())
|
||||
.await
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(DeriveIden)]
|
||||
enum Debug {
|
||||
Table,
|
||||
Id,
|
||||
Title,
|
||||
Text,
|
||||
}
|
@ -1,6 +0,0 @@
|
||||
use sea_orm_migration::prelude::*;
|
||||
|
||||
#[async_std::main]
|
||||
async fn main() {
|
||||
cli::run_cli(migration::Migrator).await;
|
||||
}
|
Reference in New Issue
Block a user