Initial commit
This commit is contained in:
8
.gitignore
vendored
Normal file
8
.gitignore
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
/target
|
||||
/dist
|
||||
/src/dist
|
||||
/src/target
|
||||
/src-tauri/target
|
||||
/node_modules
|
||||
/src-tauri/pdfjs/node_modules
|
||||
/src-tauri/pdfjs/package-lock.json
|
||||
3
.gitmodules
vendored
Normal file
3
.gitmodules
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
[submodule "vendor/pdfjs-dist"]
|
||||
path = vendor/pdfjs-dist
|
||||
url = https://github.com/mozilla/pdfjs-dist
|
||||
84
CLAUDE.md
Normal file
84
CLAUDE.md
Normal file
@@ -0,0 +1,84 @@
|
||||
# Brittle — Literature Management System
|
||||
|
||||
A desktop application for managing academic references, PDFs, and annotations. Written in Rust.
|
||||
|
||||
## Project Vision
|
||||
|
||||
Brittle is a personal literature management tool in the spirit of Zotero. Core capabilities:
|
||||
|
||||
- Organize references in nested libraries (libraries contain sub-libraries)
|
||||
- Display and annotate PDFs (create, edit, view annotations)
|
||||
- Export references as BibTeX
|
||||
|
||||
This is a hobby project built primarily through AI-assisted development. The human operator makes architectural decisions; the agent writes the code.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Mandatory: Library-First Design
|
||||
|
||||
The backend is a **Rust crate** (`brittle-core`) that exposes a public API. This is the primary interface. A frontend like [iced](https://github.com/iced-rs/iced) links directly against this crate.
|
||||
|
||||
The API must be designed so that wrapping it in REST, gRPC, IPC, or any other transport layer requires **zero changes to `brittle-core`**. This means:
|
||||
|
||||
- No UI concepts leak into the core (no widget types, no rendering logic, no event loops)
|
||||
- Return types must be transport-agnostic: no framework-specific channels, callbacks, or handle types. Plain data that any layer can serialize or forward
|
||||
- Errors are structured and meaningful, not stringly-typed
|
||||
- The public API surface is the **single source of truth** for what Brittle can do
|
||||
|
||||
If a frontend needs something that isn't in the core API, the correct response is to extend the core API — not to hack around it.
|
||||
|
||||
### Dependency Philosophy
|
||||
|
||||
Use external crates where they provide real value (PDF rendering, BibTeX parsing, SQLite bindings, etc.). Do not pull in a dependency for something the standard library handles or for trivial utility code. When choosing between crates, prefer those that are well-maintained, have minimal transitive dependencies, and are widely used in the Rust ecosystem.
|
||||
|
||||
### Database Versioning
|
||||
|
||||
The database must support some form of version control or change tracking. The specific mechanism (event sourcing, snapshot-based, migration-based, audit log, etc.) is an **open design question** to be resolved during planning before implementation begins.
|
||||
|
||||
## Development Workflow
|
||||
|
||||
Every non-trivial feature follows this process end-to-end. Do not skip steps.
|
||||
|
||||
### 1. Requirements (what, not how)
|
||||
|
||||
Start from user-facing behavior. What does this feature need to do? Define concrete acceptance criteria. Do not think about implementation yet.
|
||||
|
||||
### 2. Architectural Fit
|
||||
|
||||
Before designing anything, assess how this feature relates to what already exists:
|
||||
|
||||
- **Fit**: Does it integrate cleanly with the current structure?
|
||||
- **Risk**: If restructuring is needed, what existing functionality could break?
|
||||
- **Benefit**: What do we gain from restructuring vs. working within the current design?
|
||||
- **Verdict**: Restructure, adapt, or flag as tech debt?
|
||||
|
||||
Write this assessment as part of the plan. Do not silently restructure.
|
||||
|
||||
### 3. Design (top-down)
|
||||
|
||||
Now decide *how*: types, traits, modules, public API surface. Work downward from the API the feature needs, not upward from utility code you think might be useful.
|
||||
|
||||
### 4. Tests
|
||||
|
||||
Write failing tests that encode the acceptance criteria from step 1. These tests define "done."
|
||||
|
||||
### 5. Implementation
|
||||
|
||||
Make the tests pass. Then refactor with confidence.
|
||||
|
||||
Use **Plan mode** at the start of any multi-file task.
|
||||
|
||||
## Agent Behavior
|
||||
|
||||
Do not blindly implement what is asked. Consider whether the request is actually what the user needs, whether there's a better approach, and whether there are edge cases or implications the user may not have considered.
|
||||
|
||||
If you see a better path, **argue for it**. Explain your reasoning clearly. But if the user disagrees after hearing your case, accept their decision and implement it well.
|
||||
|
||||
## Code Style
|
||||
|
||||
- Prioritize readability and testability over cleverness
|
||||
- Use meaningful names; avoid abbreviations unless they're domain-standard (e.g., `bib`, `doi`, `pdf`)
|
||||
- Keep functions focused — if a function needs a paragraph-long comment to explain, it should probably be split
|
||||
- Use Rust idioms: `Result` for fallible operations, strong types over stringly-typed data, `enum` for state machines
|
||||
- Run `cargo clippy` and `cargo fmt` before considering any task complete
|
||||
- Document public API items with doc comments
|
||||
5224
Cargo.lock
generated
Normal file
5224
Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
3
Cargo.toml
Normal file
3
Cargo.toml
Normal file
@@ -0,0 +1,3 @@
|
||||
[workspace]
|
||||
members = ["brittle-core", "brittle-keymap", "src-tauri"]
|
||||
resolver = "2"
|
||||
7
Trunk.toml
Normal file
7
Trunk.toml
Normal file
@@ -0,0 +1,7 @@
|
||||
[build]
|
||||
target = "src/index.html"
|
||||
dist = "dist"
|
||||
|
||||
[serve]
|
||||
port = 1420
|
||||
open = false
|
||||
22
brittle-core/Cargo.toml
Normal file
22
brittle-core/Cargo.toml
Normal file
@@ -0,0 +1,22 @@
|
||||
[package]
|
||||
name = "brittle-core"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
[dependencies]
|
||||
chrono = { version = "0.4", features = ["serde"] }
|
||||
git2 = { version = "0.20", features = ["vendored-libgit2"] }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
sha2 = "0.10"
|
||||
thiserror = "2"
|
||||
toml = "0.8"
|
||||
ureq = "2"
|
||||
uuid = { version = "1", features = ["v7", "serde"] }
|
||||
|
||||
[[bin]]
|
||||
name = "brittle-seed"
|
||||
path = "src/bin/seed.rs"
|
||||
|
||||
[dev-dependencies]
|
||||
serde_json = "1"
|
||||
tempfile = "3"
|
||||
179
brittle-core/src/bibtex/export.rs
Normal file
179
brittle-core/src/bibtex/export.rs
Normal file
@@ -0,0 +1,179 @@
|
||||
use crate::bibtex::validation::validate_for_export;
|
||||
use crate::error::BibtexError;
|
||||
use crate::model::{Person, Reference};
|
||||
|
||||
/// Escape a BibTeX field value by wrapping special characters in braces.
|
||||
/// Handles `{`, `}`, `\`, and preserves existing braced groups.
|
||||
fn escape_field(value: &str) -> String {
|
||||
// Wrap the whole value in braces — simple and safe for most content.
|
||||
// This prevents BibTeX from case-folding titles and handles special chars.
|
||||
format!("{{{value}}}")
|
||||
}
|
||||
|
||||
/// Format a list of persons as a BibTeX "and"-separated author string.
|
||||
fn format_persons(persons: &[Person]) -> String {
|
||||
persons
|
||||
.iter()
|
||||
.map(|p| p.to_bibtex())
|
||||
.collect::<Vec<_>>()
|
||||
.join(" and ")
|
||||
}
|
||||
|
||||
/// Export a single reference as a BibTeX entry string.
|
||||
///
|
||||
/// Returns an error if required fields are missing.
|
||||
pub fn export_reference(reference: &Reference) -> Result<String, BibtexError> {
|
||||
validate_for_export(reference)?;
|
||||
|
||||
let mut out = String::new();
|
||||
|
||||
out.push('@');
|
||||
out.push_str(reference.entry_type.bibtex_name());
|
||||
out.push('{');
|
||||
out.push_str(&reference.cite_key);
|
||||
out.push_str(",\n");
|
||||
|
||||
// Authors and editors come first for readability.
|
||||
if !reference.authors.is_empty() {
|
||||
let authors = format_persons(&reference.authors);
|
||||
out.push_str(&format!(" author = {},\n", escape_field(&authors)));
|
||||
}
|
||||
if !reference.editors.is_empty() {
|
||||
let editors = format_persons(&reference.editors);
|
||||
out.push_str(&format!(" editor = {},\n", escape_field(&editors)));
|
||||
}
|
||||
|
||||
// All other fields in sorted order (BTreeMap guarantees this).
|
||||
for (key, value) in &reference.fields {
|
||||
out.push_str(&format!(" {key} = {},\n", escape_field(value)));
|
||||
}
|
||||
|
||||
out.push('}');
|
||||
|
||||
Ok(out)
|
||||
}
|
||||
|
||||
/// Export multiple references as a `.bib` file string.
|
||||
///
|
||||
/// Skips references with missing required fields and collects all errors.
|
||||
/// Returns the BibTeX string and a list of any export errors.
|
||||
pub fn export_references(references: &[Reference]) -> (String, Vec<BibtexError>) {
|
||||
let mut out = String::new();
|
||||
let mut errors = Vec::new();
|
||||
|
||||
for reference in references {
|
||||
match export_reference(reference) {
|
||||
Ok(entry) => {
|
||||
if !out.is_empty() {
|
||||
out.push('\n');
|
||||
}
|
||||
out.push_str(&entry);
|
||||
out.push('\n');
|
||||
}
|
||||
Err(e) => errors.push(e),
|
||||
}
|
||||
}
|
||||
|
||||
(out, errors)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::model::{EntryType, Person, Reference};
|
||||
|
||||
fn make_article() -> Reference {
|
||||
let mut r = Reference::new("turing1950", EntryType::Article);
|
||||
r.authors.push(Person {
|
||||
family: "Turing".into(),
|
||||
given: Some("Alan M.".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
});
|
||||
r.fields.insert(
|
||||
"title".into(),
|
||||
"Computing Machinery and Intelligence".into(),
|
||||
);
|
||||
r.fields.insert("journal".into(), "Mind".into());
|
||||
r.fields.insert("year".into(), "1950".into());
|
||||
r.fields.insert("volume".into(), "59".into());
|
||||
r
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn article_export() {
|
||||
let r = make_article();
|
||||
let bibtex = export_reference(&r).unwrap();
|
||||
assert!(bibtex.starts_with("@article{turing1950,"));
|
||||
assert!(bibtex.contains("author = {Turing, Alan M.}"));
|
||||
assert!(bibtex.contains("title = {Computing Machinery and Intelligence}"));
|
||||
assert!(bibtex.contains("journal = {Mind}"));
|
||||
assert!(bibtex.contains("year = {1950}"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_author_formatting() {
|
||||
let mut r = Reference::new("ab2024", EntryType::Article);
|
||||
r.authors.push(Person {
|
||||
family: "Doe".into(),
|
||||
given: Some("Jane".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
});
|
||||
r.authors.push(Person {
|
||||
family: "Smith".into(),
|
||||
given: Some("John".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
});
|
||||
r.fields.insert("title".into(), "A Paper".into());
|
||||
r.fields.insert("journal".into(), "Nature".into());
|
||||
r.fields.insert("year".into(), "2024".into());
|
||||
|
||||
let bibtex = export_reference(&r).unwrap();
|
||||
assert!(bibtex.contains("Doe, Jane and Smith, John"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn missing_required_field_returns_error() {
|
||||
let mut r = make_article();
|
||||
r.fields.remove("journal");
|
||||
assert!(export_reference(&r).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn book_with_editor() {
|
||||
let mut r = Reference::new("knuth1986", EntryType::Book);
|
||||
r.editors.push(Person::new("Knuth"));
|
||||
r.fields.insert("title".into(), "The TeXbook".into());
|
||||
r.fields.insert("publisher".into(), "Addison-Wesley".into());
|
||||
r.fields.insert("year".into(), "1986".into());
|
||||
|
||||
let bibtex = export_reference(&r).unwrap();
|
||||
assert!(bibtex.starts_with("@book{"));
|
||||
assert!(bibtex.contains("editor = {Knuth}"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn fields_appear_in_sorted_order() {
|
||||
let r = make_article();
|
||||
let bibtex = export_reference(&r).unwrap();
|
||||
let journal_pos = bibtex.find("journal").unwrap();
|
||||
let title_pos = bibtex.find("title").unwrap();
|
||||
let year_pos = bibtex.find("year").unwrap();
|
||||
// BTreeMap order: journal < title < volume < year (alphabetical)
|
||||
assert!(journal_pos < title_pos);
|
||||
assert!(title_pos < year_pos);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn export_references_collects_errors() {
|
||||
let good = make_article();
|
||||
let bad = Reference::new("incomplete", EntryType::Article);
|
||||
// Missing author, title, journal, year
|
||||
|
||||
let (bibtex, errors) = export_references(&[good, bad]);
|
||||
assert_eq!(errors.len(), 1);
|
||||
assert!(bibtex.contains("@article{turing1950,"));
|
||||
}
|
||||
}
|
||||
5
brittle-core/src/bibtex/mod.rs
Normal file
5
brittle-core/src/bibtex/mod.rs
Normal file
@@ -0,0 +1,5 @@
|
||||
pub mod export;
|
||||
pub mod validation;
|
||||
|
||||
pub use export::{export_reference, export_references};
|
||||
pub use validation::validate_for_export;
|
||||
104
brittle-core/src/bibtex/validation.rs
Normal file
104
brittle-core/src/bibtex/validation.rs
Normal file
@@ -0,0 +1,104 @@
|
||||
use crate::error::BibtexError;
|
||||
use crate::model::{EntryType, Reference};
|
||||
|
||||
/// Returns the required fields for a given BibTeX entry type.
|
||||
fn required_fields(entry_type: &EntryType) -> &'static [&'static str] {
|
||||
match entry_type {
|
||||
EntryType::Article => &["author", "title", "journal", "year"],
|
||||
EntryType::Book => &["title", "publisher", "year"],
|
||||
EntryType::Booklet => &["title"],
|
||||
EntryType::InBook => &["title", "publisher", "year", "chapter"],
|
||||
EntryType::InCollection => &["author", "title", "booktitle", "publisher", "year"],
|
||||
EntryType::InProceedings => &["author", "title", "booktitle", "year"],
|
||||
EntryType::Manual => &["title"],
|
||||
EntryType::MastersThesis => &["author", "title", "school", "year"],
|
||||
EntryType::Misc => &[],
|
||||
EntryType::PhdThesis => &["author", "title", "school", "year"],
|
||||
EntryType::Proceedings => &["title", "year"],
|
||||
EntryType::TechReport => &["author", "title", "institution", "year"],
|
||||
EntryType::Unpublished => &["author", "title", "note"],
|
||||
EntryType::Online => &["title", "url"],
|
||||
}
|
||||
}
|
||||
|
||||
/// Validate that a reference has all required fields for BibTeX export.
|
||||
/// Returns an error describing the first missing required field found.
|
||||
pub fn validate_for_export(reference: &Reference) -> Result<(), BibtexError> {
|
||||
let required = required_fields(&reference.entry_type);
|
||||
|
||||
for &field in required {
|
||||
let present = match field {
|
||||
"author" => !reference.authors.is_empty(),
|
||||
"editor" => !reference.editors.is_empty(),
|
||||
_ => reference.fields.contains_key(field),
|
||||
};
|
||||
if !present {
|
||||
return Err(BibtexError::MissingRequiredField {
|
||||
cite_key: reference.cite_key.clone(),
|
||||
entry_type: reference.entry_type.bibtex_name().to_owned(),
|
||||
field: field.to_owned(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::model::{EntryType, Person, Reference};
|
||||
|
||||
fn make_article() -> Reference {
|
||||
let mut r = Reference::new("doe2024", EntryType::Article);
|
||||
r.authors.push(Person::new("Doe"));
|
||||
r.fields.insert("title".into(), "A Paper".into());
|
||||
r.fields.insert("journal".into(), "Nature".into());
|
||||
r.fields.insert("year".into(), "2024".into());
|
||||
r
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn valid_article_passes() {
|
||||
let r = make_article();
|
||||
assert!(validate_for_export(&r).is_ok());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn article_missing_author_fails() {
|
||||
let mut r = make_article();
|
||||
r.authors.clear();
|
||||
let err = validate_for_export(&r).unwrap_err();
|
||||
assert!(
|
||||
matches!(err, BibtexError::MissingRequiredField { field, .. } if field == "author")
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn article_missing_journal_fails() {
|
||||
let mut r = make_article();
|
||||
r.fields.remove("journal");
|
||||
let err = validate_for_export(&r).unwrap_err();
|
||||
assert!(
|
||||
matches!(err, BibtexError::MissingRequiredField { field, .. } if field == "journal")
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn misc_has_no_required_fields() {
|
||||
let r = Reference::new("anon", EntryType::Misc);
|
||||
assert!(validate_for_export(&r).is_ok());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn phd_thesis_requires_school() {
|
||||
let mut r = Reference::new("smith2020", EntryType::PhdThesis);
|
||||
r.authors.push(Person::new("Smith"));
|
||||
r.fields.insert("title".into(), "A Thesis".into());
|
||||
r.fields.insert("year".into(), "2020".into());
|
||||
let err = validate_for_export(&r).unwrap_err();
|
||||
assert!(
|
||||
matches!(err, BibtexError::MissingRequiredField { field, .. } if field == "school")
|
||||
);
|
||||
}
|
||||
}
|
||||
286
brittle-core/src/bin/seed.rs
Normal file
286
brittle-core/src/bin/seed.rs
Normal file
@@ -0,0 +1,286 @@
|
||||
//! Creates an example Brittle repository with realistic academic references.
|
||||
//!
|
||||
//! For references that have freely available PDFs (arXiv preprints and open
|
||||
//! author copies), the script downloads the PDF and attaches it to the
|
||||
//! reference. Downloads that fail are skipped with a warning so the seed
|
||||
//! always completes even without network access.
|
||||
//!
|
||||
//! Usage:
|
||||
//! brittle-seed [PATH]
|
||||
//!
|
||||
//! PATH defaults to `~/brittle-example`. The directory must not already
|
||||
//! contain a git repository.
|
||||
|
||||
use std::io::Read;
|
||||
use std::path::PathBuf;
|
||||
|
||||
use brittle_core::{Brittle, EntryType, FsStore, Person, ReferenceId};
|
||||
|
||||
fn main() {
|
||||
let path = match std::env::args().nth(1) {
|
||||
Some(p) => PathBuf::from(p),
|
||||
None => {
|
||||
let home = std::env::var("HOME").expect("HOME not set");
|
||||
PathBuf::from(home).join("brittle-example")
|
||||
}
|
||||
};
|
||||
|
||||
if path.join(".git").exists() {
|
||||
eprintln!("error: {} already contains a git repository", path.display());
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
std::fs::create_dir_all(&path).expect("could not create directory");
|
||||
|
||||
println!("Creating repository at {} …", path.display());
|
||||
let mut b = Brittle::create(&path).expect("create repository");
|
||||
|
||||
// ── Libraries ─────────────────────────────────────────────────────────────
|
||||
|
||||
let cs = b.create_library("Computer Science", None).unwrap();
|
||||
let ml = b.create_library("Machine Learning", Some(cs.id)).unwrap();
|
||||
let sys = b.create_library("Systems", Some(cs.id)).unwrap();
|
||||
let math = b.create_library("Mathematics", None).unwrap();
|
||||
let pl = b.create_library("Programming Languages", Some(cs.id)).unwrap();
|
||||
|
||||
// ── References ────────────────────────────────────────────────────────────
|
||||
|
||||
// -- Machine Learning --
|
||||
|
||||
let mut r = b.create_reference("lecun1998gradient", EntryType::Article).unwrap();
|
||||
r.authors = vec![
|
||||
person("LeCun", "Yann"),
|
||||
person("Bottou", "Léon"),
|
||||
person("Bengio", "Yoshua"),
|
||||
person("Haffner", "Patrick"),
|
||||
];
|
||||
r.fields.insert("title".into(), "Gradient-based learning applied to document recognition".into());
|
||||
r.fields.insert("journal".into(), "Proceedings of the IEEE".into());
|
||||
r.fields.insert("volume".into(), "86".into());
|
||||
r.fields.insert("number".into(), "11".into());
|
||||
r.fields.insert("pages".into(), "2278--2324".into());
|
||||
r.fields.insert("year".into(), "1998".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(ml.id, id).unwrap();
|
||||
attach_pdf(&mut b, id, "http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf");
|
||||
|
||||
let mut r = b.create_reference("vaswani2017attention", EntryType::InProceedings).unwrap();
|
||||
r.authors = vec![
|
||||
person("Vaswani", "Ashish"),
|
||||
person("Shazeer", "Noam"),
|
||||
person("Parmar", "Niki"),
|
||||
person("Uszkoreit", "Jakob"),
|
||||
person("Jones", "Llion"),
|
||||
person("Gomez", "Aidan N."),
|
||||
person("Kaiser", "Łukasz"),
|
||||
person("Polosukhin", "Illia"),
|
||||
];
|
||||
r.fields.insert("title".into(), "Attention Is All You Need".into());
|
||||
r.fields.insert("booktitle".into(), "Advances in Neural Information Processing Systems".into());
|
||||
r.fields.insert("volume".into(), "30".into());
|
||||
r.fields.insert("year".into(), "2017".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(ml.id, id).unwrap();
|
||||
attach_pdf(&mut b, id, "https://arxiv.org/pdf/1706.03762");
|
||||
|
||||
let mut r = b.create_reference("goodfellow2016deep", EntryType::Book).unwrap();
|
||||
r.authors = vec![
|
||||
person("Goodfellow", "Ian"),
|
||||
person("Bengio", "Yoshua"),
|
||||
person("Courville", "Aaron"),
|
||||
];
|
||||
r.fields.insert("title".into(), "Deep Learning".into());
|
||||
r.fields.insert("publisher".into(), "MIT Press".into());
|
||||
r.fields.insert("year".into(), "2016".into());
|
||||
r.fields.insert("url".into(), "http://www.deeplearningbook.org".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(ml.id, id).unwrap();
|
||||
// No freely available PDF for this book.
|
||||
|
||||
let mut r = b.create_reference("ho2020denoising", EntryType::InProceedings).unwrap();
|
||||
r.authors = vec![
|
||||
person("Ho", "Jonathan"),
|
||||
person("Jain", "Ajay"),
|
||||
person("Abbeel", "Pieter"),
|
||||
];
|
||||
r.fields.insert("title".into(), "Denoising Diffusion Probabilistic Models".into());
|
||||
r.fields.insert("booktitle".into(), "Advances in Neural Information Processing Systems".into());
|
||||
r.fields.insert("volume".into(), "33".into());
|
||||
r.fields.insert("pages".into(), "6840--6851".into());
|
||||
r.fields.insert("year".into(), "2020".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(ml.id, id).unwrap();
|
||||
attach_pdf(&mut b, id, "https://arxiv.org/pdf/2006.11239");
|
||||
|
||||
// -- Systems --
|
||||
|
||||
let mut r = b.create_reference("lamport1978time", EntryType::Article).unwrap();
|
||||
r.authors = vec![person("Lamport", "Leslie")];
|
||||
r.fields.insert("title".into(), "Time, Clocks, and the Ordering of Events in a Distributed System".into());
|
||||
r.fields.insert("journal".into(), "Communications of the ACM".into());
|
||||
r.fields.insert("volume".into(), "21".into());
|
||||
r.fields.insert("number".into(), "7".into());
|
||||
r.fields.insert("pages".into(), "558--565".into());
|
||||
r.fields.insert("year".into(), "1978".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(sys.id, id).unwrap();
|
||||
attach_pdf(&mut b, id, "https://lamport.azurewebsites.net/pubs/time-clocks.pdf");
|
||||
|
||||
let mut r = b.create_reference("rosenblum1992lfs", EntryType::Article).unwrap();
|
||||
r.authors = vec![
|
||||
person("Rosenblum", "Mendel"),
|
||||
person("Ousterhout", "John K."),
|
||||
];
|
||||
r.fields.insert("title".into(), "The Design and Implementation of a Log-Structured File System".into());
|
||||
r.fields.insert("journal".into(), "ACM Transactions on Computer Systems".into());
|
||||
r.fields.insert("volume".into(), "10".into());
|
||||
r.fields.insert("number".into(), "1".into());
|
||||
r.fields.insert("pages".into(), "26--52".into());
|
||||
r.fields.insert("year".into(), "1992".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(sys.id, id).unwrap();
|
||||
// Paywalled; no freely available PDF.
|
||||
|
||||
let mut r = b.create_reference("dean2004mapreduce", EntryType::InProceedings).unwrap();
|
||||
r.authors = vec![
|
||||
person("Dean", "Jeffrey"),
|
||||
person("Ghemawat", "Sanjay"),
|
||||
];
|
||||
r.fields.insert("title".into(), "MapReduce: Simplified Data Processing on Large Clusters".into());
|
||||
r.fields.insert("booktitle".into(), "OSDI".into());
|
||||
r.fields.insert("pages".into(), "137--150".into());
|
||||
r.fields.insert("year".into(), "2004".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(sys.id, id).unwrap();
|
||||
attach_pdf(&mut b, id, "https://static.googleusercontent.com/media/research.google.com/en//archive/mapreduce-osdi04.pdf");
|
||||
|
||||
// -- Programming Languages --
|
||||
|
||||
let mut r = b.create_reference("milner1978polymorphism", EntryType::Article).unwrap();
|
||||
r.authors = vec![person("Milner", "Robin")];
|
||||
r.fields.insert("title".into(), "A Theory of Type Polymorphism in Programming".into());
|
||||
r.fields.insert("journal".into(), "Journal of Computer and System Sciences".into());
|
||||
r.fields.insert("volume".into(), "17".into());
|
||||
r.fields.insert("number".into(), "3".into());
|
||||
r.fields.insert("pages".into(), "348--375".into());
|
||||
r.fields.insert("year".into(), "1978".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(pl.id, id).unwrap();
|
||||
// Paywalled; no freely available PDF.
|
||||
|
||||
let mut r = b.create_reference("matsakis2014rust", EntryType::InProceedings).unwrap();
|
||||
r.authors = vec![
|
||||
person("Matsakis", "Nicholas D."),
|
||||
person("Klock", "Felix S."),
|
||||
];
|
||||
r.fields.insert("title".into(), "The Rust Language".into());
|
||||
r.fields.insert("booktitle".into(), "ACM SIGAda Annual Conference on High Integrity Language Technology".into());
|
||||
r.fields.insert("pages".into(), "103--104".into());
|
||||
r.fields.insert("year".into(), "2014".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(pl.id, id).unwrap();
|
||||
// Paywalled; no freely available PDF.
|
||||
|
||||
// -- Mathematics --
|
||||
|
||||
let mut r = b.create_reference("turing1936computable", EntryType::Article).unwrap();
|
||||
r.authors = vec![person("Turing", "Alan M.")];
|
||||
r.fields.insert("title".into(), "On Computable Numbers, with an Application to the Entscheidungsproblem".into());
|
||||
r.fields.insert("journal".into(), "Proceedings of the London Mathematical Society".into());
|
||||
r.fields.insert("volume".into(), "42".into());
|
||||
r.fields.insert("number".into(), "1".into());
|
||||
r.fields.insert("pages".into(), "230--265".into());
|
||||
r.fields.insert("year".into(), "1936".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(math.id, id).unwrap();
|
||||
// No freely available PDF.
|
||||
|
||||
let mut r = b.create_reference("knuth1984texbook", EntryType::Book).unwrap();
|
||||
r.authors = vec![person("Knuth", "Donald E.")];
|
||||
r.fields.insert("title".into(), "The TeXbook".into());
|
||||
r.fields.insert("publisher".into(), "Addison-Wesley".into());
|
||||
r.fields.insert("year".into(), "1984".into());
|
||||
r.fields.insert("series".into(), "Computers and Typesetting".into());
|
||||
r.fields.insert("volume".into(), "A".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(math.id, id).unwrap();
|
||||
// Copyrighted book; no freely available PDF.
|
||||
|
||||
// A reference in both ML and Mathematics (cross-library membership).
|
||||
let mut r = b.create_reference("cybenko1989approximation", EntryType::Article).unwrap();
|
||||
r.authors = vec![person("Cybenko", "George")];
|
||||
r.fields.insert("title".into(), "Approximation by Superpositions of a Sigmoidal Function".into());
|
||||
r.fields.insert("journal".into(), "Mathematics of Control, Signals, and Systems".into());
|
||||
r.fields.insert("volume".into(), "2".into());
|
||||
r.fields.insert("number".into(), "4".into());
|
||||
r.fields.insert("pages".into(), "303--314".into());
|
||||
r.fields.insert("year".into(), "1989".into());
|
||||
let id = r.id;
|
||||
b.update_reference(r).unwrap();
|
||||
b.add_to_library(ml.id, id).unwrap();
|
||||
b.add_to_library(math.id, id).unwrap();
|
||||
// Paywalled; no freely available PDF.
|
||||
|
||||
println!();
|
||||
println!("Done.");
|
||||
println!();
|
||||
println!(" Libraries : Computer Science (Machine Learning, Systems, Programming Languages), Mathematics");
|
||||
println!(" References: 12 across all libraries");
|
||||
println!();
|
||||
println!("Open the repository in Brittle with: :open {}", path.display());
|
||||
}
|
||||
|
||||
// ── PDF download ──────────────────────────────────────────────────────────────
|
||||
|
||||
/// Download the PDF at `url` and attach it to `id`. Prints progress and
|
||||
/// skips silently on any error so the seed always completes.
|
||||
fn attach_pdf(b: &mut Brittle<FsStore>, id: ReferenceId, url: &str) {
|
||||
let label = url.rsplit('/').next().unwrap_or(url);
|
||||
print!(" ↓ {label} … ");
|
||||
std::io::Write::flush(&mut std::io::stdout()).ok();
|
||||
|
||||
match download(url) {
|
||||
Err(e) => println!("skipped ({e})"),
|
||||
Ok(bytes) => {
|
||||
let tmp = std::env::temp_dir().join(format!("{id}.pdf"));
|
||||
if let Err(e) = std::fs::write(&tmp, &bytes) {
|
||||
println!("skipped (write: {e})");
|
||||
return;
|
||||
}
|
||||
match b.attach_pdf(id, &tmp) {
|
||||
Ok(_) => println!("{} KB", bytes.len() / 1024),
|
||||
Err(e) => println!("skipped (attach: {e})"),
|
||||
}
|
||||
let _ = std::fs::remove_file(&tmp);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn download(url: &str) -> Result<Vec<u8>, Box<dyn std::error::Error>> {
|
||||
let resp = ureq::get(url).call()?;
|
||||
let mut buf = Vec::new();
|
||||
resp.into_reader().read_to_end(&mut buf)?;
|
||||
Ok(buf)
|
||||
}
|
||||
|
||||
// ── Helpers ───────────────────────────────────────────────────────────────────
|
||||
|
||||
fn person(family: &str, given: &str) -> Person {
|
||||
Person {
|
||||
family: family.into(),
|
||||
given: Some(given.into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
}
|
||||
}
|
||||
104
brittle-core/src/error.rs
Normal file
104
brittle-core/src/error.rs
Normal file
@@ -0,0 +1,104 @@
|
||||
use std::path::PathBuf;
|
||||
use thiserror::Error;
|
||||
|
||||
/// Top-level error returned from all public Brittle API methods.
|
||||
#[derive(Debug, Error)]
|
||||
pub enum BrittleError {
|
||||
#[error("{0}")]
|
||||
Store(#[from] StoreError),
|
||||
|
||||
#[error("{0}")]
|
||||
Validation(#[from] ValidationError),
|
||||
|
||||
#[error("{0}")]
|
||||
BibTeX(#[from] BibtexError),
|
||||
}
|
||||
|
||||
/// Errors from the storage layer.
|
||||
#[derive(Debug, Error)]
|
||||
pub enum StoreError {
|
||||
#[error("{entity_type} not found: {id}")]
|
||||
NotFound { entity_type: EntityType, id: String },
|
||||
|
||||
#[error("I/O error: {0}")]
|
||||
Io(#[from] std::io::Error),
|
||||
|
||||
#[error("serialization error: {message}")]
|
||||
Serialization { message: String },
|
||||
|
||||
#[error("deserialization error for {path}: {message}")]
|
||||
Deserialization { path: PathBuf, message: String },
|
||||
|
||||
#[error("git error: {0}")]
|
||||
Git(#[from] git2::Error),
|
||||
|
||||
#[error("repository not found at {path}")]
|
||||
RepoNotFound { path: PathBuf },
|
||||
|
||||
#[error("repository already exists at {path}")]
|
||||
RepoAlreadyExists { path: PathBuf },
|
||||
}
|
||||
|
||||
/// The kind of entity involved in a not-found error.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum EntityType {
|
||||
Reference,
|
||||
Library,
|
||||
Annotation,
|
||||
Snapshot,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for EntityType {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
match self {
|
||||
EntityType::Reference => write!(f, "Reference"),
|
||||
EntityType::Library => write!(f, "Library"),
|
||||
EntityType::Annotation => write!(f, "Annotation"),
|
||||
EntityType::Snapshot => write!(f, "Snapshot"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Business logic validation errors.
|
||||
#[derive(Debug, Error)]
|
||||
pub enum ValidationError {
|
||||
#[error(
|
||||
"library cycle detected: moving library {library_id} under {parent_id} would create a cycle"
|
||||
)]
|
||||
LibraryCycle {
|
||||
library_id: String,
|
||||
parent_id: String,
|
||||
},
|
||||
|
||||
#[error("library {id} has children and cannot be deleted; delete or move children first")]
|
||||
LibraryHasChildren { id: String },
|
||||
|
||||
#[error("cite key already exists: {cite_key}")]
|
||||
DuplicateCiteKey { cite_key: String },
|
||||
|
||||
#[error("cite key cannot be empty")]
|
||||
EmptyCiteKey,
|
||||
|
||||
#[error("library name cannot be empty")]
|
||||
EmptyLibraryName,
|
||||
|
||||
#[error("reference {reference_id} has no PDF attached")]
|
||||
NoPdfAttached { reference_id: String },
|
||||
|
||||
#[error("PDF file not found: {path}")]
|
||||
PdfNotFound { path: PathBuf },
|
||||
|
||||
#[error("there are uncommitted changes; create a snapshot or call discard_changes() first")]
|
||||
UncommittedChanges,
|
||||
}
|
||||
|
||||
/// Errors specific to BibTeX export.
|
||||
#[derive(Debug, Error)]
|
||||
pub enum BibtexError {
|
||||
#[error("reference '{cite_key}' ({entry_type}): missing required field '{field}'")]
|
||||
MissingRequiredField {
|
||||
cite_key: String,
|
||||
entry_type: String,
|
||||
field: String,
|
||||
},
|
||||
}
|
||||
1051
brittle-core/src/lib.rs
Normal file
1051
brittle-core/src/lib.rs
Normal file
File diff suppressed because it is too large
Load Diff
229
brittle-core/src/model/annotation.rs
Normal file
229
brittle-core/src/model/annotation.rs
Normal file
@@ -0,0 +1,229 @@
|
||||
use crate::model::ids::{AnnotationId, ReferenceId};
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// A point in PDF coordinate space.
|
||||
/// Origin is bottom-left; units are points (1/72 inch), matching ISO 32000.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Point {
|
||||
pub x: f64,
|
||||
pub y: f64,
|
||||
}
|
||||
|
||||
/// A rectangle in PDF coordinate space.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Rect {
|
||||
pub x: f64,
|
||||
pub y: f64,
|
||||
pub width: f64,
|
||||
pub height: f64,
|
||||
}
|
||||
|
||||
/// A quadrilateral for text markup annotations (highlight, underline, etc.).
|
||||
/// Four points define one region, typically one line of text.
|
||||
/// Matches the PDF spec QuadPoints representation (4 vertices per quad).
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Quad {
|
||||
pub points: [Point; 4],
|
||||
}
|
||||
|
||||
/// RGBA color.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct Color {
|
||||
pub r: u8,
|
||||
pub g: u8,
|
||||
pub b: u8,
|
||||
pub a: u8,
|
||||
}
|
||||
|
||||
impl Color {
|
||||
pub const YELLOW: Color = Color {
|
||||
r: 255,
|
||||
g: 255,
|
||||
b: 0,
|
||||
a: 128,
|
||||
};
|
||||
pub const RED: Color = Color {
|
||||
r: 255,
|
||||
g: 0,
|
||||
b: 0,
|
||||
a: 128,
|
||||
};
|
||||
pub const GREEN: Color = Color {
|
||||
r: 0,
|
||||
g: 255,
|
||||
b: 0,
|
||||
a: 128,
|
||||
};
|
||||
}
|
||||
|
||||
/// The four text markup annotation types defined in ISO 32000.
|
||||
/// All share the same QuadPoints-based geometry.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum TextMarkupType {
|
||||
Highlight,
|
||||
Underline,
|
||||
Squiggly,
|
||||
StrikeOut,
|
||||
}
|
||||
|
||||
/// The kind of annotation and its type-specific geometry/data.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(tag = "type", rename_all = "lowercase")]
|
||||
pub enum AnnotationType {
|
||||
/// Text markup (highlight, underline, squiggly, strikeout).
|
||||
/// Uses QuadPoints per PDF spec for precise multi-line region selection.
|
||||
TextMarkup {
|
||||
markup_type: TextMarkupType,
|
||||
quads: Vec<Quad>,
|
||||
color: Color,
|
||||
/// The selected text, stored for search and export without re-reading the PDF.
|
||||
selected_text: Option<String>,
|
||||
},
|
||||
/// Sticky note (popup comment).
|
||||
Note { position: Point },
|
||||
/// Inline text box.
|
||||
FreeText { rect: Rect },
|
||||
/// Freehand ink drawing (e.g., circling a diagram).
|
||||
Ink {
|
||||
/// Multiple strokes, each a sequence of connected points.
|
||||
paths: Vec<Vec<Point>>,
|
||||
color: Color,
|
||||
/// Stroke width in points.
|
||||
width: f64,
|
||||
},
|
||||
/// Area/image selection for extracting figures from PDFs.
|
||||
Area { rect: Rect },
|
||||
}
|
||||
|
||||
/// A single annotation on a PDF page.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Annotation {
|
||||
pub id: AnnotationId,
|
||||
pub reference_id: ReferenceId,
|
||||
/// 0-indexed physical page number.
|
||||
pub page: u32,
|
||||
/// Display page label (e.g., "iv", "23") — may differ from the physical page index.
|
||||
pub page_label: Option<String>,
|
||||
pub annotation_type: AnnotationType,
|
||||
/// Free-form text: note body, comment on a highlight, etc.
|
||||
pub content: Option<String>,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub modified_at: DateTime<Utc>,
|
||||
}
|
||||
|
||||
impl Annotation {
|
||||
pub fn new(reference_id: ReferenceId, page: u32, annotation_type: AnnotationType) -> Self {
|
||||
let now = Utc::now();
|
||||
Self {
|
||||
id: AnnotationId::new(),
|
||||
reference_id,
|
||||
page,
|
||||
page_label: None,
|
||||
annotation_type,
|
||||
content: None,
|
||||
created_at: now,
|
||||
modified_at: now,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// All annotations for a single reference, stored as one file.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
pub struct AnnotationSet {
|
||||
pub reference_id: ReferenceId,
|
||||
pub annotations: Vec<Annotation>,
|
||||
}
|
||||
|
||||
impl AnnotationSet {
|
||||
pub fn new(reference_id: ReferenceId) -> Self {
|
||||
Self {
|
||||
reference_id,
|
||||
annotations: Vec::new(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn make_highlight() -> AnnotationType {
|
||||
AnnotationType::TextMarkup {
|
||||
markup_type: TextMarkupType::Highlight,
|
||||
quads: vec![Quad {
|
||||
points: [
|
||||
Point { x: 10.0, y: 20.0 },
|
||||
Point { x: 100.0, y: 20.0 },
|
||||
Point { x: 10.0, y: 30.0 },
|
||||
Point { x: 100.0, y: 30.0 },
|
||||
],
|
||||
}],
|
||||
color: Color::YELLOW,
|
||||
selected_text: Some("important text".into()),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn annotation_serde_round_trip_highlight() {
|
||||
let ref_id = ReferenceId::new();
|
||||
let set = AnnotationSet {
|
||||
reference_id: ref_id,
|
||||
annotations: vec![Annotation::new(ref_id, 3, make_highlight())],
|
||||
};
|
||||
|
||||
let toml_str = toml::to_string(&set).expect("serialize");
|
||||
let set2: AnnotationSet = toml::from_str(&toml_str).expect("deserialize");
|
||||
|
||||
assert_eq!(set.reference_id, set2.reference_id);
|
||||
assert_eq!(set.annotations.len(), set2.annotations.len());
|
||||
assert_eq!(set.annotations[0].page, set2.annotations[0].page);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn annotation_serde_round_trip_ink() {
|
||||
let ref_id = ReferenceId::new();
|
||||
let ink = AnnotationType::Ink {
|
||||
paths: vec![vec![Point { x: 0.0, y: 0.0 }, Point { x: 10.0, y: 10.0 }]],
|
||||
color: Color::RED,
|
||||
width: 2.0,
|
||||
};
|
||||
let set = AnnotationSet {
|
||||
reference_id: ref_id,
|
||||
annotations: vec![Annotation::new(ref_id, 0, ink)],
|
||||
};
|
||||
|
||||
let toml_str = toml::to_string(&set).expect("serialize");
|
||||
let set2: AnnotationSet = toml::from_str(&toml_str).expect("deserialize");
|
||||
assert_eq!(set, set2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn all_markup_types_serialize() {
|
||||
let ref_id = ReferenceId::new();
|
||||
for markup_type in [
|
||||
TextMarkupType::Highlight,
|
||||
TextMarkupType::Underline,
|
||||
TextMarkupType::Squiggly,
|
||||
TextMarkupType::StrikeOut,
|
||||
] {
|
||||
let ann = Annotation::new(
|
||||
ref_id,
|
||||
0,
|
||||
AnnotationType::TextMarkup {
|
||||
markup_type,
|
||||
quads: vec![],
|
||||
color: Color::GREEN,
|
||||
selected_text: None,
|
||||
},
|
||||
);
|
||||
let set = AnnotationSet {
|
||||
reference_id: ref_id,
|
||||
annotations: vec![ann],
|
||||
};
|
||||
let toml_str = toml::to_string(&set).expect("serialize");
|
||||
let _: AnnotationSet = toml::from_str(&toml_str).expect("deserialize");
|
||||
}
|
||||
}
|
||||
}
|
||||
67
brittle-core/src/model/ids.rs
Normal file
67
brittle-core/src/model/ids.rs
Normal file
@@ -0,0 +1,67 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::fmt;
|
||||
use uuid::Uuid;
|
||||
|
||||
macro_rules! define_id {
|
||||
($name:ident) => {
|
||||
#[derive(
|
||||
Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Serialize, Deserialize,
|
||||
)]
|
||||
pub struct $name(pub Uuid);
|
||||
|
||||
impl $name {
|
||||
pub fn new() -> Self {
|
||||
Self(Uuid::now_v7())
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for $name {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Uuid> for $name {
|
||||
fn from(uuid: Uuid) -> Self {
|
||||
Self(uuid)
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for $name {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{}", self.0)
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
define_id!(ReferenceId);
|
||||
define_id!(LibraryId);
|
||||
define_id!(AnnotationId);
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn new_generates_unique_ids() {
|
||||
let a = ReferenceId::new();
|
||||
let b = ReferenceId::new();
|
||||
assert_ne!(a, b);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn display_is_uuid_format() {
|
||||
let id = ReferenceId::new();
|
||||
let s = id.to_string();
|
||||
assert_eq!(s.len(), 36); // UUID hyphenated format
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip() {
|
||||
let id = LibraryId::new();
|
||||
let json = serde_json::to_string(&id).unwrap();
|
||||
let id2: LibraryId = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(id, id2);
|
||||
}
|
||||
}
|
||||
66
brittle-core/src/model/library.rs
Normal file
66
brittle-core/src/model/library.rs
Normal file
@@ -0,0 +1,66 @@
|
||||
use crate::model::ids::{LibraryId, ReferenceId};
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::BTreeSet;
|
||||
|
||||
/// A named collection of references. Forms a tree via `parent_id`.
|
||||
///
|
||||
/// References are not "owned" by a library — they exist in a flat pool.
|
||||
/// A reference can appear in multiple libraries (multi-membership).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct Library {
|
||||
pub id: LibraryId,
|
||||
pub name: String,
|
||||
/// `None` means this is a root library (no parent).
|
||||
pub parent_id: Option<LibraryId>,
|
||||
/// The set of references that are members of this library.
|
||||
/// BTreeSet for deterministic serialization order.
|
||||
pub members: BTreeSet<ReferenceId>,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub modified_at: DateTime<Utc>,
|
||||
}
|
||||
|
||||
impl Library {
|
||||
pub fn new(name: impl Into<String>, parent_id: Option<LibraryId>) -> Self {
|
||||
let now = Utc::now();
|
||||
Self {
|
||||
id: LibraryId::new(),
|
||||
name: name.into(),
|
||||
parent_id,
|
||||
members: BTreeSet::new(),
|
||||
created_at: now,
|
||||
modified_at: now,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn library_serde_round_trip() {
|
||||
let mut lib = Library::new("Machine Learning", None);
|
||||
let ref_id = ReferenceId::new();
|
||||
lib.members.insert(ref_id);
|
||||
|
||||
let toml_str = toml::to_string(&lib).expect("serialize to TOML");
|
||||
let lib2: Library = toml::from_str(&toml_str).expect("deserialize from TOML");
|
||||
|
||||
assert_eq!(lib.id, lib2.id);
|
||||
assert_eq!(lib.name, lib2.name);
|
||||
assert_eq!(lib.members, lib2.members);
|
||||
assert_eq!(lib.parent_id, lib2.parent_id);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn nested_library_serde_round_trip() {
|
||||
let parent = Library::new("Science", None);
|
||||
let child = Library::new("Physics", Some(parent.id));
|
||||
|
||||
let toml_str = toml::to_string(&child).expect("serialize to TOML");
|
||||
let child2: Library = toml::from_str(&toml_str).expect("deserialize from TOML");
|
||||
|
||||
assert_eq!(child2.parent_id, Some(parent.id));
|
||||
}
|
||||
}
|
||||
13
brittle-core/src/model/mod.rs
Normal file
13
brittle-core/src/model/mod.rs
Normal file
@@ -0,0 +1,13 @@
|
||||
pub mod annotation;
|
||||
pub mod ids;
|
||||
pub mod library;
|
||||
pub mod reference;
|
||||
pub mod snapshot;
|
||||
|
||||
pub use annotation::{
|
||||
Annotation, AnnotationSet, AnnotationType, Color, Point, Quad, Rect, TextMarkupType,
|
||||
};
|
||||
pub use ids::{AnnotationId, LibraryId, ReferenceId};
|
||||
pub use library::Library;
|
||||
pub use reference::{EntryType, PdfAttachment, Person, Reference};
|
||||
pub use snapshot::Snapshot;
|
||||
241
brittle-core/src/model/reference.rs
Normal file
241
brittle-core/src/model/reference.rs
Normal file
@@ -0,0 +1,241 @@
|
||||
use crate::model::ids::ReferenceId;
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::BTreeMap;
|
||||
use std::fmt;
|
||||
use std::path::PathBuf;
|
||||
|
||||
/// A person (author, editor, translator, etc.).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct Person {
|
||||
pub family: String,
|
||||
pub given: Option<String>,
|
||||
/// Name prefix: "von", "de", "van der", etc.
|
||||
pub prefix: Option<String>,
|
||||
/// Name suffix: "Jr.", "III", etc.
|
||||
pub suffix: Option<String>,
|
||||
}
|
||||
|
||||
impl Person {
|
||||
pub fn new(family: impl Into<String>) -> Self {
|
||||
Self {
|
||||
family: family.into(),
|
||||
given: None,
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Format for display: "Given prefix Family, Suffix" — natural reading order.
|
||||
pub fn display_name(&self) -> String {
|
||||
let mut parts = Vec::new();
|
||||
if let Some(given) = &self.given {
|
||||
parts.push(given.as_str());
|
||||
}
|
||||
if let Some(prefix) = &self.prefix {
|
||||
parts.push(prefix.as_str());
|
||||
}
|
||||
parts.push(self.family.as_str());
|
||||
let mut name = parts.join(" ");
|
||||
if let Some(suffix) = &self.suffix {
|
||||
name.push_str(", ");
|
||||
name.push_str(suffix);
|
||||
}
|
||||
name
|
||||
}
|
||||
|
||||
/// Format as BibTeX expects: "{prefix} {family}, {suffix}, {given}".
|
||||
/// Falls back gracefully when optional parts are absent.
|
||||
pub fn to_bibtex(&self) -> String {
|
||||
let mut family_part = String::new();
|
||||
if let Some(prefix) = &self.prefix {
|
||||
family_part.push_str(prefix);
|
||||
family_part.push(' ');
|
||||
}
|
||||
family_part.push_str(&self.family);
|
||||
|
||||
match (&self.suffix, &self.given) {
|
||||
(Some(suffix), Some(given)) => {
|
||||
format!("{family_part}, {suffix}, {given}")
|
||||
}
|
||||
(Some(suffix), None) => format!("{family_part}, {suffix}"),
|
||||
(None, Some(given)) => format!("{family_part}, {given}"),
|
||||
(None, None) => family_part,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for Person {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{}", self.display_name())
|
||||
}
|
||||
}
|
||||
|
||||
/// Standard BibTeX and common BibLaTeX entry types.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum EntryType {
|
||||
Article,
|
||||
Book,
|
||||
Booklet,
|
||||
InBook,
|
||||
InCollection,
|
||||
InProceedings,
|
||||
Manual,
|
||||
MastersThesis,
|
||||
Misc,
|
||||
PhdThesis,
|
||||
Proceedings,
|
||||
TechReport,
|
||||
Unpublished,
|
||||
Online,
|
||||
}
|
||||
|
||||
impl EntryType {
|
||||
/// The BibTeX entry type name as it appears in `.bib` files.
|
||||
pub fn bibtex_name(&self) -> &'static str {
|
||||
match self {
|
||||
EntryType::Article => "article",
|
||||
EntryType::Book => "book",
|
||||
EntryType::Booklet => "booklet",
|
||||
EntryType::InBook => "inbook",
|
||||
EntryType::InCollection => "incollection",
|
||||
EntryType::InProceedings => "inproceedings",
|
||||
EntryType::Manual => "manual",
|
||||
EntryType::MastersThesis => "mastersthesis",
|
||||
EntryType::Misc => "misc",
|
||||
EntryType::PhdThesis => "phdthesis",
|
||||
EntryType::Proceedings => "proceedings",
|
||||
EntryType::TechReport => "techreport",
|
||||
EntryType::Unpublished => "unpublished",
|
||||
EntryType::Online => "online",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A PDF file stored inside the Brittle repository.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct PdfAttachment {
|
||||
/// Path relative to the repository root (e.g., `"pdfs/550e8400-....pdf"`).
|
||||
pub stored_path: PathBuf,
|
||||
/// SHA-256 hex digest of the file contents for integrity verification.
|
||||
pub content_hash: String,
|
||||
}
|
||||
|
||||
/// A citable work. The core entity of Brittle.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Reference {
|
||||
pub id: ReferenceId,
|
||||
/// The BibTeX cite key (e.g., `"knuth1984texbook"`). User-facing and mutable.
|
||||
pub cite_key: String,
|
||||
pub entry_type: EntryType,
|
||||
/// Authors listed in order.
|
||||
pub authors: Vec<Person>,
|
||||
/// Editors (for edited books, proceedings, etc.).
|
||||
pub editors: Vec<Person>,
|
||||
/// All other fields (title, year, journal, volume, etc.) as plain strings.
|
||||
/// BTreeMap for deterministic serialization order (important for git diffs).
|
||||
pub fields: BTreeMap<String, String>,
|
||||
pub pdf: Option<PdfAttachment>,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub modified_at: DateTime<Utc>,
|
||||
}
|
||||
|
||||
impl Reference {
|
||||
pub fn new(cite_key: impl Into<String>, entry_type: EntryType) -> Self {
|
||||
let now = Utc::now();
|
||||
Self {
|
||||
id: ReferenceId::new(),
|
||||
cite_key: cite_key.into(),
|
||||
entry_type,
|
||||
authors: Vec::new(),
|
||||
editors: Vec::new(),
|
||||
fields: BTreeMap::new(),
|
||||
pdf: None,
|
||||
created_at: now,
|
||||
modified_at: now,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the value of the `title` field, if present.
|
||||
pub fn title(&self) -> Option<&str> {
|
||||
self.fields.get("title").map(String::as_str)
|
||||
}
|
||||
|
||||
/// Returns the value of the `year` field, if present.
|
||||
pub fn year(&self) -> Option<&str> {
|
||||
self.fields.get("year").map(String::as_str)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn person_bibtex_full() {
|
||||
let p = Person {
|
||||
family: "Dijkstra".into(),
|
||||
given: Some("Edsger W.".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
};
|
||||
assert_eq!(p.to_bibtex(), "Dijkstra, Edsger W.");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn person_bibtex_with_prefix() {
|
||||
let p = Person {
|
||||
family: "Beethoven".into(),
|
||||
given: Some("Ludwig".into()),
|
||||
prefix: Some("van".into()),
|
||||
suffix: None,
|
||||
};
|
||||
assert_eq!(p.to_bibtex(), "van Beethoven, Ludwig");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn person_bibtex_with_suffix() {
|
||||
let p = Person {
|
||||
family: "King".into(),
|
||||
given: Some("Martin Luther".into()),
|
||||
prefix: None,
|
||||
suffix: Some("Jr.".into()),
|
||||
};
|
||||
assert_eq!(p.to_bibtex(), "King, Jr., Martin Luther");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn person_bibtex_family_only() {
|
||||
let p = Person::new("Aristotle");
|
||||
assert_eq!(p.to_bibtex(), "Aristotle");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reference_serde_round_trip() {
|
||||
let mut r = Reference::new("doe2024", EntryType::Article);
|
||||
r.authors.push(Person {
|
||||
family: "Doe".into(),
|
||||
given: Some("Jane".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
});
|
||||
r.fields.insert("title".into(), "A Great Paper".into());
|
||||
r.fields.insert("year".into(), "2024".into());
|
||||
|
||||
let toml_str = toml::to_string(&r).expect("serialize to TOML");
|
||||
let r2: Reference = toml::from_str(&toml_str).expect("deserialize from TOML");
|
||||
|
||||
assert_eq!(r.id, r2.id);
|
||||
assert_eq!(r.cite_key, r2.cite_key);
|
||||
assert_eq!(r.authors, r2.authors);
|
||||
assert_eq!(r.fields, r2.fields);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn entry_type_bibtex_names() {
|
||||
assert_eq!(EntryType::Article.bibtex_name(), "article");
|
||||
assert_eq!(EntryType::InProceedings.bibtex_name(), "inproceedings");
|
||||
assert_eq!(EntryType::PhdThesis.bibtex_name(), "phdthesis");
|
||||
}
|
||||
}
|
||||
12
brittle-core/src/model/snapshot.rs
Normal file
12
brittle-core/src/model/snapshot.rs
Normal file
@@ -0,0 +1,12 @@
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::Serialize;
|
||||
|
||||
/// Metadata about a stored snapshot (git commit).
|
||||
/// Not serialized to files — read directly from git history.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize)]
|
||||
pub struct Snapshot {
|
||||
/// Git commit SHA (hex string).
|
||||
pub id: String,
|
||||
pub message: String,
|
||||
pub timestamp: DateTime<Utc>,
|
||||
}
|
||||
448
brittle-core/src/store/fs.rs
Normal file
448
brittle-core/src/store/fs.rs
Normal file
@@ -0,0 +1,448 @@
|
||||
use crate::error::{EntityType, StoreError};
|
||||
use crate::model::{AnnotationSet, Library, LibraryId, Reference, ReferenceId, Snapshot};
|
||||
use crate::store::Store;
|
||||
use chrono::{DateTime, TimeZone, Utc};
|
||||
use git2::{IndexAddOption, Repository, Signature};
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
const REFERENCES_DIR: &str = "references";
|
||||
const LIBRARIES_DIR: &str = "libraries";
|
||||
const ANNOTATIONS_DIR: &str = "annotations";
|
||||
const PDFS_DIR: &str = "pdfs";
|
||||
|
||||
/// Filesystem + git-backed store. Each entity is a TOML file.
|
||||
/// Snapshots are git commits; time travel is git checkout.
|
||||
pub struct FsStore {
|
||||
root: PathBuf,
|
||||
repo: Repository,
|
||||
}
|
||||
|
||||
impl FsStore {
|
||||
/// Create a new Brittle repository at the given path.
|
||||
/// Fails if the path already contains a git repository.
|
||||
pub fn create(path: &Path) -> Result<Self, StoreError> {
|
||||
if path.join(".git").exists() {
|
||||
return Err(StoreError::RepoAlreadyExists {
|
||||
path: path.to_owned(),
|
||||
});
|
||||
}
|
||||
|
||||
let repo = Repository::init(path).map_err(StoreError::Git)?;
|
||||
|
||||
// Create subdirectories.
|
||||
for dir in [REFERENCES_DIR, LIBRARIES_DIR, ANNOTATIONS_DIR, PDFS_DIR] {
|
||||
std::fs::create_dir_all(path.join(dir))?;
|
||||
}
|
||||
|
||||
let mut store = Self {
|
||||
root: path.to_owned(),
|
||||
repo,
|
||||
};
|
||||
|
||||
// Create the initial commit so the repo has a HEAD.
|
||||
store.commit_all("Initialize Brittle repository")?;
|
||||
|
||||
Ok(store)
|
||||
}
|
||||
|
||||
/// Open an existing Brittle repository.
|
||||
pub fn open(path: &Path) -> Result<Self, StoreError> {
|
||||
let repo = Repository::open(path).map_err(|_| StoreError::RepoNotFound {
|
||||
path: path.to_owned(),
|
||||
})?;
|
||||
Ok(Self {
|
||||
root: path.to_owned(),
|
||||
repo,
|
||||
})
|
||||
}
|
||||
|
||||
/// Stage all changes and create a git commit. Returns the commit OID as hex.
|
||||
fn commit_all(&mut self, message: &str) -> Result<String, StoreError> {
|
||||
let mut index = self.repo.index().map_err(StoreError::Git)?;
|
||||
index
|
||||
.add_all(["*"].iter(), IndexAddOption::DEFAULT, None)
|
||||
.map_err(StoreError::Git)?;
|
||||
index.write().map_err(StoreError::Git)?;
|
||||
|
||||
let tree_oid = index.write_tree().map_err(StoreError::Git)?;
|
||||
let tree = self.repo.find_tree(tree_oid).map_err(StoreError::Git)?;
|
||||
|
||||
let sig = Signature::now("Brittle", "brittle@local").map_err(StoreError::Git)?;
|
||||
|
||||
let parent_commit = self.repo.head().ok().and_then(|h| h.peel_to_commit().ok());
|
||||
|
||||
let oid = match &parent_commit {
|
||||
Some(parent) => self
|
||||
.repo
|
||||
.commit(Some("HEAD"), &sig, &sig, message, &tree, &[parent])
|
||||
.map_err(StoreError::Git)?,
|
||||
None => self
|
||||
.repo
|
||||
.commit(Some("HEAD"), &sig, &sig, message, &tree, &[])
|
||||
.map_err(StoreError::Git)?,
|
||||
};
|
||||
|
||||
Ok(oid.to_string())
|
||||
}
|
||||
|
||||
fn reference_path(&self, id: ReferenceId) -> PathBuf {
|
||||
self.root.join(REFERENCES_DIR).join(format!("{id}.toml"))
|
||||
}
|
||||
|
||||
fn library_path(&self, id: LibraryId) -> PathBuf {
|
||||
self.root.join(LIBRARIES_DIR).join(format!("{id}.toml"))
|
||||
}
|
||||
|
||||
fn annotation_path(&self, ref_id: ReferenceId) -> PathBuf {
|
||||
self.root
|
||||
.join(ANNOTATIONS_DIR)
|
||||
.join(format!("{ref_id}.toml"))
|
||||
}
|
||||
|
||||
pub fn pdf_dir(&self) -> PathBuf {
|
||||
self.root.join(PDFS_DIR)
|
||||
}
|
||||
|
||||
/// Returns the repository root directory.
|
||||
pub fn root(&self) -> &Path {
|
||||
&self.root
|
||||
}
|
||||
|
||||
fn write_toml<T: serde::Serialize>(&self, path: &Path, value: &T) -> Result<(), StoreError> {
|
||||
let content = toml::to_string(value).map_err(|e| StoreError::Serialization {
|
||||
message: e.to_string(),
|
||||
})?;
|
||||
std::fs::write(path, content)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn read_toml<T: serde::de::DeserializeOwned>(&self, path: &Path) -> Result<T, StoreError> {
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
toml::from_str(&content).map_err(|e| StoreError::Deserialization {
|
||||
path: path.to_owned(),
|
||||
message: e.to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
fn ids_from_dir<T, F>(&self, dir: &str, parse: F) -> Result<Vec<T>, StoreError>
|
||||
where
|
||||
F: Fn(&str) -> Option<T>,
|
||||
{
|
||||
let dir_path = self.root.join(dir);
|
||||
let mut ids = Vec::new();
|
||||
for entry in std::fs::read_dir(&dir_path)? {
|
||||
let entry = entry?;
|
||||
let name = entry.file_name();
|
||||
let name = name.to_string_lossy();
|
||||
if let Some(stem) = name.strip_suffix(".toml")
|
||||
&& let Some(id) = parse(stem)
|
||||
{
|
||||
ids.push(id);
|
||||
}
|
||||
}
|
||||
Ok(ids)
|
||||
}
|
||||
}
|
||||
|
||||
impl Store for FsStore {
|
||||
fn save_reference(&mut self, reference: &Reference) -> Result<(), StoreError> {
|
||||
self.write_toml(&self.reference_path(reference.id), reference)
|
||||
}
|
||||
|
||||
fn load_reference(&self, id: ReferenceId) -> Result<Reference, StoreError> {
|
||||
let path = self.reference_path(id);
|
||||
if !path.exists() {
|
||||
return Err(StoreError::NotFound {
|
||||
entity_type: EntityType::Reference,
|
||||
id: id.to_string(),
|
||||
});
|
||||
}
|
||||
self.read_toml(&path)
|
||||
}
|
||||
|
||||
fn delete_reference(&mut self, id: ReferenceId) -> Result<(), StoreError> {
|
||||
let path = self.reference_path(id);
|
||||
if !path.exists() {
|
||||
return Err(StoreError::NotFound {
|
||||
entity_type: EntityType::Reference,
|
||||
id: id.to_string(),
|
||||
});
|
||||
}
|
||||
std::fs::remove_file(path)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn list_reference_ids(&self) -> Result<Vec<ReferenceId>, StoreError> {
|
||||
self.ids_from_dir(REFERENCES_DIR, |s| {
|
||||
s.parse::<uuid::Uuid>().ok().map(ReferenceId::from)
|
||||
})
|
||||
}
|
||||
|
||||
fn save_library(&mut self, library: &Library) -> Result<(), StoreError> {
|
||||
self.write_toml(&self.library_path(library.id), library)
|
||||
}
|
||||
|
||||
fn load_library(&self, id: LibraryId) -> Result<Library, StoreError> {
|
||||
let path = self.library_path(id);
|
||||
if !path.exists() {
|
||||
return Err(StoreError::NotFound {
|
||||
entity_type: EntityType::Library,
|
||||
id: id.to_string(),
|
||||
});
|
||||
}
|
||||
self.read_toml(&path)
|
||||
}
|
||||
|
||||
fn delete_library(&mut self, id: LibraryId) -> Result<(), StoreError> {
|
||||
let path = self.library_path(id);
|
||||
if !path.exists() {
|
||||
return Err(StoreError::NotFound {
|
||||
entity_type: EntityType::Library,
|
||||
id: id.to_string(),
|
||||
});
|
||||
}
|
||||
std::fs::remove_file(path)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn list_library_ids(&self) -> Result<Vec<LibraryId>, StoreError> {
|
||||
self.ids_from_dir(LIBRARIES_DIR, |s| {
|
||||
s.parse::<uuid::Uuid>().ok().map(LibraryId::from)
|
||||
})
|
||||
}
|
||||
|
||||
fn load_annotations(&self, ref_id: ReferenceId) -> Result<AnnotationSet, StoreError> {
|
||||
let path = self.annotation_path(ref_id);
|
||||
if !path.exists() {
|
||||
return Ok(AnnotationSet::new(ref_id));
|
||||
}
|
||||
self.read_toml(&path)
|
||||
}
|
||||
|
||||
fn save_annotations(&mut self, set: &AnnotationSet) -> Result<(), StoreError> {
|
||||
self.write_toml(&self.annotation_path(set.reference_id), set)
|
||||
}
|
||||
|
||||
fn delete_annotations(&mut self, ref_id: ReferenceId) -> Result<(), StoreError> {
|
||||
let path = self.annotation_path(ref_id);
|
||||
if path.exists() {
|
||||
std::fs::remove_file(path)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn create_snapshot(&mut self, message: &str) -> Result<Snapshot, StoreError> {
|
||||
let oid = self.commit_all(message)?;
|
||||
let commit = self
|
||||
.repo
|
||||
.find_commit(git2::Oid::from_str(&oid).map_err(StoreError::Git)?)
|
||||
.map_err(StoreError::Git)?;
|
||||
let timestamp = commit_timestamp(&commit)?;
|
||||
|
||||
Ok(Snapshot {
|
||||
id: oid,
|
||||
message: message.to_owned(),
|
||||
timestamp,
|
||||
})
|
||||
}
|
||||
|
||||
fn list_snapshots(&self) -> Result<Vec<Snapshot>, StoreError> {
|
||||
let mut revwalk = self.repo.revwalk().map_err(StoreError::Git)?;
|
||||
revwalk.push_head().map_err(StoreError::Git)?;
|
||||
revwalk
|
||||
.set_sorting(git2::Sort::TIME)
|
||||
.map_err(StoreError::Git)?;
|
||||
|
||||
let mut snapshots = Vec::new();
|
||||
for oid in revwalk {
|
||||
let oid = oid.map_err(StoreError::Git)?;
|
||||
let commit = self.repo.find_commit(oid).map_err(StoreError::Git)?;
|
||||
let message = commit.message().unwrap_or("").to_owned();
|
||||
let timestamp = commit_timestamp(&commit)?;
|
||||
snapshots.push(Snapshot {
|
||||
id: oid.to_string(),
|
||||
message,
|
||||
timestamp,
|
||||
});
|
||||
}
|
||||
|
||||
Ok(snapshots)
|
||||
}
|
||||
|
||||
fn restore_snapshot(&mut self, snapshot_id: &str) -> Result<(), StoreError> {
|
||||
let oid = git2::Oid::from_str(snapshot_id).map_err(|_| StoreError::NotFound {
|
||||
entity_type: EntityType::Snapshot,
|
||||
id: snapshot_id.to_owned(),
|
||||
})?;
|
||||
|
||||
let commit = self
|
||||
.repo
|
||||
.find_commit(oid)
|
||||
.map_err(|_| StoreError::NotFound {
|
||||
entity_type: EntityType::Snapshot,
|
||||
id: snapshot_id.to_owned(),
|
||||
})?;
|
||||
|
||||
let tree = commit.tree().map_err(StoreError::Git)?;
|
||||
|
||||
// Checkout the tree, updating both the index and the working directory.
|
||||
// `force` overwrites modified tracked files; `remove_untracked` removes
|
||||
// files that were written since the last snapshot but never committed.
|
||||
let mut checkout_opts = git2::build::CheckoutBuilder::new();
|
||||
checkout_opts.force().remove_untracked(true);
|
||||
self.repo
|
||||
.checkout_tree(tree.as_object(), Some(&mut checkout_opts))
|
||||
.map_err(StoreError::Git)?;
|
||||
|
||||
// Move HEAD to point at the restored commit.
|
||||
self.repo.set_head_detached(oid).map_err(StoreError::Git)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn has_uncommitted_changes(&self) -> Result<bool, StoreError> {
|
||||
let statuses = self
|
||||
.repo
|
||||
.statuses(Some(
|
||||
git2::StatusOptions::new()
|
||||
.include_untracked(true)
|
||||
.recurse_untracked_dirs(true),
|
||||
))
|
||||
.map_err(StoreError::Git)?;
|
||||
Ok(!statuses.is_empty())
|
||||
}
|
||||
}
|
||||
|
||||
fn commit_timestamp(commit: &git2::Commit<'_>) -> Result<DateTime<Utc>, StoreError> {
|
||||
let time = commit.time();
|
||||
Utc.timestamp_opt(time.seconds(), 0)
|
||||
.single()
|
||||
.ok_or_else(|| StoreError::Serialization {
|
||||
message: "invalid commit timestamp".into(),
|
||||
})
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::model::{EntryType, Library, Reference};
|
||||
|
||||
fn make_store(dir: &Path) -> FsStore {
|
||||
FsStore::create(dir).expect("create store")
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn create_and_open() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let store = make_store(tmp.path());
|
||||
drop(store);
|
||||
FsStore::open(tmp.path()).expect("re-open store");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn create_fails_if_repo_exists() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
make_store(tmp.path());
|
||||
assert!(FsStore::create(tmp.path()).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_load_delete_reference() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
let r = Reference::new("test2024", EntryType::Article);
|
||||
let id = r.id;
|
||||
|
||||
store.save_reference(&r).unwrap();
|
||||
let loaded = store.load_reference(id).unwrap();
|
||||
assert_eq!(loaded.cite_key, "test2024");
|
||||
|
||||
store.delete_reference(id).unwrap();
|
||||
assert!(store.load_reference(id).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn list_reference_ids() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
let r1 = Reference::new("a2024", EntryType::Article);
|
||||
let r2 = Reference::new("b2024", EntryType::Book);
|
||||
store.save_reference(&r1).unwrap();
|
||||
store.save_reference(&r2).unwrap();
|
||||
|
||||
let ids = store.list_reference_ids().unwrap();
|
||||
assert_eq!(ids.len(), 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_load_library() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
let lib = Library::new("ML Papers", None);
|
||||
let id = lib.id;
|
||||
|
||||
store.save_library(&lib).unwrap();
|
||||
let loaded = store.load_library(id).unwrap();
|
||||
assert_eq!(loaded.name, "ML Papers");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn annotations_missing_returns_empty_set() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let store = make_store(tmp.path());
|
||||
let ref_id = ReferenceId::new();
|
||||
let set = store.load_annotations(ref_id).unwrap();
|
||||
assert!(set.annotations.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn create_and_list_snapshot() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
|
||||
// Save something so there's content to commit beyond the initial commit.
|
||||
let r = Reference::new("snap2024", EntryType::Misc);
|
||||
store.save_reference(&r).unwrap();
|
||||
let snap = store.create_snapshot("my first snapshot").unwrap();
|
||||
|
||||
let snapshots = store.list_snapshots().unwrap();
|
||||
assert!(snapshots.iter().any(|s| s.id == snap.id));
|
||||
assert!(snapshots.iter().any(|s| s.message == "my first snapshot"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn restore_snapshot_reverts_state() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
|
||||
let r = Reference::new("before2024", EntryType::Article);
|
||||
let ref_id = r.id;
|
||||
store.save_reference(&r).unwrap();
|
||||
let snap = store.create_snapshot("baseline").unwrap();
|
||||
|
||||
// Modify state: add another reference.
|
||||
let r2 = Reference::new("after2024", EntryType::Book);
|
||||
store.save_reference(&r2).unwrap();
|
||||
assert_eq!(store.list_reference_ids().unwrap().len(), 2);
|
||||
|
||||
// Restore to baseline — should have only 1 reference.
|
||||
store.restore_snapshot(&snap.id).unwrap();
|
||||
|
||||
let store2 = FsStore::open(tmp.path()).unwrap();
|
||||
assert_eq!(store2.list_reference_ids().unwrap().len(), 1);
|
||||
assert!(store2.load_reference(ref_id).is_ok());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn has_uncommitted_changes_detects_new_files() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut store = make_store(tmp.path());
|
||||
|
||||
assert!(!store.has_uncommitted_changes().unwrap());
|
||||
|
||||
let r = Reference::new("new2024", EntryType::Misc);
|
||||
store.save_reference(&r).unwrap();
|
||||
|
||||
assert!(store.has_uncommitted_changes().unwrap());
|
||||
}
|
||||
}
|
||||
302
brittle-core/src/store/memory.rs
Normal file
302
brittle-core/src/store/memory.rs
Normal file
@@ -0,0 +1,302 @@
|
||||
use crate::error::{EntityType, StoreError};
|
||||
use crate::model::{AnnotationSet, Library, LibraryId, Reference, ReferenceId, Snapshot};
|
||||
use crate::store::Store;
|
||||
use chrono::Utc;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// In-memory store for testing. Not suitable for production use.
|
||||
#[derive(Debug, Default)]
|
||||
pub struct MemoryStore {
|
||||
references: HashMap<ReferenceId, Reference>,
|
||||
libraries: HashMap<LibraryId, Library>,
|
||||
annotations: HashMap<ReferenceId, AnnotationSet>,
|
||||
/// Checkpoints for snapshot simulation: (id, message, cloned state).
|
||||
snapshots: Vec<(String, String, Box<MemorySnapshot>)>,
|
||||
next_snapshot_idx: usize,
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
struct MemorySnapshot {
|
||||
references: HashMap<ReferenceId, Reference>,
|
||||
libraries: HashMap<LibraryId, Library>,
|
||||
annotations: HashMap<ReferenceId, AnnotationSet>,
|
||||
}
|
||||
|
||||
impl MemoryStore {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
}
|
||||
|
||||
impl Store for MemoryStore {
|
||||
fn save_reference(&mut self, reference: &Reference) -> Result<(), StoreError> {
|
||||
self.references.insert(reference.id, reference.clone());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn load_reference(&self, id: ReferenceId) -> Result<Reference, StoreError> {
|
||||
self.references
|
||||
.get(&id)
|
||||
.cloned()
|
||||
.ok_or_else(|| StoreError::NotFound {
|
||||
entity_type: EntityType::Reference,
|
||||
id: id.to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
fn delete_reference(&mut self, id: ReferenceId) -> Result<(), StoreError> {
|
||||
self.references
|
||||
.remove(&id)
|
||||
.ok_or_else(|| StoreError::NotFound {
|
||||
entity_type: EntityType::Reference,
|
||||
id: id.to_string(),
|
||||
})?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn list_reference_ids(&self) -> Result<Vec<ReferenceId>, StoreError> {
|
||||
Ok(self.references.keys().copied().collect())
|
||||
}
|
||||
|
||||
fn save_library(&mut self, library: &Library) -> Result<(), StoreError> {
|
||||
self.libraries.insert(library.id, library.clone());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn load_library(&self, id: LibraryId) -> Result<Library, StoreError> {
|
||||
self.libraries
|
||||
.get(&id)
|
||||
.cloned()
|
||||
.ok_or_else(|| StoreError::NotFound {
|
||||
entity_type: EntityType::Library,
|
||||
id: id.to_string(),
|
||||
})
|
||||
}
|
||||
|
||||
fn delete_library(&mut self, id: LibraryId) -> Result<(), StoreError> {
|
||||
self.libraries
|
||||
.remove(&id)
|
||||
.ok_or_else(|| StoreError::NotFound {
|
||||
entity_type: EntityType::Library,
|
||||
id: id.to_string(),
|
||||
})?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn list_library_ids(&self) -> Result<Vec<LibraryId>, StoreError> {
|
||||
Ok(self.libraries.keys().copied().collect())
|
||||
}
|
||||
|
||||
fn load_annotations(&self, ref_id: ReferenceId) -> Result<AnnotationSet, StoreError> {
|
||||
Ok(self
|
||||
.annotations
|
||||
.get(&ref_id)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| AnnotationSet::new(ref_id)))
|
||||
}
|
||||
|
||||
fn save_annotations(&mut self, set: &AnnotationSet) -> Result<(), StoreError> {
|
||||
self.annotations.insert(set.reference_id, set.clone());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn delete_annotations(&mut self, ref_id: ReferenceId) -> Result<(), StoreError> {
|
||||
self.annotations.remove(&ref_id);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn create_snapshot(&mut self, message: &str) -> Result<Snapshot, StoreError> {
|
||||
let id = format!("mem-snapshot-{:04}", self.next_snapshot_idx);
|
||||
self.next_snapshot_idx += 1;
|
||||
let snapshot_data = Box::new(MemorySnapshot {
|
||||
references: self.references.clone(),
|
||||
libraries: self.libraries.clone(),
|
||||
annotations: self.annotations.clone(),
|
||||
});
|
||||
let timestamp = Utc::now();
|
||||
self.snapshots
|
||||
.push((id.clone(), message.to_owned(), snapshot_data));
|
||||
Ok(Snapshot {
|
||||
id,
|
||||
message: message.to_owned(),
|
||||
timestamp,
|
||||
})
|
||||
}
|
||||
|
||||
fn list_snapshots(&self) -> Result<Vec<Snapshot>, StoreError> {
|
||||
let snapshots = self
|
||||
.snapshots
|
||||
.iter()
|
||||
.rev()
|
||||
.map(|(id, message, _)| Snapshot {
|
||||
id: id.clone(),
|
||||
message: message.clone(),
|
||||
timestamp: Utc::now(), // timestamps not stored in MemoryStore
|
||||
})
|
||||
.collect();
|
||||
Ok(snapshots)
|
||||
}
|
||||
|
||||
fn restore_snapshot(&mut self, snapshot_id: &str) -> Result<(), StoreError> {
|
||||
let snapshot = self
|
||||
.snapshots
|
||||
.iter()
|
||||
.find(|(id, _, _)| id == snapshot_id)
|
||||
.ok_or_else(|| StoreError::NotFound {
|
||||
entity_type: EntityType::Snapshot,
|
||||
id: snapshot_id.to_owned(),
|
||||
})?;
|
||||
self.references = snapshot.2.references.clone();
|
||||
self.libraries = snapshot.2.libraries.clone();
|
||||
self.annotations = snapshot.2.annotations.clone();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn has_uncommitted_changes(&self) -> Result<bool, StoreError> {
|
||||
// MemoryStore has no concept of uncommitted changes.
|
||||
Ok(false)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::model::{AnnotationType, EntryType, Library, Reference, TextMarkupType};
|
||||
|
||||
fn make_reference() -> Reference {
|
||||
Reference::new("test2024", EntryType::Article)
|
||||
}
|
||||
|
||||
fn make_library() -> Library {
|
||||
Library::new("Test Library", None)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_and_load_reference() {
|
||||
let mut store = MemoryStore::new();
|
||||
let r = make_reference();
|
||||
let id = r.id;
|
||||
store.save_reference(&r).unwrap();
|
||||
let r2 = store.load_reference(id).unwrap();
|
||||
assert_eq!(r.cite_key, r2.cite_key);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn load_missing_reference_returns_error() {
|
||||
let store = MemoryStore::new();
|
||||
let id = ReferenceId::new();
|
||||
let err = store.load_reference(id).unwrap_err();
|
||||
assert!(matches!(
|
||||
err,
|
||||
StoreError::NotFound {
|
||||
entity_type: EntityType::Reference,
|
||||
..
|
||||
}
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn delete_reference() {
|
||||
let mut store = MemoryStore::new();
|
||||
let r = make_reference();
|
||||
let id = r.id;
|
||||
store.save_reference(&r).unwrap();
|
||||
store.delete_reference(id).unwrap();
|
||||
assert!(store.load_reference(id).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn list_reference_ids() {
|
||||
let mut store = MemoryStore::new();
|
||||
let r1 = make_reference();
|
||||
let r2 = make_reference();
|
||||
store.save_reference(&r1).unwrap();
|
||||
store.save_reference(&r2).unwrap();
|
||||
let ids = store.list_reference_ids().unwrap();
|
||||
assert_eq!(ids.len(), 2);
|
||||
assert!(ids.contains(&r1.id));
|
||||
assert!(ids.contains(&r2.id));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_and_load_library() {
|
||||
let mut store = MemoryStore::new();
|
||||
let lib = make_library();
|
||||
let id = lib.id;
|
||||
store.save_library(&lib).unwrap();
|
||||
let lib2 = store.load_library(id).unwrap();
|
||||
assert_eq!(lib.name, lib2.name);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn delete_library() {
|
||||
let mut store = MemoryStore::new();
|
||||
let lib = make_library();
|
||||
let id = lib.id;
|
||||
store.save_library(&lib).unwrap();
|
||||
store.delete_library(id).unwrap();
|
||||
assert!(store.load_library(id).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn annotations_default_to_empty_set() {
|
||||
let store = MemoryStore::new();
|
||||
let ref_id = ReferenceId::new();
|
||||
let set = store.load_annotations(ref_id).unwrap();
|
||||
assert_eq!(set.reference_id, ref_id);
|
||||
assert!(set.annotations.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_and_load_annotations() {
|
||||
use crate::model::{Annotation, Color};
|
||||
|
||||
let mut store = MemoryStore::new();
|
||||
let ref_id = ReferenceId::new();
|
||||
let ann = Annotation::new(
|
||||
ref_id,
|
||||
0,
|
||||
AnnotationType::TextMarkup {
|
||||
markup_type: TextMarkupType::Highlight,
|
||||
quads: vec![],
|
||||
color: Color::YELLOW,
|
||||
selected_text: None,
|
||||
},
|
||||
);
|
||||
let set = AnnotationSet {
|
||||
reference_id: ref_id,
|
||||
annotations: vec![ann],
|
||||
};
|
||||
store.save_annotations(&set).unwrap();
|
||||
let set2 = store.load_annotations(ref_id).unwrap();
|
||||
assert_eq!(set2.annotations.len(), 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn snapshot_create_and_restore() {
|
||||
let mut store = MemoryStore::new();
|
||||
let r = make_reference();
|
||||
let ref_id = r.id;
|
||||
store.save_reference(&r).unwrap();
|
||||
|
||||
let snap = store.create_snapshot("first snapshot").unwrap();
|
||||
|
||||
// Modify state after snapshot.
|
||||
store.delete_reference(ref_id).unwrap();
|
||||
assert!(store.load_reference(ref_id).is_err());
|
||||
|
||||
// Restore snapshot.
|
||||
store.restore_snapshot(&snap.id).unwrap();
|
||||
assert!(store.load_reference(ref_id).is_ok());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn list_snapshots_in_reverse_order() {
|
||||
let mut store = MemoryStore::new();
|
||||
store.create_snapshot("first").unwrap();
|
||||
store.create_snapshot("second").unwrap();
|
||||
let snaps = store.list_snapshots().unwrap();
|
||||
assert_eq!(snaps.len(), 2);
|
||||
assert_eq!(snaps[0].message, "second"); // most recent first
|
||||
}
|
||||
}
|
||||
44
brittle-core/src/store/mod.rs
Normal file
44
brittle-core/src/store/mod.rs
Normal file
@@ -0,0 +1,44 @@
|
||||
pub mod fs;
|
||||
pub mod memory;
|
||||
|
||||
use crate::error::StoreError;
|
||||
use crate::model::{AnnotationSet, Library, LibraryId, Reference, ReferenceId, Snapshot};
|
||||
|
||||
/// Abstraction over the storage backend.
|
||||
///
|
||||
/// The git-backed filesystem (`FsStore`) is the production implementation.
|
||||
/// An in-memory implementation (`MemoryStore`) exists for testing.
|
||||
pub trait Store {
|
||||
// ---- References ----
|
||||
|
||||
fn save_reference(&mut self, reference: &Reference) -> Result<(), StoreError>;
|
||||
fn load_reference(&self, id: ReferenceId) -> Result<Reference, StoreError>;
|
||||
fn delete_reference(&mut self, id: ReferenceId) -> Result<(), StoreError>;
|
||||
fn list_reference_ids(&self) -> Result<Vec<ReferenceId>, StoreError>;
|
||||
|
||||
// ---- Libraries ----
|
||||
|
||||
fn save_library(&mut self, library: &Library) -> Result<(), StoreError>;
|
||||
fn load_library(&self, id: LibraryId) -> Result<Library, StoreError>;
|
||||
fn delete_library(&mut self, id: LibraryId) -> Result<(), StoreError>;
|
||||
fn list_library_ids(&self) -> Result<Vec<LibraryId>, StoreError>;
|
||||
|
||||
// ---- Annotations ----
|
||||
|
||||
/// Load the annotation set for a reference. Returns an empty set if none exists.
|
||||
fn load_annotations(&self, ref_id: ReferenceId) -> Result<AnnotationSet, StoreError>;
|
||||
fn save_annotations(&mut self, set: &AnnotationSet) -> Result<(), StoreError>;
|
||||
fn delete_annotations(&mut self, ref_id: ReferenceId) -> Result<(), StoreError>;
|
||||
|
||||
// ---- Snapshots ----
|
||||
|
||||
fn create_snapshot(&mut self, message: &str) -> Result<Snapshot, StoreError>;
|
||||
fn list_snapshots(&self) -> Result<Vec<Snapshot>, StoreError>;
|
||||
/// Restore to a previous snapshot. Caller must ensure no uncommitted changes exist.
|
||||
fn restore_snapshot(&mut self, snapshot_id: &str) -> Result<(), StoreError>;
|
||||
fn has_uncommitted_changes(&self) -> Result<bool, StoreError>;
|
||||
}
|
||||
|
||||
// Re-export concrete types for convenience.
|
||||
pub use fs::FsStore;
|
||||
pub use memory::MemoryStore;
|
||||
163
brittle-core/tests/end_to_end.rs
Normal file
163
brittle-core/tests/end_to_end.rs
Normal file
@@ -0,0 +1,163 @@
|
||||
/// End-to-end integration test using a real Brittle<FsStore> repository.
|
||||
///
|
||||
/// Exercises the full workflow: create repo, add references with authors,
|
||||
/// organize in libraries, export BibTeX, create a snapshot, modify state,
|
||||
/// restore the snapshot, and verify everything reverted correctly.
|
||||
use brittle_core::{
|
||||
AnnotationType, Brittle, BrittleError, Color, EntryType, Person, TextMarkupType,
|
||||
ValidationError,
|
||||
};
|
||||
|
||||
#[test]
|
||||
fn full_workflow() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut db = Brittle::create(tmp.path()).unwrap();
|
||||
|
||||
// ---- Create references ----
|
||||
let mut turing = db
|
||||
.create_reference("turing1950", EntryType::Article)
|
||||
.unwrap();
|
||||
turing.authors.push(Person {
|
||||
family: "Turing".into(),
|
||||
given: Some("Alan M.".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
});
|
||||
turing.fields.insert(
|
||||
"title".into(),
|
||||
"Computing Machinery and Intelligence".into(),
|
||||
);
|
||||
turing.fields.insert("journal".into(), "Mind".into());
|
||||
turing.fields.insert("year".into(), "1950".into());
|
||||
let turing = db.update_reference(turing).unwrap();
|
||||
|
||||
let mut knuth = db.create_reference("knuth1984", EntryType::Book).unwrap();
|
||||
knuth.authors.push(Person::new("Knuth"));
|
||||
knuth.fields.insert("title".into(), "The TeXbook".into());
|
||||
knuth
|
||||
.fields
|
||||
.insert("publisher".into(), "Addison-Wesley".into());
|
||||
knuth.fields.insert("year".into(), "1984".into());
|
||||
let knuth = db.update_reference(knuth).unwrap();
|
||||
|
||||
// ---- Organize in libraries ----
|
||||
let cs = db.create_library("Computer Science", None).unwrap();
|
||||
let ai = db.create_library("AI", Some(cs.id)).unwrap();
|
||||
|
||||
db.add_to_library(cs.id, turing.id).unwrap();
|
||||
db.add_to_library(ai.id, turing.id).unwrap(); // multi-membership
|
||||
db.add_to_library(cs.id, knuth.id).unwrap();
|
||||
|
||||
// Both references are in CS.
|
||||
let cs_refs = db.list_library_references(cs.id).unwrap();
|
||||
assert_eq!(cs_refs.len(), 2);
|
||||
|
||||
// Turing is in both CS and AI.
|
||||
let turing_libs = db.list_reference_libraries(turing.id).unwrap();
|
||||
assert_eq!(turing_libs.len(), 2);
|
||||
|
||||
// ---- BibTeX export ----
|
||||
let (bibtex, errors) = db.export_library_bibtex(cs.id).unwrap();
|
||||
assert!(errors.is_empty(), "unexpected BibTeX errors: {errors:?}");
|
||||
assert!(bibtex.contains("@article{turing1950,"));
|
||||
assert!(bibtex.contains("Turing, Alan M."));
|
||||
assert!(bibtex.contains("Computing Machinery and Intelligence"));
|
||||
assert!(bibtex.contains("@book{knuth1984,"));
|
||||
|
||||
// ---- Annotations ----
|
||||
let ann = db
|
||||
.create_annotation(
|
||||
turing.id,
|
||||
0,
|
||||
AnnotationType::TextMarkup {
|
||||
markup_type: TextMarkupType::Highlight,
|
||||
quads: vec![],
|
||||
color: Color::YELLOW,
|
||||
selected_text: Some("The Imitation Game".into()),
|
||||
},
|
||||
Some("Key concept".into()),
|
||||
)
|
||||
.unwrap();
|
||||
|
||||
let annotations = db.get_annotations(turing.id).unwrap();
|
||||
assert_eq!(annotations.len(), 1);
|
||||
assert_eq!(annotations[0].content.as_deref(), Some("Key concept"));
|
||||
|
||||
// ---- Snapshot ----
|
||||
let snap = db
|
||||
.create_snapshot("Baseline with Turing and Knuth")
|
||||
.unwrap();
|
||||
assert!(!snap.id.is_empty());
|
||||
|
||||
let snapshots = db.list_snapshots().unwrap();
|
||||
// At least our named snapshot + the initial "Initialize Brittle repository" commit.
|
||||
assert!(snapshots.len() >= 2);
|
||||
assert!(
|
||||
snapshots
|
||||
.iter()
|
||||
.any(|s| s.message == "Baseline with Turing and Knuth")
|
||||
);
|
||||
|
||||
// ---- Modify state after snapshot ----
|
||||
db.delete_reference(knuth.id).unwrap();
|
||||
assert!(db.get_reference(knuth.id).is_err());
|
||||
|
||||
let cs_refs_after = db.list_library_references(cs.id).unwrap();
|
||||
assert_eq!(
|
||||
cs_refs_after.len(),
|
||||
1,
|
||||
"Knuth should have been removed from library"
|
||||
);
|
||||
|
||||
// ---- Restore snapshot ----
|
||||
// The knuth deletion is written to disk but not committed — verify this.
|
||||
assert!(db.has_uncommitted_changes().unwrap());
|
||||
|
||||
// restore_snapshot errors on uncommitted changes; use discard_changes instead.
|
||||
db.discard_changes().unwrap();
|
||||
|
||||
// After restore, Knuth should be back.
|
||||
let knuth_restored = db.get_reference(knuth.id).unwrap();
|
||||
assert_eq!(knuth_restored.cite_key, "knuth1984");
|
||||
|
||||
// CS library should have 2 members again.
|
||||
let cs_refs_restored = db.list_library_references(cs.id).unwrap();
|
||||
assert_eq!(cs_refs_restored.len(), 2);
|
||||
|
||||
// Inspect git log to verify history is human-readable.
|
||||
let snapshots_after = db.list_snapshots().unwrap();
|
||||
assert!(!snapshots_after.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn get_pdf_path_returns_error_when_no_pdf_attached() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut db = Brittle::create(tmp.path()).unwrap();
|
||||
let r = db.create_reference("nopdf2024", EntryType::Misc).unwrap();
|
||||
|
||||
let err = db.get_pdf_path(r.id).unwrap_err();
|
||||
assert!(
|
||||
matches!(
|
||||
err,
|
||||
BrittleError::Validation(ValidationError::NoPdfAttached { .. })
|
||||
),
|
||||
"expected NoPdfAttached, got {err}"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn get_pdf_path_returns_path_after_attach() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let mut db = Brittle::create(tmp.path()).unwrap();
|
||||
let r = db.create_reference("withpdf2024", EntryType::Misc).unwrap();
|
||||
|
||||
// Write a dummy PDF file.
|
||||
let source = tmp.path().join("dummy.pdf");
|
||||
std::fs::write(&source, b"%PDF-1.4 dummy").unwrap();
|
||||
|
||||
db.attach_pdf(r.id, &source).unwrap();
|
||||
|
||||
let path = db.get_pdf_path(r.id).unwrap();
|
||||
assert!(path.exists(), "PDF path {path:?} should exist on disk");
|
||||
assert_eq!(path.extension().and_then(|e| e.to_str()), Some("pdf"));
|
||||
}
|
||||
4
brittle-keymap/Cargo.toml
Normal file
4
brittle-keymap/Cargo.toml
Normal file
@@ -0,0 +1,4 @@
|
||||
[package]
|
||||
name = "brittle-keymap"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
72
brittle-keymap/src/actions.rs
Normal file
72
brittle-keymap/src/actions.rs
Normal file
@@ -0,0 +1,72 @@
|
||||
//! Canonical action name constants used by both the default bindings and the UI.
|
||||
//!
|
||||
//! Every string that the keymap can emit as an action name is defined here.
|
||||
//! The UI dispatches on these to decide what to do.
|
||||
|
||||
// ── Navigation (within the focused pane) ─────────────────────────────────────
|
||||
|
||||
/// Move the selection cursor one step down (repeated `count` times).
|
||||
pub const NAV_DOWN: &str = "nav.down";
|
||||
/// Move the selection cursor one step up (repeated `count` times).
|
||||
pub const NAV_UP: &str = "nav.up";
|
||||
/// Jump to the first item.
|
||||
pub const NAV_TOP: &str = "nav.top";
|
||||
/// Jump to the last item.
|
||||
pub const NAV_BOTTOM: &str = "nav.bottom";
|
||||
/// Scroll / page down.
|
||||
pub const NAV_PAGE_DOWN: &str = "nav.page.down";
|
||||
/// Scroll / page up.
|
||||
pub const NAV_PAGE_UP: &str = "nav.page.up";
|
||||
|
||||
// ── Pane focus ────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Focus the left pane (library tree).
|
||||
pub const FOCUS_LEFT: &str = "focus.left";
|
||||
/// Focus the centre pane (reference list).
|
||||
pub const FOCUS_CENTER: &str = "focus.center";
|
||||
/// Focus the right pane (reference detail / editor).
|
||||
pub const FOCUS_RIGHT: &str = "focus.right";
|
||||
/// Move focus to the next pane (cycles left → centre → right → left).
|
||||
pub const FOCUS_NEXT: &str = "focus.next";
|
||||
/// Move focus to the previous pane.
|
||||
pub const FOCUS_PREV: &str = "focus.prev";
|
||||
|
||||
// ── Library tree ──────────────────────────────────────────────────────────────
|
||||
|
||||
/// Expand the selected tree node.
|
||||
pub const TREE_EXPAND: &str = "tree.expand";
|
||||
/// Collapse the selected tree node.
|
||||
pub const TREE_COLLAPSE: &str = "tree.collapse";
|
||||
/// Toggle the selected tree node open/closed.
|
||||
pub const TREE_TOGGLE: &str = "tree.toggle";
|
||||
|
||||
// ── Item actions ──────────────────────────────────────────────────────────────
|
||||
|
||||
/// Open the selected item (load PDF, expand library, etc.).
|
||||
pub const ACTION_OPEN: &str = "action.open";
|
||||
/// Begin editing the selected item.
|
||||
pub const ACTION_EDIT: &str = "action.edit";
|
||||
/// Delete the selected item.
|
||||
pub const ACTION_DELETE: &str = "action.delete";
|
||||
/// Create a new item in the current context.
|
||||
pub const ACTION_NEW: &str = "action.new";
|
||||
|
||||
// ── Tabs ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Cycle to the next tab (wraps around).
|
||||
pub const TAB_NEXT: &str = "tab.next";
|
||||
/// Cycle to the previous tab (wraps around).
|
||||
pub const TAB_PREV: &str = "tab.prev";
|
||||
/// Close the current tab (no-op on the Library tab).
|
||||
pub const TAB_CLOSE: &str = "tab.close";
|
||||
|
||||
// ── Input modes ───────────────────────────────────────────────────────────────
|
||||
|
||||
/// Enter command mode (the `:` prompt).
|
||||
pub const MODE_COMMAND: &str = "mode.command";
|
||||
/// Enter search / filter mode (the `/` prompt).
|
||||
pub const MODE_SEARCH: &str = "mode.search";
|
||||
/// Return to normal mode (dismiss any prompt, clear pending sequence).
|
||||
pub const MODE_NORMAL: &str = "mode.normal";
|
||||
/// Export current view as BibTeX.
|
||||
pub const MODE_BIBTEX: &str = "mode.bibtex";
|
||||
198
brittle-keymap/src/binding.rs
Normal file
198
brittle-keymap/src/binding.rs
Normal file
@@ -0,0 +1,198 @@
|
||||
//! Binding set: maps key sequences to action names.
|
||||
|
||||
use crate::key::{Key, ParseError};
|
||||
|
||||
/// A named binding: a key sequence that triggers an action.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Binding {
|
||||
/// The ordered sequence of keys that must be pressed.
|
||||
pub keys: Vec<Key>,
|
||||
/// The action name emitted when the sequence completes.
|
||||
pub action: String,
|
||||
}
|
||||
|
||||
/// Result of looking up a (partial) key sequence in a [`BindingSet`].
|
||||
#[derive(Debug, PartialEq, Eq)]
|
||||
pub enum LookupResult {
|
||||
/// The sequence is an exact match for a binding.
|
||||
Exact(String),
|
||||
/// The sequence is a valid prefix of one or more bindings; keep waiting.
|
||||
Prefix,
|
||||
/// The sequence matches nothing.
|
||||
NoMatch,
|
||||
}
|
||||
|
||||
/// A collection of key→action bindings.
|
||||
///
|
||||
/// Internally stored as a flat `Vec`; acceptable because the number of
|
||||
/// bindings is small (typically < 100) and sequences are short (≤ 4 keys).
|
||||
#[derive(Default, Clone, Debug)]
|
||||
pub struct BindingSet {
|
||||
bindings: Vec<Binding>,
|
||||
}
|
||||
|
||||
impl BindingSet {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
/// Add a binding from pre-parsed keys.
|
||||
pub fn add(&mut self, keys: Vec<Key>, action: impl Into<String>) {
|
||||
self.bindings.push(Binding {
|
||||
keys,
|
||||
action: action.into(),
|
||||
});
|
||||
}
|
||||
|
||||
/// Add a binding by parsing the key sequence string.
|
||||
///
|
||||
/// Returns the `ParseError` if the string is not valid.
|
||||
pub fn add_parsed(
|
||||
&mut self,
|
||||
key_sequence: &str,
|
||||
action: impl Into<String>,
|
||||
) -> Result<(), ParseError> {
|
||||
let keys = crate::key::parse_sequence(key_sequence)?;
|
||||
self.add(keys, action);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Look up a (possibly partial) key sequence.
|
||||
///
|
||||
/// If a sequence is both an exact match *and* a prefix of longer bindings,
|
||||
/// the exact match takes priority (no ambiguity).
|
||||
pub fn lookup(&self, keys: &[Key]) -> LookupResult {
|
||||
let mut found_prefix = false;
|
||||
|
||||
for binding in &self.bindings {
|
||||
if binding.keys == keys {
|
||||
return LookupResult::Exact(binding.action.clone());
|
||||
}
|
||||
if binding.keys.starts_with(keys) && binding.keys.len() > keys.len() {
|
||||
found_prefix = true;
|
||||
}
|
||||
}
|
||||
|
||||
if found_prefix {
|
||||
LookupResult::Prefix
|
||||
} else {
|
||||
LookupResult::NoMatch
|
||||
}
|
||||
}
|
||||
|
||||
/// Apply user-defined overrides on top of this binding set.
|
||||
///
|
||||
/// For each `(action, key_sequence)` pair in `overrides`:
|
||||
/// - All existing bindings for that action are removed.
|
||||
/// - A new binding from the parsed sequence is added.
|
||||
///
|
||||
/// Unknown action names are added as new bindings; parse errors are skipped.
|
||||
pub fn apply_overrides<'a>(&mut self, overrides: impl IntoIterator<Item = (&'a str, &'a str)>) {
|
||||
for (action, key_seq) in overrides {
|
||||
// Remove existing bindings for this action.
|
||||
self.bindings.retain(|b| b.action != action);
|
||||
// Add the new binding (skip on parse error).
|
||||
if let Ok(keys) = crate::key::parse_sequence(key_seq) {
|
||||
self.add(keys, action);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Return all bindings for inspection.
|
||||
pub fn bindings(&self) -> &[Binding] {
|
||||
&self.bindings
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::key::Key;
|
||||
|
||||
fn set_with_defaults() -> BindingSet {
|
||||
let mut s = BindingSet::new();
|
||||
s.add_parsed("j", "nav.down").unwrap();
|
||||
s.add_parsed("k", "nav.up").unwrap();
|
||||
s.add_parsed("<Down>", "nav.down").unwrap();
|
||||
s.add_parsed("gg", "nav.top").unwrap();
|
||||
s.add_parsed("G", "nav.bottom").unwrap();
|
||||
s.add_parsed("zo", "tree.expand").unwrap();
|
||||
s
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn exact_single_key() {
|
||||
let s = set_with_defaults();
|
||||
let keys = vec![Key::char('j')];
|
||||
assert_eq!(s.lookup(&keys), LookupResult::Exact("nav.down".into()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn exact_multi_key() {
|
||||
let s = set_with_defaults();
|
||||
let keys = vec![Key::char('g'), Key::char('g')];
|
||||
assert_eq!(s.lookup(&keys), LookupResult::Exact("nav.top".into()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn prefix_of_multi_key_binding() {
|
||||
let s = set_with_defaults();
|
||||
let keys = vec![Key::char('g')];
|
||||
assert_eq!(s.lookup(&keys), LookupResult::Prefix);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_match() {
|
||||
let s = set_with_defaults();
|
||||
let keys = vec![Key::char('x')];
|
||||
assert_eq!(s.lookup(&keys), LookupResult::NoMatch);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_match_wrong_continuation() {
|
||||
let s = set_with_defaults();
|
||||
// "gx" should not match anything.
|
||||
let keys = vec![Key::char('g'), Key::char('x')];
|
||||
assert_eq!(s.lookup(&keys), LookupResult::NoMatch);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_overrides_replaces_action_binding() {
|
||||
let mut s = set_with_defaults();
|
||||
// Replace nav.down from "j" to "n".
|
||||
s.apply_overrides([("nav.down", "n")]);
|
||||
|
||||
// Old "j" binding is gone.
|
||||
assert_eq!(s.lookup(&[Key::char('j')]), LookupResult::NoMatch);
|
||||
// New "n" binding is present.
|
||||
assert_eq!(
|
||||
s.lookup(&[Key::char('n')]),
|
||||
LookupResult::Exact("nav.down".into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_overrides_removes_all_bindings_for_action() {
|
||||
let mut s = set_with_defaults();
|
||||
// nav.down is bound to both "j" and "<Down>".
|
||||
s.apply_overrides([("nav.down", "n")]);
|
||||
|
||||
// Both old bindings should be gone.
|
||||
use crate::key::{Key as K, KeyCode};
|
||||
assert_eq!(
|
||||
s.lookup(&[K::plain(KeyCode::ArrowDown)]),
|
||||
LookupResult::NoMatch
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn apply_overrides_bad_sequence_is_skipped() {
|
||||
let mut s = set_with_defaults();
|
||||
// "<Bogus>" is not a valid key sequence — the override should be silently skipped.
|
||||
s.apply_overrides([("nav.down", "<Bogus>")]);
|
||||
// Original binding is still gone (we removed it before failing to parse).
|
||||
// This is an acceptable edge case — the user has a bad config.
|
||||
}
|
||||
}
|
||||
208
brittle-keymap/src/defaults.rs
Normal file
208
brittle-keymap/src/defaults.rs
Normal file
@@ -0,0 +1,208 @@
|
||||
//! Built-in default keybindings.
|
||||
//!
|
||||
//! These are Sioyek/zathura/vim-inspired defaults. Every binding refers to an
|
||||
//! action constant from [`crate::actions`].
|
||||
|
||||
use crate::{actions as a, binding::BindingSet};
|
||||
|
||||
/// Return a [`BindingSet`] populated with the built-in default bindings.
|
||||
///
|
||||
/// Multiple key sequences may map to the same action (e.g. both `j` and
|
||||
/// `<Down>` trigger `nav.down`).
|
||||
pub fn default_bindings() -> BindingSet {
|
||||
let mut set = BindingSet::new();
|
||||
|
||||
// Helper to register a binding, panicking if the sequence is invalid.
|
||||
// Invalid sequences in defaults are a programming error, not a user error.
|
||||
macro_rules! bind {
|
||||
($seq:expr => $action:expr) => {
|
||||
set.add_parsed($seq, $action)
|
||||
.unwrap_or_else(|e| panic!("invalid default binding '{}': {}", $seq, e));
|
||||
};
|
||||
}
|
||||
|
||||
// ── Navigation ────────────────────────────────────────────────────────────
|
||||
bind!("j" => a::NAV_DOWN);
|
||||
bind!("<Down>" => a::NAV_DOWN);
|
||||
bind!("k" => a::NAV_UP);
|
||||
bind!("<Up>" => a::NAV_UP);
|
||||
bind!("gg" => a::NAV_TOP);
|
||||
bind!("G" => a::NAV_BOTTOM);
|
||||
bind!("<C-d>" => a::NAV_PAGE_DOWN);
|
||||
bind!("<C-u>" => a::NAV_PAGE_UP);
|
||||
bind!("<PageDown>" => a::NAV_PAGE_DOWN);
|
||||
bind!("<PageUp>" => a::NAV_PAGE_UP);
|
||||
|
||||
// ── Pane focus ────────────────────────────────────────────────────────────
|
||||
bind!("<Tab>" => a::FOCUS_NEXT);
|
||||
bind!("<S-Tab>" => a::FOCUS_PREV);
|
||||
bind!("H" => a::FOCUS_LEFT);
|
||||
bind!("M" => a::FOCUS_CENTER);
|
||||
bind!("L" => a::FOCUS_RIGHT);
|
||||
|
||||
// ── Library tree ──────────────────────────────────────────────────────────
|
||||
bind!("zo" => a::TREE_EXPAND);
|
||||
bind!("zc" => a::TREE_COLLAPSE);
|
||||
bind!("za" => a::TREE_TOGGLE);
|
||||
// Arrow-style tree navigation: l expands, h collapses.
|
||||
bind!("l" => a::TREE_EXPAND);
|
||||
bind!("h" => a::TREE_COLLAPSE);
|
||||
|
||||
// ── Item actions ──────────────────────────────────────────────────────────
|
||||
bind!("<Enter>" => a::ACTION_OPEN);
|
||||
bind!("e" => a::ACTION_EDIT);
|
||||
bind!("d" => a::ACTION_DELETE);
|
||||
bind!("n" => a::ACTION_NEW);
|
||||
|
||||
// ── Tabs ──────────────────────────────────────────────────────────────────
|
||||
// vim-style gt/gT; g is already a prefix from gg (nav.top).
|
||||
bind!("gt" => a::TAB_NEXT);
|
||||
bind!("gT" => a::TAB_PREV);
|
||||
bind!("q" => a::TAB_CLOSE);
|
||||
|
||||
// ── Input modes ───────────────────────────────────────────────────────────
|
||||
bind!(":" => a::MODE_COMMAND);
|
||||
bind!("/" => a::MODE_SEARCH);
|
||||
bind!("<Esc>" => a::MODE_NORMAL);
|
||||
bind!("b" => a::MODE_BIBTEX);
|
||||
|
||||
set
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::{
|
||||
binding::LookupResult,
|
||||
key::{Key, KeyCode},
|
||||
};
|
||||
|
||||
fn defaults() -> BindingSet {
|
||||
default_bindings()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn j_maps_to_nav_down() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('j')]),
|
||||
LookupResult::Exact(a::NAV_DOWN.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn arrow_down_maps_to_nav_down() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::plain(KeyCode::ArrowDown)]),
|
||||
LookupResult::Exact(a::NAV_DOWN.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn gg_maps_to_nav_top() {
|
||||
let d = defaults();
|
||||
// 'g' alone is a prefix.
|
||||
assert_eq!(d.lookup(&[Key::char('g')]), LookupResult::Prefix);
|
||||
// 'gg' is the full binding.
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('g'), Key::char('g')]),
|
||||
LookupResult::Exact(a::NAV_TOP.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn capital_g_maps_to_nav_bottom() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('G')]),
|
||||
LookupResult::Exact(a::NAV_BOTTOM.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zo_maps_to_tree_expand() {
|
||||
let d = defaults();
|
||||
assert_eq!(d.lookup(&[Key::char('z')]), LookupResult::Prefix);
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('z'), Key::char('o')]),
|
||||
LookupResult::Exact(a::TREE_EXPAND.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn colon_maps_to_mode_command() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char(':')]),
|
||||
LookupResult::Exact(a::MODE_COMMAND.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn escape_maps_to_mode_normal() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::plain(KeyCode::Escape)]),
|
||||
LookupResult::Exact(a::MODE_NORMAL.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tab_maps_to_focus_next() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::plain(KeyCode::Tab)]),
|
||||
LookupResult::Exact(a::FOCUS_NEXT.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn shift_tab_maps_to_focus_prev() {
|
||||
use crate::key::parse_sequence;
|
||||
let d = defaults();
|
||||
let shift_tab = &parse_sequence("<S-Tab>").unwrap()[0];
|
||||
assert_eq!(
|
||||
d.lookup(&[shift_tab.clone()]),
|
||||
LookupResult::Exact(a::FOCUS_PREV.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn gt_maps_to_tab_next() {
|
||||
let d = defaults();
|
||||
// g is still a prefix (gg, gt, gT all share it)
|
||||
assert_eq!(d.lookup(&[Key::char('g')]), LookupResult::Prefix);
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('g'), Key::char('t')]),
|
||||
LookupResult::Exact(a::TAB_NEXT.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn capital_gt_maps_to_tab_prev() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('g'), Key::char('T')]),
|
||||
LookupResult::Exact(a::TAB_PREV.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn q_maps_to_tab_close() {
|
||||
let d = defaults();
|
||||
assert_eq!(
|
||||
d.lookup(&[Key::char('q')]),
|
||||
LookupResult::Exact(a::TAB_CLOSE.into())
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn all_default_sequences_are_valid() {
|
||||
// Constructing the defaults panics on any invalid sequence, so this
|
||||
// test implicitly validates every binding in default_bindings().
|
||||
let _ = default_bindings();
|
||||
}
|
||||
}
|
||||
371
brittle-keymap/src/key.rs
Normal file
371
brittle-keymap/src/key.rs
Normal file
@@ -0,0 +1,371 @@
|
||||
//! Key representation and string parsing.
|
||||
//!
|
||||
//! Key sequences are written in a vim-inspired notation:
|
||||
//!
|
||||
//! | Notation | Meaning |
|
||||
//! |------------------|-------------------------------|
|
||||
//! | `j` | the letter j |
|
||||
//! | `G` | capital G (no modifier needed) |
|
||||
//! | `<Enter>` / `<CR>` | Enter / Return |
|
||||
//! | `<Esc>` | Escape |
|
||||
//! | `<Tab>` | Tab |
|
||||
//! | `<BS>` | Backspace |
|
||||
//! | `<Del>` | Delete |
|
||||
//! | `<Space>` | Space bar |
|
||||
//! | `<Up/Down/Left/Right>` | Arrow keys |
|
||||
//! | `<Home>` / `<End>` | Home / End |
|
||||
//! | `<PageUp>` / `<PageDown>` | Page Up / Down |
|
||||
//! | `<C-x>` | Ctrl+x |
|
||||
//! | `<S-Tab>` | Shift+Tab |
|
||||
//! | `<M-x>` / `<A-x>` | Alt+x |
|
||||
//!
|
||||
//! Sequences are formed by concatenating specs: `gg`, `zo`, `<C-d>`.
|
||||
|
||||
use std::fmt;
|
||||
|
||||
/// A single key press, including its modifiers.
|
||||
#[derive(Clone, PartialEq, Eq, Hash, Debug)]
|
||||
pub struct Key {
|
||||
pub code: KeyCode,
|
||||
pub ctrl: bool,
|
||||
pub shift: bool,
|
||||
pub alt: bool,
|
||||
pub meta: bool,
|
||||
}
|
||||
|
||||
impl Key {
|
||||
/// Create an unmodified character key.
|
||||
pub fn char(c: char) -> Self {
|
||||
Key {
|
||||
code: KeyCode::Char(c),
|
||||
ctrl: false,
|
||||
shift: false,
|
||||
alt: false,
|
||||
meta: false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a key with the given code and no modifiers.
|
||||
pub fn plain(code: KeyCode) -> Self {
|
||||
Key {
|
||||
code,
|
||||
ctrl: false,
|
||||
shift: false,
|
||||
alt: false,
|
||||
meta: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for Key {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
let needs_brackets = self.ctrl
|
||||
|| self.shift
|
||||
|| self.alt
|
||||
|| self.meta
|
||||
|| !matches!(self.code, KeyCode::Char(_));
|
||||
|
||||
if needs_brackets {
|
||||
write!(f, "<")?;
|
||||
if self.ctrl {
|
||||
write!(f, "C-")?;
|
||||
}
|
||||
if self.shift {
|
||||
write!(f, "S-")?;
|
||||
}
|
||||
if self.alt {
|
||||
write!(f, "M-")?;
|
||||
}
|
||||
if self.meta {
|
||||
write!(f, "D-")?;
|
||||
}
|
||||
write!(f, "{}>", self.code)
|
||||
} else {
|
||||
write!(f, "{}", self.code)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// The logical key code, independent of platform.
|
||||
#[derive(Clone, PartialEq, Eq, Hash, Debug)]
|
||||
pub enum KeyCode {
|
||||
/// A Unicode character key (letters, digits, punctuation).
|
||||
Char(char),
|
||||
Enter,
|
||||
Escape,
|
||||
Tab,
|
||||
Backspace,
|
||||
Delete,
|
||||
Space,
|
||||
ArrowUp,
|
||||
ArrowDown,
|
||||
ArrowLeft,
|
||||
ArrowRight,
|
||||
Home,
|
||||
End,
|
||||
PageUp,
|
||||
PageDown,
|
||||
F(u8),
|
||||
}
|
||||
|
||||
impl fmt::Display for KeyCode {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
KeyCode::Char(c) => write!(f, "{}", c),
|
||||
KeyCode::Enter => write!(f, "Enter"),
|
||||
KeyCode::Escape => write!(f, "Esc"),
|
||||
KeyCode::Tab => write!(f, "Tab"),
|
||||
KeyCode::Backspace => write!(f, "BS"),
|
||||
KeyCode::Delete => write!(f, "Del"),
|
||||
KeyCode::Space => write!(f, "Space"),
|
||||
KeyCode::ArrowUp => write!(f, "Up"),
|
||||
KeyCode::ArrowDown => write!(f, "Down"),
|
||||
KeyCode::ArrowLeft => write!(f, "Left"),
|
||||
KeyCode::ArrowRight => write!(f, "Right"),
|
||||
KeyCode::Home => write!(f, "Home"),
|
||||
KeyCode::End => write!(f, "End"),
|
||||
KeyCode::PageUp => write!(f, "PageUp"),
|
||||
KeyCode::PageDown => write!(f, "PageDown"),
|
||||
KeyCode::F(n) => write!(f, "F{}", n),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ── Parsing ───────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Error type for key sequence parsing.
|
||||
#[derive(Debug, PartialEq, Eq, Clone)]
|
||||
pub enum ParseError {
|
||||
/// A `<` was not closed with a `>`.
|
||||
UnclosedBracket,
|
||||
/// A `<...>` block contained an unrecognized key name.
|
||||
UnknownKey(String),
|
||||
/// An empty `<>` was encountered.
|
||||
EmptyBracket,
|
||||
}
|
||||
|
||||
impl fmt::Display for ParseError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
ParseError::UnclosedBracket => write!(f, "unclosed '<' in key sequence"),
|
||||
ParseError::UnknownKey(k) => write!(f, "unknown key name: '{}'", k),
|
||||
ParseError::EmptyBracket => write!(f, "empty '<>' in key sequence"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a key sequence string (e.g. `"gg"`, `"<C-d>"`, `"zo"`) into a list of [`Key`]s.
|
||||
pub fn parse_sequence(s: &str) -> Result<Vec<Key>, ParseError> {
|
||||
let mut keys = Vec::new();
|
||||
let mut chars = s.chars().peekable();
|
||||
|
||||
while let Some(c) = chars.next() {
|
||||
if c == '<' {
|
||||
// Consume until the matching '>'.
|
||||
let mut spec = String::new();
|
||||
loop {
|
||||
match chars.next() {
|
||||
Some('>') => break,
|
||||
Some(c) => spec.push(c),
|
||||
None => return Err(ParseError::UnclosedBracket),
|
||||
}
|
||||
}
|
||||
if spec.is_empty() {
|
||||
return Err(ParseError::EmptyBracket);
|
||||
}
|
||||
keys.push(parse_bracket_spec(&spec)?);
|
||||
} else {
|
||||
keys.push(Key::char(c));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(keys)
|
||||
}
|
||||
|
||||
/// Parse the interior of a `<...>` bracket, e.g. `"C-d"`, `"S-Tab"`, `"Enter"`.
|
||||
fn parse_bracket_spec(spec: &str) -> Result<Key, ParseError> {
|
||||
let mut ctrl = false;
|
||||
let mut shift = false;
|
||||
let mut alt = false;
|
||||
let mut meta = false;
|
||||
let mut rest = spec;
|
||||
|
||||
// Strip modifier prefixes in any order.
|
||||
loop {
|
||||
if let Some(s) = rest.strip_prefix("C-") {
|
||||
ctrl = true;
|
||||
rest = s;
|
||||
} else if let Some(s) = rest.strip_prefix("S-") {
|
||||
shift = true;
|
||||
rest = s;
|
||||
} else if let Some(s) = rest.strip_prefix("M-").or_else(|| rest.strip_prefix("A-")) {
|
||||
alt = true;
|
||||
rest = s;
|
||||
} else if let Some(s) = rest.strip_prefix("D-") {
|
||||
meta = true;
|
||||
rest = s;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
let code = match rest {
|
||||
"Enter" | "CR" | "Return" => KeyCode::Enter,
|
||||
"Esc" | "Escape" => KeyCode::Escape,
|
||||
"Tab" => KeyCode::Tab,
|
||||
"BS" | "Backspace" => KeyCode::Backspace,
|
||||
"Del" | "Delete" => KeyCode::Delete,
|
||||
"Space" => KeyCode::Space,
|
||||
"Up" => KeyCode::ArrowUp,
|
||||
"Down" => KeyCode::ArrowDown,
|
||||
"Left" => KeyCode::ArrowLeft,
|
||||
"Right" => KeyCode::ArrowRight,
|
||||
"Home" => KeyCode::Home,
|
||||
"End" => KeyCode::End,
|
||||
"PageUp" => KeyCode::PageUp,
|
||||
"PageDown" => KeyCode::PageDown,
|
||||
s if s.starts_with('F') && s.len() > 1 => {
|
||||
let n: u8 = s[1..]
|
||||
.parse()
|
||||
.map_err(|_| ParseError::UnknownKey(spec.to_owned()))?;
|
||||
KeyCode::F(n)
|
||||
}
|
||||
s if s.chars().count() == 1 => KeyCode::Char(s.chars().next().unwrap()),
|
||||
_ => return Err(ParseError::UnknownKey(spec.to_owned())),
|
||||
};
|
||||
|
||||
Ok(Key {
|
||||
code,
|
||||
ctrl,
|
||||
shift,
|
||||
alt,
|
||||
meta,
|
||||
})
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn seq(s: &str) -> Vec<Key> {
|
||||
parse_sequence(s).unwrap()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn single_char() {
|
||||
assert_eq!(seq("j"), vec![Key::char('j')]);
|
||||
assert_eq!(seq("G"), vec![Key::char('G')]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_char_sequence() {
|
||||
assert_eq!(seq("gg"), vec![Key::char('g'), Key::char('g')]);
|
||||
assert_eq!(seq("zo"), vec![Key::char('z'), Key::char('o')]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn special_keys() {
|
||||
assert_eq!(seq("<Enter>"), vec![Key::plain(KeyCode::Enter)]);
|
||||
assert_eq!(seq("<CR>"), vec![Key::plain(KeyCode::Enter)]);
|
||||
assert_eq!(seq("<Esc>"), vec![Key::plain(KeyCode::Escape)]);
|
||||
assert_eq!(seq("<Tab>"), vec![Key::plain(KeyCode::Tab)]);
|
||||
assert_eq!(seq("<BS>"), vec![Key::plain(KeyCode::Backspace)]);
|
||||
assert_eq!(seq("<Del>"), vec![Key::plain(KeyCode::Delete)]);
|
||||
assert_eq!(seq("<Space>"), vec![Key::plain(KeyCode::Space)]);
|
||||
assert_eq!(seq("<Up>"), vec![Key::plain(KeyCode::ArrowUp)]);
|
||||
assert_eq!(seq("<Down>"), vec![Key::plain(KeyCode::ArrowDown)]);
|
||||
assert_eq!(seq("<Home>"), vec![Key::plain(KeyCode::Home)]);
|
||||
assert_eq!(seq("<End>"), vec![Key::plain(KeyCode::End)]);
|
||||
assert_eq!(seq("<PageUp>"), vec![Key::plain(KeyCode::PageUp)]);
|
||||
assert_eq!(seq("<PageDown>"), vec![Key::plain(KeyCode::PageDown)]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn ctrl_modifier() {
|
||||
let key = &seq("<C-d>")[0];
|
||||
assert_eq!(key.code, KeyCode::Char('d'));
|
||||
assert!(key.ctrl);
|
||||
assert!(!key.shift);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn shift_modifier() {
|
||||
let key = &seq("<S-Tab>")[0];
|
||||
assert_eq!(key.code, KeyCode::Tab);
|
||||
assert!(key.shift);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn alt_modifier() {
|
||||
let key = &seq("<M-j>")[0];
|
||||
assert_eq!(key.code, KeyCode::Char('j'));
|
||||
assert!(key.alt);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn combined_modifiers() {
|
||||
let key = &seq("<C-S-Tab>")[0];
|
||||
assert_eq!(key.code, KeyCode::Tab);
|
||||
assert!(key.ctrl);
|
||||
assert!(key.shift);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn function_key() {
|
||||
assert_eq!(seq("<F1>"), vec![Key::plain(KeyCode::F(1))]);
|
||||
assert_eq!(seq("<F12>"), vec![Key::plain(KeyCode::F(12))]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mixed_sequence() {
|
||||
let keys = seq("<C-d>j<Enter>");
|
||||
assert_eq!(keys.len(), 3);
|
||||
assert_eq!(keys[0].code, KeyCode::Char('d'));
|
||||
assert!(keys[0].ctrl);
|
||||
assert_eq!(keys[1], Key::char('j'));
|
||||
assert_eq!(keys[2].code, KeyCode::Enter);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unclosed_bracket_error() {
|
||||
assert_eq!(parse_sequence("<Enter"), Err(ParseError::UnclosedBracket));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unknown_key_error() {
|
||||
assert!(matches!(
|
||||
parse_sequence("<Foobar>"),
|
||||
Err(ParseError::UnknownKey(_))
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_bracket_error() {
|
||||
assert_eq!(parse_sequence("<>"), Err(ParseError::EmptyBracket));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn display_char_key() {
|
||||
assert_eq!(Key::char('j').to_string(), "j");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn display_ctrl_key() {
|
||||
let k = seq("<C-d>")[0].clone();
|
||||
assert_eq!(k.to_string(), "<C-d>");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn display_shift_tab() {
|
||||
let k = seq("<S-Tab>")[0].clone();
|
||||
assert_eq!(k.to_string(), "<S-Tab>");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn display_special_key() {
|
||||
let k = Key::plain(KeyCode::Enter);
|
||||
assert_eq!(k.to_string(), "<Enter>");
|
||||
}
|
||||
}
|
||||
23
brittle-keymap/src/lib.rs
Normal file
23
brittle-keymap/src/lib.rs
Normal file
@@ -0,0 +1,23 @@
|
||||
//! Keymap engine for Brittle.
|
||||
//!
|
||||
//! Provides vim-style key sequence parsing, binding sets, and a stateful
|
||||
//! input processor that supports count prefixes and multi-key sequences.
|
||||
//!
|
||||
//! # Quick start
|
||||
//!
|
||||
//! ```rust
|
||||
//! use brittle_keymap::{KeymapState, default_bindings};
|
||||
//!
|
||||
//! let mut state = KeymapState::new(default_bindings());
|
||||
//! ```
|
||||
|
||||
pub mod actions;
|
||||
pub mod binding;
|
||||
pub mod defaults;
|
||||
pub mod key;
|
||||
pub mod state;
|
||||
|
||||
pub use binding::{BindingSet, LookupResult};
|
||||
pub use defaults::default_bindings;
|
||||
pub use key::{parse_sequence, Key, KeyCode, ParseError};
|
||||
pub use state::{KeymapState, Outcome};
|
||||
354
brittle-keymap/src/state.rs
Normal file
354
brittle-keymap/src/state.rs
Normal file
@@ -0,0 +1,354 @@
|
||||
//! The keymap state machine.
|
||||
//!
|
||||
//! `KeymapState` processes a stream of [`Key`] presses and emits [`Outcome`]s.
|
||||
//! It supports:
|
||||
//!
|
||||
//! - **Count prefixes**: digits build a numeric multiplier before the binding
|
||||
//! fires (e.g. `5j` → `nav.down` with count 5). `0` alone is treated as a
|
||||
//! regular key (not a count) so it can be bound; `10`, `20`, … work normally.
|
||||
//!
|
||||
//! - **Multi-key sequences**: bindings like `gg`, `zo`, `zc` are resolved by
|
||||
//! accumulating pressed keys until an exact match is found. If the
|
||||
//! accumulated sequence stops being a prefix of any binding, the machine
|
||||
//! discards the prefix and retries with just the latest key.
|
||||
//!
|
||||
//! - **Configurable bindings**: the `BindingSet` is injected at construction;
|
||||
//! user overrides are applied before constructing the state.
|
||||
|
||||
use crate::{
|
||||
binding::{BindingSet, LookupResult},
|
||||
key::Key,
|
||||
};
|
||||
|
||||
/// What the state machine decided after processing one key press.
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub enum Outcome {
|
||||
/// An action was fully resolved.
|
||||
Action {
|
||||
/// The action name (see [`crate::actions`]).
|
||||
name: String,
|
||||
/// The count prefix (always ≥ 1; `1` if no prefix was typed).
|
||||
count: u32,
|
||||
},
|
||||
/// The key was consumed but we are waiting for more keys.
|
||||
Pending,
|
||||
/// The key (or the accumulated sequence) did not match any binding.
|
||||
Unbound,
|
||||
}
|
||||
|
||||
/// The keymap state machine.
|
||||
#[derive(Debug)]
|
||||
pub struct KeymapState {
|
||||
bindings: BindingSet,
|
||||
/// Keys accumulated so far in the current multi-key sequence.
|
||||
pending: Vec<Key>,
|
||||
/// The count prefix typed before the current sequence (0 = none).
|
||||
count: u32,
|
||||
}
|
||||
|
||||
impl KeymapState {
|
||||
pub fn new(bindings: BindingSet) -> Self {
|
||||
Self {
|
||||
bindings,
|
||||
pending: Vec::new(),
|
||||
count: 0,
|
||||
}
|
||||
}
|
||||
|
||||
/// Process one key press and return the outcome.
|
||||
pub fn process(&mut self, key: Key) -> Outcome {
|
||||
// Count-building: only when no sequence is in progress.
|
||||
if self.pending.is_empty() {
|
||||
if let Some(digit) = digit_of(&key) {
|
||||
// '0' alone is not a count-start — it's treated as a key.
|
||||
// Once a count > 0 has started, '0' extends it (e.g., "10j").
|
||||
if self.count > 0 || digit != 0 {
|
||||
self.count = self.count.saturating_mul(10).saturating_add(digit);
|
||||
return Outcome::Pending;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Append to the in-progress sequence and look it up.
|
||||
self.pending.push(key.clone());
|
||||
|
||||
match self.bindings.lookup(&self.pending) {
|
||||
LookupResult::Exact(action) => {
|
||||
let count = if self.count == 0 { 1 } else { self.count };
|
||||
let outcome = Outcome::Action {
|
||||
name: action,
|
||||
count,
|
||||
};
|
||||
self.reset();
|
||||
outcome
|
||||
}
|
||||
LookupResult::Prefix => Outcome::Pending,
|
||||
LookupResult::NoMatch => {
|
||||
// The accumulated sequence is a dead end.
|
||||
// Discard the prefix and retry with only the last key,
|
||||
// unless this is already a single-key sequence.
|
||||
if self.pending.len() > 1 {
|
||||
let last = key; // the key we just pushed
|
||||
self.pending.clear();
|
||||
self.count = 0;
|
||||
return self.process(last);
|
||||
}
|
||||
// Single key and still no match → unbound.
|
||||
self.reset();
|
||||
Outcome::Unbound
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Reset the state machine to idle (clears pending keys and count).
|
||||
pub fn reset(&mut self) {
|
||||
self.pending.clear();
|
||||
self.count = 0;
|
||||
}
|
||||
|
||||
/// Keys accumulated so far (useful for displaying a "pending sequence" hint).
|
||||
pub fn pending_keys(&self) -> &[Key] {
|
||||
&self.pending
|
||||
}
|
||||
|
||||
/// Count prefix typed so far (0 = none).
|
||||
pub fn current_count(&self) -> u32 {
|
||||
self.count
|
||||
}
|
||||
}
|
||||
|
||||
/// If `key` is an unmodified digit, return its numeric value; otherwise `None`.
|
||||
fn digit_of(key: &Key) -> Option<u32> {
|
||||
if key.ctrl || key.shift || key.alt || key.meta {
|
||||
return None;
|
||||
}
|
||||
if let crate::key::KeyCode::Char(c) = key.code {
|
||||
c.to_digit(10)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::key::{Key, KeyCode};
|
||||
|
||||
fn make_state() -> KeymapState {
|
||||
let mut bindings = BindingSet::new();
|
||||
bindings.add_parsed("j", "nav.down").unwrap();
|
||||
bindings.add_parsed("k", "nav.up").unwrap();
|
||||
bindings.add_parsed("<Down>", "nav.down").unwrap();
|
||||
bindings.add_parsed("gg", "nav.top").unwrap();
|
||||
bindings.add_parsed("G", "nav.bottom").unwrap();
|
||||
bindings.add_parsed("zo", "tree.expand").unwrap();
|
||||
bindings.add_parsed("zc", "tree.collapse").unwrap();
|
||||
bindings.add_parsed("za", "tree.toggle").unwrap();
|
||||
bindings.add_parsed("<Esc>", "mode.normal").unwrap();
|
||||
bindings.add_parsed(":", "mode.command").unwrap();
|
||||
KeymapState::new(bindings)
|
||||
}
|
||||
|
||||
fn key(c: char) -> Key {
|
||||
Key::char(c)
|
||||
}
|
||||
|
||||
fn special(code: KeyCode) -> Key {
|
||||
Key::plain(code)
|
||||
}
|
||||
|
||||
// ── Single-key bindings ───────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn single_key_fires_action() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unbound_key_returns_unbound() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(s.process(key('x')), Outcome::Unbound);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn special_key_fires_action() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(
|
||||
s.process(special(KeyCode::Escape)),
|
||||
Outcome::Action {
|
||||
name: "mode.normal".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// ── Count prefix ──────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn count_prefix_single_digit() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(s.process(key('5')), Outcome::Pending);
|
||||
assert_eq!(s.current_count(), 5);
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 5
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn count_prefix_multi_digit() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(s.process(key('1')), Outcome::Pending);
|
||||
assert_eq!(s.process(key('0')), Outcome::Pending);
|
||||
assert_eq!(s.current_count(), 10);
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 10
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_alone_is_not_a_count() {
|
||||
// '0' alone with no prior count should be treated as a key, not a count-start.
|
||||
let mut bindings = BindingSet::new();
|
||||
bindings.add_parsed("0", "nav.top").unwrap();
|
||||
let mut s = KeymapState::new(bindings);
|
||||
assert_eq!(
|
||||
s.process(key('0')),
|
||||
Outcome::Action {
|
||||
name: "nav.top".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn count_resets_after_action() {
|
||||
let mut s = make_state();
|
||||
s.process(key('3'));
|
||||
s.process(key('j'));
|
||||
// Next key should have count=1 again.
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// ── Multi-key sequences ───────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn multi_key_first_key_is_pending() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(s.process(key('g')), Outcome::Pending);
|
||||
assert_eq!(s.pending_keys(), &[Key::char('g')]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_key_sequence_fires_on_completion() {
|
||||
let mut s = make_state();
|
||||
s.process(key('g'));
|
||||
assert_eq!(
|
||||
s.process(key('g')),
|
||||
Outcome::Action {
|
||||
name: "nav.top".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
// State is cleared after action.
|
||||
assert!(s.pending_keys().is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_key_wrong_continuation_retries_last_key() {
|
||||
let mut s = make_state();
|
||||
// 'g' starts a sequence; 'j' does not continue it → retry 'j' alone.
|
||||
s.process(key('g'));
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_key_wrong_continuation_unbound_if_retry_also_fails() {
|
||||
let mut s = make_state();
|
||||
// 'g' starts a sequence; 'x' does not continue it and is also unbound alone.
|
||||
s.process(key('g'));
|
||||
assert_eq!(s.process(key('x')), Outcome::Unbound);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zo_sequence() {
|
||||
let mut s = make_state();
|
||||
assert_eq!(s.process(key('z')), Outcome::Pending);
|
||||
assert_eq!(
|
||||
s.process(key('o')),
|
||||
Outcome::Action {
|
||||
name: "tree.expand".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn count_with_multi_key_sequence() {
|
||||
let mut s = make_state();
|
||||
s.process(key('3')); // count = 3
|
||||
s.process(key('z')); // pending = [z]
|
||||
assert_eq!(
|
||||
s.process(key('o')),
|
||||
Outcome::Action {
|
||||
name: "tree.expand".into(),
|
||||
count: 3
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// ── Reset ─────────────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn reset_clears_pending_and_count() {
|
||||
let mut s = make_state();
|
||||
s.process(key('5'));
|
||||
s.process(key('g'));
|
||||
s.reset();
|
||||
assert_eq!(s.current_count(), 0);
|
||||
assert!(s.pending_keys().is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn after_reset_processes_normally() {
|
||||
let mut s = make_state();
|
||||
s.process(key('5'));
|
||||
s.process(key('g'));
|
||||
s.reset();
|
||||
assert_eq!(
|
||||
s.process(key('j')),
|
||||
Outcome::Action {
|
||||
name: "nav.down".into(),
|
||||
count: 1
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
30
src-tauri/Cargo.toml
Normal file
30
src-tauri/Cargo.toml
Normal file
@@ -0,0 +1,30 @@
|
||||
[package]
|
||||
name = "brittle-app"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[lib]
|
||||
name = "brittle_app"
|
||||
|
||||
[[bin]]
|
||||
name = "brittle"
|
||||
path = "src/main.rs"
|
||||
|
||||
[build-dependencies]
|
||||
tauri-build = { version = "2", features = [] }
|
||||
|
||||
[dependencies]
|
||||
brittle-core = { path = "../brittle-core" }
|
||||
tauri = { version = "2", features = [] }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
toml = "0.8"
|
||||
dirs = "6"
|
||||
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
|
||||
thiserror = "2"
|
||||
url = "2"
|
||||
urlencoding = "2"
|
||||
uuid = { version = "1", features = ["v7"] }
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = "3"
|
||||
7
src-tauri/Trunk.toml
Normal file
7
src-tauri/Trunk.toml
Normal file
@@ -0,0 +1,7 @@
|
||||
[build]
|
||||
target = "../src/index.html"
|
||||
dist = "../dist"
|
||||
|
||||
[serve]
|
||||
port = 1420
|
||||
open = false
|
||||
3
src-tauri/build.rs
Normal file
3
src-tauri/build.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
fn main() {
|
||||
tauri_build::build()
|
||||
}
|
||||
7
src-tauri/capabilities/default.json
Normal file
7
src-tauri/capabilities/default.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"$schema": "https://schema.tauri.app/config/2/capability.json",
|
||||
"identifier": "default",
|
||||
"description": "Default capability for the main window",
|
||||
"windows": ["main"],
|
||||
"permissions": ["core:default"]
|
||||
}
|
||||
1
src-tauri/gen/schemas/acl-manifests.json
Normal file
1
src-tauri/gen/schemas/acl-manifests.json
Normal file
File diff suppressed because one or more lines are too long
1
src-tauri/gen/schemas/capabilities.json
Normal file
1
src-tauri/gen/schemas/capabilities.json
Normal file
@@ -0,0 +1 @@
|
||||
{"default":{"identifier":"default","description":"Default capability for the main window","local":true,"windows":["main"],"permissions":["core:default"]}}
|
||||
2244
src-tauri/gen/schemas/desktop-schema.json
Normal file
2244
src-tauri/gen/schemas/desktop-schema.json
Normal file
File diff suppressed because it is too large
Load Diff
2244
src-tauri/gen/schemas/linux-schema.json
Normal file
2244
src-tauri/gen/schemas/linux-schema.json
Normal file
File diff suppressed because it is too large
Load Diff
BIN
src-tauri/icons/icon.png
Normal file
BIN
src-tauri/icons/icon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 103 B |
8
src-tauri/pdfjs/package.json
Normal file
8
src-tauri/pdfjs/package.json
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"name": "brittle-pdfjs",
|
||||
"private": true,
|
||||
"description": "Pre-built pdfjs-dist for the Brittle PDF viewer.",
|
||||
"dependencies": {
|
||||
"pdfjs-dist": "3.11.174"
|
||||
}
|
||||
}
|
||||
41
src-tauri/src/commands/annotation.rs
Normal file
41
src-tauri/src/commands/annotation.rs
Normal file
@@ -0,0 +1,41 @@
|
||||
//! Tauri commands for PDF annotation CRUD.
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::{Annotation, AnnotationId, AnnotationType, ReferenceId};
|
||||
use tauri::State;
|
||||
|
||||
#[tauri::command]
|
||||
pub fn create_annotation(
|
||||
state: State<AppState>,
|
||||
reference_id: ReferenceId,
|
||||
page: u32,
|
||||
annotation_type: AnnotationType,
|
||||
content: Option<String>,
|
||||
) -> Result<Annotation, String> {
|
||||
state.with_repo(|b| b.create_annotation(reference_id, page, annotation_type, content))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn get_annotations(
|
||||
state: State<AppState>,
|
||||
reference_id: ReferenceId,
|
||||
) -> Result<Vec<Annotation>, String> {
|
||||
state.with_repo_read(|b| b.get_annotations(reference_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn update_annotation(
|
||||
state: State<AppState>,
|
||||
annotation: Annotation,
|
||||
) -> Result<Annotation, String> {
|
||||
state.with_repo(|b| b.update_annotation(annotation))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn delete_annotation(
|
||||
state: State<AppState>,
|
||||
reference_id: ReferenceId,
|
||||
annotation_id: AnnotationId,
|
||||
) -> Result<(), String> {
|
||||
state.with_repo(|b| b.delete_annotation(reference_id, annotation_id))
|
||||
}
|
||||
44
src-tauri/src/commands/bibtex.rs
Normal file
44
src-tauri/src/commands/bibtex.rs
Normal file
@@ -0,0 +1,44 @@
|
||||
//! Tauri commands for BibTeX export.
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::{LibraryId, ReferenceId};
|
||||
use serde::Serialize;
|
||||
use tauri::State;
|
||||
|
||||
/// Result of a BibTeX export: the formatted string plus any non-fatal errors.
|
||||
#[derive(Serialize)]
|
||||
pub struct BibtexExportResult {
|
||||
pub bibtex: String,
|
||||
/// Warnings for references that were skipped due to missing required fields.
|
||||
pub errors: Vec<String>,
|
||||
}
|
||||
|
||||
/// Export a list of references as BibTeX.
|
||||
#[tauri::command]
|
||||
pub fn export_bibtex(
|
||||
state: State<AppState>,
|
||||
reference_ids: Vec<ReferenceId>,
|
||||
) -> Result<BibtexExportResult, String> {
|
||||
state.with_repo_read(|b| {
|
||||
let (bibtex, errors) = b.export_bibtex(&reference_ids)?;
|
||||
Ok(BibtexExportResult {
|
||||
bibtex,
|
||||
errors: errors.iter().map(|e| e.to_string()).collect(),
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
/// Export all references in a library as BibTeX.
|
||||
#[tauri::command]
|
||||
pub fn export_library_bibtex(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
) -> Result<BibtexExportResult, String> {
|
||||
state.with_repo_read(|b| {
|
||||
let (bibtex, errors) = b.export_library_bibtex(library_id)?;
|
||||
Ok(BibtexExportResult {
|
||||
bibtex,
|
||||
errors: errors.iter().map(|e| e.to_string()).collect(),
|
||||
})
|
||||
})
|
||||
}
|
||||
73
src-tauri/src/commands/config.rs
Normal file
73
src-tauri/src/commands/config.rs
Normal file
@@ -0,0 +1,73 @@
|
||||
//! Tauri commands for reading and writing configuration.
|
||||
|
||||
use std::path::Path;
|
||||
|
||||
use crate::config::{GlobalConfig, ProjectConfig};
|
||||
|
||||
/// Load the global config from `~/.config/brittle/config.toml`.
|
||||
/// Returns the default config if the file does not yet exist.
|
||||
#[tauri::command]
|
||||
pub fn load_global_config() -> Result<GlobalConfig, String> {
|
||||
GlobalConfig::load().map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Persist the global config to `~/.config/brittle/config.toml`.
|
||||
#[tauri::command]
|
||||
pub fn save_global_config(config: GlobalConfig) -> Result<(), String> {
|
||||
config.save().map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Load the per-project config from `{repo_path}/.brittle/config.toml`.
|
||||
/// Returns the default config if the file does not yet exist.
|
||||
#[tauri::command]
|
||||
pub fn load_project_config(repo_path: String) -> Result<ProjectConfig, String> {
|
||||
ProjectConfig::load(Path::new(&repo_path)).map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Persist the per-project config to `{repo_path}/.brittle/config.toml`.
|
||||
#[tauri::command]
|
||||
pub fn save_project_config(repo_path: String, config: ProjectConfig) -> Result<(), String> {
|
||||
config
|
||||
.save(Path::new(&repo_path))
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Return the current theme name (`"dark"` or `"light"`) from the global config.
|
||||
#[tauri::command]
|
||||
pub fn get_theme() -> Result<String, String> {
|
||||
use crate::config::Theme;
|
||||
GlobalConfig::load()
|
||||
.map(|c| match c.appearance.theme {
|
||||
Theme::Dark => "dark".to_string(),
|
||||
Theme::Light => "light".to_string(),
|
||||
})
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Persist a new theme choice to the global config.
|
||||
///
|
||||
/// `theme` must be `"dark"` or `"light"`.
|
||||
#[tauri::command]
|
||||
pub fn set_theme(theme: String) -> Result<(), String> {
|
||||
use crate::config::Theme;
|
||||
let parsed = match theme.as_str() {
|
||||
"dark" => Theme::Dark,
|
||||
"light" => Theme::Light,
|
||||
other => return Err(format!("unknown theme '{other}'")),
|
||||
};
|
||||
let mut config = GlobalConfig::load().map_err(|e| e.to_string())?;
|
||||
config.appearance.theme = parsed;
|
||||
config.save().map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Return the user's keybinding overrides from the global config.
|
||||
///
|
||||
/// Map keys are action names in snake_case (e.g. `"tab_next"`); values are
|
||||
/// key-sequence strings (e.g. `"<C-Right>"`). Returns an empty map if no
|
||||
/// config file exists or the `[keybindings]` section is absent.
|
||||
#[tauri::command]
|
||||
pub fn get_keybindings() -> Result<std::collections::HashMap<String, String>, String> {
|
||||
GlobalConfig::load()
|
||||
.map(|c| c.keybindings.0)
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
97
src-tauri/src/commands/library.rs
Normal file
97
src-tauri/src/commands/library.rs
Normal file
@@ -0,0 +1,97 @@
|
||||
//! Tauri commands for library CRUD, hierarchy, and membership.
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::{Library, LibraryId, ReferenceId};
|
||||
use tauri::State;
|
||||
|
||||
#[tauri::command]
|
||||
pub fn create_library(
|
||||
state: State<AppState>,
|
||||
name: String,
|
||||
parent_id: Option<LibraryId>,
|
||||
) -> Result<Library, String> {
|
||||
state.with_repo(|b| b.create_library(name, parent_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn get_library(state: State<AppState>, id: LibraryId) -> Result<Library, String> {
|
||||
state.with_repo_read(|b| b.get_library(id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn rename_library(
|
||||
state: State<AppState>,
|
||||
id: LibraryId,
|
||||
new_name: String,
|
||||
) -> Result<Library, String> {
|
||||
state.with_repo(|b| b.rename_library(id, new_name))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn move_library(
|
||||
state: State<AppState>,
|
||||
id: LibraryId,
|
||||
new_parent: Option<LibraryId>,
|
||||
) -> Result<Library, String> {
|
||||
state.with_repo(|b| b.move_library(id, new_parent))
|
||||
}
|
||||
|
||||
/// Delete a library. Fails if it has child libraries.
|
||||
#[tauri::command]
|
||||
pub fn delete_library(state: State<AppState>, id: LibraryId) -> Result<(), String> {
|
||||
state.with_repo(|b| b.delete_library(id))
|
||||
}
|
||||
|
||||
/// Delete a library and all its descendants (recursive).
|
||||
#[tauri::command]
|
||||
pub fn force_delete_library(state: State<AppState>, id: LibraryId) -> Result<(), String> {
|
||||
state.with_repo(|b| b.force_delete_library(id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_root_libraries(state: State<AppState>) -> Result<Vec<Library>, String> {
|
||||
state.with_repo_read(|b| b.list_root_libraries())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_child_libraries(
|
||||
state: State<AppState>,
|
||||
parent_id: LibraryId,
|
||||
) -> Result<Vec<Library>, String> {
|
||||
state.with_repo_read(|b| b.list_child_libraries(parent_id))
|
||||
}
|
||||
|
||||
/// Return the ancestor chain of a library, ordered root → direct parent.
|
||||
#[tauri::command]
|
||||
pub fn get_library_ancestors(
|
||||
state: State<AppState>,
|
||||
id: LibraryId,
|
||||
) -> Result<Vec<Library>, String> {
|
||||
state.with_repo_read(|b| b.get_library_ancestors(id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn add_to_library(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
reference_id: ReferenceId,
|
||||
) -> Result<(), String> {
|
||||
state.with_repo(|b| b.add_to_library(library_id, reference_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn remove_from_library(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
reference_id: ReferenceId,
|
||||
) -> Result<(), String> {
|
||||
state.with_repo(|b| b.remove_from_library(library_id, reference_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_reference_libraries(
|
||||
state: State<AppState>,
|
||||
reference_id: ReferenceId,
|
||||
) -> Result<Vec<Library>, String> {
|
||||
state.with_repo_read(|b| b.list_reference_libraries(reference_id))
|
||||
}
|
||||
9
src-tauri/src/commands/mod.rs
Normal file
9
src-tauri/src/commands/mod.rs
Normal file
@@ -0,0 +1,9 @@
|
||||
pub mod annotation;
|
||||
pub mod bibtex;
|
||||
pub mod config;
|
||||
pub mod library;
|
||||
pub mod pdf;
|
||||
pub mod reference;
|
||||
pub mod repository;
|
||||
pub mod snapshot;
|
||||
pub mod window;
|
||||
31
src-tauri/src/commands/pdf.rs
Normal file
31
src-tauri/src/commands/pdf.rs
Normal file
@@ -0,0 +1,31 @@
|
||||
//! Tauri commands for PDF attachment and retrieval.
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::{PdfAttachment, ReferenceId};
|
||||
use std::path::PathBuf;
|
||||
use tauri::State;
|
||||
|
||||
/// Attach a PDF file to a reference by copying it into the repository.
|
||||
///
|
||||
/// `source_path` is the absolute path to the file to copy.
|
||||
/// Returns the stored attachment metadata.
|
||||
#[tauri::command]
|
||||
pub fn attach_pdf(
|
||||
state: State<AppState>,
|
||||
reference_id: ReferenceId,
|
||||
source_path: String,
|
||||
) -> Result<PdfAttachment, String> {
|
||||
let path = PathBuf::from(&source_path);
|
||||
state.with_repo(|b| b.attach_pdf(reference_id, &path))
|
||||
}
|
||||
|
||||
/// Return the absolute filesystem path to the PDF attached to a reference.
|
||||
///
|
||||
/// Returns an error if no PDF is attached.
|
||||
#[tauri::command]
|
||||
pub fn get_pdf_path(state: State<AppState>, reference_id: ReferenceId) -> Result<String, String> {
|
||||
state.with_repo_read(|b| {
|
||||
b.get_pdf_path(reference_id)
|
||||
.map(|p| p.to_string_lossy().into_owned())
|
||||
})
|
||||
}
|
||||
87
src-tauri/src/commands/reference.rs
Normal file
87
src-tauri/src/commands/reference.rs
Normal file
@@ -0,0 +1,87 @@
|
||||
//! Tauri commands for reference CRUD and search.
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::{EntryType, LibraryId, Reference, ReferenceId, ReferenceSummary};
|
||||
use tauri::State;
|
||||
|
||||
#[tauri::command]
|
||||
pub fn create_reference(
|
||||
state: State<AppState>,
|
||||
cite_key: String,
|
||||
entry_type: EntryType,
|
||||
) -> Result<Reference, String> {
|
||||
state.with_repo(|b| b.create_reference(cite_key, entry_type))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn get_reference(state: State<AppState>, id: ReferenceId) -> Result<Reference, String> {
|
||||
state.with_repo_read(|b| b.get_reference(id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn update_reference(state: State<AppState>, reference: Reference) -> Result<Reference, String> {
|
||||
state.with_repo(|b| b.update_reference(reference))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn delete_reference(state: State<AppState>, id: ReferenceId) -> Result<(), String> {
|
||||
state.with_repo(|b| b.delete_reference(id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_references(state: State<AppState>) -> Result<Vec<ReferenceSummary>, String> {
|
||||
state.with_repo_read(|b| b.list_references())
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn set_field(
|
||||
state: State<AppState>,
|
||||
id: ReferenceId,
|
||||
field: String,
|
||||
value: String,
|
||||
) -> Result<(), String> {
|
||||
state.with_repo(|b| b.set_field(id, &field, value))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn remove_field(state: State<AppState>, id: ReferenceId, field: String) -> Result<(), String> {
|
||||
state.with_repo(|b| b.remove_field(id, &field))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn search_references(
|
||||
state: State<AppState>,
|
||||
query: String,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
state.with_repo_read(|b| b.search_references(&query))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn search_library_references(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
query: String,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
state.with_repo_read(|b| b.search_library_references(library_id, &query))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_library_references(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
state.with_repo_read(|b| b.list_library_references(library_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_library_references_recursive(
|
||||
state: State<AppState>,
|
||||
library_id: LibraryId,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
let result = state.with_repo_read(|b| b.list_library_references_recursive(library_id));
|
||||
match &result {
|
||||
Ok(refs) => eprintln!("[brittle] list_library_references_recursive({library_id}): {} refs", refs.len()),
|
||||
Err(e) => eprintln!("[brittle] list_library_references_recursive({library_id}): ERROR: {e}"),
|
||||
}
|
||||
result
|
||||
}
|
||||
60
src-tauri/src/commands/repository.rs
Normal file
60
src-tauri/src/commands/repository.rs
Normal file
@@ -0,0 +1,60 @@
|
||||
//! Tauri commands for repository lifecycle (create, open, close).
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::Brittle;
|
||||
use std::path::PathBuf;
|
||||
use tauri::State;
|
||||
|
||||
fn expand_tilde(path: &str) -> PathBuf {
|
||||
if let Some(rest) = path.strip_prefix("~/") {
|
||||
if let Ok(home) = std::env::var("HOME") {
|
||||
return PathBuf::from(home).join(rest);
|
||||
}
|
||||
}
|
||||
if path == "~" {
|
||||
if let Ok(home) = std::env::var("HOME") {
|
||||
return PathBuf::from(home);
|
||||
}
|
||||
}
|
||||
PathBuf::from(path)
|
||||
}
|
||||
|
||||
/// Create a new Brittle repository at `path` and open it.
|
||||
#[tauri::command]
|
||||
pub fn create_repository(state: State<AppState>, path: String) -> Result<(), String> {
|
||||
let path = expand_tilde(&path);
|
||||
let brittle = Brittle::create(&path).map_err(|e| e.to_string())?;
|
||||
*state
|
||||
.brittle
|
||||
.lock()
|
||||
.map_err(|_| "lock poisoned".to_string())? = Some(brittle);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Open an existing Brittle repository at `path`.
|
||||
#[tauri::command]
|
||||
pub fn open_repository(state: State<AppState>, path: String) -> Result<(), String> {
|
||||
let path = expand_tilde(&path);
|
||||
let brittle = Brittle::open(&path).map_err(|e| e.to_string())?;
|
||||
*state
|
||||
.brittle
|
||||
.lock()
|
||||
.map_err(|_| "lock poisoned".to_string())? = Some(brittle);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Close the currently open repository. No-op if none is open.
|
||||
#[tauri::command]
|
||||
pub fn close_repository(state: State<AppState>) -> Result<(), String> {
|
||||
*state
|
||||
.brittle
|
||||
.lock()
|
||||
.map_err(|_| "lock poisoned".to_string())? = None;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Return the filesystem path of the currently open repository.
|
||||
#[tauri::command]
|
||||
pub fn repository_root(state: State<AppState>) -> Result<String, String> {
|
||||
state.with_repo_read(|b| Ok(b.repository_root().to_string_lossy().into_owned()))
|
||||
}
|
||||
33
src-tauri/src/commands/snapshot.rs
Normal file
33
src-tauri/src/commands/snapshot.rs
Normal file
@@ -0,0 +1,33 @@
|
||||
//! Tauri commands for snapshotting (git-backed history).
|
||||
|
||||
use crate::state::AppState;
|
||||
use brittle_core::Snapshot;
|
||||
use tauri::State;
|
||||
|
||||
#[tauri::command]
|
||||
pub fn create_snapshot(state: State<AppState>, message: String) -> Result<Snapshot, String> {
|
||||
state.with_repo(|b| b.create_snapshot(&message))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn list_snapshots(state: State<AppState>) -> Result<Vec<Snapshot>, String> {
|
||||
state.with_repo_read(|b| b.list_snapshots())
|
||||
}
|
||||
|
||||
/// Restore to a named snapshot. Fails if there are uncommitted changes.
|
||||
/// Use `discard_changes` first if needed.
|
||||
#[tauri::command]
|
||||
pub fn restore_snapshot(state: State<AppState>, snapshot_id: String) -> Result<(), String> {
|
||||
state.with_repo(|b| b.restore_snapshot(&snapshot_id))
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
pub fn has_uncommitted_changes(state: State<AppState>) -> Result<bool, String> {
|
||||
state.with_repo_read(|b| b.has_uncommitted_changes())
|
||||
}
|
||||
|
||||
/// Discard all uncommitted changes, reverting to the last snapshot.
|
||||
#[tauri::command]
|
||||
pub fn discard_changes(state: State<AppState>) -> Result<(), String> {
|
||||
state.with_repo(|b| b.discard_changes())
|
||||
}
|
||||
54
src-tauri/src/commands/window.rs
Normal file
54
src-tauri/src/commands/window.rs
Normal file
@@ -0,0 +1,54 @@
|
||||
//! Tauri commands for managing PDF viewer windows.
|
||||
|
||||
use tauri::{AppHandle, Manager, WebviewUrl, WebviewWindowBuilder};
|
||||
|
||||
/// Open a PDF viewer window for the given reference ID.
|
||||
///
|
||||
/// If a window for this reference is already open it is focused instead of
|
||||
/// creating a duplicate.
|
||||
///
|
||||
/// The window loads `brittle://app/viewer?ref_id=<ref_id>`.
|
||||
#[tauri::command]
|
||||
pub fn open_pdf_window(app: AppHandle, ref_id: String) -> Result<(), String> {
|
||||
let label = format!("pdf-{}", ref_id);
|
||||
|
||||
// If already open, just focus it.
|
||||
if let Some(win) = app.get_webview_window(&label) {
|
||||
win.set_focus().map_err(|e| e.to_string())?;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let url_str = format!(
|
||||
"brittle://app/viewer?ref_id={}",
|
||||
urlencoding::encode(&ref_id)
|
||||
);
|
||||
let url = url_str.parse::<url::Url>().map_err(|e| e.to_string())?;
|
||||
|
||||
WebviewWindowBuilder::new(&app, &label, WebviewUrl::External(url))
|
||||
.title("PDF Viewer — Brittle")
|
||||
.inner_size(900.0, 750.0)
|
||||
.min_inner_size(600.0, 400.0)
|
||||
.build()
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Close the PDF viewer window for the given reference ID, if open.
|
||||
#[tauri::command]
|
||||
pub fn close_pdf_window(app: AppHandle, ref_id: String) -> Result<(), String> {
|
||||
let label = format!("pdf-{}", ref_id);
|
||||
if let Some(win) = app.get_webview_window(&label) {
|
||||
win.close().map_err(|e| e.to_string())?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Return the labels of all currently open PDF viewer windows.
|
||||
#[tauri::command]
|
||||
pub fn list_pdf_windows(app: AppHandle) -> Vec<String> {
|
||||
app.webview_windows()
|
||||
.into_keys()
|
||||
.filter(|label| label.starts_with("pdf-"))
|
||||
.collect()
|
||||
}
|
||||
183
src-tauri/src/config/global.rs
Normal file
183
src-tauri/src/config/global.rs
Normal file
@@ -0,0 +1,183 @@
|
||||
//! Global (user-wide) configuration.
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use super::{AppearanceConfig, ConfigError, KeybindingsConfig, LayoutConfig};
|
||||
|
||||
/// Record of recently opened repositories, stored in the global config.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct ProjectsState {
|
||||
pub recent: Vec<PathBuf>,
|
||||
pub last_opened: Option<PathBuf>,
|
||||
}
|
||||
|
||||
/// User-wide configuration, stored at `~/.config/brittle/config.toml`.
|
||||
///
|
||||
/// All fields are optional in the file; missing fields fall back to their
|
||||
/// `Default` implementations.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct GlobalConfig {
|
||||
pub appearance: AppearanceConfig,
|
||||
pub layout: LayoutConfig,
|
||||
pub projects: ProjectsState,
|
||||
/// Flat map of action name → key combo string for user-defined overrides.
|
||||
#[serde(with = "keybindings_map")]
|
||||
pub keybindings: KeybindingsConfig,
|
||||
}
|
||||
|
||||
impl GlobalConfig {
|
||||
/// Load from the standard platform config directory.
|
||||
///
|
||||
/// Returns `Default` if the file does not yet exist.
|
||||
pub fn load() -> Result<Self, ConfigError> {
|
||||
Self::load_from(&global_config_path()?)
|
||||
}
|
||||
|
||||
/// Load from an explicit path.
|
||||
///
|
||||
/// Returns `Default` if the file does not exist.
|
||||
pub fn load_from(path: &Path) -> Result<Self, ConfigError> {
|
||||
if !path.exists() {
|
||||
return Ok(Self::default());
|
||||
}
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
Ok(toml::from_str(&content)?)
|
||||
}
|
||||
|
||||
/// Save to the standard platform config directory,
|
||||
/// creating parent directories as needed.
|
||||
pub fn save(&self) -> Result<(), ConfigError> {
|
||||
self.save_to(&global_config_path()?)
|
||||
}
|
||||
|
||||
/// Save to an explicit path, creating parent directories as needed.
|
||||
pub fn save_to(&self, path: &Path) -> Result<(), ConfigError> {
|
||||
if let Some(parent) = path.parent() {
|
||||
std::fs::create_dir_all(parent)?;
|
||||
}
|
||||
std::fs::write(path, toml::to_string_pretty(self)?)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
fn global_config_path() -> Result<PathBuf, ConfigError> {
|
||||
dirs::config_dir()
|
||||
.ok_or(ConfigError::NoConfigDir)
|
||||
.map(|d| d.join("brittle").join("config.toml"))
|
||||
}
|
||||
|
||||
/// Custom serde module so `KeybindingsConfig` round-trips as a flat TOML table.
|
||||
///
|
||||
/// Stored in the file as:
|
||||
/// ```toml
|
||||
/// [keybindings]
|
||||
/// focus_left = "H"
|
||||
/// tab_next = "gt"
|
||||
/// ```
|
||||
mod keybindings_map {
|
||||
use super::*;
|
||||
use serde::{Deserializer, Serializer};
|
||||
|
||||
pub fn serialize<S: Serializer>(kc: &KeybindingsConfig, s: S) -> Result<S::Ok, S::Error> {
|
||||
kc.0.serialize(s)
|
||||
}
|
||||
|
||||
pub fn deserialize<'de, D: Deserializer<'de>>(d: D) -> Result<KeybindingsConfig, D::Error> {
|
||||
HashMap::<String, String>::deserialize(d).map(KeybindingsConfig)
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::config::Theme;
|
||||
use tempfile::TempDir;
|
||||
|
||||
#[test]
|
||||
fn global_config_defaults() {
|
||||
let cfg = GlobalConfig::default();
|
||||
assert_eq!(cfg.appearance.theme, Theme::Dark);
|
||||
assert_eq!(cfg.appearance.font_size, 14);
|
||||
assert!(cfg.projects.recent.is_empty());
|
||||
assert!(cfg.keybindings.0.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn global_config_round_trips() {
|
||||
let mut cfg = GlobalConfig::default();
|
||||
cfg.appearance.theme = Theme::Light;
|
||||
cfg.appearance.font_size = 16;
|
||||
cfg.keybindings
|
||||
.0
|
||||
.insert("focus_left".to_string(), "C-h".to_string());
|
||||
|
||||
let s = toml::to_string_pretty(&cfg).unwrap();
|
||||
let parsed: GlobalConfig = toml::from_str(&s).unwrap();
|
||||
assert_eq!(parsed, cfg);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_toml_uses_all_defaults() {
|
||||
let cfg: GlobalConfig = toml::from_str("").unwrap();
|
||||
assert_eq!(cfg.appearance.font_size, 14);
|
||||
assert!((cfg.layout.left_pane_fraction - 0.20).abs() < f32::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn partial_toml_uses_defaults_for_missing_sections() {
|
||||
let toml = "[appearance]\ntheme = \"light\"\n";
|
||||
let cfg: GlobalConfig = toml::from_str(toml).unwrap();
|
||||
assert_eq!(cfg.appearance.theme, Theme::Light);
|
||||
// layout not specified — should be default
|
||||
assert!((cfg.layout.left_pane_fraction - 0.20).abs() < f32::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn load_from_nonexistent_path_returns_default() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let path = tmp.path().join("does_not_exist.toml");
|
||||
let cfg = GlobalConfig::load_from(&path).unwrap();
|
||||
assert_eq!(cfg, GlobalConfig::default());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_to_and_load_from_round_trip() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let path = tmp.path().join("config.toml");
|
||||
|
||||
let mut original = GlobalConfig::default();
|
||||
original.appearance.theme = Theme::Light;
|
||||
original.appearance.font_size = 18;
|
||||
original.save_to(&path).unwrap();
|
||||
|
||||
let loaded = GlobalConfig::load_from(&path).unwrap();
|
||||
assert_eq!(loaded, original);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_to_creates_parent_directories() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let path = tmp.path().join("nested").join("dirs").join("config.toml");
|
||||
GlobalConfig::default().save_to(&path).unwrap();
|
||||
assert!(path.exists());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn keybinding_overrides_round_trip() {
|
||||
let mut cfg = GlobalConfig::default();
|
||||
cfg.keybindings
|
||||
.0
|
||||
.insert("tab_next".to_string(), "C-Right".to_string());
|
||||
|
||||
let s = toml::to_string_pretty(&cfg).unwrap();
|
||||
let parsed: GlobalConfig = toml::from_str(&s).unwrap();
|
||||
assert_eq!(parsed.keybindings.0["tab_next"], "C-Right");
|
||||
}
|
||||
}
|
||||
279
src-tauri/src/config/mod.rs
Normal file
279
src-tauri/src/config/mod.rs
Normal file
@@ -0,0 +1,279 @@
|
||||
//! Configuration types for Brittle.
|
||||
//!
|
||||
//! Two levels of config exist:
|
||||
//! - [`GlobalConfig`] — stored at `~/.config/brittle/config.toml`; applies to all projects.
|
||||
//! - [`ProjectConfig`] — stored at `{repo}/.brittle/config.toml`; overrides globals per project.
|
||||
//!
|
||||
//! Use [`MergedConfig::merge`] to produce the effective config the app should use.
|
||||
|
||||
mod global;
|
||||
mod project;
|
||||
|
||||
pub use global::GlobalConfig;
|
||||
pub use project::ProjectConfig;
|
||||
|
||||
use std::collections::HashMap;
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
// ── Shared types ──────────────────────────────────────────────────────────────
|
||||
|
||||
/// Application colour theme.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, Default)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum Theme {
|
||||
#[default]
|
||||
Dark,
|
||||
Light,
|
||||
}
|
||||
|
||||
/// Appearance settings (font size, theme).
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(default)]
|
||||
pub struct AppearanceConfig {
|
||||
pub theme: Theme,
|
||||
pub font_size: u32,
|
||||
}
|
||||
|
||||
impl Default for AppearanceConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
theme: Theme::Dark,
|
||||
font_size: 14,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Partial appearance override from a project config.
|
||||
/// Only `Some` fields replace the global value.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct AppearanceOverride {
|
||||
pub theme: Option<Theme>,
|
||||
pub font_size: Option<u32>,
|
||||
}
|
||||
|
||||
/// Pane layout proportions (fractions of the window width, 0..1).
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(default)]
|
||||
pub struct LayoutConfig {
|
||||
pub left_pane_fraction: f32,
|
||||
pub right_pane_fraction: f32,
|
||||
}
|
||||
|
||||
impl Default for LayoutConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
left_pane_fraction: 0.20,
|
||||
right_pane_fraction: 0.35,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Keybinding overrides — maps action name to key combo string.
|
||||
///
|
||||
/// Only actions the user wants to rebind need an entry here; everything else
|
||||
/// falls back to the built-in defaults defined in `keymap`.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
pub struct KeybindingsConfig(pub HashMap<String, String>);
|
||||
|
||||
// ── Merged config ─────────────────────────────────────────────────────────────
|
||||
|
||||
/// The effective config the application uses at runtime, produced by merging
|
||||
/// global defaults with per-project overrides.
|
||||
#[derive(Debug, Clone)]
|
||||
#[allow(dead_code)] // consumed in Phase 3 when AppState is wired up
|
||||
pub struct MergedConfig {
|
||||
pub appearance: AppearanceConfig,
|
||||
pub layout: LayoutConfig,
|
||||
pub keybindings: KeybindingsConfig,
|
||||
/// Reference IDs of tabs that should be restored on launch.
|
||||
pub open_tabs: Vec<String>,
|
||||
}
|
||||
|
||||
impl MergedConfig {
|
||||
#[allow(dead_code)] // consumed in Phase 3 when AppState is wired up
|
||||
/// Merge `global` with an optional `project` override.
|
||||
///
|
||||
/// Project values take precedence for appearance; layout and keybindings
|
||||
/// are always taken from the global config (per-project overrides for those
|
||||
/// are intentionally not supported — it would be confusing).
|
||||
pub fn merge(global: &GlobalConfig, project: Option<&ProjectConfig>) -> Self {
|
||||
let appearance = match project.and_then(|p| p.appearance.as_ref()) {
|
||||
Some(ov) => AppearanceConfig {
|
||||
theme: ov
|
||||
.theme
|
||||
.clone()
|
||||
.unwrap_or_else(|| global.appearance.theme.clone()),
|
||||
font_size: ov.font_size.unwrap_or(global.appearance.font_size),
|
||||
},
|
||||
None => global.appearance.clone(),
|
||||
};
|
||||
|
||||
Self {
|
||||
appearance,
|
||||
layout: global.layout.clone(),
|
||||
keybindings: global.keybindings.clone(),
|
||||
open_tabs: project
|
||||
.map(|p| p.session.open_tabs.clone())
|
||||
.unwrap_or_default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ── Error type ────────────────────────────────────────────────────────────────
|
||||
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
pub enum ConfigError {
|
||||
#[error("no config directory found on this platform")]
|
||||
NoConfigDir,
|
||||
#[error("I/O error: {0}")]
|
||||
Io(#[from] std::io::Error),
|
||||
#[error("TOML parse error: {0}")]
|
||||
Parse(#[from] toml::de::Error),
|
||||
#[error("TOML serialize error: {0}")]
|
||||
Serialize(#[from] toml::ser::Error),
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::config::project::SessionConfig;
|
||||
|
||||
// ── AppearanceConfig ──────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn appearance_defaults_are_dark_14pt() {
|
||||
let a = AppearanceConfig::default();
|
||||
assert_eq!(a.theme, Theme::Dark);
|
||||
assert_eq!(a.font_size, 14);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn appearance_round_trips() {
|
||||
let original = AppearanceConfig {
|
||||
theme: Theme::Light,
|
||||
font_size: 16,
|
||||
};
|
||||
let s = toml::to_string_pretty(&original).unwrap();
|
||||
let parsed: AppearanceConfig = toml::from_str(&s).unwrap();
|
||||
assert_eq!(parsed, original);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn appearance_missing_fields_use_defaults() {
|
||||
// Only theme specified; font_size should fall back to 14.
|
||||
let parsed: AppearanceConfig = toml::from_str("theme = \"light\"").unwrap();
|
||||
assert_eq!(parsed.theme, Theme::Light);
|
||||
assert_eq!(parsed.font_size, 14);
|
||||
}
|
||||
|
||||
// ── LayoutConfig ─────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn layout_defaults() {
|
||||
let l = LayoutConfig::default();
|
||||
assert!((l.left_pane_fraction - 0.20).abs() < f32::EPSILON);
|
||||
assert!((l.right_pane_fraction - 0.35).abs() < f32::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn layout_round_trips() {
|
||||
let original = LayoutConfig {
|
||||
left_pane_fraction: 0.25,
|
||||
right_pane_fraction: 0.40,
|
||||
};
|
||||
let s = toml::to_string_pretty(&original).unwrap();
|
||||
let parsed: LayoutConfig = toml::from_str(&s).unwrap();
|
||||
assert_eq!(parsed, original);
|
||||
}
|
||||
|
||||
// ── MergedConfig ─────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn merge_without_project_uses_globals() {
|
||||
let global = GlobalConfig {
|
||||
appearance: AppearanceConfig {
|
||||
theme: Theme::Light,
|
||||
font_size: 16,
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
let merged = MergedConfig::merge(&global, None);
|
||||
assert_eq!(merged.appearance.theme, Theme::Light);
|
||||
assert_eq!(merged.appearance.font_size, 16);
|
||||
assert!(merged.open_tabs.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn merge_project_overrides_theme() {
|
||||
let global = GlobalConfig {
|
||||
appearance: AppearanceConfig {
|
||||
theme: Theme::Dark,
|
||||
font_size: 14,
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
let project = ProjectConfig {
|
||||
appearance: Some(AppearanceOverride {
|
||||
theme: Some(Theme::Light),
|
||||
font_size: None,
|
||||
}),
|
||||
..Default::default()
|
||||
};
|
||||
let merged = MergedConfig::merge(&global, Some(&project));
|
||||
assert_eq!(merged.appearance.theme, Theme::Light);
|
||||
// font_size not overridden — inherits from global
|
||||
assert_eq!(merged.appearance.font_size, 14);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn merge_project_overrides_font_size_only() {
|
||||
let global = GlobalConfig {
|
||||
appearance: AppearanceConfig {
|
||||
theme: Theme::Dark,
|
||||
font_size: 14,
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
let project = ProjectConfig {
|
||||
appearance: Some(AppearanceOverride {
|
||||
theme: None,
|
||||
font_size: Some(18),
|
||||
}),
|
||||
..Default::default()
|
||||
};
|
||||
let merged = MergedConfig::merge(&global, Some(&project));
|
||||
assert_eq!(merged.appearance.theme, Theme::Dark);
|
||||
assert_eq!(merged.appearance.font_size, 18);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn merge_project_open_tabs_are_included() {
|
||||
let global = GlobalConfig::default();
|
||||
let project = ProjectConfig {
|
||||
session: SessionConfig {
|
||||
open_tabs: vec!["tab-a".to_string(), "tab-b".to_string()],
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
let merged = MergedConfig::merge(&global, Some(&project));
|
||||
assert_eq!(merged.open_tabs, ["tab-a", "tab-b"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn merge_layout_always_from_global() {
|
||||
let global = GlobalConfig {
|
||||
layout: LayoutConfig {
|
||||
left_pane_fraction: 0.30,
|
||||
right_pane_fraction: 0.40,
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
// Project config has no layout override capability.
|
||||
let merged = MergedConfig::merge(&global, None);
|
||||
assert!((merged.layout.left_pane_fraction - 0.30).abs() < f32::EPSILON);
|
||||
}
|
||||
}
|
||||
160
src-tauri/src/config/project.rs
Normal file
160
src-tauri/src/config/project.rs
Normal file
@@ -0,0 +1,160 @@
|
||||
//! Per-project configuration.
|
||||
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use super::{AppearanceOverride, ConfigError};
|
||||
|
||||
/// Reference IDs of tabs to restore when the project is next opened.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct SessionConfig {
|
||||
pub open_tabs: Vec<String>,
|
||||
}
|
||||
|
||||
/// Per-project configuration, stored at `{repo}/.brittle/config.toml`.
|
||||
///
|
||||
/// This file should be in the project's `.gitignore` — it holds local session
|
||||
/// state and optional appearance overrides that are personal to this machine.
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, Default)]
|
||||
#[serde(default)]
|
||||
pub struct ProjectConfig {
|
||||
/// Optional per-project appearance overrides. `None` means "use globals."
|
||||
pub appearance: Option<AppearanceOverride>,
|
||||
pub session: SessionConfig,
|
||||
}
|
||||
|
||||
impl ProjectConfig {
|
||||
/// Load from `{repo_root}/.brittle/config.toml`.
|
||||
///
|
||||
/// Returns `Default` if the file does not yet exist.
|
||||
pub fn load(repo_root: &Path) -> Result<Self, ConfigError> {
|
||||
Self::load_from(&project_config_path(repo_root))
|
||||
}
|
||||
|
||||
/// Load from an explicit path.
|
||||
///
|
||||
/// Returns `Default` if the file does not exist.
|
||||
pub fn load_from(path: &Path) -> Result<Self, ConfigError> {
|
||||
if !path.exists() {
|
||||
return Ok(Self::default());
|
||||
}
|
||||
let content = std::fs::read_to_string(path)?;
|
||||
Ok(toml::from_str(&content)?)
|
||||
}
|
||||
|
||||
/// Save to `{repo_root}/.brittle/config.toml`.
|
||||
pub fn save(&self, repo_root: &Path) -> Result<(), ConfigError> {
|
||||
self.save_to(&project_config_path(repo_root))
|
||||
}
|
||||
|
||||
/// Save to an explicit path, creating parent directories as needed.
|
||||
pub fn save_to(&self, path: &Path) -> Result<(), ConfigError> {
|
||||
if let Some(parent) = path.parent() {
|
||||
std::fs::create_dir_all(parent)?;
|
||||
}
|
||||
std::fs::write(path, toml::to_string_pretty(self)?)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the canonical path for a project's config file.
|
||||
pub fn project_config_path(repo_root: &Path) -> PathBuf {
|
||||
repo_root.join(".brittle").join("config.toml")
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::config::Theme;
|
||||
use tempfile::TempDir;
|
||||
|
||||
#[test]
|
||||
fn project_config_defaults() {
|
||||
let cfg = ProjectConfig::default();
|
||||
assert!(cfg.appearance.is_none());
|
||||
assert!(cfg.session.open_tabs.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn project_config_round_trips() {
|
||||
let cfg = ProjectConfig {
|
||||
appearance: Some(AppearanceOverride {
|
||||
theme: Some(Theme::Light),
|
||||
font_size: Some(16),
|
||||
}),
|
||||
session: SessionConfig {
|
||||
open_tabs: vec!["abc-123".to_string()],
|
||||
},
|
||||
};
|
||||
let s = toml::to_string_pretty(&cfg).unwrap();
|
||||
let parsed: ProjectConfig = toml::from_str(&s).unwrap();
|
||||
assert_eq!(parsed, cfg);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_toml_uses_defaults() {
|
||||
let cfg: ProjectConfig = toml::from_str("").unwrap();
|
||||
assert_eq!(cfg, ProjectConfig::default());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn partial_appearance_override_keeps_none_fields() {
|
||||
// Only theme specified; font_size should remain None.
|
||||
let toml = "[appearance]\ntheme = \"light\"\n";
|
||||
let cfg: ProjectConfig = toml::from_str(toml).unwrap();
|
||||
let ov = cfg.appearance.unwrap();
|
||||
assert_eq!(ov.theme, Some(Theme::Light));
|
||||
assert!(ov.font_size.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn load_from_nonexistent_path_returns_default() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let path = tmp.path().join("nope.toml");
|
||||
let cfg = ProjectConfig::load_from(&path).unwrap();
|
||||
assert_eq!(cfg, ProjectConfig::default());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn save_to_and_load_from_round_trip() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let path = tmp.path().join("config.toml");
|
||||
|
||||
let original = ProjectConfig {
|
||||
appearance: Some(AppearanceOverride {
|
||||
theme: Some(Theme::Dark),
|
||||
font_size: None,
|
||||
}),
|
||||
session: SessionConfig {
|
||||
open_tabs: vec!["ref-1".to_string()],
|
||||
},
|
||||
};
|
||||
original.save_to(&path).unwrap();
|
||||
|
||||
let loaded = ProjectConfig::load_from(&path).unwrap();
|
||||
assert_eq!(loaded, original);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn load_and_save_use_brittle_subdir() {
|
||||
let tmp = TempDir::new().unwrap();
|
||||
let repo = tmp.path();
|
||||
|
||||
let original = ProjectConfig {
|
||||
session: SessionConfig {
|
||||
open_tabs: vec!["x".to_string()],
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
original.save(repo).unwrap();
|
||||
|
||||
assert!(repo.join(".brittle").join("config.toml").exists());
|
||||
|
||||
let loaded = ProjectConfig::load(repo).unwrap();
|
||||
assert_eq!(loaded, original);
|
||||
}
|
||||
}
|
||||
87
src-tauri/src/lib.rs
Normal file
87
src-tauri/src/lib.rs
Normal file
@@ -0,0 +1,87 @@
|
||||
mod commands;
|
||||
mod config;
|
||||
mod pdf_protocol;
|
||||
pub mod state;
|
||||
|
||||
use state::AppState;
|
||||
|
||||
pub fn run() {
|
||||
tauri::Builder::default()
|
||||
.manage(AppState::new())
|
||||
.setup(|app| {
|
||||
#[cfg(debug_assertions)]
|
||||
{
|
||||
use tauri::Manager;
|
||||
if let Some(win) = app.get_webview_window("main") {
|
||||
win.open_devtools();
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
})
|
||||
.register_uri_scheme_protocol("brittle", |ctx, req| {
|
||||
pdf_protocol::handle(ctx.app_handle(), &req)
|
||||
})
|
||||
.invoke_handler(tauri::generate_handler![
|
||||
// config
|
||||
commands::config::load_global_config,
|
||||
commands::config::save_global_config,
|
||||
commands::config::load_project_config,
|
||||
commands::config::save_project_config,
|
||||
commands::config::get_theme,
|
||||
commands::config::set_theme,
|
||||
commands::config::get_keybindings,
|
||||
// repository
|
||||
commands::repository::create_repository,
|
||||
commands::repository::open_repository,
|
||||
commands::repository::close_repository,
|
||||
commands::repository::repository_root,
|
||||
// reference
|
||||
commands::reference::create_reference,
|
||||
commands::reference::get_reference,
|
||||
commands::reference::update_reference,
|
||||
commands::reference::delete_reference,
|
||||
commands::reference::list_references,
|
||||
commands::reference::set_field,
|
||||
commands::reference::remove_field,
|
||||
commands::reference::search_references,
|
||||
commands::reference::search_library_references,
|
||||
commands::reference::list_library_references,
|
||||
commands::reference::list_library_references_recursive,
|
||||
// library
|
||||
commands::library::create_library,
|
||||
commands::library::get_library,
|
||||
commands::library::rename_library,
|
||||
commands::library::move_library,
|
||||
commands::library::delete_library,
|
||||
commands::library::force_delete_library,
|
||||
commands::library::list_root_libraries,
|
||||
commands::library::list_child_libraries,
|
||||
commands::library::get_library_ancestors,
|
||||
commands::library::add_to_library,
|
||||
commands::library::remove_from_library,
|
||||
commands::library::list_reference_libraries,
|
||||
// annotation
|
||||
commands::annotation::create_annotation,
|
||||
commands::annotation::get_annotations,
|
||||
commands::annotation::update_annotation,
|
||||
commands::annotation::delete_annotation,
|
||||
// pdf
|
||||
commands::pdf::attach_pdf,
|
||||
commands::pdf::get_pdf_path,
|
||||
// snapshot
|
||||
commands::snapshot::create_snapshot,
|
||||
commands::snapshot::list_snapshots,
|
||||
commands::snapshot::restore_snapshot,
|
||||
commands::snapshot::has_uncommitted_changes,
|
||||
commands::snapshot::discard_changes,
|
||||
// bibtex
|
||||
commands::bibtex::export_bibtex,
|
||||
commands::bibtex::export_library_bibtex,
|
||||
// window
|
||||
commands::window::open_pdf_window,
|
||||
commands::window::close_pdf_window,
|
||||
commands::window::list_pdf_windows,
|
||||
])
|
||||
.run(tauri::generate_context!())
|
||||
.expect("error while running tauri application");
|
||||
}
|
||||
5
src-tauri/src/main.rs
Normal file
5
src-tauri/src/main.rs
Normal file
@@ -0,0 +1,5 @@
|
||||
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
|
||||
|
||||
fn main() {
|
||||
brittle_app::run()
|
||||
}
|
||||
320
src-tauri/src/pdf_protocol.rs
Normal file
320
src-tauri/src/pdf_protocol.rs
Normal file
@@ -0,0 +1,320 @@
|
||||
//! Custom `brittle://` URI scheme handler.
|
||||
//!
|
||||
//! Routes:
|
||||
//! `brittle://app/viewer?ref_id=<uuid>` — the PDF viewer HTML page
|
||||
//! `brittle://app/pdfjs/<path>` — pdfjs-dist static assets
|
||||
//! `brittle://app/pdf?ref_id=<uuid>` — raw PDF bytes from the repository
|
||||
//!
|
||||
//! The pure routing and path-resolution logic lives in the `routing` sub-module
|
||||
//! so it can be unit-tested without a running Tauri application.
|
||||
|
||||
use std::path::PathBuf;
|
||||
use tauri::{
|
||||
http::{header, Request, Response, StatusCode},
|
||||
AppHandle, Manager, Runtime,
|
||||
};
|
||||
|
||||
use crate::state::AppState;
|
||||
|
||||
// ── Embedded assets ───────────────────────────────────────────────────────────
|
||||
|
||||
static VIEWER_HTML: &[u8] = include_bytes!("pdf_viewer.html");
|
||||
|
||||
// ── Public API ────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Entry point registered with `tauri::Builder::register_uri_scheme_protocol`.
|
||||
///
|
||||
/// Generic over the Tauri runtime so the function can be used from a closure
|
||||
/// in the builder without knowing the concrete runtime at compile time.
|
||||
pub fn handle<R: Runtime>(app: &AppHandle<R>, req: &Request<Vec<u8>>) -> Response<Vec<u8>> {
|
||||
let uri = req.uri();
|
||||
|
||||
match routing::classify(uri.path(), uri.query()) {
|
||||
routing::Route::Viewer { ref_id } => serve_viewer(&ref_id),
|
||||
routing::Route::PdfjsAsset { rel_path } => {
|
||||
if rel_path.contains("..") {
|
||||
return response_403();
|
||||
}
|
||||
serve_pdfjs_file(&pdfjs_root(app), &rel_path)
|
||||
}
|
||||
routing::Route::Pdf { ref_id } => serve_pdf(app, &ref_id),
|
||||
routing::Route::NotFound => response_404(),
|
||||
}
|
||||
}
|
||||
|
||||
// ── Route handlers ────────────────────────────────────────────────────────────
|
||||
|
||||
fn serve_viewer(ref_id: &str) -> Response<Vec<u8>> {
|
||||
// Substitute the ref_id into the HTML template so the viewer knows which PDF to load.
|
||||
let html = String::from_utf8_lossy(VIEWER_HTML)
|
||||
.replace("ref_id=\"\"", &format!("ref_id=\"{}\"", ref_id));
|
||||
response_ok(html.into_bytes(), "text/html; charset=utf-8")
|
||||
}
|
||||
|
||||
fn serve_pdfjs_file(pdfjs_root: &std::path::Path, rel_path: &str) -> Response<Vec<u8>> {
|
||||
let full_path = pdfjs_root.join(rel_path);
|
||||
match std::fs::read(&full_path) {
|
||||
Ok(bytes) => {
|
||||
let mime = routing::mime_for_path(&full_path);
|
||||
let mut resp = response_ok(bytes, mime);
|
||||
resp.headers_mut().insert(
|
||||
header::CACHE_CONTROL,
|
||||
"public, max-age=3600".parse().unwrap(),
|
||||
);
|
||||
resp
|
||||
}
|
||||
Err(_) => response_404(),
|
||||
}
|
||||
}
|
||||
|
||||
fn serve_pdf<R: Runtime>(app: &AppHandle<R>, ref_id: &str) -> Response<Vec<u8>> {
|
||||
use brittle_core::{model::ids::ReferenceId, store::FsStore, Brittle};
|
||||
use uuid::Uuid;
|
||||
|
||||
let uuid = match Uuid::parse_str(ref_id) {
|
||||
Ok(u) => u,
|
||||
Err(_) => return response_400("invalid ref_id: not a valid UUID"),
|
||||
};
|
||||
let rid = ReferenceId::from(uuid);
|
||||
|
||||
let state = app.state::<AppState>();
|
||||
let pdf_path: Result<PathBuf, String> =
|
||||
state.with_repo_read(|b: &Brittle<FsStore>| b.get_pdf_path(rid));
|
||||
|
||||
match pdf_path {
|
||||
Err(e) => response_404_msg(&e),
|
||||
Ok(path) => match std::fs::read(&path) {
|
||||
Ok(bytes) => {
|
||||
let mut resp = response_ok(bytes, "application/pdf");
|
||||
resp.headers_mut()
|
||||
.insert(header::CACHE_CONTROL, "no-store".parse().unwrap());
|
||||
resp
|
||||
}
|
||||
Err(e) => response_500(&e.to_string()),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// ── Path resolution ───────────────────────────────────────────────────────────
|
||||
|
||||
fn pdfjs_root<R: Runtime>(app: &AppHandle<R>) -> PathBuf {
|
||||
if cfg!(debug_assertions) {
|
||||
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||
.join("pdfjs")
|
||||
.join("node_modules")
|
||||
.join("pdfjs-dist")
|
||||
} else {
|
||||
app.path()
|
||||
.resource_dir()
|
||||
.unwrap_or_default()
|
||||
.join("pdfjs-dist")
|
||||
}
|
||||
}
|
||||
|
||||
// ── Response builders ─────────────────────────────────────────────────────────
|
||||
|
||||
fn response_ok(body: Vec<u8>, content_type: &str) -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.header(header::CONTENT_TYPE, content_type)
|
||||
.header(header::ACCESS_CONTROL_ALLOW_ORIGIN, "*")
|
||||
.status(StatusCode::OK)
|
||||
.body(body)
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
fn response_404() -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.status(StatusCode::NOT_FOUND)
|
||||
.body(b"Not Found".to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
fn response_404_msg(msg: &str) -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.status(StatusCode::NOT_FOUND)
|
||||
.body(msg.as_bytes().to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
fn response_400(msg: &str) -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.status(StatusCode::BAD_REQUEST)
|
||||
.body(msg.as_bytes().to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
fn response_403() -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.status(StatusCode::FORBIDDEN)
|
||||
.body(b"Forbidden".to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
fn response_500(msg: &str) -> Response<Vec<u8>> {
|
||||
Response::builder()
|
||||
.status(StatusCode::INTERNAL_SERVER_ERROR)
|
||||
.body(msg.as_bytes().to_vec())
|
||||
.unwrap()
|
||||
}
|
||||
|
||||
// ── Pure routing logic (unit-testable) ───────────────────────────────────────
|
||||
|
||||
pub mod routing {
|
||||
use std::path::Path;
|
||||
|
||||
#[derive(Debug, PartialEq)]
|
||||
pub enum Route {
|
||||
Viewer { ref_id: String },
|
||||
PdfjsAsset { rel_path: String },
|
||||
Pdf { ref_id: String },
|
||||
NotFound,
|
||||
}
|
||||
|
||||
/// Classify a `brittle://app/{path}?{query}` request into a `Route`.
|
||||
pub fn classify(path: &str, query: Option<&str>) -> Route {
|
||||
let ref_id = extract_ref_id(query);
|
||||
|
||||
if path == "/viewer" {
|
||||
Route::Viewer { ref_id }
|
||||
} else if let Some(rel) = path.strip_prefix("/pdfjs/") {
|
||||
Route::PdfjsAsset {
|
||||
rel_path: rel.to_owned(),
|
||||
}
|
||||
} else if path == "/pdf" {
|
||||
Route::Pdf { ref_id }
|
||||
} else {
|
||||
Route::NotFound
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract the value of `ref_id=…` from a URL query string.
|
||||
pub fn extract_ref_id(query: Option<&str>) -> String {
|
||||
query
|
||||
.unwrap_or("")
|
||||
.split('&')
|
||||
.find_map(|part| part.strip_prefix("ref_id="))
|
||||
.unwrap_or("")
|
||||
.to_owned()
|
||||
}
|
||||
|
||||
/// Return the appropriate MIME type for a file path based on its extension.
|
||||
pub fn mime_for_path(path: &Path) -> &'static str {
|
||||
match path.extension().and_then(|e| e.to_str()) {
|
||||
Some("js") => "application/javascript; charset=utf-8",
|
||||
Some("mjs") => "application/javascript; charset=utf-8",
|
||||
Some("css") => "text/css; charset=utf-8",
|
||||
Some("html") => "text/html; charset=utf-8",
|
||||
Some("pdf") => "application/pdf",
|
||||
Some("woff2") => "font/woff2",
|
||||
Some("woff") => "font/woff",
|
||||
Some("png") => "image/png",
|
||||
Some("jpg") | Some("jpeg") => "image/jpeg",
|
||||
Some("svg") => "image/svg+xml",
|
||||
Some("map") => "application/json",
|
||||
_ => "application/octet-stream",
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn viewer_route() {
|
||||
let r = classify("/viewer", Some("ref_id=abc-123"));
|
||||
assert_eq!(
|
||||
r,
|
||||
Route::Viewer {
|
||||
ref_id: "abc-123".into()
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn viewer_route_no_ref_id() {
|
||||
let r = classify("/viewer", None);
|
||||
assert_eq!(r, Route::Viewer { ref_id: "".into() });
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pdfjs_asset_route_build_file() {
|
||||
let r = classify("/pdfjs/build/pdf.min.js", None);
|
||||
assert_eq!(
|
||||
r,
|
||||
Route::PdfjsAsset {
|
||||
rel_path: "build/pdf.min.js".into()
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pdfjs_asset_route_nested() {
|
||||
let r = classify("/pdfjs/web/pdf_viewer.css", None);
|
||||
assert_eq!(
|
||||
r,
|
||||
Route::PdfjsAsset {
|
||||
rel_path: "web/pdf_viewer.css".into()
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pdf_route() {
|
||||
let r = classify("/pdf", Some("ref_id=01234567-89ab-cdef-0123-456789abcdef"));
|
||||
assert_eq!(
|
||||
r,
|
||||
Route::Pdf {
|
||||
ref_id: "01234567-89ab-cdef-0123-456789abcdef".into()
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unknown_paths_are_not_found() {
|
||||
assert_eq!(classify("/unknown", None), Route::NotFound);
|
||||
assert_eq!(classify("/", None), Route::NotFound);
|
||||
assert_eq!(classify("/favicon.ico", None), Route::NotFound);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn extract_ref_id_from_compound_query() {
|
||||
let id = extract_ref_id(Some("foo=bar&ref_id=my-id&baz=1"));
|
||||
assert_eq!(id, "my-id");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn extract_ref_id_missing_returns_empty() {
|
||||
assert_eq!(extract_ref_id(None), "");
|
||||
assert_eq!(extract_ref_id(Some("foo=bar")), "");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mime_for_js_files() {
|
||||
assert!(mime_for_path(Path::new("pdf.min.js")).contains("javascript"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mime_for_css_files() {
|
||||
assert!(mime_for_path(Path::new("viewer.css")).contains("css"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mime_for_unknown_extension() {
|
||||
assert_eq!(
|
||||
mime_for_path(Path::new("data.bin")),
|
||||
"application/octet-stream"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn path_traversal_rel_path_contains_dotdot() {
|
||||
// The handler rejects rel_paths containing ".."; verify the routing
|
||||
// surfaces them so the handler can block them.
|
||||
if let Route::PdfjsAsset { rel_path } = classify("/pdfjs/../secret", None) {
|
||||
assert!(rel_path.contains(".."));
|
||||
} else {
|
||||
panic!("expected PdfjsAsset route");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
288
src-tauri/src/pdf_viewer.html
Normal file
288
src-tauri/src/pdf_viewer.html
Normal file
@@ -0,0 +1,288 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>PDF Viewer — Brittle</title>
|
||||
<style>
|
||||
*, *::before, *::after { margin: 0; padding: 0; box-sizing: border-box; }
|
||||
|
||||
html, body {
|
||||
height: 100%;
|
||||
overflow: hidden;
|
||||
background: #3a3a3a;
|
||||
font-family: system-ui, -apple-system, sans-serif;
|
||||
color: #ccc;
|
||||
}
|
||||
|
||||
body { display: flex; flex-direction: column; }
|
||||
|
||||
#error-banner {
|
||||
display: none;
|
||||
flex-shrink: 0;
|
||||
background: #5c1a1a;
|
||||
color: #f99;
|
||||
padding: 10px 20px;
|
||||
font-size: 13px;
|
||||
border-bottom: 1px solid #7a2222;
|
||||
}
|
||||
|
||||
#toolbar {
|
||||
flex-shrink: 0;
|
||||
background: #252525;
|
||||
border-bottom: 1px solid #444;
|
||||
padding: 5px 14px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 14px;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
#zoom-controls { display: flex; align-items: center; gap: 6px; }
|
||||
|
||||
button {
|
||||
background: #3d3d3d;
|
||||
color: #ccc;
|
||||
border: 1px solid #555;
|
||||
border-radius: 4px;
|
||||
padding: 3px 10px;
|
||||
cursor: pointer;
|
||||
font-size: 13px;
|
||||
line-height: 1.4;
|
||||
min-width: 28px;
|
||||
}
|
||||
button:hover { background: #4a4a4a; }
|
||||
|
||||
#zoom-label {
|
||||
min-width: 46px;
|
||||
text-align: center;
|
||||
font-size: 12px;
|
||||
color: #bbb;
|
||||
}
|
||||
|
||||
#page-indicator {
|
||||
color: #888;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
#status {
|
||||
margin-left: auto;
|
||||
color: #888;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
/* Scrollable page stack */
|
||||
#canvas-container {
|
||||
flex: 1;
|
||||
overflow-y: auto;
|
||||
overflow-x: auto;
|
||||
padding: 20px 0;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
gap: 12px;
|
||||
}
|
||||
|
||||
.page-wrapper { flex-shrink: 0; }
|
||||
|
||||
.page-wrapper canvas {
|
||||
display: block;
|
||||
background: #fff;
|
||||
box-shadow: 0 3px 14px rgba(0, 0, 0, 0.55);
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
<div id="error-banner"></div>
|
||||
|
||||
<div id="toolbar">
|
||||
<div id="zoom-controls">
|
||||
<button id="btn-zoom-out" title="Zoom out [ − ]">−</button>
|
||||
<span id="zoom-label">—</span>
|
||||
<button id="btn-zoom-in" title="Zoom in [ + ]">+</button>
|
||||
<button id="btn-zoom-fit" title="Fit width [ 0 ]">Fit</button>
|
||||
</div>
|
||||
<span id="page-indicator">— / —</span>
|
||||
<span id="status">Loading PDF.js…</span>
|
||||
</div>
|
||||
|
||||
<div id="canvas-container"></div>
|
||||
|
||||
<script src="brittle://app/pdfjs/build/pdf.min.js"></script>
|
||||
<script>
|
||||
"use strict";
|
||||
|
||||
const refId = new URLSearchParams(location.search).get("ref_id") || "";
|
||||
|
||||
const pdfjsLib = window.pdfjsLib;
|
||||
if (!pdfjsLib) {
|
||||
showError("PDF.js failed to load. Make sure the app is running inside Brittle.");
|
||||
} else {
|
||||
pdfjsLib.GlobalWorkerOptions.workerSrc = "";
|
||||
|
||||
const container = document.getElementById("canvas-container");
|
||||
const statusEl = document.getElementById("status");
|
||||
const zoomLabel = document.getElementById("zoom-label");
|
||||
const pageIndicator = document.getElementById("page-indicator");
|
||||
|
||||
const DPR = window.devicePixelRatio || 1;
|
||||
const ZOOM_MIN = 0.1;
|
||||
const ZOOM_MAX = 5.0;
|
||||
|
||||
let pdfDoc = null;
|
||||
let scale = 1.0; // display scale (not multiplied by DPR yet)
|
||||
let renderGen = 0; // incremented on each render pass to cancel stale ones
|
||||
let renderTimer = null;
|
||||
|
||||
// ── Utilities ──────────────────────────────────────────────────────────
|
||||
|
||||
function setStatus(msg) { statusEl.textContent = msg; }
|
||||
|
||||
function clampScale(s) {
|
||||
return Math.max(ZOOM_MIN, Math.min(ZOOM_MAX, s));
|
||||
}
|
||||
|
||||
function updateZoomLabel() {
|
||||
zoomLabel.textContent = Math.round(scale * 100) + "%";
|
||||
}
|
||||
|
||||
// ── Zoom ───────────────────────────────────────────────────────────────
|
||||
|
||||
async function applyScale(newScale) {
|
||||
scale = clampScale(newScale);
|
||||
updateZoomLabel();
|
||||
scheduleRender();
|
||||
}
|
||||
|
||||
async function fitToWidth() {
|
||||
if (!pdfDoc) return;
|
||||
const page = await pdfDoc.getPage(1);
|
||||
const vp = page.getViewport({ scale: 1.0 });
|
||||
const avail = container.clientWidth - 40; // padding
|
||||
return clampScale(avail / vp.width);
|
||||
}
|
||||
|
||||
// ── Rendering ──────────────────────────────────────────────────────────
|
||||
|
||||
function scheduleRender() {
|
||||
clearTimeout(renderTimer);
|
||||
renderTimer = setTimeout(renderAll, 60);
|
||||
}
|
||||
|
||||
async function renderAll() {
|
||||
if (!pdfDoc) return;
|
||||
const gen = ++renderGen;
|
||||
|
||||
// Replace the entire container contents with fresh wrappers.
|
||||
container.innerHTML = "";
|
||||
const wrappers = [];
|
||||
for (let i = 1; i <= pdfDoc.numPages; i++) {
|
||||
const wrap = document.createElement("div");
|
||||
wrap.className = "page-wrapper";
|
||||
wrap.dataset.page = String(i);
|
||||
const canvas = document.createElement("canvas");
|
||||
wrap.appendChild(canvas);
|
||||
container.appendChild(wrap);
|
||||
wrappers.push(wrap);
|
||||
}
|
||||
|
||||
// Render pages one by one; abort if a newer render was requested.
|
||||
for (let i = 0; i < wrappers.length; i++) {
|
||||
if (renderGen !== gen) return;
|
||||
await renderPage(wrappers[i], i + 1);
|
||||
}
|
||||
|
||||
setStatus("Ready");
|
||||
refreshPageIndicator();
|
||||
}
|
||||
|
||||
async function renderPage(wrapper, pageNum) {
|
||||
try {
|
||||
const page = await pdfDoc.getPage(pageNum);
|
||||
const vp = page.getViewport({ scale: scale * DPR });
|
||||
const canvas = wrapper.querySelector("canvas");
|
||||
if (!canvas) return;
|
||||
|
||||
canvas.width = vp.width;
|
||||
canvas.height = vp.height;
|
||||
canvas.style.width = Math.round(vp.width / DPR) + "px";
|
||||
canvas.style.height = Math.round(vp.height / DPR) + "px";
|
||||
|
||||
await page.render({ canvasContext: canvas.getContext("2d"), viewport: vp }).promise;
|
||||
} catch (e) {
|
||||
if (e?.name !== "RenderingCancelledException") console.warn("render:", e);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Page indicator (updates on scroll) ─────────────────────────────────
|
||||
|
||||
function refreshPageIndicator() {
|
||||
if (!pdfDoc) return;
|
||||
const top = container.getBoundingClientRect().top;
|
||||
let current = 1;
|
||||
for (const wrap of container.querySelectorAll(".page-wrapper")) {
|
||||
if (wrap.getBoundingClientRect().bottom > top + 4) {
|
||||
current = parseInt(wrap.dataset.page, 10);
|
||||
break;
|
||||
}
|
||||
}
|
||||
pageIndicator.textContent = current + " / " + pdfDoc.numPages;
|
||||
}
|
||||
|
||||
container.addEventListener("scroll", refreshPageIndicator, { passive: true });
|
||||
|
||||
// ── Load ───────────────────────────────────────────────────────────────
|
||||
|
||||
async function load() {
|
||||
if (!refId) { showError("No ref_id in URL."); return; }
|
||||
setStatus("Loading…");
|
||||
try {
|
||||
const url = "brittle://app/pdf?ref_id=" + encodeURIComponent(refId);
|
||||
pdfDoc = await pdfjsLib.getDocument({ url, disableWorker: true }).promise;
|
||||
|
||||
scale = await fitToWidth();
|
||||
updateZoomLabel();
|
||||
setStatus("Rendering…");
|
||||
await renderAll();
|
||||
} catch (e) {
|
||||
showError("Could not load PDF: " + e.message);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Toolbar buttons ────────────────────────────────────────────────────
|
||||
|
||||
document.getElementById("btn-zoom-out").addEventListener("click",
|
||||
() => applyScale(scale / 1.25));
|
||||
document.getElementById("btn-zoom-in").addEventListener("click",
|
||||
() => applyScale(scale * 1.25));
|
||||
document.getElementById("btn-zoom-fit").addEventListener("click",
|
||||
async () => applyScale(await fitToWidth()));
|
||||
|
||||
// Ctrl+Scroll zoom
|
||||
container.addEventListener("wheel", ev => {
|
||||
if (!ev.ctrlKey) return;
|
||||
ev.preventDefault();
|
||||
applyScale(scale * (ev.deltaY < 0 ? 1.1 : 1 / 1.1));
|
||||
}, { passive: false });
|
||||
|
||||
// Keyboard shortcuts (active when the iframe has focus)
|
||||
document.addEventListener("keydown", ev => {
|
||||
if (ev.target.tagName === "INPUT") return;
|
||||
if (ev.key === "+" || ev.key === "=") { ev.preventDefault(); applyScale(scale * 1.25); }
|
||||
if (ev.key === "-") { ev.preventDefault(); applyScale(scale / 1.25); }
|
||||
if (ev.key === "0") { ev.preventDefault(); fitToWidth().then(applyScale); }
|
||||
});
|
||||
|
||||
load();
|
||||
}
|
||||
|
||||
function showError(msg) {
|
||||
const b = document.getElementById("error-banner");
|
||||
b.textContent = msg;
|
||||
b.style.display = "block";
|
||||
document.getElementById("status").textContent = "Error";
|
||||
}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
103
src-tauri/src/state.rs
Normal file
103
src-tauri/src/state.rs
Normal file
@@ -0,0 +1,103 @@
|
||||
//! Managed application state for the Tauri backend.
|
||||
|
||||
use brittle_core::{store::FsStore, Brittle, BrittleError};
|
||||
use std::sync::Mutex;
|
||||
|
||||
/// Shared state held by the Tauri runtime for the lifetime of the application.
|
||||
///
|
||||
/// `brittle` is `None` until the user opens or creates a repository.
|
||||
pub struct AppState {
|
||||
pub brittle: Mutex<Option<Brittle<FsStore>>>,
|
||||
}
|
||||
|
||||
impl Default for AppState {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl AppState {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
brittle: Mutex::new(None),
|
||||
}
|
||||
}
|
||||
|
||||
/// Run a closure with mutable access to the open repository.
|
||||
///
|
||||
/// Errors if:
|
||||
/// - the mutex is poisoned,
|
||||
/// - no repository is currently open, or
|
||||
/// - the operation itself returns an error.
|
||||
pub fn with_repo<T, F>(&self, f: F) -> Result<T, String>
|
||||
where
|
||||
F: FnOnce(&mut Brittle<FsStore>) -> Result<T, BrittleError>,
|
||||
{
|
||||
let mut guard = self
|
||||
.brittle
|
||||
.lock()
|
||||
.map_err(|_| "internal: state lock poisoned".to_string())?;
|
||||
let brittle = guard
|
||||
.as_mut()
|
||||
.ok_or_else(|| "no repository open".to_string())?;
|
||||
f(brittle).map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
/// Run a closure with read-only access to the open repository.
|
||||
pub fn with_repo_read<T, F>(&self, f: F) -> Result<T, String>
|
||||
where
|
||||
F: FnOnce(&Brittle<FsStore>) -> Result<T, BrittleError>,
|
||||
{
|
||||
let guard = self
|
||||
.brittle
|
||||
.lock()
|
||||
.map_err(|_| "internal: state lock poisoned".to_string())?;
|
||||
let brittle = guard
|
||||
.as_ref()
|
||||
.ok_or_else(|| "no repository open".to_string())?;
|
||||
f(brittle).map_err(|e| e.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use brittle_core::EntryType;
|
||||
|
||||
fn open_state() -> (AppState, tempfile::TempDir) {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let state = AppState::new();
|
||||
let brittle = Brittle::create(tmp.path()).unwrap();
|
||||
*state.brittle.lock().unwrap() = Some(brittle);
|
||||
(state, tmp)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn with_repo_fails_when_no_repo_open() {
|
||||
let state = AppState::new();
|
||||
let result = state.with_repo(|_| Ok(()));
|
||||
assert_eq!(result.unwrap_err(), "no repository open");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn with_repo_read_fails_when_no_repo_open() {
|
||||
let state = AppState::new();
|
||||
let result = state.with_repo_read(|_| Ok(()));
|
||||
assert_eq!(result.unwrap_err(), "no repository open");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn with_repo_succeeds_when_open() {
|
||||
let (state, _tmp) = open_state();
|
||||
let result = state.with_repo(|b| b.create_reference("test2024", EntryType::Article));
|
||||
assert!(result.is_ok());
|
||||
assert_eq!(result.unwrap().cite_key, "test2024");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn with_repo_read_can_list_references() {
|
||||
let (state, _tmp) = open_state();
|
||||
let result = state.with_repo_read(|b| b.list_references());
|
||||
assert!(result.is_ok());
|
||||
}
|
||||
}
|
||||
39
src-tauri/tauri.conf.json
Normal file
39
src-tauri/tauri.conf.json
Normal file
@@ -0,0 +1,39 @@
|
||||
{
|
||||
"$schema": "https://schema.tauri.app/config/2",
|
||||
"productName": "Brittle",
|
||||
"version": "0.1.0",
|
||||
"identifier": "dev.brittle.app",
|
||||
"build": {
|
||||
"beforeDevCommand": {
|
||||
"script": "trunk serve",
|
||||
"cwd": "."
|
||||
},
|
||||
"beforeBuildCommand": {
|
||||
"script": "trunk build --release",
|
||||
"cwd": "."
|
||||
},
|
||||
"devUrl": "http://localhost:1420",
|
||||
"frontendDist": "../dist"
|
||||
},
|
||||
"app": {
|
||||
"withGlobalTauri": true,
|
||||
"windows": [
|
||||
{
|
||||
"label": "main",
|
||||
"title": "Brittle",
|
||||
"width": 1200,
|
||||
"height": 800,
|
||||
"minWidth": 800,
|
||||
"minHeight": 600
|
||||
}
|
||||
],
|
||||
"security": {
|
||||
"csp": null
|
||||
}
|
||||
},
|
||||
"bundle": {
|
||||
"active": true,
|
||||
"targets": "all",
|
||||
"icon": []
|
||||
}
|
||||
}
|
||||
269
src-tauri/tests/commands.rs
Normal file
269
src-tauri/tests/commands.rs
Normal file
@@ -0,0 +1,269 @@
|
||||
//! Integration tests for the Tauri command layer.
|
||||
//!
|
||||
//! These tests exercise `AppState` and the command logic end-to-end using a
|
||||
//! real `Brittle<FsStore>` repository in a temp directory. Tauri IPC is not
|
||||
//! involved — the functions under test are plain Rust.
|
||||
|
||||
use brittle_app::state::AppState;
|
||||
use brittle_core::{Brittle, EntryType, Person};
|
||||
|
||||
fn open_state() -> (AppState, tempfile::TempDir) {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let state = AppState::new();
|
||||
let brittle = Brittle::create(tmp.path()).unwrap();
|
||||
*state.brittle.lock().unwrap() = Some(brittle);
|
||||
(state, tmp)
|
||||
}
|
||||
|
||||
// ── AppState ─────────────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn no_repo_open_returns_error() {
|
||||
let state = AppState::new();
|
||||
let err = state.with_repo(|_| Ok(())).unwrap_err();
|
||||
assert_eq!(err, "no repository open");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn with_repo_propagates_brittle_errors() {
|
||||
let (state, _tmp) = open_state();
|
||||
// Trying to get a non-existent reference propagates the StoreError.
|
||||
let err = state
|
||||
.with_repo_read(|b| {
|
||||
use brittle_core::model::ids::ReferenceId;
|
||||
b.get_reference(ReferenceId::new())
|
||||
})
|
||||
.unwrap_err();
|
||||
assert!(!err.is_empty());
|
||||
}
|
||||
|
||||
// ── Repository lifecycle ──────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn create_and_reopen_repository() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let state = AppState::new();
|
||||
|
||||
// Create
|
||||
{
|
||||
let brittle = Brittle::create(tmp.path()).unwrap();
|
||||
*state.brittle.lock().unwrap() = Some(brittle);
|
||||
}
|
||||
|
||||
// Close
|
||||
*state.brittle.lock().unwrap() = None;
|
||||
assert!(state.with_repo_read(|_| Ok(())).is_err());
|
||||
|
||||
// Reopen
|
||||
{
|
||||
let brittle = Brittle::open(tmp.path()).unwrap();
|
||||
*state.brittle.lock().unwrap() = Some(brittle);
|
||||
}
|
||||
assert!(state.with_repo_read(|_| Ok(())).is_ok());
|
||||
}
|
||||
|
||||
// ── Reference CRUD ───────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn create_and_list_references() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
state
|
||||
.with_repo(|b| b.create_reference("turing1950", EntryType::Article))
|
||||
.unwrap();
|
||||
state
|
||||
.with_repo(|b| b.create_reference("knuth1984", EntryType::Book))
|
||||
.unwrap();
|
||||
|
||||
let refs = state.with_repo_read(|b| b.list_references()).unwrap();
|
||||
assert_eq!(refs.len(), 2);
|
||||
let keys: Vec<&str> = refs.iter().map(|r| r.cite_key.as_str()).collect();
|
||||
assert!(keys.contains(&"turing1950"));
|
||||
assert!(keys.contains(&"knuth1984"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn delete_reference_removes_it() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let r = state
|
||||
.with_repo(|b| b.create_reference("gone2024", EntryType::Misc))
|
||||
.unwrap();
|
||||
|
||||
state.with_repo(|b| b.delete_reference(r.id)).unwrap();
|
||||
|
||||
let refs = state.with_repo_read(|b| b.list_references()).unwrap();
|
||||
assert!(refs.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn set_and_remove_field() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let r = state
|
||||
.with_repo(|b| b.create_reference("fields2024", EntryType::Article))
|
||||
.unwrap();
|
||||
|
||||
state
|
||||
.with_repo(|b| b.set_field(r.id, "title", "A Test Title"))
|
||||
.unwrap();
|
||||
|
||||
let fetched = state.with_repo_read(|b| b.get_reference(r.id)).unwrap();
|
||||
assert_eq!(
|
||||
fetched.fields.get("title").map(String::as_str),
|
||||
Some("A Test Title")
|
||||
);
|
||||
|
||||
state.with_repo(|b| b.remove_field(r.id, "title")).unwrap();
|
||||
|
||||
let fetched2 = state.with_repo_read(|b| b.get_reference(r.id)).unwrap();
|
||||
assert!(!fetched2.fields.contains_key("title"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn search_references_filters_by_query() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
state
|
||||
.with_repo(|b| b.create_reference("turing1950", EntryType::Article))
|
||||
.unwrap();
|
||||
state
|
||||
.with_repo(|b| b.create_reference("knuth1984", EntryType::Book))
|
||||
.unwrap();
|
||||
|
||||
let results = state
|
||||
.with_repo_read(|b| b.search_references("turing"))
|
||||
.unwrap();
|
||||
assert_eq!(results.len(), 1);
|
||||
assert_eq!(results[0].cite_key, "turing1950");
|
||||
}
|
||||
|
||||
// ── Library ───────────────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn create_nested_libraries_and_query_hierarchy() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let root = state.with_repo(|b| b.create_library("Root", None)).unwrap();
|
||||
let child = state
|
||||
.with_repo(|b| b.create_library("Child", Some(root.id)))
|
||||
.unwrap();
|
||||
|
||||
let roots = state.with_repo_read(|b| b.list_root_libraries()).unwrap();
|
||||
assert_eq!(roots.len(), 1);
|
||||
assert_eq!(roots[0].id, root.id);
|
||||
|
||||
let children = state
|
||||
.with_repo_read(|b| b.list_child_libraries(root.id))
|
||||
.unwrap();
|
||||
assert_eq!(children.len(), 1);
|
||||
assert_eq!(children[0].id, child.id);
|
||||
|
||||
let ancestors = state
|
||||
.with_repo_read(|b| b.get_library_ancestors(child.id))
|
||||
.unwrap();
|
||||
assert_eq!(ancestors.len(), 1);
|
||||
assert_eq!(ancestors[0].id, root.id);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn add_reference_to_library_and_query() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let r = state
|
||||
.with_repo(|b| b.create_reference("member2024", EntryType::Article))
|
||||
.unwrap();
|
||||
let lib = state.with_repo(|b| b.create_library("Lib", None)).unwrap();
|
||||
|
||||
state.with_repo(|b| b.add_to_library(lib.id, r.id)).unwrap();
|
||||
|
||||
let members = state
|
||||
.with_repo_read(|b| b.list_library_references(lib.id))
|
||||
.unwrap();
|
||||
assert_eq!(members.len(), 1);
|
||||
assert_eq!(members[0].id, r.id);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn force_delete_library_removes_subtree() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let root = state.with_repo(|b| b.create_library("Root", None)).unwrap();
|
||||
state
|
||||
.with_repo(|b| b.create_library("Child", Some(root.id)))
|
||||
.unwrap();
|
||||
|
||||
state
|
||||
.with_repo(|b| b.force_delete_library(root.id))
|
||||
.unwrap();
|
||||
|
||||
let all = state.with_repo_read(|b| b.list_root_libraries()).unwrap();
|
||||
assert!(all.is_empty());
|
||||
}
|
||||
|
||||
// ── BibTeX export ─────────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn export_library_bibtex_contains_entries() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
let mut r = state
|
||||
.with_repo(|b| b.create_reference("turing1950", EntryType::Article))
|
||||
.unwrap();
|
||||
r.authors.push(Person::new("Turing"));
|
||||
r.fields.insert(
|
||||
"title".into(),
|
||||
"Computing Machinery and Intelligence".into(),
|
||||
);
|
||||
r.fields.insert("journal".into(), "Mind".into());
|
||||
r.fields.insert("year".into(), "1950".into());
|
||||
let r = state.with_repo(|b| b.update_reference(r)).unwrap();
|
||||
|
||||
let lib = state.with_repo(|b| b.create_library("CS", None)).unwrap();
|
||||
state.with_repo(|b| b.add_to_library(lib.id, r.id)).unwrap();
|
||||
|
||||
let (bibtex, errors) = state
|
||||
.with_repo_read(|b| b.export_library_bibtex(lib.id))
|
||||
.unwrap();
|
||||
|
||||
assert!(errors.is_empty());
|
||||
assert!(bibtex.contains("@article{turing1950,"));
|
||||
assert!(bibtex.contains("Computing Machinery and Intelligence"));
|
||||
}
|
||||
|
||||
// ── Snapshot ──────────────────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn snapshot_and_discard_changes() {
|
||||
let (state, _tmp) = open_state();
|
||||
|
||||
state
|
||||
.with_repo(|b| b.create_reference("snap2024", EntryType::Misc))
|
||||
.unwrap();
|
||||
|
||||
let snap = state.with_repo(|b| b.create_snapshot("baseline")).unwrap();
|
||||
assert!(!snap.id.is_empty());
|
||||
|
||||
let snapshots = state.with_repo_read(|b| b.list_snapshots()).unwrap();
|
||||
assert!(snapshots.iter().any(|s| s.message == "baseline"));
|
||||
|
||||
// Delete the reference (uncommitted change).
|
||||
let r_id = state
|
||||
.with_repo_read(|b| b.list_references())
|
||||
.unwrap()
|
||||
.into_iter()
|
||||
.next()
|
||||
.unwrap()
|
||||
.id;
|
||||
state.with_repo(|b| b.delete_reference(r_id)).unwrap();
|
||||
|
||||
assert!(state
|
||||
.with_repo_read(|b| b.has_uncommitted_changes())
|
||||
.unwrap());
|
||||
|
||||
// Discard → reference comes back.
|
||||
state.with_repo(|b| b.discard_changes()).unwrap();
|
||||
|
||||
let refs = state.with_repo_read(|b| b.list_references()).unwrap();
|
||||
assert_eq!(refs.len(), 1);
|
||||
}
|
||||
1988
src/Cargo.lock
generated
Normal file
1988
src/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
19
src/Cargo.toml
Normal file
19
src/Cargo.toml
Normal file
@@ -0,0 +1,19 @@
|
||||
[package]
|
||||
name = "brittle-ui"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
|
||||
[workspace]
|
||||
|
||||
[dependencies]
|
||||
brittle-keymap = { path = "../brittle-keymap" }
|
||||
js-sys = "0.3"
|
||||
leptos = { version = "0.7", features = ["csr"] }
|
||||
wasm-bindgen = "0.2"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde-wasm-bindgen = "0.6"
|
||||
wasm-bindgen-futures = "0.4"
|
||||
web-sys = { version = "0.3", features = ["DataTransfer", "DragEvent", "KeyboardEvent"] }
|
||||
|
||||
[dev-dependencies]
|
||||
serde_json = "1"
|
||||
11
src/index.html
Normal file
11
src/index.html
Normal file
@@ -0,0 +1,11 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Brittle</title>
|
||||
<link data-trunk rel="css" href="style.css" />
|
||||
<link data-trunk rel="rust" data-wasm-opt="z" />
|
||||
</head>
|
||||
<body></body>
|
||||
</html>
|
||||
147
src/src/command_bar.rs
Normal file
147
src/src/command_bar.rs
Normal file
@@ -0,0 +1,147 @@
|
||||
//! The command / search bar shown at the bottom of the screen.
|
||||
//!
|
||||
//! Visible only in [`AppMode::Command`] and [`AppMode::Search`] modes.
|
||||
//! Handles its own keyboard events (Enter, Escape) and stops propagation
|
||||
//! so the global keymap does not also process those keys.
|
||||
|
||||
use leptos::prelude::*;
|
||||
use web_sys::KeyboardEvent;
|
||||
|
||||
use crate::{commands::{self, CommandEffect}, mode::AppMode, ThemeContext};
|
||||
|
||||
/// Shared search query, provided in context so other components can read it.
|
||||
///
|
||||
/// Set by the search bar when the user commits a search (`<Enter>`).
|
||||
/// Cleared when the user cancels with `<Esc>`.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct SearchQuery(pub RwSignal<String>);
|
||||
|
||||
pub fn provide_search_query() -> RwSignal<String> {
|
||||
let sig = RwSignal::new(String::new());
|
||||
provide_context(SearchQuery(sig));
|
||||
sig
|
||||
}
|
||||
|
||||
/// The command/search bar component.
|
||||
///
|
||||
/// Pass the application mode signal; the bar appears/disappears reactively.
|
||||
#[component]
|
||||
pub fn CommandBar(mode: RwSignal<AppMode>) -> impl IntoView {
|
||||
let input_ref = NodeRef::<leptos::html::Input>::new();
|
||||
let (input_val, set_input_val) = signal(String::new());
|
||||
let (status_msg, set_status_msg) = signal(Option::<String>::None);
|
||||
|
||||
let search_query = use_context::<SearchQuery>().map(|sq| sq.0);
|
||||
let theme = use_context::<ThemeContext>().map(|tc| tc.0);
|
||||
let reload_trigger = use_context::<crate::ReloadTrigger>().map(|r| r.0);
|
||||
|
||||
// Clear input and status when the mode changes; autofocus on open.
|
||||
Effect::new(move |_| {
|
||||
let m = mode.get();
|
||||
set_input_val.set(String::new());
|
||||
set_status_msg.set(None);
|
||||
if m != AppMode::Normal {
|
||||
if let Some(el) = input_ref.get() {
|
||||
let _ = el.focus();
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
let prefix = move || match mode.get() {
|
||||
AppMode::Normal => "",
|
||||
AppMode::Command => ":",
|
||||
AppMode::Search => "/",
|
||||
};
|
||||
|
||||
let on_keydown = move |ev: KeyboardEvent| {
|
||||
match ev.key().as_str() {
|
||||
"Escape" => {
|
||||
ev.prevent_default();
|
||||
ev.stop_propagation();
|
||||
// Cancel: clear search query and return to normal.
|
||||
if let Some(sq) = search_query {
|
||||
sq.set(String::new());
|
||||
}
|
||||
mode.set(AppMode::Normal);
|
||||
}
|
||||
"Enter" => {
|
||||
ev.prevent_default();
|
||||
ev.stop_propagation();
|
||||
let val = input_val.get_untracked();
|
||||
match mode.get_untracked() {
|
||||
AppMode::Command => {
|
||||
let outcome = commands::dispatch(&val);
|
||||
if let Some(msg) = outcome.message {
|
||||
// Show error; stay in command mode.
|
||||
set_status_msg.set(Some(msg));
|
||||
return;
|
||||
}
|
||||
if let Some(effect) = outcome.effect {
|
||||
match effect {
|
||||
CommandEffect::SetTheme(t) => {
|
||||
if let Some(sig) = theme {
|
||||
sig.set(t.clone());
|
||||
}
|
||||
leptos::task::spawn_local(async move {
|
||||
let _ = crate::tauri::set_theme(&t).await;
|
||||
});
|
||||
}
|
||||
CommandEffect::OpenRepository(path) => {
|
||||
let reload = reload_trigger;
|
||||
leptos::task::spawn_local(async move {
|
||||
match crate::tauri::open_repository(&path).await {
|
||||
Ok(()) => {
|
||||
if let Some(t) = reload {
|
||||
t.update(|n| *n += 1);
|
||||
}
|
||||
mode.set(AppMode::Normal);
|
||||
}
|
||||
Err(e) => set_status_msg.set(Some(e)),
|
||||
}
|
||||
});
|
||||
return; // stay open until async resolves
|
||||
}
|
||||
}
|
||||
}
|
||||
mode.set(AppMode::Normal);
|
||||
}
|
||||
AppMode::Search => {
|
||||
// Commit the search query and return to normal.
|
||||
if let Some(sq) = search_query {
|
||||
sq.set(val);
|
||||
}
|
||||
mode.set(AppMode::Normal);
|
||||
}
|
||||
AppMode::Normal => {}
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
};
|
||||
|
||||
let on_input = move |ev: web_sys::Event| {
|
||||
set_input_val.set(event_target_value(&ev));
|
||||
set_status_msg.set(None);
|
||||
};
|
||||
|
||||
view! {
|
||||
<Show when=move || mode.get() != AppMode::Normal>
|
||||
<div class="command-bar">
|
||||
<span class="command-prefix">{prefix}</span>
|
||||
<input
|
||||
node_ref=input_ref
|
||||
type="text"
|
||||
class="command-input"
|
||||
prop:value=input_val
|
||||
on:input=on_input
|
||||
on:keydown=on_keydown
|
||||
autocomplete="off"
|
||||
spellcheck="false"
|
||||
/>
|
||||
{move || status_msg.get().map(|m| view! {
|
||||
<span class="command-status">" — "{m}</span>
|
||||
})}
|
||||
</div>
|
||||
</Show>
|
||||
}
|
||||
}
|
||||
128
src/src/commands.rs
Normal file
128
src/src/commands.rs
Normal file
@@ -0,0 +1,128 @@
|
||||
//! Command dispatch for command mode (the `:` prompt).
|
||||
//!
|
||||
//! Each command returns a [`DispatchOutcome`] containing an optional status
|
||||
//! message and an optional side-effect for the UI layer to perform.
|
||||
//! A `None` message means the command succeeded silently; `Some(msg)` is shown
|
||||
//! in the command bar as an error or confirmation.
|
||||
//!
|
||||
//! Commands are simple strings; arguments follow a space: `:theme dark`.
|
||||
|
||||
/// A side-effect that the UI layer must perform after a successful command.
|
||||
pub enum CommandEffect {
|
||||
/// Apply and persist the given theme (`"dark"` or `"light"`).
|
||||
SetTheme(String),
|
||||
/// Open the repository at the given filesystem path.
|
||||
OpenRepository(String),
|
||||
}
|
||||
|
||||
/// Result of dispatching a command.
|
||||
pub struct DispatchOutcome {
|
||||
/// Message to show in the command bar; `None` = silent success.
|
||||
pub message: Option<String>,
|
||||
/// Side-effect for the UI layer to carry out.
|
||||
pub effect: Option<CommandEffect>,
|
||||
}
|
||||
|
||||
impl DispatchOutcome {
|
||||
fn ok() -> Self {
|
||||
Self { message: None, effect: None }
|
||||
}
|
||||
fn err(msg: impl Into<String>) -> Self {
|
||||
Self { message: Some(msg.into()), effect: None }
|
||||
}
|
||||
fn with_effect(effect: CommandEffect) -> Self {
|
||||
Self { message: None, effect: Some(effect) }
|
||||
}
|
||||
}
|
||||
|
||||
/// Execute a command entered in command mode.
|
||||
pub fn dispatch(input: &str) -> DispatchOutcome {
|
||||
let input = input.trim();
|
||||
if input.is_empty() {
|
||||
return DispatchOutcome::ok();
|
||||
}
|
||||
|
||||
let (cmd, args) = input
|
||||
.split_once(' ')
|
||||
.map(|(c, a)| (c, a.trim()))
|
||||
.unwrap_or((input, ""));
|
||||
|
||||
match cmd {
|
||||
// ── Lifecycle ────────────────────────────────────────────────────────
|
||||
"q" | "quit" => {
|
||||
// TODO Phase 8: call tauri::window::close()
|
||||
DispatchOutcome::ok()
|
||||
}
|
||||
|
||||
// ── Theme ────────────────────────────────────────────────────────────
|
||||
"theme" => match args {
|
||||
"dark" | "light" => {
|
||||
DispatchOutcome::with_effect(CommandEffect::SetTheme(args.to_string()))
|
||||
}
|
||||
"" => DispatchOutcome::err("usage: theme <dark|light>"),
|
||||
a => DispatchOutcome::err(format!("unknown theme '{a}'")),
|
||||
},
|
||||
|
||||
// ── Repository ───────────────────────────────────────────────────────
|
||||
"open" => {
|
||||
if args.is_empty() {
|
||||
DispatchOutcome::err("usage: open <path>")
|
||||
} else {
|
||||
DispatchOutcome::with_effect(CommandEffect::OpenRepository(args.to_string()))
|
||||
}
|
||||
}
|
||||
|
||||
_ => DispatchOutcome::err(format!("unknown command '{cmd}'")),
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn empty_input_is_silent() {
|
||||
assert!(dispatch("").message.is_none());
|
||||
assert!(dispatch(" ").message.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn quit_is_silent() {
|
||||
assert!(dispatch("q").message.is_none());
|
||||
assert!(dispatch("quit").message.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn valid_theme_produces_effect() {
|
||||
let dark = dispatch("theme dark");
|
||||
assert!(dark.message.is_none());
|
||||
assert!(matches!(&dark.effect, Some(CommandEffect::SetTheme(t)) if t == "dark"));
|
||||
|
||||
let light = dispatch("theme light");
|
||||
assert!(light.message.is_none());
|
||||
assert!(matches!(&light.effect, Some(CommandEffect::SetTheme(t)) if t == "light"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn invalid_theme_returns_error() {
|
||||
let msg = dispatch("theme solarized").message;
|
||||
assert!(msg.is_some());
|
||||
assert!(msg.unwrap().contains("solarized"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn theme_without_args_returns_usage() {
|
||||
let msg = dispatch("theme").message;
|
||||
assert!(msg.is_some());
|
||||
assert!(msg.unwrap().contains("usage"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn unknown_command_returns_error() {
|
||||
let msg = dispatch("frobnicate").message;
|
||||
assert!(msg.is_some());
|
||||
assert!(msg.unwrap().contains("frobnicate"));
|
||||
}
|
||||
}
|
||||
337
src/src/lib_tab.rs
Normal file
337
src/src/lib_tab.rs
Normal file
@@ -0,0 +1,337 @@
|
||||
//! The Library tab: three-pane layout with tree, list, and detail panes.
|
||||
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
use brittle_keymap::actions;
|
||||
use leptos::prelude::*;
|
||||
use leptos::task::spawn_local;
|
||||
|
||||
use crate::{
|
||||
command_bar::SearchQuery,
|
||||
lib_tree::{flatten_tree, LibraryTree, TreeRow},
|
||||
models::{Library, LibraryId, Reference, ReferenceId, ReferenceSummary},
|
||||
pub_detail::PubDetail,
|
||||
pub_list::PubList,
|
||||
ActionEvent,
|
||||
};
|
||||
|
||||
// ── Pane enum ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Which of the three panes currently has keyboard focus.
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
pub enum Pane {
|
||||
Tree,
|
||||
List,
|
||||
Detail,
|
||||
}
|
||||
|
||||
impl Pane {
|
||||
pub fn next(self) -> Self {
|
||||
match self {
|
||||
Pane::Tree => Pane::List,
|
||||
Pane::List => Pane::Detail,
|
||||
Pane::Detail => Pane::Tree,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn prev(self) -> Self {
|
||||
match self {
|
||||
Pane::Tree => Pane::Detail,
|
||||
Pane::List => Pane::Tree,
|
||||
Pane::Detail => Pane::List,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ── Component ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Root component for the Library tab.
|
||||
///
|
||||
/// Owns all UI state, wires keymap actions, and drives async data loading.
|
||||
#[component]
|
||||
pub fn LibTab() -> impl IntoView {
|
||||
// ── Context signals ───────────────────────────────────────────────────────
|
||||
let keymap_action = use_context::<crate::KeymapAction>()
|
||||
.expect("KeymapAction context missing")
|
||||
.0;
|
||||
|
||||
let search_query = use_context::<SearchQuery>()
|
||||
.map(|sq| sq.0)
|
||||
.unwrap_or_else(|| RwSignal::new(String::new()));
|
||||
|
||||
// ── UI state ──────────────────────────────────────────────────────────────
|
||||
let focused = RwSignal::new(Pane::Tree);
|
||||
|
||||
// Tree state
|
||||
let root_libs = RwSignal::new(Vec::<Library>::new());
|
||||
let children_cache = RwSignal::new(HashMap::<String, Vec<Library>>::new());
|
||||
let expanded = RwSignal::new(HashSet::<String>::new());
|
||||
let tree_cursor = RwSignal::new(0usize);
|
||||
|
||||
// List state
|
||||
let list_items = RwSignal::new(Vec::<ReferenceSummary>::new());
|
||||
let list_cursor = RwSignal::new(0usize);
|
||||
|
||||
// Detail state
|
||||
let detail_ref = RwSignal::new(Option::<Reference>::None);
|
||||
|
||||
// ── Derived / computed ────────────────────────────────────────────────────
|
||||
|
||||
// Flattened visible tree rows (recomputed when tree data or expand set changes).
|
||||
let tree_rows = Memo::new(move |_| {
|
||||
flatten_tree(&root_libs.get(), &children_cache.get(), &expanded.get(), 0)
|
||||
});
|
||||
|
||||
// The library selected by the tree cursor.
|
||||
let selected_library = Memo::new(move |_| {
|
||||
let pos = tree_cursor.get();
|
||||
tree_rows.with(|rows| rows.get(pos).map(|r| LibraryId(r.id.clone())))
|
||||
});
|
||||
|
||||
// The reference ID selected by the list cursor.
|
||||
let selected_ref_id = Memo::new(move |_| {
|
||||
let pos = list_cursor.get();
|
||||
list_items.with(|items| items.get(pos).map(|r| r.id.clone()))
|
||||
});
|
||||
|
||||
// ── Initial data load (and reload on repository change) ───────────────────
|
||||
let reload_trigger = use_context::<crate::ReloadTrigger>().map(|r| r.0);
|
||||
Effect::new(move |_| {
|
||||
if let Some(t) = reload_trigger { t.get(); } // track the trigger
|
||||
spawn_local(async move {
|
||||
match crate::tauri::list_root_libraries().await {
|
||||
Ok(libs) => root_libs.set(libs),
|
||||
Err(e) => leptos::logging::log!("load libraries: {e}"),
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ── Reactive data loads ───────────────────────────────────────────────────
|
||||
|
||||
// Reload list whenever the selected library or search query changes.
|
||||
Effect::new(move |_| {
|
||||
let lib_id = selected_library.get();
|
||||
let query = search_query.get();
|
||||
spawn_local(async move {
|
||||
let result = match (&lib_id, query.is_empty()) {
|
||||
(Some(id), true) => crate::tauri::list_library_references_recursive(id).await,
|
||||
(Some(id), false) => crate::tauri::search_library_references(id, &query).await,
|
||||
(None, true) => crate::tauri::list_references().await,
|
||||
(None, false) => crate::tauri::search_references(&query).await,
|
||||
};
|
||||
match result {
|
||||
Ok(items) => list_items.set(items),
|
||||
Err(e) => {
|
||||
leptos::logging::error!("load publications: {e}");
|
||||
list_items.set(vec![]);
|
||||
}
|
||||
}
|
||||
list_cursor.set(0);
|
||||
});
|
||||
});
|
||||
|
||||
// Load full reference when list cursor moves.
|
||||
Effect::new(move |_| {
|
||||
let ref_id: Option<ReferenceId> = selected_ref_id.get();
|
||||
spawn_local(async move {
|
||||
let loaded = match ref_id {
|
||||
Some(id) => crate::tauri::get_reference(&id).await.ok(),
|
||||
None => None,
|
||||
};
|
||||
detail_ref.set(loaded);
|
||||
});
|
||||
});
|
||||
|
||||
// ── Keymap wiring ─────────────────────────────────────────────────────────
|
||||
Effect::new(move |_| {
|
||||
let Some(ev) = keymap_action.get() else { return };
|
||||
|
||||
// ACTION_OPEN in the List or Detail pane opens a PDF tab for the selected reference.
|
||||
if ev.name == actions::ACTION_OPEN
|
||||
&& matches!(focused.get_untracked(), Pane::List | Pane::Detail)
|
||||
{
|
||||
if let Some(ref_id) = selected_ref_id.get_untracked() {
|
||||
let title = detail_ref.with_untracked(|r| {
|
||||
r.as_ref()
|
||||
.map(|r| r.cite_key.clone())
|
||||
.unwrap_or_else(|| ref_id.0.clone())
|
||||
});
|
||||
if let Some(ctx) = use_context::<crate::OpenPdfContext>() {
|
||||
ctx.0.set(Some(crate::PdfOpenRequest {
|
||||
ref_id: ref_id.0.clone(),
|
||||
title,
|
||||
}));
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
handle_action(
|
||||
&ev,
|
||||
focused,
|
||||
TreeState { cursor: tree_cursor, rows: tree_rows, expanded },
|
||||
ListState { cursor: list_cursor, items: list_items },
|
||||
);
|
||||
});
|
||||
|
||||
// ── View ──────────────────────────────────────────────────────────────────
|
||||
view! {
|
||||
<div class="lib-tab">
|
||||
<div class="pane pane-left">
|
||||
<LibraryTree
|
||||
root_libs=root_libs
|
||||
children_cache=children_cache
|
||||
expanded=expanded
|
||||
cursor=tree_cursor
|
||||
focused=focused
|
||||
/>
|
||||
</div>
|
||||
<div class="pane pane-center">
|
||||
<PubList
|
||||
items=list_items
|
||||
cursor=list_cursor
|
||||
focused=focused
|
||||
/>
|
||||
</div>
|
||||
<div class="pane pane-right">
|
||||
<PubDetail reference=detail_ref />
|
||||
</div>
|
||||
</div>
|
||||
}
|
||||
}
|
||||
|
||||
// ── Action handler ────────────────────────────────────────────────────────────
|
||||
|
||||
struct TreeState {
|
||||
cursor: RwSignal<usize>,
|
||||
rows: Memo<Vec<TreeRow>>,
|
||||
expanded: RwSignal<HashSet<String>>,
|
||||
}
|
||||
|
||||
struct ListState {
|
||||
cursor: RwSignal<usize>,
|
||||
items: RwSignal<Vec<ReferenceSummary>>,
|
||||
}
|
||||
|
||||
fn handle_action(
|
||||
ev: &ActionEvent,
|
||||
focused: RwSignal<Pane>,
|
||||
tree: TreeState,
|
||||
list: ListState,
|
||||
) {
|
||||
let count = (ev.count as usize).max(1);
|
||||
let cur = focused.get_untracked();
|
||||
|
||||
match ev.name.as_str() {
|
||||
// ── Focus switching ───────────────────────────────────────────────────
|
||||
actions::FOCUS_LEFT => focused.set(Pane::Tree),
|
||||
actions::FOCUS_CENTER => focused.set(Pane::List),
|
||||
actions::FOCUS_RIGHT => focused.set(Pane::Detail),
|
||||
actions::FOCUS_NEXT => focused.update(|p| *p = p.next()),
|
||||
actions::FOCUS_PREV => focused.update(|p| *p = p.prev()),
|
||||
|
||||
// ── Tree navigation ───────────────────────────────────────────────────
|
||||
actions::NAV_DOWN if cur == Pane::Tree => {
|
||||
let len = tree.rows.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
tree.cursor.update(|c| *c = (*c + count).min(len - 1));
|
||||
}
|
||||
}
|
||||
actions::NAV_UP if cur == Pane::Tree => {
|
||||
tree.cursor.update(|c| *c = c.saturating_sub(count));
|
||||
}
|
||||
actions::NAV_TOP if cur == Pane::Tree => tree.cursor.set(0),
|
||||
actions::NAV_BOTTOM if cur == Pane::Tree => {
|
||||
let len = tree.rows.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
tree.cursor.set(len - 1);
|
||||
}
|
||||
}
|
||||
actions::NAV_PAGE_DOWN if cur == Pane::Tree => {
|
||||
let len = tree.rows.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
tree.cursor.update(|c| *c = (*c + 10 * count).min(len - 1));
|
||||
}
|
||||
}
|
||||
actions::NAV_PAGE_UP if cur == Pane::Tree => {
|
||||
tree.cursor.update(|c| *c = c.saturating_sub(10 * count));
|
||||
}
|
||||
|
||||
// ── Tree expand / collapse ────────────────────────────────────────────
|
||||
n if (n == actions::TREE_EXPAND || n == actions::ACTION_OPEN) && cur == Pane::Tree => {
|
||||
tree_expand(tree.cursor, tree.rows, tree.expanded);
|
||||
}
|
||||
actions::TREE_COLLAPSE if cur == Pane::Tree => {
|
||||
tree_collapse(tree.cursor, tree.rows, tree.expanded);
|
||||
}
|
||||
actions::TREE_TOGGLE if cur == Pane::Tree => {
|
||||
let id = tree.rows.with_untracked(|rows| {
|
||||
rows.get(tree.cursor.get_untracked()).map(|r| r.id.clone())
|
||||
});
|
||||
if let Some(id) = id {
|
||||
let is_open = tree.expanded.with_untracked(|e| e.contains(&id));
|
||||
if is_open {
|
||||
tree.expanded.update(|e| { e.remove(&id); });
|
||||
} else {
|
||||
tree.expanded.update(|e| { e.insert(id.clone()); });
|
||||
// Child loading is handled reactively by LibraryTree's Effect.
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ── List navigation ───────────────────────────────────────────────────
|
||||
actions::NAV_DOWN if cur == Pane::List => {
|
||||
let len = list.items.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
list.cursor.update(|c| *c = (*c + count).min(len - 1));
|
||||
}
|
||||
}
|
||||
actions::NAV_UP if cur == Pane::List => {
|
||||
list.cursor.update(|c| *c = c.saturating_sub(count));
|
||||
}
|
||||
actions::NAV_TOP if cur == Pane::List => list.cursor.set(0),
|
||||
actions::NAV_BOTTOM if cur == Pane::List => {
|
||||
let len = list.items.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
list.cursor.set(len - 1);
|
||||
}
|
||||
}
|
||||
actions::NAV_PAGE_DOWN if cur == Pane::List => {
|
||||
let len = list.items.with_untracked(Vec::len);
|
||||
if len > 0 {
|
||||
list.cursor.update(|c| *c = (*c + 10 * count).min(len - 1));
|
||||
}
|
||||
}
|
||||
actions::NAV_PAGE_UP if cur == Pane::List => {
|
||||
list.cursor.update(|c| *c = c.saturating_sub(10 * count));
|
||||
}
|
||||
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tree helpers ──────────────────────────────────────────────────────────────
|
||||
|
||||
fn tree_expand(
|
||||
cursor: RwSignal<usize>,
|
||||
rows: Memo<Vec<TreeRow>>,
|
||||
expanded: RwSignal<HashSet<String>>,
|
||||
) {
|
||||
let id = rows.with_untracked(|r| r.get(cursor.get_untracked()).map(|row| row.id.clone()));
|
||||
if let Some(id) = id {
|
||||
expanded.update(|e| { e.insert(id.clone()); });
|
||||
// Child loading is handled reactively by LibraryTree's Effect.
|
||||
}
|
||||
}
|
||||
|
||||
fn tree_collapse(
|
||||
cursor: RwSignal<usize>,
|
||||
rows: Memo<Vec<TreeRow>>,
|
||||
expanded: RwSignal<HashSet<String>>,
|
||||
) {
|
||||
let id = rows.with_untracked(|r| r.get(cursor.get_untracked()).map(|row| row.id.clone()));
|
||||
if let Some(id) = id {
|
||||
expanded.update(|e| { e.remove(&id); });
|
||||
}
|
||||
}
|
||||
|
||||
323
src/src/lib_tree.rs
Normal file
323
src/src/lib_tree.rs
Normal file
@@ -0,0 +1,323 @@
|
||||
//! Library tree: flattening logic and the left-pane component.
|
||||
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
use leptos::prelude::*;
|
||||
use leptos::task::spawn_local;
|
||||
use web_sys::DragEvent;
|
||||
|
||||
use crate::models::{Library, LibraryId, ReferenceId};
|
||||
|
||||
// ── Helpers ───────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Load `id`'s children from the backend and cache them, unless already cached.
|
||||
pub fn load_children_if_needed(id: String, children_cache: RwSignal<HashMap<String, Vec<Library>>>) {
|
||||
let already = children_cache.with_untracked(|c| c.contains_key(&id));
|
||||
if !already {
|
||||
spawn_local(async move {
|
||||
let lib_id = LibraryId(id.clone());
|
||||
match crate::tauri::list_child_libraries(&lib_id).await {
|
||||
Ok(children) => children_cache.update(|c| { c.insert(id, children); }),
|
||||
Err(e) => leptos::logging::error!("load children ({id}): {e}"),
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tree logic ────────────────────────────────────────────────────────────────
|
||||
|
||||
/// A single visible row in the rendered library tree.
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub struct TreeRow {
|
||||
/// Library ID string (UUID).
|
||||
pub id: String,
|
||||
/// Display name.
|
||||
pub name: String,
|
||||
/// Nesting depth (0 = root).
|
||||
pub depth: usize,
|
||||
/// Whether this node is currently expanded.
|
||||
pub expanded: bool,
|
||||
/// `true` if children have not been loaded yet, or the node has children.
|
||||
/// `false` only when children were loaded and the result was empty.
|
||||
pub may_have_children: bool,
|
||||
}
|
||||
|
||||
/// Flatten the visible portion of the library hierarchy into an ordered list.
|
||||
///
|
||||
/// Only nodes whose ancestors are all expanded appear in the output.
|
||||
/// Children are ordered as returned by `children_cache`.
|
||||
pub fn flatten_tree(
|
||||
libs: &[Library],
|
||||
children_cache: &HashMap<String, Vec<Library>>,
|
||||
expanded: &HashSet<String>,
|
||||
depth: usize,
|
||||
) -> Vec<TreeRow> {
|
||||
let mut rows = Vec::new();
|
||||
for lib in libs {
|
||||
let id = lib.id.0.clone();
|
||||
let is_expanded = expanded.contains(&id);
|
||||
let children = children_cache.get(&id);
|
||||
let may_have_children = children.is_none_or(|c| !c.is_empty());
|
||||
|
||||
rows.push(TreeRow {
|
||||
id: id.clone(),
|
||||
name: lib.name.clone(),
|
||||
depth,
|
||||
expanded: is_expanded,
|
||||
may_have_children,
|
||||
});
|
||||
|
||||
if is_expanded {
|
||||
if let Some(child_libs) = children {
|
||||
rows.extend(flatten_tree(child_libs, children_cache, expanded, depth + 1));
|
||||
}
|
||||
}
|
||||
}
|
||||
rows
|
||||
}
|
||||
|
||||
// ── Component ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Left pane: renders the library tree.
|
||||
///
|
||||
/// Navigation (j/k, expand/collapse) is driven entirely from the parent via
|
||||
/// `cursor` and `expanded`. Click on a row updates `cursor`.
|
||||
///
|
||||
/// Each tree row is also a drag-drop target: dropping a publication item onto a
|
||||
/// row calls `add_to_library`, adding the reference to that library.
|
||||
#[component]
|
||||
pub fn LibraryTree(
|
||||
root_libs: RwSignal<Vec<Library>>,
|
||||
children_cache: RwSignal<HashMap<String, Vec<Library>>>,
|
||||
expanded: RwSignal<HashSet<String>>,
|
||||
cursor: RwSignal<usize>,
|
||||
focused: RwSignal<crate::lib_tab::Pane>,
|
||||
) -> impl IntoView {
|
||||
use crate::lib_tab::Pane;
|
||||
use leptos::either::Either;
|
||||
|
||||
let rows = Memo::new(move |_| {
|
||||
flatten_tree(&root_libs.get(), &children_cache.get(), &expanded.get(), 0)
|
||||
});
|
||||
|
||||
// Reactively load children for any newly-expanded node.
|
||||
// Using an Effect guarantees this runs in a proper reactive owner context,
|
||||
// which makes `spawn_local` and signal updates flush correctly.
|
||||
Effect::new(move |_| {
|
||||
let exp = expanded.get(); // subscribe to expansion changes
|
||||
for id in exp {
|
||||
load_children_if_needed(id, children_cache);
|
||||
}
|
||||
});
|
||||
|
||||
// Which tree-row ID (if any) is the current drag-over target.
|
||||
let drag_over_id: RwSignal<Option<String>> = RwSignal::new(None);
|
||||
|
||||
view! {
|
||||
<div class="tree-pane" class:pane-focused=move || focused.get() == Pane::Tree>
|
||||
{move || {
|
||||
let row_list = rows.get();
|
||||
if row_list.is_empty() {
|
||||
Either::Left(view! {
|
||||
<div class="empty-state">"Open a repository to see libraries"</div>
|
||||
})
|
||||
} else {
|
||||
let cursor_pos = cursor.get();
|
||||
Either::Right(view! {
|
||||
<ul class="tree-list">
|
||||
{row_list.into_iter().enumerate().map(|(i, row)| {
|
||||
let indent_px = row.depth * 16;
|
||||
let icon = if row.may_have_children {
|
||||
if row.expanded { "▾ " } else { "▸ " }
|
||||
} else {
|
||||
" "
|
||||
};
|
||||
let is_cursor = i == cursor_pos;
|
||||
let row_may_have_children = row.may_have_children;
|
||||
|
||||
// Clone row.id for each closure that captures it.
|
||||
let row_id_click = row.id.clone();
|
||||
let row_id_class = row.id.clone();
|
||||
let row_id_over = row.id.clone();
|
||||
let row_id_drop = row.id.clone();
|
||||
|
||||
view! {
|
||||
<li
|
||||
class="tree-item"
|
||||
class:tree-cursor=is_cursor
|
||||
class:tree-drop-target=move || {
|
||||
drag_over_id.get().as_deref() == Some(row_id_class.as_str())
|
||||
}
|
||||
style=format!("padding-left: {indent_px}px")
|
||||
on:click=move |_| {
|
||||
cursor.set(i);
|
||||
if row_may_have_children {
|
||||
let is_open = expanded.with_untracked(|e| e.contains(&row_id_click));
|
||||
if is_open {
|
||||
expanded.update(|e| { e.remove(&row_id_click); });
|
||||
} else {
|
||||
expanded.update(|e| { e.insert(row_id_click.clone()); });
|
||||
// Child loading is handled by the reactive Effect above.
|
||||
}
|
||||
}
|
||||
}
|
||||
on:dragover=move |ev: DragEvent| {
|
||||
ev.prevent_default();
|
||||
drag_over_id.set(Some(row_id_over.clone()));
|
||||
}
|
||||
on:dragleave=move |_: DragEvent| {
|
||||
drag_over_id.set(None);
|
||||
}
|
||||
on:drop=move |ev: DragEvent| {
|
||||
ev.prevent_default();
|
||||
drag_over_id.set(None);
|
||||
let Some(dt) = ev.data_transfer() else { return };
|
||||
let Ok(ref_id_str) = dt.get_data("application/brittle-ref-id") else { return };
|
||||
if ref_id_str.is_empty() { return }
|
||||
let lib_id = LibraryId(row_id_drop.clone());
|
||||
let ref_id = ReferenceId(ref_id_str);
|
||||
spawn_local(async move {
|
||||
if let Err(e) = crate::tauri::add_to_library(&lib_id, &ref_id).await {
|
||||
leptos::logging::warn!("add_to_library: {e}");
|
||||
}
|
||||
});
|
||||
}
|
||||
>
|
||||
<span class="tree-icon">{icon}</span>
|
||||
<span class="tree-name">{row.name.clone()}</span>
|
||||
</li>
|
||||
}
|
||||
}).collect::<Vec<_>>()}
|
||||
</ul>
|
||||
})
|
||||
}
|
||||
}}
|
||||
</div>
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::models::LibraryId;
|
||||
|
||||
fn lib(id: &str, name: &str, parent: Option<&str>) -> Library {
|
||||
Library {
|
||||
id: LibraryId(id.into()),
|
||||
name: name.into(),
|
||||
parent_id: parent.map(|p| LibraryId(p.into())),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_root_produces_empty_rows() {
|
||||
let rows = flatten_tree(&[], &Default::default(), &Default::default(), 0);
|
||||
assert!(rows.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn single_root_unloaded_children() {
|
||||
let root = lib("r", "Root", None);
|
||||
let rows = flatten_tree(&[root], &Default::default(), &Default::default(), 0);
|
||||
assert_eq!(rows.len(), 1);
|
||||
assert_eq!(rows[0].id, "r");
|
||||
assert_eq!(rows[0].name, "Root");
|
||||
assert_eq!(rows[0].depth, 0);
|
||||
assert!(!rows[0].expanded);
|
||||
// Unknown children → may_have_children = true
|
||||
assert!(rows[0].may_have_children);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn collapsed_node_hides_children() {
|
||||
let parent = lib("p", "Parent", None);
|
||||
let child = lib("c", "Child", Some("p"));
|
||||
let mut cache = HashMap::new();
|
||||
cache.insert("p".into(), vec![child]);
|
||||
// Not in expanded set → children hidden
|
||||
let rows = flatten_tree(&[parent], &cache, &Default::default(), 0);
|
||||
assert_eq!(rows.len(), 1);
|
||||
assert!(rows[0].may_have_children);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn expanded_node_shows_children() {
|
||||
let parent = lib("p", "Parent", None);
|
||||
let child = lib("c", "Child", Some("p"));
|
||||
let mut cache = HashMap::new();
|
||||
cache.insert("p".into(), vec![child]);
|
||||
let mut expanded = HashSet::new();
|
||||
expanded.insert("p".into());
|
||||
|
||||
let rows = flatten_tree(&[parent], &cache, &expanded, 0);
|
||||
assert_eq!(rows.len(), 2);
|
||||
assert_eq!(rows[0].name, "Parent");
|
||||
assert!(rows[0].expanded);
|
||||
assert_eq!(rows[1].name, "Child");
|
||||
assert_eq!(rows[1].depth, 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn loaded_empty_children_marks_leaf() {
|
||||
let node = lib("n", "Node", None);
|
||||
let mut cache = HashMap::new();
|
||||
cache.insert("n".into(), vec![]); // explicitly empty
|
||||
let rows = flatten_tree(&[node], &cache, &Default::default(), 0);
|
||||
assert_eq!(rows.len(), 1);
|
||||
assert!(!rows[0].may_have_children);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multi_level_nesting() {
|
||||
let root = lib("r", "Root", None);
|
||||
let mid = lib("m", "Mid", Some("r"));
|
||||
let leaf = lib("l", "Leaf", Some("m"));
|
||||
|
||||
let mut cache = HashMap::new();
|
||||
cache.insert("r".into(), vec![mid]);
|
||||
cache.insert("m".into(), vec![leaf]);
|
||||
|
||||
let mut expanded = HashSet::new();
|
||||
expanded.insert("r".into());
|
||||
expanded.insert("m".into());
|
||||
|
||||
let rows = flatten_tree(&[root], &cache, &expanded, 0);
|
||||
assert_eq!(rows.len(), 3);
|
||||
assert_eq!(rows[0].depth, 0);
|
||||
assert_eq!(rows[1].depth, 1);
|
||||
assert_eq!(rows[2].depth, 2);
|
||||
assert_eq!(rows[2].name, "Leaf");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn multiple_roots_ordered() {
|
||||
let a = lib("a", "Alpha", None);
|
||||
let b = lib("b", "Beta", None);
|
||||
let rows = flatten_tree(&[a, b], &Default::default(), &Default::default(), 0);
|
||||
assert_eq!(rows.len(), 2);
|
||||
assert_eq!(rows[0].name, "Alpha");
|
||||
assert_eq!(rows[1].name, "Beta");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn only_expanded_subtrees_included() {
|
||||
// Parent expanded, but sibling collapsed
|
||||
let p = lib("p", "P", None);
|
||||
let s = lib("s", "S", None); // sibling root, also collapsed
|
||||
let c = lib("c", "C", Some("p"));
|
||||
let mut cache = HashMap::new();
|
||||
cache.insert("p".into(), vec![c]);
|
||||
cache.insert("s".into(), vec![lib("sc", "SC", Some("s"))]);
|
||||
|
||||
let mut expanded = HashSet::new();
|
||||
expanded.insert("p".into()); // expand only P
|
||||
|
||||
let rows = flatten_tree(&[p, s], &cache, &expanded, 0);
|
||||
assert_eq!(rows.len(), 3); // P, C, S (SC hidden)
|
||||
assert_eq!(rows[0].name, "P");
|
||||
assert_eq!(rows[1].name, "C");
|
||||
assert_eq!(rows[2].name, "S");
|
||||
}
|
||||
}
|
||||
419
src/src/main.rs
Normal file
419
src/src/main.rs
Normal file
@@ -0,0 +1,419 @@
|
||||
mod command_bar;
|
||||
mod commands;
|
||||
mod lib_tab;
|
||||
mod lib_tree;
|
||||
mod models;
|
||||
mod mode;
|
||||
mod pdf_viewer;
|
||||
mod pub_detail;
|
||||
mod pub_list;
|
||||
mod tab_bar;
|
||||
mod tauri;
|
||||
|
||||
use std::{
|
||||
cell::{Cell, RefCell},
|
||||
rc::Rc,
|
||||
};
|
||||
|
||||
use brittle_keymap::{actions, Key, KeyCode, KeymapState, Outcome, default_bindings};
|
||||
use command_bar::{CommandBar, provide_search_query};
|
||||
use leptos::prelude::*;
|
||||
use lib_tab::LibTab;
|
||||
use mode::{AppMode, provide_mode};
|
||||
use pdf_viewer::PdfViewer;
|
||||
use tab_bar::TabBar;
|
||||
use web_sys::KeyboardEvent;
|
||||
|
||||
fn main() {
|
||||
leptos::mount::mount_to_body(App);
|
||||
}
|
||||
|
||||
// ── Action event ──────────────────────────────────────────────────────────────
|
||||
|
||||
/// A dispatched keymap action.
|
||||
///
|
||||
/// The `seq` field increments monotonically so that firing the same action
|
||||
/// twice in succession produces a distinct signal value and triggers
|
||||
/// reactive updates both times.
|
||||
#[derive(Clone, PartialEq, Eq)]
|
||||
pub struct ActionEvent {
|
||||
pub name: String,
|
||||
pub count: u32,
|
||||
seq: u32,
|
||||
}
|
||||
|
||||
thread_local! {
|
||||
static ACTION_SEQ: Cell<u32> = const { Cell::new(0) };
|
||||
}
|
||||
|
||||
pub fn next_seq() -> u32 {
|
||||
ACTION_SEQ.with(|s| {
|
||||
let n = s.get();
|
||||
s.set(n.wrapping_add(1));
|
||||
n
|
||||
})
|
||||
}
|
||||
|
||||
/// Context handle giving components read access to the last dispatched action.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct KeymapAction(pub ReadSignal<Option<ActionEvent>>);
|
||||
|
||||
// ── Tab types ─────────────────────────────────────────────────────────────────
|
||||
|
||||
/// A single open tab in the application.
|
||||
#[derive(Clone, PartialEq, Eq)]
|
||||
pub enum AppTab {
|
||||
/// The persistent library / reference browser tab (always index 0).
|
||||
Library,
|
||||
/// A PDF viewer tab for a specific reference.
|
||||
Pdf {
|
||||
/// UUID string of the reference whose PDF is being viewed.
|
||||
ref_id: String,
|
||||
/// Short display name shown in the tab (typically the cite key).
|
||||
title: String,
|
||||
},
|
||||
}
|
||||
|
||||
// ── Theme context ─────────────────────────────────────────────────────────────
|
||||
|
||||
/// Context handle for the current colour theme (`"dark"` or `"light"`).
|
||||
///
|
||||
/// Writable so `CommandBar` can update it when the user types `:theme …`.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ThemeContext(pub RwSignal<String>);
|
||||
|
||||
fn provide_theme() {
|
||||
let theme = RwSignal::new("dark".to_string());
|
||||
provide_context(ThemeContext(theme));
|
||||
|
||||
// Reactively apply the `data-theme` attribute to `<html>`.
|
||||
Effect::new(move |_| {
|
||||
let t = theme.get();
|
||||
if let Some(doc) = web_sys::window().and_then(|w| w.document()) {
|
||||
if let Some(el) = doc.document_element() {
|
||||
let _ = el.set_attribute("data-theme", &t);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Load the persisted theme from backend config; replace the default if
|
||||
// found. Runs after the effect is live, so the DOM is updated immediately
|
||||
// once the IPC call returns.
|
||||
leptos::task::spawn_local(async move {
|
||||
if let Ok(t) = crate::tauri::get_theme().await {
|
||||
theme.set(t);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Reload trigger ────────────────────────────────────────────────────────────
|
||||
|
||||
/// Incrementing counter that components watch to know when to reload their data.
|
||||
///
|
||||
/// Incremented whenever the open repository changes (`:open <path>`).
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct ReloadTrigger(pub RwSignal<u32>);
|
||||
|
||||
// ── PDF open context ──────────────────────────────────────────────────────────
|
||||
|
||||
/// Request to open (or switch to) a PDF tab, posted by child components.
|
||||
#[derive(Clone, PartialEq, Eq)]
|
||||
pub struct PdfOpenRequest {
|
||||
pub ref_id: String,
|
||||
pub title: String,
|
||||
}
|
||||
|
||||
/// Context handle that child components write to when they want to open a PDF tab.
|
||||
#[derive(Clone, Copy)]
|
||||
pub struct OpenPdfContext(pub RwSignal<Option<PdfOpenRequest>>);
|
||||
|
||||
// ── Keymap provider ────────────────────────────────────────────────────────────
|
||||
|
||||
fn provide_keymap() {
|
||||
let state: Rc<RefCell<KeymapState>> =
|
||||
Rc::new(RefCell::new(KeymapState::new(default_bindings())));
|
||||
|
||||
let (action_read, action_write) = signal::<Option<ActionEvent>>(None);
|
||||
provide_context(KeymapAction(action_read));
|
||||
|
||||
let state_for_listener = state.clone();
|
||||
let listener = window_event_listener(leptos::ev::keydown, move |ev: KeyboardEvent| {
|
||||
if is_input_focused() {
|
||||
return;
|
||||
}
|
||||
if let Some(key) = key_from_event(&ev) {
|
||||
ev.prevent_default();
|
||||
let outcome = state_for_listener.borrow_mut().process(key);
|
||||
if let Outcome::Action { name, count } = outcome {
|
||||
action_write.set(Some(ActionEvent { name, count, seq: next_seq() }));
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
on_cleanup(move || listener.remove());
|
||||
|
||||
// Asynchronously load user keybinding overrides and hot-swap the state.
|
||||
// Runs after the listener is already live, so the app is usable with
|
||||
// defaults while the async IPC call is in flight.
|
||||
let state_for_overrides = state.clone();
|
||||
leptos::task::spawn_local(async move {
|
||||
let overrides = match crate::tauri::get_keybindings().await {
|
||||
Ok(map) if !map.is_empty() => map,
|
||||
_ => return, // no config or error — keep defaults
|
||||
};
|
||||
|
||||
let mut bindings = default_bindings();
|
||||
// Config keys use snake_case; action names use dot.notation.
|
||||
let pairs: Vec<(String, String)> = overrides
|
||||
.into_iter()
|
||||
.map(|(k, v)| (action_key_to_name(&k), v))
|
||||
.collect();
|
||||
bindings.apply_overrides(pairs.iter().map(|(k, v)| (k.as_str(), v.as_str())));
|
||||
*state_for_overrides.borrow_mut() = KeymapState::new(bindings);
|
||||
});
|
||||
}
|
||||
|
||||
/// Convert a config keybinding key (snake_case) to an action name (dot.notation).
|
||||
///
|
||||
/// ```
|
||||
/// # use brittle_ui::action_key_to_name; // won't compile as-is, just illustrating
|
||||
/// assert_eq!(action_key_to_name("tab_next"), "tab.next");
|
||||
/// assert_eq!(action_key_to_name("nav_page_down"), "nav.page.down");
|
||||
/// ```
|
||||
fn action_key_to_name(key: &str) -> String {
|
||||
key.replace('_', ".")
|
||||
}
|
||||
|
||||
// ── Helpers ────────────────────────────────────────────────────────────────────
|
||||
|
||||
fn is_input_focused() -> bool {
|
||||
let Some(doc) = web_sys::window().and_then(|w| w.document()) else {
|
||||
return false;
|
||||
};
|
||||
let Some(el) = doc.active_element() else {
|
||||
return false;
|
||||
};
|
||||
let tag = el.tag_name().to_uppercase();
|
||||
if tag == "INPUT" || tag == "TEXTAREA" {
|
||||
return true;
|
||||
}
|
||||
el.get_attribute("contenteditable")
|
||||
.map(|v| v != "false")
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
fn key_from_event(ev: &KeyboardEvent) -> Option<Key> {
|
||||
let key_str = ev.key();
|
||||
let code = match key_str.as_str() {
|
||||
"Enter" => KeyCode::Enter,
|
||||
"Escape" => KeyCode::Escape,
|
||||
"Tab" => KeyCode::Tab,
|
||||
"Backspace" => KeyCode::Backspace,
|
||||
"Delete" => KeyCode::Delete,
|
||||
" " => KeyCode::Space,
|
||||
"ArrowUp" => KeyCode::ArrowUp,
|
||||
"ArrowDown" => KeyCode::ArrowDown,
|
||||
"ArrowLeft" => KeyCode::ArrowLeft,
|
||||
"ArrowRight" => KeyCode::ArrowRight,
|
||||
"Home" => KeyCode::Home,
|
||||
"End" => KeyCode::End,
|
||||
"PageUp" => KeyCode::PageUp,
|
||||
"PageDown" => KeyCode::PageDown,
|
||||
s if s.starts_with('F') && s.len() > 1 => KeyCode::F(s[1..].parse::<u8>().ok()?),
|
||||
s if s.chars().count() == 1 => KeyCode::Char(s.chars().next().unwrap()),
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
// For Char keys the character already encodes the shift state:
|
||||
// ':' is the shifted ';', 'G' is the shifted 'g', etc.
|
||||
// Including shift: true would produce <S-:> which never matches a binding.
|
||||
// For special keys (Tab, arrows, …) shift must be preserved (<S-Tab>).
|
||||
let shift = match code {
|
||||
KeyCode::Char(_) => false,
|
||||
_ => ev.shift_key(),
|
||||
};
|
||||
|
||||
Some(Key {
|
||||
code,
|
||||
ctrl: ev.ctrl_key(),
|
||||
shift,
|
||||
alt: ev.alt_key(),
|
||||
meta: ev.meta_key(),
|
||||
})
|
||||
}
|
||||
|
||||
// ── Root component ─────────────────────────────────────────────────────────────
|
||||
|
||||
#[component]
|
||||
fn App() -> impl IntoView {
|
||||
provide_keymap();
|
||||
provide_theme();
|
||||
let mode = provide_mode();
|
||||
provide_search_query();
|
||||
|
||||
// ── Tab state ─────────────────────────────────────────────────────────────
|
||||
// The Library tab is always present at index 0 and cannot be closed.
|
||||
let tabs = RwSignal::new(vec![AppTab::Library]);
|
||||
let active_tab = RwSignal::new(0usize);
|
||||
|
||||
// Provide the open-PDF context so LibTab can post open requests.
|
||||
let open_pdf_req = RwSignal::new(Option::<PdfOpenRequest>::None);
|
||||
provide_context(OpenPdfContext(open_pdf_req));
|
||||
|
||||
// Reload trigger: increment when a repository is opened.
|
||||
provide_context(ReloadTrigger(RwSignal::new(0u32)));
|
||||
|
||||
// ── Keymap wiring ─────────────────────────────────────────────────────────
|
||||
let keymap_action = use_context::<KeymapAction>().unwrap().0;
|
||||
Effect::new(move |_| {
|
||||
let Some(ev) = keymap_action.get() else { return };
|
||||
match ev.name.as_str() {
|
||||
// Mode switching
|
||||
actions::MODE_COMMAND => mode.set(AppMode::Command),
|
||||
actions::MODE_SEARCH => mode.set(AppMode::Search),
|
||||
actions::MODE_NORMAL => mode.set(AppMode::Normal),
|
||||
|
||||
// Tab cycling
|
||||
actions::TAB_NEXT => {
|
||||
let len = tabs.with_untracked(Vec::len);
|
||||
if len > 1 {
|
||||
active_tab.update(|i| *i = (*i + 1) % len);
|
||||
}
|
||||
}
|
||||
actions::TAB_PREV => {
|
||||
let len = tabs.with_untracked(Vec::len);
|
||||
if len > 1 {
|
||||
active_tab.update(|i| *i = if *i == 0 { len - 1 } else { *i - 1 });
|
||||
}
|
||||
}
|
||||
|
||||
// Close the active tab (Library tab is immortal)
|
||||
actions::TAB_CLOSE => {
|
||||
let idx = active_tab.get_untracked();
|
||||
if idx > 0 {
|
||||
tabs.update(|t| { t.remove(idx); });
|
||||
active_tab.update(|i| *i = i.saturating_sub(1));
|
||||
}
|
||||
}
|
||||
|
||||
_ => {}
|
||||
}
|
||||
});
|
||||
|
||||
// ── Open-PDF request handler ───────────────────────────────────────────────
|
||||
// Watches for requests from LibTab and opens or activates the PDF tab.
|
||||
Effect::new(move |_| {
|
||||
let Some(req) = open_pdf_req.get() else { return };
|
||||
|
||||
let existing = tabs.with_untracked(|t| {
|
||||
t.iter().position(|tab| {
|
||||
matches!(tab, AppTab::Pdf { ref_id: r, .. } if *r == req.ref_id)
|
||||
})
|
||||
});
|
||||
|
||||
if let Some(idx) = existing {
|
||||
active_tab.set(idx);
|
||||
} else {
|
||||
let new_idx = tabs.with_untracked(Vec::len);
|
||||
tabs.update(|t| t.push(AppTab::Pdf {
|
||||
ref_id: req.ref_id.clone(),
|
||||
title: req.title.clone(),
|
||||
}));
|
||||
active_tab.set(new_idx);
|
||||
}
|
||||
|
||||
// Clear so the same ref_id can re-trigger (e.g. switch away, then re-open).
|
||||
open_pdf_req.set(None);
|
||||
});
|
||||
|
||||
// ── View ──────────────────────────────────────────────────────────────────
|
||||
view! {
|
||||
<div class="app">
|
||||
<TabBar tabs=tabs active_tab=active_tab />
|
||||
<div class="app-body">
|
||||
// Library tab — always mounted; hidden when a PDF tab is active.
|
||||
<div style=move || if active_tab.get() == 0 { "height:100%" } else { "display:none" }>
|
||||
<LibTab />
|
||||
</div>
|
||||
// PDF tabs — each mounted once (keyed by ref_id) and hidden
|
||||
// when not active, so iframe state is preserved across switches.
|
||||
<For
|
||||
each=move || {
|
||||
tabs.get()
|
||||
.into_iter()
|
||||
.enumerate()
|
||||
.filter_map(|(i, tab)| match tab {
|
||||
AppTab::Pdf { ref_id, .. } => Some((i, ref_id)),
|
||||
AppTab::Library => None,
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
}
|
||||
key=|(_, ref_id)| ref_id.clone()
|
||||
children=move |(i, ref_id)| {
|
||||
// Derive visibility reactively: look up the live tabs
|
||||
// so the style updates correctly after tab close/reorder.
|
||||
let ref_id_vis = ref_id.clone();
|
||||
let is_visible = move || {
|
||||
let active = active_tab.get();
|
||||
tabs.with(|t| {
|
||||
t.get(active)
|
||||
.map(|tab| matches!(
|
||||
tab,
|
||||
AppTab::Pdf { ref_id: r, .. } if *r == ref_id_vis
|
||||
))
|
||||
.unwrap_or(false)
|
||||
})
|
||||
};
|
||||
// `i` at creation time is used as a fallback for the
|
||||
// close button in TabBar; here we only need visibility.
|
||||
let _ = i;
|
||||
view! {
|
||||
<div style=move || if is_visible() { "height:100%" } else { "display:none" }>
|
||||
<PdfViewer ref_id=ref_id />
|
||||
</div>
|
||||
}
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<CommandBar mode />
|
||||
</div>
|
||||
}
|
||||
}
|
||||
|
||||
// ── Tests ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::action_key_to_name;
|
||||
|
||||
#[test]
|
||||
fn single_segment_unchanged() {
|
||||
assert_eq!(action_key_to_name("quit"), "quit");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn two_segment_conversion() {
|
||||
assert_eq!(action_key_to_name("tab_next"), "tab.next");
|
||||
assert_eq!(action_key_to_name("tab_prev"), "tab.prev");
|
||||
assert_eq!(action_key_to_name("tab_close"), "tab.close");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn three_segment_conversion() {
|
||||
assert_eq!(action_key_to_name("nav_page_down"), "nav.page.down");
|
||||
assert_eq!(action_key_to_name("nav_page_up"), "nav.page.up");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn focus_actions() {
|
||||
assert_eq!(action_key_to_name("focus_left"), "focus.left");
|
||||
assert_eq!(action_key_to_name("focus_center"), "focus.center");
|
||||
assert_eq!(action_key_to_name("focus_right"), "focus.right");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mode_actions() {
|
||||
assert_eq!(action_key_to_name("mode_command"), "mode.command");
|
||||
assert_eq!(action_key_to_name("mode_normal"), "mode.normal");
|
||||
}
|
||||
}
|
||||
24
src/src/mode.rs
Normal file
24
src/src/mode.rs
Normal file
@@ -0,0 +1,24 @@
|
||||
//! Application mode state.
|
||||
//!
|
||||
//! Brittle has three modes:
|
||||
//! - **Normal** — keyboard shortcuts are active; the default state.
|
||||
//! - **Command** — the `:` prompt is open; user types a command.
|
||||
//! - **Search** — the `/` prompt is open; user types a search/filter query.
|
||||
|
||||
use leptos::prelude::*;
|
||||
|
||||
/// The current input mode of the application.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub enum AppMode {
|
||||
Normal,
|
||||
Command,
|
||||
Search,
|
||||
}
|
||||
|
||||
/// Create the application mode signal.
|
||||
///
|
||||
/// Returns the [`RwSignal`] which should be passed to components that need it.
|
||||
/// Call once from the root component.
|
||||
pub fn provide_mode() -> RwSignal<AppMode> {
|
||||
RwSignal::new(AppMode::Normal)
|
||||
}
|
||||
256
src/src/models.rs
Normal file
256
src/src/models.rs
Normal file
@@ -0,0 +1,256 @@
|
||||
//! Frontend mirror of `brittle-core` data types.
|
||||
//!
|
||||
//! These types are intentionally minimal — they carry only the fields the
|
||||
//! frontend needs. Unknown fields coming from the Tauri IPC are silently
|
||||
//! ignored by serde's default behaviour.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::BTreeMap;
|
||||
|
||||
// ── IDs ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// A library identifier (UUID string).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Deserialize, Serialize)]
|
||||
#[serde(transparent)]
|
||||
pub struct LibraryId(pub String);
|
||||
|
||||
/// A reference identifier (UUID string).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Deserialize, Serialize)]
|
||||
#[serde(transparent)]
|
||||
pub struct ReferenceId(pub String);
|
||||
|
||||
// ── Supporting types ──────────────────────────────────────────────────────────
|
||||
|
||||
/// A person (author, editor, etc.).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Deserialize)]
|
||||
pub struct Person {
|
||||
pub family: String,
|
||||
pub given: Option<String>,
|
||||
pub prefix: Option<String>,
|
||||
pub suffix: Option<String>,
|
||||
}
|
||||
|
||||
impl Person {
|
||||
/// "Given Family" or just "Family" when given is absent.
|
||||
pub fn display_name(&self) -> String {
|
||||
match &self.given {
|
||||
Some(g) => format!("{} {}", g, self.family),
|
||||
None => self.family.clone(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// BibTeX entry type.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Deserialize)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum EntryType {
|
||||
Article,
|
||||
Book,
|
||||
Booklet,
|
||||
InBook,
|
||||
InCollection,
|
||||
InProceedings,
|
||||
Manual,
|
||||
MastersThesis,
|
||||
Misc,
|
||||
PhdThesis,
|
||||
Proceedings,
|
||||
TechReport,
|
||||
Unpublished,
|
||||
Online,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for EntryType {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
let s = match self {
|
||||
EntryType::Article => "article",
|
||||
EntryType::Book => "book",
|
||||
EntryType::Booklet => "booklet",
|
||||
EntryType::InBook => "inbook",
|
||||
EntryType::InCollection => "incollection",
|
||||
EntryType::InProceedings => "inproceedings",
|
||||
EntryType::Manual => "manual",
|
||||
EntryType::MastersThesis => "mastersthesis",
|
||||
EntryType::Misc => "misc",
|
||||
EntryType::PhdThesis => "phdthesis",
|
||||
EntryType::Proceedings => "proceedings",
|
||||
EntryType::TechReport => "techreport",
|
||||
EntryType::Unpublished => "unpublished",
|
||||
EntryType::Online => "online",
|
||||
};
|
||||
write!(f, "{s}")
|
||||
}
|
||||
}
|
||||
|
||||
// ── Library ───────────────────────────────────────────────────────────────────
|
||||
|
||||
/// A library node in the tree (subset of `brittle_core::Library`).
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Deserialize)]
|
||||
pub struct Library {
|
||||
pub id: LibraryId,
|
||||
pub name: String,
|
||||
pub parent_id: Option<LibraryId>,
|
||||
}
|
||||
|
||||
// ── References ────────────────────────────────────────────────────────────────
|
||||
|
||||
/// A lightweight reference record used in the publication list.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Deserialize)]
|
||||
pub struct ReferenceSummary {
|
||||
pub id: ReferenceId,
|
||||
pub cite_key: String,
|
||||
pub entry_type: EntryType,
|
||||
pub title: Option<String>,
|
||||
pub authors: Vec<Person>,
|
||||
pub year: Option<String>,
|
||||
}
|
||||
|
||||
impl ReferenceSummary {
|
||||
/// Title or "[no title]" fallback.
|
||||
pub fn title_display(&self) -> &str {
|
||||
self.title.as_deref().unwrap_or("[no title]")
|
||||
}
|
||||
|
||||
/// Compact author string: "Family", "A & B", or "A et al."
|
||||
pub fn author_display(&self) -> String {
|
||||
match self.authors.len() {
|
||||
0 => "—".into(),
|
||||
1 => self.authors[0].family.clone(),
|
||||
2 => format!("{} & {}", self.authors[0].family, self.authors[1].family),
|
||||
_ => format!("{} et al.", self.authors[0].family),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Full reference record used in the detail pane.
|
||||
#[derive(Debug, Clone, PartialEq, Deserialize)]
|
||||
pub struct Reference {
|
||||
pub id: ReferenceId,
|
||||
pub cite_key: String,
|
||||
pub entry_type: EntryType,
|
||||
pub authors: Vec<Person>,
|
||||
pub editors: Vec<Person>,
|
||||
/// BibTeX fields: title, year, journal, doi, abstract, …
|
||||
pub fields: BTreeMap<String, String>,
|
||||
pub created_at: String,
|
||||
pub modified_at: String,
|
||||
}
|
||||
|
||||
// ── Tests ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use serde_json::from_str;
|
||||
|
||||
#[test]
|
||||
fn library_id_transparent() {
|
||||
let id: LibraryId = from_str(r#""abc-123""#).unwrap();
|
||||
assert_eq!(id.0, "abc-123");
|
||||
|
||||
// Also verify round-trips back to a plain string.
|
||||
let back = serde_json::to_string(&id).unwrap();
|
||||
assert_eq!(back, r#""abc-123""#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn library_ignores_extra_fields() {
|
||||
// brittle-core sends members, timestamps, etc. — we ignore them.
|
||||
let json = r#"{
|
||||
"id": "lib-1",
|
||||
"name": "Physics",
|
||||
"parent_id": null,
|
||||
"members": ["ref-x"],
|
||||
"created_at": "2024-01-01T00:00:00Z",
|
||||
"modified_at": "2024-01-01T00:00:00Z"
|
||||
}"#;
|
||||
let lib: Library = from_str(json).unwrap();
|
||||
assert_eq!(lib.id.0, "lib-1");
|
||||
assert_eq!(lib.name, "Physics");
|
||||
assert_eq!(lib.parent_id, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn library_with_parent() {
|
||||
let json = r#"{"id": "child-1", "name": "QM", "parent_id": "lib-1"}"#;
|
||||
let lib: Library = from_str(json).unwrap();
|
||||
assert_eq!(lib.parent_id, Some(LibraryId("lib-1".into())));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reference_summary_fields() {
|
||||
let json = r#"{
|
||||
"id": "ref-1",
|
||||
"cite_key": "einstein1905",
|
||||
"entry_type": "article",
|
||||
"title": "On the Electrodynamics of Moving Bodies",
|
||||
"authors": [{"family": "Einstein", "given": "Albert", "prefix": null, "suffix": null}],
|
||||
"year": "1905"
|
||||
}"#;
|
||||
let rs: ReferenceSummary = from_str(json).unwrap();
|
||||
assert_eq!(rs.cite_key, "einstein1905");
|
||||
assert_eq!(rs.title_display(), "On the Electrodynamics of Moving Bodies");
|
||||
assert_eq!(rs.author_display(), "Einstein");
|
||||
assert_eq!(rs.year.as_deref(), Some("1905"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reference_summary_no_title() {
|
||||
let rs = ReferenceSummary {
|
||||
id: ReferenceId("x".into()),
|
||||
cite_key: "x".into(),
|
||||
entry_type: EntryType::Misc,
|
||||
title: None,
|
||||
authors: vec![],
|
||||
year: None,
|
||||
};
|
||||
assert_eq!(rs.title_display(), "[no title]");
|
||||
assert_eq!(rs.author_display(), "—");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn author_display_variants() {
|
||||
fn p(family: &str) -> Person {
|
||||
Person { family: family.into(), given: None, prefix: None, suffix: None }
|
||||
}
|
||||
let base = ReferenceSummary {
|
||||
id: ReferenceId("x".into()),
|
||||
cite_key: "x".into(),
|
||||
entry_type: EntryType::Article,
|
||||
title: None,
|
||||
authors: vec![],
|
||||
year: None,
|
||||
};
|
||||
let one = ReferenceSummary { authors: vec![p("Smith")], ..base.clone() };
|
||||
assert_eq!(one.author_display(), "Smith");
|
||||
|
||||
let two = ReferenceSummary { authors: vec![p("Smith"), p("Jones")], ..base.clone() };
|
||||
assert_eq!(two.author_display(), "Smith & Jones");
|
||||
|
||||
let many = ReferenceSummary {
|
||||
authors: vec![p("Smith"), p("Jones"), p("Brown")],
|
||||
..base
|
||||
};
|
||||
assert_eq!(many.author_display(), "Smith et al.");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn person_display_name() {
|
||||
let p = Person {
|
||||
family: "Turing".into(),
|
||||
given: Some("Alan".into()),
|
||||
prefix: None,
|
||||
suffix: None,
|
||||
};
|
||||
assert_eq!(p.display_name(), "Alan Turing");
|
||||
|
||||
let p2 = Person { given: None, ..p };
|
||||
assert_eq!(p2.display_name(), "Turing");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn entry_type_display() {
|
||||
assert_eq!(EntryType::Article.to_string(), "article");
|
||||
assert_eq!(EntryType::InProceedings.to_string(), "inproceedings");
|
||||
}
|
||||
}
|
||||
28
src/src/pdf_viewer.rs
Normal file
28
src/src/pdf_viewer.rs
Normal file
@@ -0,0 +1,28 @@
|
||||
//! PDF viewer tab: embeds the Tauri-served PDF viewer in an iframe.
|
||||
//!
|
||||
//! The custom `brittle://` URI scheme serves:
|
||||
//! - `brittle://app/viewer?ref_id=<uuid>` — the viewer HTML page (PDF.js)
|
||||
//! - `brittle://app/pdf?ref_id=<uuid>` — the raw PDF bytes
|
||||
//!
|
||||
//! Using an `<iframe>` keeps the viewer alive when the tab is hidden (via
|
||||
//! `display:none`), so scrolling position and zoom are preserved across
|
||||
//! tab switches.
|
||||
|
||||
use leptos::prelude::*;
|
||||
|
||||
/// Renders the PDF viewer for a single reference.
|
||||
///
|
||||
/// `ref_id` must be the UUID string of a reference that has an attached PDF.
|
||||
/// The iframe loads the viewer HTML served by the `brittle://` custom protocol.
|
||||
#[component]
|
||||
pub fn PdfViewer(ref_id: String) -> impl IntoView {
|
||||
let url = format!("brittle://app/viewer?ref_id={ref_id}");
|
||||
view! {
|
||||
<iframe
|
||||
class="pdf-frame"
|
||||
src=url
|
||||
// Intentionally no `sandbox` attribute — the brittle:// protocol
|
||||
// and PDF.js require unrestricted access within the webview.
|
||||
/>
|
||||
}
|
||||
}
|
||||
108
src/src/pub_detail.rs
Normal file
108
src/src/pub_detail.rs
Normal file
@@ -0,0 +1,108 @@
|
||||
//! Publication detail: right pane showing full reference fields.
|
||||
|
||||
use leptos::prelude::*;
|
||||
|
||||
use crate::models::Reference;
|
||||
|
||||
/// Right pane: displays the fields of the currently selected reference.
|
||||
///
|
||||
/// When `reference` is `None`, a placeholder is shown.
|
||||
#[component]
|
||||
pub fn PubDetail(reference: RwSignal<Option<Reference>>) -> impl IntoView {
|
||||
use leptos::either::Either;
|
||||
|
||||
view! {
|
||||
<div class="pub-detail-pane">
|
||||
{move || match reference.get() {
|
||||
None => Either::Left(view! {
|
||||
<div class="empty-state">"Select a publication to see details"</div>
|
||||
}),
|
||||
Some(r) => Either::Right(view! {
|
||||
<div class="detail-content">
|
||||
<h2 class="detail-title">
|
||||
{r.fields.get("title").cloned().unwrap_or_else(|| "[no title]".into())}
|
||||
</h2>
|
||||
<dl class="detail-fields">
|
||||
// Entry type + cite key
|
||||
<dt>"Type"</dt>
|
||||
<dd>{r.entry_type.to_string()}</dd>
|
||||
|
||||
<dt>"Cite key"</dt>
|
||||
<dd class="mono">{r.cite_key.clone()}</dd>
|
||||
|
||||
// Authors
|
||||
{if r.authors.is_empty() {
|
||||
None
|
||||
} else {
|
||||
let names = r.authors.iter()
|
||||
.map(|p| p.display_name())
|
||||
.collect::<Vec<_>>()
|
||||
.join("; ");
|
||||
Some(view! {
|
||||
<dt>"Authors"</dt>
|
||||
<dd>{names}</dd>
|
||||
})
|
||||
}}
|
||||
|
||||
// Editors
|
||||
{if r.editors.is_empty() {
|
||||
None
|
||||
} else {
|
||||
let names = r.editors.iter()
|
||||
.map(|p| p.display_name())
|
||||
.collect::<Vec<_>>()
|
||||
.join("; ");
|
||||
Some(view! {
|
||||
<dt>"Editors"</dt>
|
||||
<dd>{names}</dd>
|
||||
})
|
||||
}}
|
||||
|
||||
// Prioritised well-known fields
|
||||
{PRIORITY_FIELDS.iter().filter_map(|&key| {
|
||||
r.fields.get(key).map(|val| view! {
|
||||
<dt>{field_label(key)}</dt>
|
||||
<dd>{val.clone()}</dd>
|
||||
})
|
||||
}).collect::<Vec<_>>()}
|
||||
|
||||
// Remaining fields in alphabetical order
|
||||
{r.fields.iter()
|
||||
.filter(|(k, _)| !PRIORITY_FIELDS.contains(&k.as_str())
|
||||
&& *k != "title")
|
||||
.map(|(k, v)| view! {
|
||||
<dt>{field_label(k)}</dt>
|
||||
<dd>{v.clone()}</dd>
|
||||
})
|
||||
.collect::<Vec<_>>()
|
||||
}
|
||||
</dl>
|
||||
<p class="detail-timestamps">
|
||||
"Modified: "{r.modified_at.clone()}
|
||||
</p>
|
||||
</div>
|
||||
}),
|
||||
}}
|
||||
</div>
|
||||
}
|
||||
}
|
||||
|
||||
/// Fields shown before the alphabetical remainder (excluding "title").
|
||||
const PRIORITY_FIELDS: &[&str] = &["year", "journal", "booktitle", "volume", "doi", "abstract"];
|
||||
|
||||
/// Pretty-print a BibTeX field key.
|
||||
fn field_label(key: &str) -> String {
|
||||
match key {
|
||||
"doi" => "DOI".into(),
|
||||
"isbn" => "ISBN".into(),
|
||||
"issn" => "ISSN".into(),
|
||||
"url" => "URL".into(),
|
||||
_ => {
|
||||
let mut s = key.replace('_', " ");
|
||||
if let Some(c) = s.get_mut(0..1) {
|
||||
c.make_ascii_uppercase();
|
||||
}
|
||||
s
|
||||
}
|
||||
}
|
||||
}
|
||||
67
src/src/pub_list.rs
Normal file
67
src/src/pub_list.rs
Normal file
@@ -0,0 +1,67 @@
|
||||
//! Publication list: centre pane showing filtered references.
|
||||
|
||||
use leptos::prelude::*;
|
||||
|
||||
use crate::{lib_tab::Pane, models::ReferenceSummary};
|
||||
|
||||
/// Centre pane: a scrollable list of publication summaries.
|
||||
///
|
||||
/// `cursor` tracks which row is selected; clicking a row updates it.
|
||||
/// Items are draggable — dropping onto a library tree node calls
|
||||
/// `add_to_library` via the Tauri IPC.
|
||||
#[component]
|
||||
pub fn PubList(
|
||||
items: RwSignal<Vec<ReferenceSummary>>,
|
||||
cursor: RwSignal<usize>,
|
||||
focused: RwSignal<Pane>,
|
||||
) -> impl IntoView {
|
||||
use leptos::either::Either;
|
||||
|
||||
view! {
|
||||
<div class="pub-list-pane" class:pane-focused=move || focused.get() == Pane::List>
|
||||
{move || {
|
||||
let list = items.get();
|
||||
if list.is_empty() {
|
||||
Either::Left(view! {
|
||||
<div class="empty-state">"No publications"</div>
|
||||
})
|
||||
} else {
|
||||
let cursor_pos = cursor.get();
|
||||
Either::Right(view! {
|
||||
<ul class="pub-list">
|
||||
{list.into_iter().enumerate().map(|(i, item)| {
|
||||
let is_cursor = i == cursor_pos;
|
||||
let title = item.title_display().to_owned();
|
||||
let authors = item.author_display();
|
||||
let year = item.year.clone().unwrap_or_default();
|
||||
let kind = item.entry_type.to_string();
|
||||
let ref_id = item.id.0.clone();
|
||||
view! {
|
||||
<li
|
||||
class="pub-item"
|
||||
class:pub-cursor=is_cursor
|
||||
draggable="true"
|
||||
on:dragstart=move |ev| {
|
||||
let ev: &web_sys::DragEvent = &ev;
|
||||
if let Some(dt) = ev.data_transfer() {
|
||||
let _ = dt.set_data(
|
||||
"application/brittle-ref-id",
|
||||
&ref_id,
|
||||
);
|
||||
}
|
||||
}
|
||||
on:click=move |_| cursor.set(i)
|
||||
>
|
||||
<span class="pub-type">{kind}</span>
|
||||
<span class="pub-title">{title}</span>
|
||||
<span class="pub-meta">{authors}{" "}{year}</span>
|
||||
</li>
|
||||
}
|
||||
}).collect::<Vec<_>>()}
|
||||
</ul>
|
||||
})
|
||||
}
|
||||
}}
|
||||
</div>
|
||||
}
|
||||
}
|
||||
57
src/src/tab_bar.rs
Normal file
57
src/src/tab_bar.rs
Normal file
@@ -0,0 +1,57 @@
|
||||
//! Tab bar component rendered at the top of the app.
|
||||
|
||||
use leptos::prelude::*;
|
||||
|
||||
use crate::AppTab;
|
||||
|
||||
/// Renders the horizontal tab bar.
|
||||
///
|
||||
/// Clicking a tab activates it. The Library tab has no close button;
|
||||
/// PDF tabs show a `×` close button on hover.
|
||||
#[component]
|
||||
pub fn TabBar(
|
||||
tabs: RwSignal<Vec<AppTab>>,
|
||||
active_tab: RwSignal<usize>,
|
||||
) -> impl IntoView {
|
||||
view! {
|
||||
<div class="tab-bar">
|
||||
{move || {
|
||||
tabs.get().into_iter().enumerate().map(|(i, tab)| {
|
||||
let is_active = i == active_tab.get();
|
||||
match tab {
|
||||
AppTab::Library => view! {
|
||||
<button
|
||||
class="tab tab-library"
|
||||
class:tab-active=is_active
|
||||
on:click=move |_| active_tab.set(i)
|
||||
>
|
||||
"Library"
|
||||
</button>
|
||||
}.into_any(),
|
||||
AppTab::Pdf { title, .. } => view! {
|
||||
<span
|
||||
class="tab tab-pdf"
|
||||
class:tab-active=is_active
|
||||
on:click=move |_| active_tab.set(i)
|
||||
>
|
||||
<span class="tab-label">{title}</span>
|
||||
<button
|
||||
class="tab-close"
|
||||
on:click=move |ev| {
|
||||
ev.stop_propagation();
|
||||
tabs.update(|t| { t.remove(i); });
|
||||
active_tab.update(|a| *a = (*a).min(
|
||||
tabs.with_untracked(|t| t.len().saturating_sub(1))
|
||||
));
|
||||
}
|
||||
>
|
||||
"×"
|
||||
</button>
|
||||
</span>
|
||||
}.into_any(),
|
||||
}
|
||||
}).collect::<Vec<_>>()
|
||||
}}
|
||||
</div>
|
||||
}
|
||||
}
|
||||
181
src/src/tauri.rs
Normal file
181
src/src/tauri.rs
Normal file
@@ -0,0 +1,181 @@
|
||||
//! Wrappers around the Tauri `invoke` API.
|
||||
//!
|
||||
//! Each function maps directly to a Tauri command defined in `src-tauri`.
|
||||
//! On non-wasm32 targets (native unit test runs) every function returns an
|
||||
//! error immediately — they are never called in that context.
|
||||
|
||||
use std::collections::HashMap;
|
||||
|
||||
use serde::Serialize;
|
||||
|
||||
use crate::models::{Library, LibraryId, Reference, ReferenceId, ReferenceSummary};
|
||||
|
||||
// ── Low-level invoke ───────────────────────────────────────────────────────────
|
||||
|
||||
/// Empty argument struct for commands that take no user parameters.
|
||||
#[derive(Serialize)]
|
||||
struct NoArgs {}
|
||||
|
||||
#[cfg(target_arch = "wasm32")]
|
||||
mod ffi {
|
||||
use wasm_bindgen::prelude::*;
|
||||
|
||||
#[wasm_bindgen]
|
||||
extern "C" {
|
||||
/// `window.__TAURI__.core.invoke(cmd, args)`
|
||||
///
|
||||
/// `catch` turns a JS exception (e.g. `window.__TAURI__` undefined when
|
||||
/// running in a plain browser) into `Err(JsValue)` instead of a WASM
|
||||
/// trap, so callers can degrade gracefully.
|
||||
#[wasm_bindgen(js_namespace = ["window", "__TAURI__", "core"], js_name = invoke, catch)]
|
||||
pub fn invoke_js(cmd: &str, args: JsValue) -> Result<js_sys::Promise, JsValue>;
|
||||
}
|
||||
}
|
||||
|
||||
/// Invoke a Tauri command, serialise `args` as JSON and deserialise the result.
|
||||
async fn invoke<T: for<'de> serde::Deserialize<'de>, A: Serialize>(
|
||||
cmd: &str,
|
||||
args: &A,
|
||||
) -> Result<T, String> {
|
||||
#[cfg(target_arch = "wasm32")]
|
||||
{
|
||||
use wasm_bindgen_futures::JsFuture;
|
||||
let args_js =
|
||||
serde_wasm_bindgen::to_value(args).map_err(|e| e.to_string())?;
|
||||
let promise = ffi::invoke_js(cmd, args_js)
|
||||
.map_err(|_| "Tauri IPC unavailable (not running inside the desktop app)".to_string())?;
|
||||
let result = JsFuture::from(promise)
|
||||
.await
|
||||
.map_err(|e| format!("{e:?}"))?;
|
||||
serde_wasm_bindgen::from_value::<T>(result).map_err(|e| e.to_string())
|
||||
}
|
||||
#[cfg(not(target_arch = "wasm32"))]
|
||||
{
|
||||
let _ = (cmd, args);
|
||||
Err("tauri not available outside of wasm32".into())
|
||||
}
|
||||
}
|
||||
|
||||
// ── Config commands ────────────────────────────────────────────────────────────
|
||||
|
||||
/// Return the user's keybinding overrides from `~/.config/brittle/config.toml`.
|
||||
///
|
||||
/// Keys are snake_case action names (e.g. `"tab_next"`); values are
|
||||
/// key-sequence strings (e.g. `"<C-Right>"`).
|
||||
pub async fn get_keybindings() -> Result<HashMap<String, String>, String> {
|
||||
invoke("get_keybindings", &NoArgs {}).await
|
||||
}
|
||||
|
||||
// ── Appearance commands ────────────────────────────────────────────────────────
|
||||
|
||||
/// Return the current theme name (`"dark"` or `"light"`) from the global config.
|
||||
pub async fn get_theme() -> Result<String, String> {
|
||||
invoke("get_theme", &NoArgs {}).await
|
||||
}
|
||||
|
||||
/// Persist a new theme choice (`"dark"` or `"light"`) to the global config.
|
||||
pub async fn set_theme(theme: &str) -> Result<(), String> {
|
||||
#[derive(Serialize)]
|
||||
struct Args<'a> {
|
||||
theme: &'a str,
|
||||
}
|
||||
invoke("set_theme", &Args { theme }).await
|
||||
}
|
||||
|
||||
// ── Repository commands ────────────────────────────────────────────────────────
|
||||
|
||||
pub async fn open_repository(path: &str) -> Result<(), String> {
|
||||
#[derive(Serialize)]
|
||||
struct Args<'a> {
|
||||
path: &'a str,
|
||||
}
|
||||
invoke("open_repository", &Args { path }).await
|
||||
}
|
||||
|
||||
// ── Library commands ───────────────────────────────────────────────────────────
|
||||
|
||||
pub async fn list_root_libraries() -> Result<Vec<Library>, String> {
|
||||
invoke("list_root_libraries", &NoArgs {}).await
|
||||
}
|
||||
|
||||
pub async fn list_child_libraries(parent_id: &LibraryId) -> Result<Vec<Library>, String> {
|
||||
#[derive(Serialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct Args<'a> {
|
||||
parent_id: &'a LibraryId,
|
||||
}
|
||||
invoke("list_child_libraries", &Args { parent_id }).await
|
||||
}
|
||||
|
||||
// ── Library membership commands ────────────────────────────────────────────────
|
||||
|
||||
/// Add a reference to a library (additive; does not move the reference).
|
||||
pub async fn add_to_library(
|
||||
library_id: &LibraryId,
|
||||
reference_id: &ReferenceId,
|
||||
) -> Result<(), String> {
|
||||
#[derive(Serialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct Args<'a> {
|
||||
library_id: &'a LibraryId,
|
||||
reference_id: &'a ReferenceId,
|
||||
}
|
||||
invoke("add_to_library", &Args { library_id, reference_id }).await
|
||||
}
|
||||
|
||||
// ── Reference commands ─────────────────────────────────────────────────────────
|
||||
|
||||
pub async fn list_references() -> Result<Vec<ReferenceSummary>, String> {
|
||||
invoke("list_references", &NoArgs {}).await
|
||||
}
|
||||
|
||||
pub async fn list_library_references(
|
||||
library_id: &LibraryId,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
#[derive(Serialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct Args<'a> {
|
||||
library_id: &'a LibraryId,
|
||||
}
|
||||
invoke("list_library_references", &Args { library_id }).await
|
||||
}
|
||||
|
||||
pub async fn list_library_references_recursive(
|
||||
library_id: &LibraryId,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
#[derive(Serialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct Args<'a> {
|
||||
library_id: &'a LibraryId,
|
||||
}
|
||||
invoke("list_library_references_recursive", &Args { library_id }).await
|
||||
}
|
||||
|
||||
pub async fn search_references(query: &str) -> Result<Vec<ReferenceSummary>, String> {
|
||||
#[derive(Serialize)]
|
||||
struct Args<'a> {
|
||||
query: &'a str,
|
||||
}
|
||||
invoke("search_references", &Args { query }).await
|
||||
}
|
||||
|
||||
pub async fn search_library_references(
|
||||
library_id: &LibraryId,
|
||||
query: &str,
|
||||
) -> Result<Vec<ReferenceSummary>, String> {
|
||||
#[derive(Serialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct Args<'a> {
|
||||
library_id: &'a LibraryId,
|
||||
query: &'a str,
|
||||
}
|
||||
invoke("search_library_references", &Args { library_id, query }).await
|
||||
}
|
||||
|
||||
pub async fn get_reference(id: &ReferenceId) -> Result<Reference, String> {
|
||||
#[derive(Serialize)]
|
||||
struct Args<'a> {
|
||||
id: &'a ReferenceId,
|
||||
}
|
||||
invoke("get_reference", &Args { id }).await
|
||||
}
|
||||
373
src/style.css
Normal file
373
src/style.css
Normal file
@@ -0,0 +1,373 @@
|
||||
/* ── Reset / base ─────────────────────────────────────────────────────────── */
|
||||
|
||||
*, *::before, *::after { box-sizing: border-box; margin: 0; padding: 0; }
|
||||
|
||||
:root {
|
||||
--bg: #1e1e2e;
|
||||
--bg-surface: #24243a;
|
||||
--bg-overlay: #2e2e44;
|
||||
--border: #3a3a55;
|
||||
--accent: #7c6af7;
|
||||
--accent-dim: #5a4dcc;
|
||||
--text: #cdd6f4;
|
||||
--text-muted: #7f849c;
|
||||
--text-subtle: #585b70;
|
||||
--cursor-bg: #3a3a70;
|
||||
--cursor-fg: #ffffff;
|
||||
--focused-ring: #7c6af7;
|
||||
--font-mono: "JetBrains Mono", "Fira Code", "Cascadia Code", monospace;
|
||||
--font-ui: system-ui, -apple-system, "Segoe UI", sans-serif;
|
||||
--radius: 4px;
|
||||
--tab-h: 32px;
|
||||
--cmd-h: 28px;
|
||||
}
|
||||
|
||||
html, body {
|
||||
height: 100%;
|
||||
background: var(--bg);
|
||||
color: var(--text);
|
||||
font-family: var(--font-ui);
|
||||
font-size: 13px;
|
||||
line-height: 1.5;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
ul { list-style: none; }
|
||||
|
||||
/* ── App shell ────────────────────────────────────────────────────────────── */
|
||||
|
||||
.app {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100vh;
|
||||
}
|
||||
|
||||
.app-body {
|
||||
flex: 1;
|
||||
overflow: hidden;
|
||||
position: relative;
|
||||
}
|
||||
|
||||
/* ── Tab bar ──────────────────────────────────────────────────────────────── */
|
||||
|
||||
.tab-bar {
|
||||
display: flex;
|
||||
align-items: stretch;
|
||||
height: var(--tab-h);
|
||||
background: var(--bg-surface);
|
||||
border-bottom: 1px solid var(--border);
|
||||
padding: 0 4px;
|
||||
gap: 2px;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.tab {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
padding: 0 12px;
|
||||
background: transparent;
|
||||
border: none;
|
||||
border-bottom: 2px solid transparent;
|
||||
color: var(--text-muted);
|
||||
cursor: pointer;
|
||||
font: inherit;
|
||||
font-size: 12px;
|
||||
white-space: nowrap;
|
||||
user-select: none;
|
||||
border-radius: var(--radius) var(--radius) 0 0;
|
||||
gap: 6px;
|
||||
}
|
||||
|
||||
.tab:hover { color: var(--text); background: var(--bg-overlay); }
|
||||
|
||||
.tab.tab-active {
|
||||
color: var(--text);
|
||||
border-bottom-color: var(--accent);
|
||||
}
|
||||
|
||||
.tab-pdf { cursor: pointer; }
|
||||
|
||||
.tab-label { pointer-events: none; }
|
||||
|
||||
.tab-close {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
width: 16px;
|
||||
height: 16px;
|
||||
background: transparent;
|
||||
border: none;
|
||||
border-radius: 2px;
|
||||
color: var(--text-muted);
|
||||
font-size: 14px;
|
||||
line-height: 1;
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
opacity: 0;
|
||||
transition: opacity 0.1s, background 0.1s;
|
||||
}
|
||||
|
||||
.tab-pdf:hover .tab-close { opacity: 1; }
|
||||
.tab-close:hover { background: var(--bg-overlay); color: var(--text); }
|
||||
|
||||
/* ── Three-pane library layout ────────────────────────────────────────────── */
|
||||
|
||||
.lib-tab {
|
||||
display: flex;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
.pane {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
overflow: hidden;
|
||||
border-right: 1px solid var(--border);
|
||||
}
|
||||
|
||||
.pane:last-child { border-right: none; }
|
||||
|
||||
.pane-left { width: 220px; flex-shrink: 0; }
|
||||
.pane-center { flex: 1; min-width: 0; }
|
||||
.pane-right { width: 300px; flex-shrink: 0; }
|
||||
|
||||
.pane-focused { outline: 1px solid var(--focused-ring); outline-offset: -1px; }
|
||||
|
||||
/* ── Library tree pane ────────────────────────────────────────────────────── */
|
||||
|
||||
.tree-pane {
|
||||
height: 100%;
|
||||
overflow-y: auto;
|
||||
padding: 4px 0;
|
||||
}
|
||||
|
||||
.tree-list { padding: 0; }
|
||||
|
||||
.tree-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 2px;
|
||||
padding: 3px 8px;
|
||||
cursor: pointer;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
border-radius: 0;
|
||||
}
|
||||
|
||||
.tree-item:hover { background: var(--bg-overlay); }
|
||||
|
||||
.tree-item.tree-cursor {
|
||||
background: var(--cursor-bg);
|
||||
color: var(--cursor-fg);
|
||||
}
|
||||
|
||||
.tree-icon {
|
||||
color: var(--text-muted);
|
||||
font-size: 11px;
|
||||
width: 14px;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.tree-name {
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
|
||||
/* ── Publication list pane ────────────────────────────────────────────────── */
|
||||
|
||||
.pub-list-pane {
|
||||
height: 100%;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.pub-list { padding: 4px 0; }
|
||||
|
||||
.pub-item {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
padding: 6px 10px;
|
||||
cursor: pointer;
|
||||
border-bottom: 1px solid var(--border);
|
||||
gap: 1px;
|
||||
}
|
||||
|
||||
.pub-item:hover { background: var(--bg-overlay); }
|
||||
|
||||
.pub-item.pub-cursor {
|
||||
background: var(--cursor-bg);
|
||||
color: var(--cursor-fg);
|
||||
}
|
||||
|
||||
.pub-item.pub-cursor .pub-meta { color: rgba(255,255,255,0.6); }
|
||||
.pub-item.pub-cursor .pub-type { color: rgba(255,255,255,0.7); }
|
||||
|
||||
.pub-type {
|
||||
font-size: 10px;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.06em;
|
||||
color: var(--accent);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.pub-title {
|
||||
font-size: 12px;
|
||||
font-weight: 500;
|
||||
line-height: 1.4;
|
||||
}
|
||||
|
||||
.pub-meta {
|
||||
font-size: 11px;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
/* ── Publication detail pane ──────────────────────────────────────────────── */
|
||||
|
||||
.pub-detail-pane {
|
||||
height: 100%;
|
||||
overflow-y: auto;
|
||||
padding: 12px;
|
||||
}
|
||||
|
||||
.detail-title {
|
||||
font-size: 14px;
|
||||
font-weight: 600;
|
||||
line-height: 1.4;
|
||||
margin-bottom: 10px;
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
.detail-fields {
|
||||
display: grid;
|
||||
grid-template-columns: auto 1fr;
|
||||
gap: 4px 12px;
|
||||
align-items: baseline;
|
||||
}
|
||||
|
||||
.detail-fields dt {
|
||||
color: var(--text-muted);
|
||||
font-size: 11px;
|
||||
white-space: nowrap;
|
||||
text-align: right;
|
||||
}
|
||||
|
||||
.detail-fields dd {
|
||||
color: var(--text);
|
||||
font-size: 12px;
|
||||
word-break: break-word;
|
||||
min-width: 0;
|
||||
}
|
||||
|
||||
.detail-timestamps {
|
||||
margin-top: 12px;
|
||||
font-size: 10px;
|
||||
color: var(--text-subtle);
|
||||
}
|
||||
|
||||
.mono {
|
||||
font-family: var(--font-mono);
|
||||
font-size: 11px;
|
||||
}
|
||||
|
||||
/* ── Empty state ──────────────────────────────────────────────────────────── */
|
||||
|
||||
.empty-state {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
height: 100%;
|
||||
color: var(--text-subtle);
|
||||
font-size: 12px;
|
||||
padding: 24px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
/* ── Command / search bar ─────────────────────────────────────────────────── */
|
||||
|
||||
.command-bar {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
height: var(--cmd-h);
|
||||
background: var(--bg-surface);
|
||||
border-top: 1px solid var(--border);
|
||||
padding: 0 8px;
|
||||
gap: 4px;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.command-prefix {
|
||||
color: var(--accent);
|
||||
font-family: var(--font-mono);
|
||||
font-size: 14px;
|
||||
font-weight: bold;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.command-input {
|
||||
flex: 1;
|
||||
background: transparent;
|
||||
border: none;
|
||||
outline: none;
|
||||
color: var(--text);
|
||||
font: inherit;
|
||||
font-family: var(--font-mono);
|
||||
font-size: 12px;
|
||||
caret-color: var(--accent);
|
||||
}
|
||||
|
||||
.command-status {
|
||||
color: #f38ba8;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
/* ── PDF viewer frame ─────────────────────────────────────────────────────── */
|
||||
|
||||
.pdf-frame {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
border: none;
|
||||
display: block;
|
||||
}
|
||||
|
||||
/* ── Drag-and-drop ────────────────────────────────────────────────────────── */
|
||||
|
||||
/* Suppress pointer events on text spans inside tree rows so that
|
||||
dragleave / dragover never fire on child elements — avoids highlight flicker
|
||||
when the cursor moves between the icon and the label inside one row. */
|
||||
.tree-icon,
|
||||
.tree-name { pointer-events: none; }
|
||||
|
||||
/* Publication items are drag sources. */
|
||||
.pub-item[draggable="true"] { cursor: grab; }
|
||||
.pub-item[draggable="true"]:active { cursor: grabbing; }
|
||||
|
||||
/* Library tree rows highlighted as drop targets. */
|
||||
.tree-item.tree-drop-target {
|
||||
background: var(--accent-dim);
|
||||
color: var(--cursor-fg);
|
||||
outline: 1px dashed var(--accent);
|
||||
outline-offset: -1px;
|
||||
}
|
||||
|
||||
/* ── Light theme (Catppuccin Latte) ──────────────────────────────────────── */
|
||||
|
||||
[data-theme="light"] {
|
||||
--bg: #eff1f5;
|
||||
--bg-surface: #e6e9ef;
|
||||
--bg-overlay: #dce0e8;
|
||||
--border: #ccd0da;
|
||||
--accent: #7287fd;
|
||||
--accent-dim: #5c6bc0;
|
||||
--text: #4c4f69;
|
||||
--text-muted: #8c8fa1;
|
||||
--text-subtle: #acb0be;
|
||||
--cursor-bg: #7287fd;
|
||||
--cursor-fg: #eff1f5;
|
||||
--focused-ring: #7287fd;
|
||||
}
|
||||
|
||||
/* ── Scrollbar (WebKit) ───────────────────────────────────────────────────── */
|
||||
|
||||
::-webkit-scrollbar { width: 6px; height: 6px; }
|
||||
::-webkit-scrollbar-track { background: transparent; }
|
||||
::-webkit-scrollbar-thumb { background: var(--border); border-radius: 3px; }
|
||||
::-webkit-scrollbar-thumb:hover { background: var(--text-subtle); }
|
||||
1
vendor/pdfjs-dist
vendored
Submodule
1
vendor/pdfjs-dist
vendored
Submodule
Submodule vendor/pdfjs-dist added at af64149885
Reference in New Issue
Block a user