FEAT Delete & Update note is now complete.

Well, as complete as it could be without proper automated testing.
I think there'll be some more testing soon, as it doesn't make sense
for it to hang out so blatantly like this.

Both a fmt and clippy pass have shaken all the lint off, and right
now it builds without warnings or lintings.  Wheee!
This commit is contained in:
Elf M. Sternberg 2020-10-26 18:54:56 -07:00
parent 739ff93427
commit 72fb3b11ee
8 changed files with 709 additions and 617 deletions

View File

@ -0,0 +1,46 @@
# Storage layer for Notesmachine
This library implements the core functionality of Notesmachine and
describes that functionality to a storage layer. There's a bit of
intermingling in here which can't be helped, although it may make sense
in the future to separate the decomposition of the note content into a
higher layer.
Notesmachine storage notes consist of two items: Zettle and Kasten,
which are German for "Note" and "Box". Here are the basic rules:
- Boxes have titles (and date metadata)
- Notes have content and a type (and date metadata)
- Notes are stored in boxes
- Notes are positioned with respect to other notes.
- There are two positions:
- Siblings, creating lists
- Children, creating trees like this one
- Notes may have references (pointers) to other boxes
- Notes may be moved around
- Notes may be deleted
- Boxes may be deleted
- When a box is renamed, every reference to that box is auto-edited to
reflect the change. If a box is renamed to match an existing box, the
notes in both boxes are merged.
Note-to-note relationships form trees, and are kept in a SQL database of
(`parent_id`, `child_id`, `position`, `relationship_type`). The
`position` is a monotonic index on the parent (that is, every pair
(`parent_id`, `position`) must be unique). The `relationship_type` is
an enum and can specify that the relationship is *original*,
*embedding*, or *referencing*. An embedded or referenced note may be
read/write or read-only with respect to the original, but there is only
one original note at any time.
Note-to-box relationships form a graph, and are kept in the SQL database
as a collection of *edges* from the note to the box (and naturally
vice-versa).
- Decision: When an original note is deleted, do all references and
embeddings also get deleted, or is the oldest one elevated to be a new
"original"? Or is that something the user may choose?
- Decision: Should the merging issue be handled at this layer, or would
it make sense to move this to a higher layer, and only provide the
hooks for it here?

View File

@ -1,4 +1,3 @@
use sqlx;
use thiserror::Error; use thiserror::Error;
/// All the ways looking up objects can fail /// All the ways looking up objects can fail

View File

@ -1,8 +1,7 @@
mod errors; mod errors;
mod row_structs; mod reference_parser;
mod store; mod store;
mod structs; mod structs;
mod reference_parser;
pub use crate::errors::NoteStoreError; pub use crate::errors::NoteStoreError;
pub use crate::store::NoteStore; pub use crate::store::NoteStore;

View File

@ -37,7 +37,7 @@ pub(crate) fn find_links(document: &str) -> Vec<String> {
} }
match &node.data.borrow().value { match &node.data.borrow().value {
&NodeValue::Text(ref text) => Some( NodeValue::Text(ref text) => Some(
RE_REFERENCES RE_REFERENCES
.captures_iter(text) .captures_iter(text)
.map(|t| String::from_utf8_lossy(&t.get(1).unwrap().as_bytes()).to_string()) .map(|t| String::from_utf8_lossy(&t.get(1).unwrap().as_bytes()).to_string())
@ -68,19 +68,16 @@ fn recase(title: &str) -> String {
RE_PASS3.replace_all(&pass, " ").trim().to_string() RE_PASS3.replace_all(&pass, " ").trim().to_string()
} }
fn build_page_titles(references: &Vec<String>) -> Vec<String> { fn build_page_titles(references: &[String]) -> Vec<String> {
references references
.iter() .iter()
.map(|s| { .map(|s| match s.chars().next() {
let c = s.chars().nth(0);
match c {
Some('#') => recase(s), Some('#') => recase(s),
Some('[') => s.strip_prefix("[[").unwrap().strip_suffix("]]").unwrap().to_string(), Some('[') => s.strip_prefix("[[").unwrap().strip_suffix("]]").unwrap().to_string(),
Some(_) => s.clone(), Some(_) => s.clone(),
_ => "".to_string(), _ => "".to_string(),
}
}) })
.filter(|s| s.len() > 0) .filter(|s| s.is_empty())
.collect() .collect()
} }

View File

@ -1,111 +0,0 @@
use chrono::{DateTime, Utc};
use derive_builder::Builder;
use serde::{Deserialize, Serialize};
use sqlx::{self, FromRow};
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub struct RawPage {
pub id: i64,
pub slug: String,
pub title: String,
pub note_id: i64,
pub creation_date: DateTime<Utc>,
pub updated_date: DateTime<Utc>,
pub lastview_date: DateTime<Utc>,
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub struct RawNote {
pub id: i64,
pub uuid: String,
pub parent_id: i64,
pub parent_uuid: String,
pub content: String,
pub position: i64,
pub notetype: String,
pub creation_date: DateTime<Utc>,
pub updated_date: DateTime<Utc>,
pub lastview_date: DateTime<Utc>,
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, Builder)]
pub struct NewPage {
pub slug: String,
pub title: String,
pub note_id: i64,
#[builder(default = r#"chrono::Utc::now()"#)]
pub creation_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub updated_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub lastview_date: DateTime<Utc>,
#[builder(default = r#"None"#)]
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, Builder)]
pub struct NewNote {
#[builder(default = r#""".to_string()"#)]
pub uuid: String,
pub content: String,
#[builder(default = r#""note".to_string()"#)]
pub notetype: String,
#[builder(default = r#"chrono::Utc::now()"#)]
pub creation_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub updated_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub lastview_date: DateTime<Utc>,
#[builder(default = r#"None"#)]
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustSlugs {
pub slug: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustTitles {
title: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustId {
pub id: i64,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct PageTitles {
pub id: i64,
pub title: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct NoteRelationship {
pub parent_id: i64,
pub note_id: i64,
pub position: i64,
pub nature: String,
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn can_build_new_note() {
let now = chrono::Utc::now();
let newnote = NewNoteBuilder::default()
.uuid("foo".to_string())
.content("bar".to_string())
.build()
.unwrap();
assert!((newnote.creation_date - now).num_minutes() < 1);
assert!((newnote.updated_date - now).num_minutes() < 1);
assert!((newnote.lastview_date - now).num_minutes() < 1);
assert!(newnote.deleted_date.is_none());
}
}

View File

@ -1,8 +1,8 @@
DROP TABLE IF EXISTS notes; DROP TABLE IF EXISTS notes;
DROP TABLE IF EXISTS note_relationships; DROP TABLE IF EXISTS note_relationships;
DROP TABLE IF EXISTS pages; DROP TABLE IF EXISTS pages;
DROP TABLE IF EXISTS favorites;
DROP TABLE IF EXISTS page_relationships; DROP TABLE IF EXISTS page_relationships;
DROP TABLE IF EXISTS favorites;
CREATE TABLE notes ( CREATE TABLE notes (
id INTEGER PRIMARY KEY AUTOINCREMENT, id INTEGER PRIMARY KEY AUTOINCREMENT,

View File

@ -1,22 +1,19 @@
use crate::errors::NoteStoreError; use crate::errors::NoteStoreError;
use crate::row_structs::{
JustId, JustSlugs, NewNote, NewNoteBuilder, NewPage, NewPageBuilder, NoteRelationship, RawNote,
PageTitles,
RawPage,
};
use crate::reference_parser::build_references; use crate::reference_parser::build_references;
use friendly_id; use crate::structs::{
JustId, JustSlugs, NewNote, NewNoteBuilder, NewPage, NewPageBuilder, NoteRelationship, PageTitles, RawNote,
RawPage, RowCount,
};
use lazy_static::lazy_static; use lazy_static::lazy_static;
use regex::Regex; use regex::Regex;
use std::collections::HashSet;
use shrinkwraprs::Shrinkwrap; use shrinkwraprs::Shrinkwrap;
use slug::slugify; use slug::slugify;
use sqlx;
use sqlx::{ use sqlx::{
sqlite::{Sqlite, SqlitePool, SqliteRow}, sqlite::{Sqlite, SqlitePool, SqliteRow},
Done, Executor, Row, Done, Executor, Row,
}; };
use std::collections::HashMap; use std::collections::HashMap;
use std::collections::HashSet;
use std::sync::Arc; use std::sync::Arc;
#[derive(Shrinkwrap, Copy, Clone)] #[derive(Shrinkwrap, Copy, Clone)]
@ -45,9 +42,7 @@ impl NoteStore {
// to its original empty form. Do not use unless you // to its original empty form. Do not use unless you
// really, really want that to happen. // really, really want that to happen.
pub async fn reset_database(&self) -> NoteResult<()> { pub async fn reset_database(&self) -> NoteResult<()> {
reset_database(&*self.0) reset_database(&*self.0).await.map_err(NoteStoreError::DBError)
.await
.map_err(NoteStoreError::DBError)
} }
/// Fetch page by slug /// Fetch page by slug
@ -78,10 +73,7 @@ impl NoteStore {
let (page, notes) = match select_page_by_title(&mut tx, title).await { let (page, notes) = match select_page_by_title(&mut tx, title).await {
Ok(page) => { Ok(page) => {
let note_id = page.note_id; let note_id = page.note_id;
( (page, select_note_collection_from_root(&mut tx, note_id).await?)
page,
select_note_collection_from_root(&mut tx, note_id).await?,
)
} }
Err(sqlx::Error::RowNotFound) => { Err(sqlx::Error::RowNotFound) => {
let page = { let page = {
@ -93,10 +85,7 @@ impl NoteStore {
select_page_by_title(&mut tx, &title).await? select_page_by_title(&mut tx, &title).await?
}; };
let note_id = page.note_id; let note_id = page.note_id;
( (page, select_note_collection_from_root(&mut tx, note_id).await?)
page,
select_note_collection_from_root(&mut tx, note_id).await?,
)
} }
Err(e) => return Err(NoteStoreError::DBError(e)), Err(e) => return Err(NoteStoreError::DBError(e)),
}; };
@ -118,14 +107,20 @@ impl NoteStore {
let mut tx = self.0.begin().await?; let mut tx = self.0.begin().await?;
// Start by building the note and putting it into its relationship. // Start by building the note and putting it into its relationship.
let parent_id = select_note_id_for_uuid(&mut tx, parent_note_uuid).await?; let parent_id: ParentId = select_note_id_for_uuid(&mut tx, parent_note_uuid).await?;
let parent_max_position = assert_max_child_position_for_note(&mut tx, parent_id).await?;
let position = if position > parent_max_position {
parent_max_position + 1
} else {
position
};
let new_note_id = insert_one_new_note(&mut tx, &new_note).await?; let new_note_id = insert_one_new_note(&mut tx, &new_note).await?;
let _ = make_room_for_new_note(&mut tx, parent_id, position).await?; let _ = make_room_for_new_note(&mut tx, parent_id, position).await?;
let _ = insert_note_note_relationship(&mut tx, parent_id, new_note_id, position).await?; let _ = insert_note_to_note_relationship(&mut tx, parent_id, new_note_id, position, "note").await?;
// From the references, make lists of pages that exist, and pages // From the references, make lists of pages that exist, and pages
// that do not. // that do not.
let found_references = find_all_references_for(&mut tx, &references).await?; let found_references = find_all_page_references_for(&mut tx, &references).await?;
let new_references = diff_references(&references, &found_references); let new_references = diff_references(&references, &found_references);
let mut known_reference_ids: Vec<PageId> = Vec::new(); let mut known_reference_ids: Vec<PageId> = Vec::new();
@ -136,17 +131,21 @@ impl NoteStore {
let new_page_slug = generate_slug(&mut tx, &one_reference).await?; let new_page_slug = generate_slug(&mut tx, &one_reference).await?;
let new_page = create_new_page_for(&one_reference, &new_page_slug, new_root_note_id); let new_page = create_new_page_for(&one_reference, &new_page_slug, new_root_note_id);
known_reference_ids.push(insert_one_new_page(&mut tx, &new_page).await?) known_reference_ids.push(insert_one_new_page(&mut tx, &new_page).await?)
}; }
// And associate the note with all the pages. // And associate the note with all the pages.
known_reference_ids.append(&mut found_references.iter().map(|r| PageId(r.id)).collect()); known_reference_ids.append(&mut found_references.iter().map(|r| PageId(r.id)).collect());
let _ = insert_note_page_references(&mut tx, new_note_id, &known_reference_ids).await?; let _ = insert_note_to_page_relationships(&mut tx, new_note_id, &known_reference_ids).await?;
tx.commit().await?; tx.commit().await?;
Ok(new_note.uuid) Ok(new_note.uuid)
} }
// TODO: Make sure the new position is sane. // This doesn't do anything with the references, as those are
// dependent entirely on the *content*, and not the *position*, of
// the note and the referenced page.
//
// TODO: Ensure the position is sane.
/// Move a note from one location to another. /// Move a note from one location to another.
pub async fn move_note( pub async fn move_note(
&self, &self,
@ -155,14 +154,14 @@ impl NoteStore {
new_parent_uuid: &str, new_parent_uuid: &str,
new_position: i64, new_position: i64,
) -> NoteResult<()> { ) -> NoteResult<()> {
let sample = vec![note_uuid, old_parent_uuid, new_parent_uuid]; let all_uuids = vec![note_uuid, old_parent_uuid, new_parent_uuid];
let mut tx = self.0.begin().await?; let mut tx = self.0.begin().await?;
// This is one of the few cases where we we're getting IDs for // This is one of the few cases where we we're getting IDs for
// notes, but the nature of the ID isn't known at this time. // notes, but the nature of the ID isn't known at this time.
// This has to be handled manually, in the next paragraph // This has to be handled manually, in the next paragraph
// below. // below.
let found_id_vec = bulk_select_ids_for_note_uuids(&mut tx, &sample).await?; let found_id_vec = bulk_select_ids_for_note_uuids(&mut tx, &all_uuids).await?;
let found_ids: HashMap<String, i64> = found_id_vec.into_iter().collect(); let found_ids: HashMap<String, i64> = found_id_vec.into_iter().collect();
if found_ids.len() != 3 { if found_ids.len() != 3 {
return Err(NoteStoreError::NotFound); return Err(NoteStoreError::NotFound);
@ -172,15 +171,21 @@ impl NoteStore {
let new_parent_id = ParentId(*found_ids.get(new_parent_uuid).unwrap()); let new_parent_id = ParentId(*found_ids.get(new_parent_uuid).unwrap());
let note_id = NoteId(*found_ids.get(note_uuid).unwrap()); let note_id = NoteId(*found_ids.get(note_uuid).unwrap());
let old_note_position = get_note_note_relationship(&mut tx, old_parent_id, note_id) let old_note = get_note_to_note_relationship(&mut tx, old_parent_id, note_id).await?;
.await? let old_note_position = old_note.position;
.position; let old_note_nature = &old_note.nature;
let _ = delete_note_note_relationship(&mut tx, old_parent_id, note_id).await?; let _ = delete_note_to_note_relationship(&mut tx, old_parent_id, note_id).await?;
let _ = close_hole_for_deleted_note(&mut tx, old_parent_id, old_note_position).await?; let _ = close_hole_for_deleted_note(&mut tx, old_parent_id, old_note_position).await?;
let parent_max_position = assert_max_child_position_for_note(&mut tx, new_parent_id).await?;
let new_position = if new_position > parent_max_position {
parent_max_position + 1
} else {
new_position
};
let _ = make_room_for_new_note(&mut tx, new_parent_id, new_position).await?; let _ = make_room_for_new_note(&mut tx, new_parent_id, new_position).await?;
let _ = let _ =
insert_note_note_relationship(&mut tx, new_parent_id, note_id, new_position).await?; insert_note_to_note_relationship(&mut tx, new_parent_id, note_id, new_position, old_note_nature).await?;
tx.commit().await?; tx.commit().await?;
Ok(()) Ok(())
} }
@ -193,17 +198,58 @@ impl NoteStore {
new_position: i64, new_position: i64,
new_nature: &str, new_nature: &str,
) -> NoteResult<()> { ) -> NoteResult<()> {
todo!() let mut tx = self.0.begin().await?;
let existing_note_id: NoteId = NoteId(select_note_id_for_uuid(&mut tx, note_uuid).await?.0);
let new_parent_id: ParentId = select_note_id_for_uuid(&mut tx, new_parent_uuid).await?;
let _ = make_room_for_new_note(&mut tx, new_parent_id, new_position).await?;
let _ = insert_note_to_note_relationship(&mut tx, new_parent_id, existing_note_id, new_position, new_nature)
.await?;
tx.commit().await?;
Ok(())
} }
/// Delete a note /// Delete a note
pub async fn delete_note(&self, note_uuid: &str, note_parent_uuid: &str) -> NoteResult<()> { pub async fn delete_note(&self, note_uuid: &str, note_parent_uuid: &str) -> NoteResult<()> {
todo!() let mut tx = self.0.begin().await?;
let condemned_note_id: NoteId = NoteId(select_note_id_for_uuid(&mut tx, note_uuid).await?.0);
let note_parent_id: ParentId = select_note_id_for_uuid(&mut tx, note_parent_uuid).await?;
let _ = delete_note_to_note_relationship(&mut tx, note_parent_id, condemned_note_id);
if count_existing_note_relationships(&mut tx, condemned_note_id).await? == 0 {
let _ = delete_note_to_page_relationships(&mut tx, condemned_note_id).await?;
let _ = delete_note(&mut tx, condemned_note_id).await?;
}
tx.commit().await?;
Ok(())
} }
/// Update a note's content /// Update a note's content
pub async fn update_note_content(&self, note_uuid: &str, content: &str) -> NoteResult<()> { pub async fn update_note_content(&self, note_uuid: &str, content: &str) -> NoteResult<()> {
todo!() let references = build_references(&content);
let mut tx = self.0.begin().await?;
let note_id: NoteId = NoteId(select_note_id_for_uuid(&mut tx, note_uuid).await?.0);
let _ = update_note_content(&mut tx, note_id, &content).await?;
let found_references = find_all_page_references_for(&mut tx, &references).await?;
let new_references = diff_references(&references, &found_references);
let mut known_reference_ids: Vec<PageId> = Vec::new();
// Create the pages that don't exist
for one_reference in new_references.iter() {
let new_root_note = create_unique_root_note();
let new_root_note_id = insert_one_new_note(&mut tx, &new_root_note).await?;
let new_page_slug = generate_slug(&mut tx, &one_reference).await?;
let new_page = create_new_page_for(&one_reference, &new_page_slug, new_root_note_id);
known_reference_ids.push(insert_one_new_page(&mut tx, &new_page).await?)
}
// And associate the note with all the pages.
known_reference_ids.append(&mut found_references.iter().map(|r| PageId(r.id)).collect());
let _ = insert_note_to_page_relationships(&mut tx, note_id, &known_reference_ids).await?;
tx.commit().await?;
Ok(())
} }
} }
// ___ _ _ // ___ _ _
@ -222,10 +268,7 @@ where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let initialize_sql = include_str!("sql/initialize_database.sql"); let initialize_sql = include_str!("sql/initialize_database.sql");
sqlx::query(initialize_sql) sqlx::query(initialize_sql).execute(executor).await.map(|_| ())
.execute(executor)
.await
.map(|_| ())
} }
async fn select_page_by_slug<'a, E>(executor: E, slug: &str) -> SqlResult<RawPage> async fn select_page_by_slug<'a, E>(executor: E, slug: &str) -> SqlResult<RawPage>
@ -268,11 +311,7 @@ where
Ok(ParentId(id.id)) Ok(ParentId(id.id))
} }
async fn make_room_for_new_note<'a, E>( async fn make_room_for_new_note<'a, E>(executor: E, parent_id: ParentId, position: i64) -> SqlResult<()>
executor: E,
parent_id: ParentId,
position: i64,
) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
@ -290,25 +329,26 @@ where
.map(|_| ()) .map(|_| ())
} }
async fn insert_note_note_relationship<'a, E>( async fn insert_note_to_note_relationship<'a, E>(
executor: E, executor: E,
parent_id: ParentId, parent_id: ParentId,
note_id: NoteId, note_id: NoteId,
position: i64, position: i64,
nature: &str,
) -> SqlResult<()> ) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let insert_note_note_relationship_sql = concat!( let insert_note_to_note_relationship_sql = concat!(
"INSERT INTO note_relationships (parent_id, note_id, position, nature) ", "INSERT INTO note_relationships (parent_id, note_id, position, nature) ",
"values (?, ?, ?, ?)" "values (?, ?, ?, ?)"
); );
sqlx::query(insert_note_note_relationship_sql) sqlx::query(insert_note_to_note_relationship_sql)
.bind(&*parent_id) .bind(&*parent_id)
.bind(&*note_id) .bind(&*note_id)
.bind(&position) .bind(&position)
.bind("note") .bind(&nature)
.execute(executor) .execute(executor)
.await .await
.map(|_| ()) .map(|_| ())
@ -318,8 +358,7 @@ async fn select_note_collection_from_root<'a, E>(executor: E, root: i64) -> SqlR
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let select_note_collection_from_root_sql = let select_note_collection_from_root_sql = include_str!("sql/select_note_collection_from_root.sql");
include_str!("sql/select_note_collection_from_root.sql");
Ok(sqlx::query_as(&select_note_collection_from_root_sql) Ok(sqlx::query_as(&select_note_collection_from_root_sql)
.bind(&root) .bind(&root)
.fetch_all(executor) .fetch_all(executor)
@ -355,12 +394,14 @@ where
)) ))
} }
fn find_maximal_slug(slugs: &Vec<JustSlugs>) -> Option<u32> { // Given a possible slug, find the slug with the highest
// uniquification number, and return that number, if any.
fn find_maximal_slug(slugs: &[JustSlugs]) -> Option<u32> {
lazy_static! { lazy_static! {
static ref RE_CAP_NUM: Regex = Regex::new(r"-(\d+)$").unwrap(); static ref RE_CAP_NUM: Regex = Regex::new(r"-(\d+)$").unwrap();
} }
if slugs.len() == 0 { if slugs.is_empty() {
return None; return None;
} }
@ -427,15 +468,33 @@ where
)) ))
} }
async fn bulk_select_ids_for_note_uuids<'a, E>( async fn insert_note_to_page_relationships<'a, E>(
executor: E, executor: E,
ids: &Vec<&str>, note_id: NoteId,
) -> SqlResult<Vec<(String, i64)>> references: &[PageId],
) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let bulk_select_ids_for_note_uuids_sql = "SELECT uuid, id FROM notes WHERE uuid IN (" let insert_note_page_references_sql = "INSERT INTO page_relationships (note_id, page_id) VALUES ".to_string()
.to_string() + &["(?, ?)"].repeat(references.len()).join(", ")
+ &";".to_string();
let mut request = sqlx::query(&insert_note_page_references_sql);
for reference in references {
request = request.bind(*note_id).bind(**reference);
}
request.execute(executor).await.map(|_| ())
}
// For a given collection of uuids, retrieve the internal ID used by
// the database.
async fn bulk_select_ids_for_note_uuids<'a, E>(executor: E, ids: &[&str]) -> SqlResult<Vec<(String, i64)>>
where
E: Executor<'a, Database = Sqlite>,
{
let bulk_select_ids_for_note_uuids_sql = "SELECT uuid, id FROM notes WHERE uuid IN (".to_string()
+ &["?"].repeat(ids.len()).join(",") + &["?"].repeat(ids.len()).join(",")
+ &");".to_string(); + &");".to_string();
@ -455,7 +514,11 @@ where
.collect()) .collect())
} }
async fn get_note_note_relationship<'a, E>( // Used by move_note to identify the single note to note relationship
// by the original parent and child pair. Used mostly to find the
// position for recalculation, to create a new gap or close an old
// one.
async fn get_note_to_note_relationship<'a, E>(
executor: E, executor: E,
parent_id: ParentId, parent_id: ParentId,
note_id: NoteId, note_id: NoteId,
@ -463,33 +526,29 @@ async fn get_note_note_relationship<'a, E>(
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let get_note_note_relationship_sql = concat!( let get_note_to_note_relationship_sql = concat!(
"SELECT parent_id, note_id, position, nature ", "SELECT parent_id, note_id, position, nature ",
"FROM note_relationships ", "FROM note_relationships ",
"WHERE parent_id = ? and note_id = ? ", "WHERE parent_id = ? and note_id = ? ",
"LIMIT 1" "LIMIT 1"
); );
sqlx::query_as(get_note_note_relationship_sql) sqlx::query_as(get_note_to_note_relationship_sql)
.bind(&*parent_id) .bind(&*parent_id)
.bind(&*note_id) .bind(&*note_id)
.fetch_one(executor) .fetch_one(executor)
.await .await
} }
async fn delete_note_note_relationship<'a, E>( async fn delete_note_to_note_relationship<'a, E>(executor: E, parent_id: ParentId, note_id: NoteId) -> SqlResult<()>
executor: E,
parent_id: ParentId,
note_id: NoteId,
) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let delete_note_note_relationship_sql = concat!( let delete_note_to_note_relationship_sql = concat!(
"DELETE FROM note_relationships ", "DELETE FROM note_relationships ",
"WHERE parent_id = ? and note_id = ? " "WHERE parent_id = ? and note_id = ? "
); );
let count = sqlx::query(delete_note_note_relationship_sql) let count = sqlx::query(delete_note_to_note_relationship_sql)
.bind(&*parent_id) .bind(&*parent_id)
.bind(&*note_id) .bind(&*note_id)
.execute(executor) .execute(executor)
@ -502,11 +561,70 @@ where
} }
} }
async fn close_hole_for_deleted_note<'a, E>( async fn delete_note_to_page_relationships<'a, E>(executor: E, note_id: NoteId) -> SqlResult<()>
executor: E, where
parent_id: ParentId, E: Executor<'a, Database = Sqlite>,
position: i64, {
) -> SqlResult<()> let delete_note_to_page_relationships_sql = concat!("DELETE FROM page_relationships ", "WHERE note_id = ? ");
let _ = sqlx::query(delete_note_to_page_relationships_sql)
.bind(&*note_id)
.execute(executor)
.await?;
Ok(())
}
async fn delete_note<'a, E>(executor: E, note_id: NoteId) -> SqlResult<()>
where
E: Executor<'a, Database = Sqlite>,
{
let delete_note_sql = concat!("DELETE FROM notes WHERE note_id = ?");
let count = sqlx::query(delete_note_sql)
.bind(&*note_id)
.execute(executor)
.await?
.rows_affected();
match count {
1 => Ok(()),
_ => Err(sqlx::Error::RowNotFound),
}
}
async fn count_existing_note_relationships<'a, E>(executor: E, note_id: NoteId) -> SqlResult<i64>
where
E: Executor<'a, Database = Sqlite>,
{
let count_existing_note_relationships_sql = "SELECT COUNT(*) as count FROM page_relationships WHERE note_id = ?";
let count: RowCount = sqlx::query_as(count_existing_note_relationships_sql)
.bind(&*note_id)
.fetch_one(executor)
.await?;
Ok(count.count)
}
async fn assert_max_child_position_for_note<'a, E>(executor: E, note_id: ParentId) -> SqlResult<i64>
where
E: Executor<'a, Database = Sqlite>,
{
let assert_max_child_position_for_note_sql =
"SELECT MAX(position) as count FROM note_relationships WHERE parent_id = ?";
let count: RowCount = sqlx::query_as(assert_max_child_position_for_note_sql)
.bind(&*note_id)
.fetch_one(executor)
.await?;
Ok(count.count)
}
// After removing a note, recalculate the position of all notes under
// the parent note, such that there order is now completely
// sequential.
async fn close_hole_for_deleted_note<'a, E>(executor: E, parent_id: ParentId, position: i64) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
@ -524,49 +642,37 @@ where
.map(|_| ()) .map(|_| ())
} }
async fn find_all_references_for<'a, E>( async fn find_all_page_references_for<'a, E>(executor: E, references: &[String]) -> SqlResult<Vec<PageTitles>>
executor: E,
references: &Vec<String>,
) -> SqlResult<Vec<PageTitles>>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let find_all_references_for_sql = "SELECT id, title FROM pages WHERE title IN (" let find_all_references_for_sql = "SELECT id, title FROM pages WHERE title IN (".to_string()
.to_string() + + &["?"].repeat(references.len()).join(",")
&["?"].repeat(references.len()).join(",") + + &");".to_string();
&");".to_string();
let mut request = sqlx::query_as(&find_all_references_for_sql); let mut request = sqlx::query_as(&find_all_references_for_sql);
for id in references.iter() { for id in references.iter() {
request = request.bind(id); request = request.bind(id);
} }
request request.fetch_all(executor).await
.fetch_all(executor)
.await
} }
async fn insert_note_page_references<'a, E>( async fn update_note_content<'a, E>(executor: E, note_id: NoteId, content: &str) -> SqlResult<()>
executor: E,
note_id: NoteId,
references: &Vec<PageId>,
) -> SqlResult<()>
where where
E: Executor<'a, Database = Sqlite>, E: Executor<'a, Database = Sqlite>,
{ {
let insert_note_page_references_sql = let update_note_content_sql = "UPDATE notes SET content = ? WHERE note_id = ?";
"INSERT INTO note_page_references (note_id, page_id) VALUES ".to_string() + let count = sqlx::query(update_note_content_sql)
&["(?, ?)"].repeat(references.len()).join(", ") + .bind(content)
&";".to_string(); .bind(&*note_id)
let mut request = sqlx::query(&insert_note_page_references_sql);
for reference in references {
request = request.bind(&*note_id).bind(**reference);
}
request
.execute(executor) .execute(executor)
.await .await?
.map(|_| ()) .rows_affected();
match count {
1 => Ok(()),
_ => Err(sqlx::Error::RowNotFound),
}
} }
fn create_unique_root_note() -> NewNote { fn create_unique_root_note() -> NewNote {
@ -589,10 +695,8 @@ fn create_new_page_for(title: &str, slug: &str, note_id: NoteId) -> NewPage {
// Given the references supplied, and the references found in the datastore, // Given the references supplied, and the references found in the datastore,
// return a list of the references not found in the datastore. // return a list of the references not found in the datastore.
fn diff_references(references: &Vec<String>, found_references: &Vec<PageTitles>) -> Vec<String> { fn diff_references(references: &[String], found_references: &[PageTitles]) -> Vec<String> {
let all: HashSet<String> = references.iter().cloned().collect(); let all: HashSet<String> = references.iter().cloned().collect();
let found: HashSet<String> = found_references.iter().map(|r| r.title.clone()).collect(); let found: HashSet<String> = found_references.iter().map(|r| r.title.clone()).collect();
all.difference(&found).cloned().collect() all.difference(&found).cloned().collect()
} }

View File

@ -1,58 +1,116 @@
use chrono::{DateTime, Utc}; use chrono::{DateTime, Utc};
use derive_builder::Builder;
use serde::{Deserialize, Serialize};
use sqlx::{self, FromRow}; use sqlx::{self, FromRow};
// // A Resource is either content or a URL to content that the #[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
// // user embeds in a note. TODO: I have no idea how to do this yet, pub struct RawPage {
// // but I'll figure it out. pub id: i64,
// #[derive(Clone, Serialize, Deserialize, Debug)]
// pub struct Resource {
// pub id: String,
// pub content: String,
// }
//
// // A Breadcrumb is a component of a reference. Every element should
// // be clickable, although in practice what's going to happen is that
// // the user will be sent to the *page* with that note, then *scrolled*
// // to that note via anchor.
// #[derive(Clone, Debug)]
// pub struct Breadcrumb {
// pub note_id: String,
// pub summary: String,
// }
//
// // A Note is the heart of our system. It is a single object that has
// // a place in our system; it has a parent, but it also has embedded
// // references that allow it to navigate through a web of related
// // objects. It may have children. *AT THIS LAYER*, though, it is
// // returned as an array. It is up to the
// #[derive(Clone, Debug)]
// pub struct Note {
// pub id: String,
// pub parent_id: String,
// pub content: String,
// pub resources: Vec<Resource>,
// pub note_type: String, // Describes the relationship to the parent note.
// pub created: DateTime<Utc>,
// pub updated: DateTime<Utc>,
// pub viewed: DateTime<Utc>,
// pub deleted: Option<DateTime<Utc>>,
// }
//
// pub struct Reference {
// pub page_id: String,
// pub page_title: String,
// pub reference_summary_titles: Vec<Breadcrumbs>,
// pub reference_summary: String,
// }
pub struct Page {
pub slug: String, pub slug: String,
pub title: String, pub title: String,
// pub notes: Vec<Notes>, // The actual notes on this page. pub note_id: i64,
// pub references: Vec<Reference>, // All other notes that reference this page. pub creation_date: DateTime<Utc>,
// pub unlinked_references: Vec<Reference>, pub updated_date: DateTime<Utc>,
pub created: DateTime<Utc>, pub lastview_date: DateTime<Utc>,
pub updated: DateTime<Utc>, pub deleted_date: Option<DateTime<Utc>>,
pub viewed: DateTime<Utc>, }
pub deleted: Option<DateTime<Utc>>,
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub struct RawNote {
pub id: i64,
pub uuid: String,
pub parent_id: i64,
pub parent_uuid: String,
pub content: String,
pub position: i64,
pub notetype: String,
pub creation_date: DateTime<Utc>,
pub updated_date: DateTime<Utc>,
pub lastview_date: DateTime<Utc>,
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, Builder)]
pub struct NewPage {
pub slug: String,
pub title: String,
pub note_id: i64,
#[builder(default = r#"chrono::Utc::now()"#)]
pub creation_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub updated_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub lastview_date: DateTime<Utc>,
#[builder(default = r#"None"#)]
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, Builder)]
pub struct NewNote {
#[builder(default = r#""".to_string()"#)]
pub uuid: String,
pub content: String,
#[builder(default = r#""note".to_string()"#)]
pub notetype: String,
#[builder(default = r#"chrono::Utc::now()"#)]
pub creation_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub updated_date: DateTime<Utc>,
#[builder(default = r#"chrono::Utc::now()"#)]
pub lastview_date: DateTime<Utc>,
#[builder(default = r#"None"#)]
pub deleted_date: Option<DateTime<Utc>>,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustSlugs {
pub slug: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustTitles {
title: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct JustId {
pub id: i64,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct PageTitles {
pub id: i64,
pub title: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct NoteRelationship {
pub parent_id: i64,
pub note_id: i64,
pub position: i64,
pub nature: String,
}
#[derive(Clone, Serialize, Deserialize, Debug, FromRow)]
pub(crate) struct RowCount {
pub count: i64,
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn can_build_new_note() {
let now = chrono::Utc::now();
let newnote = NewNoteBuilder::default()
.uuid("foo".to_string())
.content("bar".to_string())
.build()
.unwrap();
assert!((newnote.creation_date - now).num_minutes() < 1);
assert!((newnote.updated_date - now).num_minutes() < 1);
assert!((newnote.lastview_date - now).num_minutes() < 1);
assert!(newnote.deleted_date.is_none());
}
} }