However, how do one access their diary, when you stopped maintaining it? Is this targeted more at the technically inclined, high-profile people who need to keep secrets?
Personally, I believe that for something like a diary/journal, it should be in a format easily readable by most tools (so a Plain-Text or a MarkDown at best), then it is in a container/folder. Now, encrypt that container/folder instead. In the future, when you need to change the tool for Encryption/Decryption, move the container/folder.
I think there are two different concerns mixed together:
1) Can I still read my data in 10 years? That’s mostly about open, well-documented formats + an export path. A journaling app can still be “safe” here if it can export to Markdown/plain-text (or at least JSON) and the on-disk schema is documented.
2) Can I decrypt it in 10 years? That’s about using boring primitives (AES-GCM, Argon2/scrypt/PBKDF2) and keeping the crypto layer simple. If it’s standard crypto, you’re not locked to one vendor the way you might be with a bespoke format.
The “plain files in an encrypted folder” approach (Cryptomator/VeraCrypt) is totally reasonable—and arguably the simplest threat model—but you do give up a lot of what makes a journal app nice (full-text search, tags, structured metadata, conflict handling, etc.). SQLite + client-side encryption is a fine compromise if there’s a solid export and the KDF/password story is strong.
The biggest real risk is still: losing the password. A printable recovery key / key export would help more than switching formats.
yeah, currently you can export your journal to json or markdown files. So you can walk away at any point.
Vendor lock-in is one of the main things i wanted to avoid. That's why I sticked with boring and standard libraries and encryption as much as possible.
Thanks for the feedback!
But so what? Another app can't really read swap file/partition. Unless it runs with elevated privileges like root, in which case the system is compromised anyway.
I'm using obsidian and cryfs. Nothing has access to those except a few programs. I'm storing notes, files, documents, whatever is important and everything is synced to the cloud.
The dual-auth approach (password + recovery key) is smart, but I think the thread is missing a bigger design question: what happens when the encryption scheme itself needs to evolve?
AES-256-GCM is rock solid today, but 10 years from now you might want to migrate to something post-quantum. If the crypto layer is tightly coupled to the storage format, you end up needing to decrypt everything with the old scheme and re-encrypt with the new one.
Better pattern: separate the key-wrapping layer from the data encryption layer. Wrap your data encryption key (DEK) with a key-encryption key (KEK) derived from the password. When you need to upgrade crypto, you only re-wrap the DEK. This is basically what LUKS does.
Also +1 to the SQLite choice. The Library of Congress recommends it as an archival format. SQLite + client-side AES + a documented schema is genuinely more future-proof than most plaintext approaches, because you get ACID transactions and full-text search for free.
The dual-auth approach (password + recovery key) is smart, but I think the thread is missing a bigger design question: what happens when the encryption scheme itself needs to evolve?
AES-256-GCM is rock solid today, but 10 years from now you might want to migrate to something post-quantum. If the crypto layer is tightly coupled to the storage format, you end up needing to decrypt everything with the old scheme and re-encrypt with the new one — which means the entire plaintext corpus has to exist in memory at once.
Better pattern: separate the key-wrapping layer from the data encryption layer. Wrap your data encryption key (DEK) with a key-encryption key (KEK) derived from the password. When you need to upgrade crypto, you only re-wrap the DEK — the bulk data stays untouched. This is basically what LUKS does and it works great.
Also +1 to the SQLite choice. People underestimate how durable that format is — the Library of Congress uses it as a recommended archival format. The combination of SQLite + client-side AES + a well-documented schema is genuinely more future-proof than most "just use plaintext files" approaches, because you get ACID transactions and full-text search for free.
Looks really cool, I like the pretty but minimalist interface. Could I store the SQlite file on, say, google drive so that I could access my journal from different devices while the contents are still kept secure because they’re encrypted?
The support for it is planned. It was thought from the beginning with supporting all the major platforms; I just started with the desktop support because there was my best use case. But the support is already planned in the near future. Android will follow shortly, and an iOS version can be done if there is demand for it.
Thanks!
Here's another approach using Rclone and an editor of your choice. Rclone has a built in crypt library that can encrypt your data and store it in a cloud provider. I use it along with Sublime Text to journal, and store my encrypted data on Dropbox.
One major problem, I don't want a journal with unbreakable encryption where I lose all my data if I ever lose the key.
I already pay for a journaling website where I know I can always recover my journals as long as I have access to my Gmail.
So, while I appreciate this security first mindset, for me it actually becomes less interesting. I want my journal to sync to the cloud, I want to be able to unlock it, I don't want to risk losing years of journals if I forget a single key.
I think you should be more cautious about relying on the services of a company like Google that can arbitrarily decide to remove your account data or access. Similar, though the person was fortunate enough to regain access: https://hey.paris/posts/appleid/
You can mitigate hardware failure and data loss, especially for a simple key, but you may not be able to prevent Google from deciding your account is gone one day.
Thanks for the feedback! That point is super valid; that's why I created it with multiple authentication slots in mind (currently, it supports both password and public key authentication) so you can use multiple simultaneously and do not need to rely on one single point of failure.
For example, if you set up a password and a key, you can use your key, and if it gets lost or compromised, you can still log in with the password, remove the old key, and generate a new one.
You can do the same in reverse: just use the password and keep the key in a safe place (like a password manager or a physical USB), and if you lose your password, you can still get access with the key.
Nice project. The SQLite-on-cloud-drive approach mentioned in another comment is actually pretty solid — if the encryption is done client-side before the file hits the cloud, it doesn't matter where it's stored. The key thing is making sure the key derivation is robust enough that a compromised cloud account doesn't compromise journal contents.
One thing I'd push back on regarding the "what if you stop maintaining it" concern: SQLite with AES-256-GCM is about as future-proof as you can get. Both are standards with multiple implementations. The real risk isn't the format dying — it's losing the password. A recovery key export (even just a paper backup of the key material) would go a long way.
For the cross-device case, you might also consider something like Syncthing for sync without any cloud intermediary. Keeps the threat model simpler.
Let alone the cloud, SQLite in iCloud Drive is the reason I am not using Bear notes app. After losing to convoluted file formats a couple of times I do not consider any journal or notes app that doesn’t let me see/edit plain text files on the disk. I will deal with encryption, storage, etc on my own. Those are too personal files to be either locked or go behind any amount of friction. I still have tons of files locked from Dyrii that was abandoned
Hey, thanks for the feedback! Yes, currently in the preferences you can see the path of your local SQLite DB file, so you could definitely sync that to the cloud.
I will improve it further in next releases to make it even simpler (for example, by defining a custom path for the store, which cannot be done currently), but it can definitely be done already.
Regarding the key for recovery: you can already do it. Mini-Diarium already supports both password and public key authentication. So you can use the password and generate the .key file and keep it in a secure place as a backup in case you forget your password (or do it in reverse: use the key file and have the password as a backup).
This is Nice.
However, how do one access their diary, when you stopped maintaining it? Is this targeted more at the technically inclined, high-profile people who need to keep secrets?
Personally, I believe that for something like a diary/journal, it should be in a format easily readable by most tools (so a Plain-Text or a MarkDown at best), then it is in a container/folder. Now, encrypt that container/folder instead. In the future, when you need to change the tool for Encryption/Decryption, move the container/folder.
For instance, tools such as https://cryptomator.org comes to mind.
I think there are two different concerns mixed together:
1) Can I still read my data in 10 years? That’s mostly about open, well-documented formats + an export path. A journaling app can still be “safe” here if it can export to Markdown/plain-text (or at least JSON) and the on-disk schema is documented.
2) Can I decrypt it in 10 years? That’s about using boring primitives (AES-GCM, Argon2/scrypt/PBKDF2) and keeping the crypto layer simple. If it’s standard crypto, you’re not locked to one vendor the way you might be with a bespoke format.
The “plain files in an encrypted folder” approach (Cryptomator/VeraCrypt) is totally reasonable—and arguably the simplest threat model—but you do give up a lot of what makes a journal app nice (full-text search, tags, structured metadata, conflict handling, etc.). SQLite + client-side encryption is a fine compromise if there’s a solid export and the KDF/password story is strong.
The biggest real risk is still: losing the password. A printable recovery key / key export would help more than switching formats.
Make the journal app store its data in plain-text Markdown files in an encrypted folder (or ZIP).
If necessary for things like search, add a cache file to the folder.
yeah, currently you can export your journal to json or markdown files. So you can walk away at any point. Vendor lock-in is one of the main things i wanted to avoid. That's why I sticked with boring and standard libraries and encryption as much as possible. Thanks for the feedback!
> Every entry is encrypted with AES-256-GCM before it touches disk
Until the OS needs more memory and swaps your secrets out.
Protected memory can be used to fix that. Working on a related project that I'm planning to share soon.
But so what? Another app can't really read swap file/partition. Unless it runs with elevated privileges like root, in which case the system is compromised anyway.
I thought we were all supposed to be encrypting our swap. Or is there something better an app can do about this?
I'm using obsidian and cryfs. Nothing has access to those except a few programs. I'm storing notes, files, documents, whatever is important and everything is synced to the cloud.
This is the beauty of it. If it works for you it's great. If this new app works for others, then it's great.
That's a good win-win situation.
As a fellow obsidian user, I wouldn't scoff at a simple app which does one thing well.
I love the minimalism of the UI.
Here's a tip: GitHub now allows you to embed a proper video in your README. (https://stackoverflow.com/questions/4279611/how-to-embed-a-v...). Quality would be much better, and people can navigate back-and-forth in the video.
Thanks! I will check that out
The dual-auth approach (password + recovery key) is smart, but I think the thread is missing a bigger design question: what happens when the encryption scheme itself needs to evolve?
AES-256-GCM is rock solid today, but 10 years from now you might want to migrate to something post-quantum. If the crypto layer is tightly coupled to the storage format, you end up needing to decrypt everything with the old scheme and re-encrypt with the new one.
Better pattern: separate the key-wrapping layer from the data encryption layer. Wrap your data encryption key (DEK) with a key-encryption key (KEK) derived from the password. When you need to upgrade crypto, you only re-wrap the DEK. This is basically what LUKS does.
Also +1 to the SQLite choice. The Library of Congress recommends it as an archival format. SQLite + client-side AES + a documented schema is genuinely more future-proof than most plaintext approaches, because you get ACID transactions and full-text search for free.
The dual-auth approach (password + recovery key) is smart, but I think the thread is missing a bigger design question: what happens when the encryption scheme itself needs to evolve?
AES-256-GCM is rock solid today, but 10 years from now you might want to migrate to something post-quantum. If the crypto layer is tightly coupled to the storage format, you end up needing to decrypt everything with the old scheme and re-encrypt with the new one — which means the entire plaintext corpus has to exist in memory at once.
Better pattern: separate the key-wrapping layer from the data encryption layer. Wrap your data encryption key (DEK) with a key-encryption key (KEK) derived from the password. When you need to upgrade crypto, you only re-wrap the DEK — the bulk data stays untouched. This is basically what LUKS does and it works great.
Also +1 to the SQLite choice. People underestimate how durable that format is — the Library of Congress uses it as a recommended archival format. The combination of SQLite + client-side AES + a well-documented schema is genuinely more future-proof than most "just use plaintext files" approaches, because you get ACID transactions and full-text search for free.
Dann, that’s a fancy README.md , love it
thanks!
Looks really cool, I like the pretty but minimalist interface. Could I store the SQlite file on, say, google drive so that I could access my journal from different devices while the contents are still kept secure because they’re encrypted?
Yes, you can definetely can! Currently you can see the location of the .db file on the preferences while your journal is open.
I will improve the experience for this use case in follow up releases, by for example being able to define a arbitrary path for your db file.
Thanks for the feedback!
The biggest problem is that this is not available on mobile platforms. Most people do this on their phones, not their laptops.
The support for it is planned. It was thought from the beginning with supporting all the major platforms; I just started with the desktop support because there was my best use case. But the support is already planned in the near future. Android will follow shortly, and an iOS version can be done if there is demand for it. Thanks!
Here's another approach using Rclone and an editor of your choice. Rclone has a built in crypt library that can encrypt your data and store it in a cloud provider. I use it along with Sublime Text to journal, and store my encrypted data on Dropbox.
More here: https://alabhya.me/rclone
Obsidian.md
One major problem, I don't want a journal with unbreakable encryption where I lose all my data if I ever lose the key.
I already pay for a journaling website where I know I can always recover my journals as long as I have access to my Gmail.
So, while I appreciate this security first mindset, for me it actually becomes less interesting. I want my journal to sync to the cloud, I want to be able to unlock it, I don't want to risk losing years of journals if I forget a single key.
>as long as I have access to my Gmail
I think you should be more cautious about relying on the services of a company like Google that can arbitrarily decide to remove your account data or access. Similar, though the person was fortunate enough to regain access: https://hey.paris/posts/appleid/
You can mitigate hardware failure and data loss, especially for a simple key, but you may not be able to prevent Google from deciding your account is gone one day.
Thanks for the feedback! That point is super valid; that's why I created it with multiple authentication slots in mind (currently, it supports both password and public key authentication) so you can use multiple simultaneously and do not need to rely on one single point of failure.
For example, if you set up a password and a key, you can use your key, and if it gets lost or compromised, you can still log in with the password, remove the old key, and generate a new one.
You can do the same in reverse: just use the password and keep the key in a safe place (like a password manager or a physical USB), and if you lose your password, you can still get access with the key.
Thanks again!
Nice project. The SQLite-on-cloud-drive approach mentioned in another comment is actually pretty solid — if the encryption is done client-side before the file hits the cloud, it doesn't matter where it's stored. The key thing is making sure the key derivation is robust enough that a compromised cloud account doesn't compromise journal contents.
One thing I'd push back on regarding the "what if you stop maintaining it" concern: SQLite with AES-256-GCM is about as future-proof as you can get. Both are standards with multiple implementations. The real risk isn't the format dying — it's losing the password. A recovery key export (even just a paper backup of the key material) would go a long way.
For the cross-device case, you might also consider something like Syncthing for sync without any cloud intermediary. Keeps the threat model simpler.
Let alone the cloud, SQLite in iCloud Drive is the reason I am not using Bear notes app. After losing to convoluted file formats a couple of times I do not consider any journal or notes app that doesn’t let me see/edit plain text files on the disk. I will deal with encryption, storage, etc on my own. Those are too personal files to be either locked or go behind any amount of friction. I still have tons of files locked from Dyrii that was abandoned
Hey, thanks for the feedback! Yes, currently in the preferences you can see the path of your local SQLite DB file, so you could definitely sync that to the cloud.
I will improve it further in next releases to make it even simpler (for example, by defining a custom path for the store, which cannot be done currently), but it can definitely be done already.
Regarding the key for recovery: you can already do it. Mini-Diarium already supports both password and public key authentication. So you can use the password and generate the .key file and keep it in a secure place as a backup in case you forget your password (or do it in reverse: use the key file and have the password as a backup).
Thanks again!