- A little update: I added privacy-focused optional shorter URLs to SDocs.
You can read more about the implementation here: https://sdocs.dev/#sec=short-links
Briefly:
We encrypt your document client side. The encrypted document is sent to the server with an id to save it against. The encryption key stays client side in the URL fragment. (And - probably very obviously - the encryption key is required to make the sever stored text readable again).https://sdocs.dev/s/{short id}#k={encryption key} └────┬───┘ └───────┬──────┘ │ │ sent to never leaves server your browserYou can test this by opening your browser's developer tools, switch to the Network tab, click Generate next to the "Short URL" heading, and inspecting the request body. You will see a base64-encoded blob of random bytes, not your document.
- URL data sites are always very cool to me. The offline service worker part is great.
The analytics[1] is incredible. Thank you for sharing (and explaining)! I love this implementation.
I'm a little confused about the privacy mention. Maybe the fragment data isn't passed but that's not a particularly strong guarantee. The javascript still has access so privacy is just a promise as far as I can tell.
Am I misunderstanding something and is there a stronger mechanism in browsers preserving the fragment data's isolation? Or is there some way to prove a url is running a github repo without modification?
- Thanks for the kind words re the analytics!
You are right re privacy. It is possible to go from url hash -> parse -> server (that’s not what SDocs does to be clear).
I’ve been thinking about how to prove our privacy mechanism. The idea I have in my head at the moment is to have 2+ established coding agents review the code after every merge to the codebase and to provide a signal (maybe visible in the footer) that, according to them it is secure and the check was made after the latest merge. Maybe overkill?! Or maybe a new way to “prove” things?? If you have other ideas please let me know.
- No, I don't have any good ideas. Just hoping someone else does, or that I'm missing something.
I think it's in the hands of browser vendors.
The agent review a la socket.dev probably doesn't address all the gaps. I think you're already doing about as much as you reasonably can.
- Thanks. The question has made me wonder about the value of some sort of real time verification service.
- i also used fragment technique for sharing html snippets but url's became very long, i had to implement optional url shortener after users complained. Unfortunately that meant server interaction.
- (I left a stand alone comment, but:) A little update: I added privacy-focused optional shorter URLs to SDocs.
You can read more about the implementation here: https://sdocs.dev/#sec=short-links
Briefly:
We encrypt your document client side. The encrypted document is sent to the server with an id to save it against. The encryption key stays client side in the URL fragment. (And - probably very obviously - the encryption key is required to make the sever stored text readable again).https://sdocs.dev/s/{short id}#k={encryption key} └────┬───┘ └───────┬──────┘ │ │ sent to never leaves server your browserYou can test this by opening your browser's developer tools, switch to the Network tab, click Generate next to the "Short URL" heading, and inspecting the request body. You will see a base64-encoded blob of random bytes, not your document.
- Really nice implementation by the way.
Re URL length: Yes... I have a feeling it could become an issue. I was wondering if a browser extension might give users the ability to have shorter urls without losing privacy... but haven't looked into it deeply/don't know if it would be possible (browser extensions are decent bridges between the local machine and the browser, so maybe some sort of decryption key could be used to allow for more compressed urls...)
- i doubt it would be possible, it boils down to compression problem compressing x amount of content to y bits, since content is unpredictable it cannot be done without having intermediary to store it.
- For this use-case, maybe compression and then encoding would get more data into the URL before you hit a limit (or before users complain)?
I.e. .md -> gzip -> base64
- This is a neat tool. I always had to manually copypaste longs texts into notepad and convert it into md format. Obvisouly i couldn't parse complex sites with lots of images or those that had weird editing. this will be useful
- Thank you. If you use an AI agent you might be able to tell it to curl the target website, extract the content into a markdown file and then sdoc it. It might have some interesting ideas with images (using the hosted URLs or hosting them yourself somehow)
- I had not heard of url fragments before. Is there a size cap?
- Ish, but the cap is the length of url that the browser can handle. For desktop chrome it's 2MB, but for mobile Safari its 80KB.
The compression algo SDocs uses reduces the size of your markdown file by ~10x, so 80KB is still ~800KB of markdown, so fairly beefy.
- Cool project. Heads up - there’s a commercial company with a very similar name that might decide to hassle you about it:
- Thanks + thanks for the heads up. I will see what happens. It's a domain-name war out there!
- [dead]
- [dead]