I’ve noticed a tendency for those folks on the internet there, to give themselves a pass to feel relaxed about otherwise concerning things, on account of them being open source. The fact that certain code is publicly available has enabled many on social media to fully dismiss any concern. How can a product be nefarious if the authors are letting us see inside?
It’s been seen many a time before, but it is of course DeepSeek’s recent emergence that has exploded the argument into the mainstream like a comfort blanket made of puzzle.
Thanks for reading bassist of impulses! Subscribe for free to receive new posts and support my work.
“Yes, the Chinese government are involved”, they say, “but don’t worry, it’s open source”. The presumption here being that somebody - probably not them - will have been all over the codebase already, checking out nooks and crannies of all shapes and sizes, and will surely have discovered any exploits, backdoors or general malware, and sounded the appropriate alarms against the headwind of positive media. We wouldn’t even be having this conversation.
But it doesn’t work like that. Such analysis takes attention, time, talent and specialisms. The source is available for consumption, but so’s my ill-advised 2016 Dubstep EP and nobody’s checking that out with gleeful abandon.

Your author, Dubstep-enjoyer and form of artificial intelligence himself, in happier times.
The base argument that the creators are letting us see inside is faulty, for two reasons:
We don’t understand what we’re looking at, even as we stare directly at it.
We aren’t given the whole system end-to-end.
Unless you’re both an expert security researcher and an LLM-creating aficionado, you could stare at the source code for DeepSeek all day and get only a small grasp on how it sticks together. Maybe not even that. It would be like being taken to the Large Hadron Collider and asked to point out which part has the bug that might cause 20% of carbon ions to be lost. Without a heck of a lot of extra knowledge, we frankly don’t stand a chance.
Let’s say, as a classic blog-based thought experiment, that there is an exploit in DeepSeek. For sake of argument, it’s a hideous backdoor that allows the Chinese government to run admin-level commands on your computer while you sleep, and is undetectable once installed. You could print out the bit of DeepSeek’s code that enables this disgusting exploit, emblazon it on the side of every double-decker bus in London, and still more than 99% of people would not recognise it for what it is.
Photo of a London bus because I just mentioned them in the article, by chan lee on Unsplash
Open source only makes software safe and un-worrisome when there’s plenty of very smart people looking at the code and contributing to the effort. Its availability alone fixes nothing. Exploits have lurked deep in the codebase for every major operating system for decades before being found, and I promise you there’s many more still to be discovered. “Coding is hard job” and looking at someone else’s code is even harder.
AND - all of this only covers the model. 99% of users won’t be downloading the model to run themselves, they’ll just use the DeepSeek web interface. The part of the whole system via which the majority of people will access the tool, is the part that’s hidden from us.
Does it log everything you ask it, alongside your IP address and any other information it can gleam about your environment? I don’t think it’s controversial to say “probably” and “maybe don’t blast your wildest fantasies and passwords into that text box”.
Look though - here comes Altman and crew to copy-paste the same techniques into their own product while claiming they’ve been stolen from, having just bent the knee to get the new president, known for keeping promises to business associates, to agree on spending $500bn for their far more expensive version, before two days later publicly telling them off for being behind the curve.
Happy LLMing, folks.
Thanks for reading bassist of impulses! Subscribe for free to receive new posts and support my work.