You might as well run our insecure cloud . . .
Because we are busy wrecking Linux security anyway!
494 publicly visible posts • joined 4 Sep 2012
Everybody dies ;)
I think Zig is different. As long as it gets to an actual release soon it occupies a highly unique position amongst the "new" languages. I'm not a fanboy so I won't preach, but there is very little hype and a lot of good work going into it.
FWIW the Ubuntu packages won't install on my Debian system, and I can't be bothered to install the dependencies & build myself, so I'll have to take the word of others here if the TE is any good!
Natural language is generally horrible for expressing logic. What language? you might say.
I suspect that people using this "technology" in future will need to be fluent in at least one of the _very few_ languages with enough internet data to snaffle^H^H^H^H^H^H^Htrain on . . . the others (languages, not people!) will simply wither & die.
On the contrary.
In the "days when there wasn't a lot of memory", running a desktop was pretty pointless, so in fact there was quite a lot of memory to spare for non-gui stuff!
I used a ramdisk (tmpfs) even on the 500MB Pi1b.
Read that again, 500 _Megabytes_! Imagine having that much memory when Linux first came out.
Sounds like a "nice to have", but that is exactly the sort of procedure that is likely to go wrong IMO, or more likely just not be used. How much "training" do you think would be necessary to get that process anywhere near right? What if code or products (or anything else) change - yes, retrain again. then patch up mistakes, etc. etc. ad infinitum.
I really do think the "translation" route is the right one, the docs and code are right there and can be directly compared! Why add layers of expensive an error-prone fuzzing?
I find your first sentence puzzling. Surely this is as it _should_ be, and translating docs to code should be a deterministic process. We all know that this technology is going to be most "useful" to those idiot company bosses who see documentation as an avoidable cost (in other words, pretty much all of them!).
$ ls -la /bin/bash /bin/dash
-rwxr-xr-x 1 root root 1234376 Mar 27 2022 /bin/bash
-rwxr-xr-x 1 root root 125560 Dec 10 2020 /bin/dash
Pretty similar situation with memory usage. All that extra code is for interactive features, none of which is useful for batch processing (plus one or two trivial syntactic "features" to convince you that you need it).
My _shell_ is bash, all my batch scripts are run in dash.
Perhaps it means: "this is not for you - try OpenRC instead" ;)
Seriously though, I have a lot of respect for the GNU folks, but IMO they have hamstrung acceptance of their software in two ways:
* Clinging to Lisp - hardly anyone used it in the '90s either
* Info system - rival to man pages using an obscure mark-up, with obscure keystrokes, and all docs split up into tiny pages of about five lines each
The Debian "base stack" is pretty complicated these days, what with systemd & other things.
Have you considered a more streamlined base OS, something like Alpine Linux, with Busybox instead of the GNU utils (they can still be installed!) and MUSL libc instead of glibc? As a Debian user for three decades I am very impressed with Alpine, and I keep it to hand on an SD card for experimenting.
This approach might be more appropriate to the "maker" use case, and more maintainable for you guys in the longer term.
Just a suggestion . . .