back to article The US government wants developers to stop using C and C++

I must be a glutton for punishment. Not only was my first programming language IBM 360 Assembler, my second language was C. Programming anything in them wasn't easy. Programming safely in either is much harder. So when the US Cybersecurity and Infrastructure Security Agency (CISA) and the FBI announced they were doubling down …

  1. Version 1.0 Silver badge
    Facepalm

    It's not the language, it's just the way it's "talking"

    Years ago after talking with the US Cybersecurity and Infrastructure Security Agency after our applications were being used by a few military medical labs we removed all access to Microsoft DLL and Windows features, returning to the original environment of just writing internal code irrelevant to any operating system "features" - they were happy and approved everything and ever since then I've had nothing hacked.

    Basically I don't see the language as the problem, it's normally just the way the code is written to access any external environment "features" (aka problems too often).

    1. Claptrap314 Silver badge
      Facepalm

      Re: It's not the language, it's just the way it's "talking"

      The problem with making things idiot proof is that we keep getting better idiots.

      The fact that rust has this "unsafe" directive (or what ever it is called) means that the language designers absolutely know that the language cannot do what people want to do with in in a memory-safe manner. The claim that reasoning about unsafe behavior can be limited to these blocks is farcical. If you want to argue otherwise, pass your mathematics prelims & get back to me.

      There are plenty of reasons to avoid C++. And, for many applications C. But using memory safety as the reason to switch to a language which proponents claim is safe when it is plainly not? Pull the other one.

      1. Michael Strorm Silver badge
        Trollface

        No, of course I've no idea if this remotely resembles the actual syntax used...

        #BEGIN UNSAFE BLOCK

        .

        .

        (your entire Rusty program here)

        .

        .

        #END UNSAFE BLOCK

        Problem solved, and CISA are happy you've used their approved language!

        1. Ian Mason

          Re: No, of course I've no idea if this remotely resembles the actual syntax used...

          It might just be a bit easier to:

          #pragma pretend-to-be-rust=on

          int main ()

          ....

          return 0;

          #pragma pretend-to-be-rust=off

          1. ChoHag Silver badge

            Re: No, of course I've no idea if this remotely resembles the actual syntax used...

            ON ERROR RESUME NEXT;

            1. MonkeyJuice Bronze badge

              Re: No, of course I've no idea if this remotely resembles the actual syntax used...

              This code stops for no man.

            2. Sceptic Tank Silver badge
              Headmaster

              Re: No, of course I've no idea if this remotely resembles the actual syntax used...

              int main()

              {

              try

              {

              ...

              }

              catch (...) {}

              return 0;

              }

        2. Anonymous Coward
          Anonymous Coward

          Re: No, of course I've no idea if this remotely resembles the actual syntax used...

          I remember when the US DoD was ADA ON EVERYTHING

          There was a missile guidance system that was "written in Ada" that just called a C routine that never returned.

          "It's compliant!"

          1. Someone Else Silver badge

            What is old is new again

            I agree, AC. I was part of the military-software complex during the time that the DoD "insisted" that all new projects were to be done in Ada. (Never mind that Ada was, much like Rust is now, a language under development, and oh, what a mess Ada 3.0 became...but I digress.) Contractors would dutifully come back with bids that were some 3-5 times higher (and 2 to 3 times longer) than the original ones, and the DoD suddenly came up with a "waiver program" that would allow said contractors to develop the code as they originally would have.

            Expect to see this lather, rinse, repeat cycle replayed in all its capitalist glory under CISA's latest fiat.

            Just who do they think they are...tRump?

            1. vtcodger Silver badge

              Re: What is old is new again

              I was also around for the Ada fiasco. Aside from the fact that the miniscule (by modern standards) embedded computers of the late 1970s probably wouldn't have had the resources to run Ada software which was purportedly designed for embedded applications, the first DOD must_use_Ada dictates were issued years before there was an available Ada compiler. And, in fact Ada compilers turned out to be rather difficult to build. It took several years for them to appear. By that time, applying for a waiver to use Fortran or whatever had become another few pages in the large stack of boilerplate forms attached to every proposal promising that one's software wouldn't pollute the local streams/rivers(much), no endangered species would be inconvenienced, any dedicated facilities would be constructed according to all applicable codes, etc, etc, etc.

              Nothing against Ada although I doubt that when they first started "requiring" it, any sane contractor would have signed up to use it unless the government covered the risks via a cost plus contract.

              I agree with CISA that memory unsafe languages are indeed a security issue. OTOH, I agree with the article that massive recoding of C/C++ into memory safe languages is improbable for economic reasons and that the switch over, to the extent it's actually possible, will be very gradual.

              1. bazza Silver badge

                Re: What is old is new again

                One of the problems with Ada was that it featured parallelism (threads) in an era when most OSes had no such concept. It was only when Greenhills did an Ada compiler that targeted VxWorks and used OS threads (tasks in that OS) for Ada threads that Ada became a practical (for some measure of practical) language.

                Prior to then Ada compilers had had to implement their own mini OS / scheduler runtime to be parcelled up with the program. And that didn’t work very well. In that era I worked for a bunch who’d bet a whole project on Ada but had so much of it that the program couldn’t be compiled…

                1. cageordie

                  Re: What is old is new again

                  When did Green Hills introduce their Ada. When I was shopping compilers in the late 80s they didn't come up in any searches. I had already delivered 15,000 executable lines of Ada using the DEC Ada compiler and proper operating systems, like VAX/VMS were very multi tasking. I had 70 tasks in the safety and arming system demonstrator I built. We chose the Verdix compiler targeted to the 68020 on top of VxWorks. VxWorks was also a multi tasking operating system. The tasking core was written by John Fogelin. We got to see Verdix optimizing the compiler over the year and a half I was working on the stand off weapon we were developing. I've read a bunch of Green Hills marketing. Like most American marketing it is economical with, or seriously abuses, the truth.

            2. rafff

              Re: What is old is new again

              At the time that UK MoD were mandating Ada, a friend was in charge of software procurement for the Royal Navy. He asked me one day, what questions he should ask the Ada "experts" who are bidding for RN contracts. I told him: ask them what language they use to write their in-house accounting software.

              1. Mage Silver badge
                Windows

                Re: in-house accounting software

                Cobol.

                1. An_Old_Dog Silver badge
                  Windows

                  Re: in-house accounting software

                  These days, it's COBOL with New Miracle Whitener Object Orientation features grafted onto it.

                  (Icon for, "I Wasn't Afraid to Admit I'd Programmed in COBOL.")

          2. chololennon

            Re: No, of course I've no idea if this remotely resembles the actual syntax used...

            > I remember when the US DoD was ADA ON EVERYTHING

            That reminds me that ADA is the perfect example for "developers are more important than programming languages": ADA is safer than C or C++, but how about this catastrophic bug? https://en.wikipedia.org/wiki/Ariane_flight_V88.

            Rewriting code in Rust is not the solution, it is a costly process (money and time) and also an error prone process... who can assure that old well tested code in C or C++ will be rewritten without bugs in Rust (or any other "memory safe" language)?

            1. bazza Silver badge

              Re: No, of course I've no idea if this remotely resembles the actual syntax used...

              The Ariane 5 first launch failure was primarily a management failure. The developers had made a mistake that was assessed and accepted for Ariane 4, which was very successful. Management didn’t allow a reassessment of this for Ariane 5!

            2. flayman

              Re: No, of course I've no idea if this remotely resembles the actual syntax used...

              If an AI can't already do it, there will be one soon that can do it with passable reliability. That's at least a starting point.

              1. flayman

                Re: No, of course I've no idea if this remotely resembles the actual syntax used...

                Piss off with your thumbs down. I'll bet money an AI will be able to translate most C code into the language of your choice within the next 3-5 years. It wouldn't be good enough for production, but it would save an enormous amount of time.

                1. Dagg Silver badge

                  Re: No, of course I've no idea if this remotely resembles the actual syntax used...

                  So... the problem is if there is an error in the source code it will just be transferred into the destination code. it's just GIGO

                  To actually fix any potential memory errors will require extensive refactoring. This will require access to the original high level/business requirements that were used to create the original code. You will also require ALL of the subsequent change requests and process changes.

                  And then you will need to try to track down the wildcard changes that you cannot find any reason as to why they were made.

        3. rafff

          Re: No, of course I've no idea if this remotely resembles the actual syntax used...

          Many moons ago, before my beard was grey, the UK MoD insisted that everything had to be written in Coral66* and not in assembler. The software providers responded by providing programs in the form:

          begin code;

          ... assembly language program ...

          end code;

          What goes around comes around.

          *Anyone else remember Coral66 and the Pink Peril (the language 'spec' - it had more holes than a fishing net)?

          1. Ferry Michael

            Re: No, of course I've no idea if this remotely resembles the actual syntax used...

            I worked in PO-Coral which became BT-Coral.

            My first big programming task was to replace some badly performing Coral with assembly code in POPUS2 Assembler, except for a for weird instructions that were so rare that the CPU designers did not even have a assembly OPCODE for them. The code was just SPECL(<hex code>), where SPECL was a "special" opcode with the parameter being the HEX for the instruction.

            The language did not last long enough for there to be an OPENREACH-Coral

            1. An_Old_Dog Silver badge
              Windows

              Re: No, of course I've no idea if this remotely resembles the actual syntax used...

              Google doesn't give me any relevant hits on POPUS2. Which computer/CPU did POPUS2 generate code for?

              1. John Smith 19 Gold badge
                IT Angle

                Google doesn't give me any relevant hits on POPUS2.

                Could be POP2 or POP11 or POPLog.

                POP2 was an AI language developed (IIRC) at Edinburgh in the early 70's. The 11 refers to a version on the PDP 11 series. Later someone grafted Prolog onto it. I thought there was something called PopTalk that incorporated Smalltalk features.

                "POPUS2" could be a mis-remembering or a version hosted on specifically US hardware.

          2. smudge

            Re: No, of course I've no idea if this remotely resembles the actual syntax used...

            *Anyone else remember Coral66 and the Pink Peril (the language 'spec' - it had more holes than a fishing net)?

            I worked with it on the Ferranti Argus 700 - that was a government, not MoD, programme. And on CTL/ITL machines (successors to the Modular One) - that was a pathology lab database and reporting system.

            The Pink Peril rings no bells with me. But I do still have a copy of the HMG Official Definition of it - the Blue Book.

        4. Phil Lord

          Re: No, of course I've no idea if this remotely resembles the actual syntax used...

          It doesn't resemble the actual syntax. The largest block that you can put unsafe on is a function. Making every function unsafe would be a complete PITA.

      2. This post has been deleted by its author

      3. containerizer

        Re: It's not the language, it's just the way it's "talking"

        > The fact that rust has this "unsafe" directive (or what ever it is called) means that the language designers absolutely know that the language cannot do what people want to do with in in a memory-safe manner.

        I can think of very few cases where this kind of operation would be necessary. The one obvious one is where you have to manipulate memory that isn't really memory, such as when you're accessing memory-mapped hardware within a kernel, or certain other low-level operations where something else is handling memory for you.

        This does not mean the language is bad. At the very least it shows that unusual memory usage which cannot be tracked by the compiler is at least auditable and the compiler can emit diagnostic warnings when they are used (or enforce against their use). In C there is no standard way to do this.

        You cannot build a programming language which is foolproof in all scenarios. You can, however, build one which minimises bugs caused by human error. I think that's arguably a win over a language which pointedly does not.

      4. DrXym

        Re: It's not the language, it's just the way it's "talking"

        Maybe you should ask why does the unsafe keyword exist in Rust? It is because Rust sometimes has to call bare metal, or external libraries (written in C/C++). These are external to Rust itself and therefore a mechanism is supplied to make those calls.

        So this is not some "ahah gotcha!" moment.

        The vast majority of Rust code is safe by default and doesn't need to use the unsafe keyword ever. And if it does need to use unsafe, then it is a sliver of code, a tiny fraction of the overall code base.

        So this is an absurd argument. Memory safety is a very big part of the language and the benefits are obvious to anyone who programs in it, which I assume you haven't.

    2. Anonymous Coward
      Anonymous Coward

      Re: CISA may well be for the chop

      Under Trump V2.0

      JUst make all your apps supplied to the US Gov do two things.

      1) Play the Jan 6th anthem on startup

      2) Start every message with 'Trump is God'

      And you will be safe... for the time being, as long as you are a White Male, born in the lower 48.

      1. MattAvan

        Re: CISA may well be for the chop

        That's fine, you just need to do a search and replace on "DEI, straight white scum!" in the existing code.

        Imagine playing identity politics games against the largest voter demographics in a country. Play stupid games, win stupid orange prizes. That's how we now more or less permanently have the Hindu Right in power here in India.

    3. eldakka
      Unhappy

      Re: It's not the language, it's just the way it's "talking"

      > Basically I don't see the language as the problem,

      Amen.

      As a Java Application Server administrator of hundreds of appservers and thousands of JVMs, I can tell you our single biggest root-cause of incidents are memory leaks. Because the devs don't have to micromanage memory, they get lazy and expect the garbage collector to clean up memory, which it does if the developers remove all references to the memory in question. But often they don't. We have cases where we have to schedule pro-active restarts of JVMs (often after already increasing the heapsize significantly to extend the time it takes to fill the heap) to prevent the heap filling up on weekly, sometimes daily and in a few cases hourly restarts of the JVMs. The hourly and daily ones are usually temporary until they fix the code, but in some cases we've had some applications requiring weekly JVM restarts for years.

      1. G40

        Re: It's not the language, it's just the way it's "talking"

        If this is true those developers are demonstrably idiots. Fire them and start improving your own internal standards.

        1. OhForF' Silver badge

          Demonstrably idiots

          Our code base still contains stuff originally developed for Java 1.1 and back then the operational requirement needed the system to run for a maximum of 12h per day. There were known corner cases that would have required special treatment to clear those orders out of the system and remove all references.

          As time to market is important and developer time is expensive it was decided to handle those corner cases using a daily cleanup including restarting all of the JVMs.

          As time flows on the system becomes ever more complex and the initial few Kilobytes of memory leaked grows a lot but new functionality is always deemed more important than implementing proper handling for all the corner cases. Operational requirements change to 24/6 but the project proposal to implement handling the known corner cases and identify and clean up any remaining reource leaks is denied as management is convinced just allowing the JVM to use more memory is sufficient and much cheaper.

          I agree that the first crash in operation on Wednesday did demonstrate there were idiots involved but i wouldn't point at anyone in development.

        2. JoeCool Silver badge

          Re: It's not the language, it's just the way it's "talking"

          Wrong.

          The whole intent of Java is to lower the bar for app developers; calling them idiots is plainly prejudicial when that is the skill level dictated by the language. It really is just a sellout to financial management.

          That's the attraction of using a lnguage that requires a certain skill level, for both the individual and the mamangement that is doping the development - they are actually required to understand such issues, which is only achievable through hands-on experience. IE. they are Professionals, and not just sending in patches from their garage.

      2. Ferry Michael

        Re: It's not the language, it's just the way it's "talking"

        .I remember a supplier that used .NET for a program that processed video packets and used the default single threaded .NET garbage collector. The program was performance limited by the garbage collector maxing out a CPU core.

        We had to spread the load across many more computers to compensate, and choose processors based on singled threaded performance.

        Switching to a multi-threaded .NET garbage collector meant multiple cores fully occupied doing garbage collection, but at least performance was slightly better and we could get by with fewer computers.

    4. JoeCool Silver badge

      Re: It's not the language, it's just the way it's "talking"

      This is correct.

      The precise issue is that C the language is not "memory unsafe" because it doesn't do memory management, the libraries and apps do that.

      So it is in fact the libraries and apps that are not memory safe.

      Anyone writing some version of "C isn't memory safe" instantly loses a chunk of credibility.

      Anyone writing some version of "C/C++ isn't memory safe" instantly loses most of their credibility.

      1. Jellied Eel Silver badge

        Re: It's not the language, it's just the way it's "talking"

        So it is in fact the libraries and apps that are not memory safe.

        So I guess one solution would be to create a DoDlib and declare everything else NSFW. Host it in the library of congress, with congress being used in the biblical sense.

        Anyone writing some version of "C isn't memory safe" instantly loses a chunk of credibility.

        To be fair, didn't K&R warn about this in the first page or chapter of their book? Been a long time since I've read it, but I also remember my lecturers warning about his, ie it'll let you overrwrite your OS if you're not careful. Then again the course was aimed at designing safety-critical systems. I do however like the idea that F-35s etc may in future be powered by Rust, if they're not already.

        1. bazza Silver badge

          Re: It's not the language, it's just the way it's "talking"

          >I do however like the idea that F-35s etc may in future be powered by Rust, if they're not already.

          Not yet - Rust is too new. Parts of F-35 rely on INTEGRITY - the OS from Greenhills, and that's written in C (using Greenhill's own C/C++ compiler and C/C++ runtime libraries). If you want compilers, libraries and an OS that has strong assurances of being correct - look no further.

          But it does smell like the US Gov (which will include the DoD) will start getting quite insistant on Rust being used, and I can see why.

          What Else?

          Rust is one component of it. Adopt it, forbid the "unsafe" keyword, and in theory you end up with code far less prone to memory mis-use errors.

          However, when one looks at today's hardware, MELTDOWN / SPECTRE and similar are all about memory misuse / mishandling within CPUs. And it's interesting to consider what can be done about that. There have been articles here on El Reg on the topic of the need to get rid of C in the hardware sense too. C / C++ and today's libraries for them all assume that its running on a Symmetric Multi Processing hardware environment (for multicore hardware). But, the hardware hasn't actually looked like that for decades; SMP is a synthetic hardware environment built on top of things like QPI, or HyperTransport (or newer equivalents), and these cache-coherency networks are what is causing MELTDOWN / SPECTRE faults which the CPU designers are seemingly powerless to fix. Apple's own silicon has recently been found to have such faults - they're unfixable in M1, M2, and they've not disabled the miscreant feature in M3 even though they can.

          So, it looks like we should be getting rid of SMP. That would leave us with - NUMA.

          We've had such systems before - Transputers are one such example. The Cell processor in PS3 was a bit NUMAry also (in how one used the SPEs). Super Computer clusters are like this too (no direct addressability of data held on another node). Various researchers are getting re-enthused about such architectures, pointing out that even 7 year old kids can be taught how to program for them.

          Of course, such a hardware switch devastates existing SMP-centric code bases, like Linux.

          What Does This Have to do with Rust? I hear you ask? Well, Rust has (in theory) perfect knowledge of what owns what data and when. You pass ownership of a piece of data from one thread to another, and it knows you've done this. An object cannot be mutable in two places at once in Rust. It is almost completely ideal for conversion from running on an SMP environment to running on a purely NUMA environment. Whereas passing ownership at present is simply used to determine what code can use some memory, it could also serve to prod the runtime that data needs to be moved from from NUMA node to another.

          In otherwords, Rust is a pretty good candidate as a language that suits both SMP and NUMA architectures.

          Golang is another - in fact, Golang makes no bones about being a recreation of the the Transputer's CSP architecture. GoLang is quite hilarious / ironic in the sense that it implements CSP, and has to do so in an faked SMP hardware environment, where most of today's hardware has more in common with the Transputers that with actual SMP hardware of the 1980s / 1990s.

          Python multiprocessing is another. Copy data from process to process - don't share it.

          This then opens up the possibility that the US Gov - having "forced" Rust on to the software world, got a Rust OS - might then start requiring hardware architectures to drop SMP too.

          The Future

          That hardware shift is some way off, and a bit of a long shot. However, if it does come, persisting with C / C++ code bases for whatever reason could become an even bigger liability in the future than anyone is thinking of at the moment. Not only might it become hard to find developers willing to write in it, or customers willing to accept code written in it, it may become difficult to find hardware to run it on.

          That ought to worry the likes of Linux more than it appears to be doing so.

          To be certain that today's SMP environments will survive and will be able to keep running C/C++, these projects need the hardware manufacturers to fix cache coherency / hardware memory faults for once and for all. Though there seems little prospect of that.

          Shared Memory is, Today, no Different to Copied Memory

          The classic "don't copy data, send a pointer to data if you want it to be fast" is maxim that should have died decades ago. It was only ever true in actual SMP environments like Intel's NetBurst of the 1990s.

          Today, for one core to access data in memory attached to a different core, pretty much the same microelectronic transactions have to take place as would be required to simply copy the data. If code on Core 0 accesses an address actually in memory attached to Core 1, then a copy of that data somehow has to find its way into Core 0's L1 cache before the code can actually do anything with it. But, that is a copy. The problem today is that it's a copy that - because this is an SMP environment and Core 1 has also (probably) accessed that address recently, there has to be a lot of transaction between the Core 0 memory subsystem and Core 1's, to make sure that all the caches everywhere all have the current content of that address.

          If you look at any Intel CPU today, you may end up with copies of the data in CPU0 / Core0's L1 and L2 caches, and in CPU1 / Core2's L1 and L2 caches, and probably also in one of the L3 caches somewhere as well as in some DDR5 memory chips. That's six copies of the data, all of which need to be kept in sync with each other.

          Just think how much easier the hardware would be, if such sharing were entirely up to the software to resolve, and how sweet that'd be if it was all resolved because the programming language's syntax made it very clear where the data needed to physically be?

          That's how important Rust could end up being, giving us a bridge from yesteryear's outdated and difficult hardware architectures to tomorrow's.

          1. StargateSg7 Bronze badge

            Re: It's not the language, it's just the way it's "talking"

            You are CORRECT in that future architectures will be truly MANY-CORE with each core being Hyperthreaded / Multi-Pipelined (i.e. from 2 to 256 threads per core!) with a serious ENFORCED division between Local Cache on-chip memory assigned to each thread in a hard-sandbox environment so no crossing of security lines is possible, then a local application non-shared heap memory where data arrays and executable code are stored on the local application heap on the local machine within a secured sandbox modality of operation, and then a shared application heap memory so threads and sub-tasks can send/receive/share data with each other AND finally a global shared heap memory within a NETWORK environment (i.e. kinda like shared hard drives but for super-fast System RAM that is PetaBytes+ in size!) so that multiple applications or batch jobs can access/process large data sets all at the same time using multiple cores and multiple threads using semaphore-based read/write/modify data access security applied to individual and groups of records and even down to individual data fields!

            Local sub-routines (i.e. individual procedures and functions) would use a local-thread-only STACK MEMORY that is also assigned ONLY within the L1 CPU cache with access only allowed by the currently assigned runtime thread.

            Once those hard divisions and data security semaphores are put into place AT THE HARDWARE LEVEL, then data security will be much easier to implement and ENFORCE with little to no runtime-speed penalty! We need to put ALL data security AT THE HARDWARE level using hardware-based Group-ID/Individual User-ID/Application-ID/Job-or-Thread-ID/Record-ID/Field-ID/Field-or-Object-Type-ID codes and data read/write/modify semaphores so the code-speed-penalty of using a SOFTWARE-based data access security semaphore infrastructure is eliminated!

            Once the hardware-level security is ENFORCED and ALWAYS-ON, the upper-level application code doesn't have to worry about the actual data security implementation ensuring that DATA ACCESS SPEED is no longer a concern! Once that concern is taken off the table, programmers and end-users can focus on user-interfaces and subject matter expertise rather than application security since access security will be ALWAYS ON and ALWAYS ENFORCED at the underlying CPU/GPU/DSP/Vector Processor hardware-level!

            Using an analogy, MANY banking services used today, now have Two-Factor Authentication (TFA) which uses something WE HAVE (i.e. our physical smartphones or a smart-card) and something WE KNOW (i.e. a username and password) to ensure the data we access and use is truly authorized for us and by us! By taking that sort of infrastructure and putting that same TFA security infrastructure into a CPU HARDWARE-level context, speed of data-security-enforcement no longer is at the easy-to-compromise end-user-or-application-level but now put at the much-more-difficult-to-subvert microchip-level. Once security analysts from around the world can finally agree on a COMMON hardware-level TFA schema that is put through a fully public and extensive security analysis THEN we the public AND the programming community can finally stop worrying AS MUCH on low-level-data-security and focus more on the actual data we are processing and using!

            V

      2. bazza Silver badge

        Re: It's not the language, it's just the way it's "talking"

        Just because the means to allocate memory is a function in a library does not mean that the language itself is memory safe. You can declare a pointer, assign some random address to it, and dereference it, all without any external libraries. The code may even work - sorta - on some platforms - depending - but quite a lot of the time it's either going to crash, of cause havoc, or both.

        Such code would never pass any resaonable programmer's definition of "memory safe code". C is not an inherently "safe" language. Anyone claiming that it is has a rather too blinkered view of the language.

        C++ is even worse, as new is a keyword and part of the language from day one.

        1. JoeCool Silver badge

          Re: It's not the language, it's just the way it's "talking"

          I am not stating that C is "inherently memory safe" I am saying that it is irrelevant to discuss the language itself as "memory unsafe" as it puts the target of the discussion in the wrong place.

          The right place, as your comment shows is in the programmer intentionally writing bad code in their app or library.

          Also modern compilers will flag your scenario as multiple errors or warnings that should not be ignored.

          C++ is even better as the strong typing requires a malicous programmer to take extreme measures to achieve illegal memory accesses.

          Finally, there is no "sorta works" there is only do as intended or do not.

      3. containerizer

        Re: It's not the language, it's just the way it's "talking"

        > The precise issue is that C the language is not "memory unsafe" because it doesn't do memory management, the libraries and apps do that.

        I am sorry, but you are very wrong.

        Managing the stack is a form of memory management and the C language does this. If you declare a variable or an array etc on the stack, or take a pointer to it or dereference it, the compiler generates code for this purpose. Off-by-one errors and other stack-related bugs are a very common cause of instability or security bugs. That's a fundamental feature of C.

        The use of malloc() and free() are discussed in the original K&R book and are defined in the ISO C99 spec (safe to assume they are in ANSI C89 and in subsequent ISO specs). They may not be operations that are directly converted into machine code by the compiler but that doesn't mean you can get away with saying they are not part of the language.

        If you want to talk about losing credibility, scoring pedantic points by trying to suggest that the language and the support library which forms part of the language specification are not intrinsically linked seems like a good way to do that.

    5. StargateSg7 Bronze badge

      Re: It's not the language, it's just the way it's "talking"

      The problem is the PROGRAMMERS and PROJECT MANAGERS themselves! They don't comment or even perform Systems Analysis on ANY of their code!

      Back in MY DAY, we had an actual Systems Analyst / Code Editor who usually led a set of multiple teams of 10 programmers whose team leads were specified to enforce a SINGLE CODING style that EVERYONE adhered to that included ENFORCING function names that made sense i.e. Save_File(); Save_Record(); Get_Record(); Get_Square_Root(); Get_Cube_Root(); Set_File_Position_In_Bytes(); etc. so that everyone code's was all the same style with the beginning and ending brackets properly positioned and corresponding to the start and end of programming blocks.

      Then the Analyst ENSURED when the coders submitted ALL functions that each one ensured all local variables had zeroed-out or properly-initialized starting and ending values AND that proper exception-handling blocks were implemented to ENSURE that every function would return a valid default value or a properly defined error code that ALWAYS was handled by a master error-handling routine. Each function was ALSO tested to ensure minimum and maximum range limits which were clipped-to, rolled-over or error-handled within underflow or overflow exception-handling routines!

      We also made sure that almost ALL functions were re-factored BY HAND to ensure that each function was 100 lines of code or less that called OTHER well-tested functions to ensure source code readability and lowering code complexity. At the top of each function, the programmer was FORCED to comment and describe the underlying mathematics AND underlying logic of the function using plain English and to detail its final purpose and HOW it worked at the lowest level. The Systems Analyst was to be MADE to understand the code itself from the comment itself and if he couldn't understand HOW it worked, the comment was revised until he could AND the source code function was tested to ENSURE it actually MATCHED what the comment said it did!

      Within each code block comments, you could NOT simply repeat the function name itself but had to make a logical comment that described WHY we called or coded the specific sub-routine or sub-block in the way we did. Then, when all this coding was done to create an application such as a Fast Program Trader App, we TESTED the code on worst case scenarios that overloaded and underloaded network communications, disk reads/writes, memory reads, writes and memory allocations/deallocations and ENSURED that hardware and software failures would allow the code to FAIL GRACEFULLY and allow the logging and saving of interim or end-user data that could be easily recoverable so as to not waste further time or money. Even if there was a catastrophic hardware failure, we could always use the data logs and automatically recover to just before the failure happened and re-use the recovered data as-is or redo and correct it using a different mode of processing.

      The Systems Analyst also made sure that they found the most COMMONLY USED functions from everyone's code that became part of a master common functions library that EVERYONE USED in the same manner using the same parameter style and naming convention so that we could reduce the coding footprint and better test for common failure points. Code that DEPENDED on those common function was rewritten and recoded to conform to the master common functions library conventions. In 99.99% of cases, those common functions were used in the same manner by programmers over and over again but written in a different style, so the Systems Analyst ENSURED that those functions were REDUCED to a well-thought-out-parameter-list that ended up as a single callable common function!

      Coding projects FAIL due to bad management! A systems analyst that has serious code editing chops AND knows how to RE-FACTOR the source code of many individually written functions into a single common function library are literally worth their weight in GOLD! We also learned to figure out that an app only has SEVEN TASKS to accomplish:

      1) End-User Interface for Web, Mobile and Desktop usage

      2) Application Management User Interface for Web, Desktop and Mobile usage

      3) Common local machine and network-based File Processing and Database Record Input/Output routines

      4) RAM memory allocation, block memory usage/counting routines and in-memory

      data management and in-memory and disk-based garbage collection routines.

      5) Security and Encryption of saved and sent data packets and security/encryption of in-memory blocks and end-user/customer fields and records

      6) Exception Handling, End-user Logging and Management Access Logging plus Graceful Failure application handling

      7) Actual application-specific end-user data processing, complex math and bitwise manipulation and more-specific data logic and data fields handling

      Number SEVEN was always the hardest to do BUT we got smart enough to ALWAYS have and save a master library of well-tested common routines within our common functions library that could be used over and over again on different projects. We never had to buy or licence EXTERNAL object-code libraries but rather made our OWN INTERNAL MASTER LIBRARY that was used for DECADES!

      And BECAUSE we separated out the major functions of any given application into the above seven layers, it was EASY and FAST to update to new technologies such as mobile phone/tablet-specific HTML-5 web-front-ends and massively parallel GPU-based back-end grid-processing! We ended up working on 150+ applications with some over ONE MILLION LINES OF APPLICATION-SPECIFIC CODE that all used the same library made since the early 1980's that has been and will be updated until this day in 2024 and beyond! The new guys even said just how good and well-tested those common routines were done for use in even their largest coding projects!

      Even when security changed from 256-bit AES encryption to Quantum Computing-Resistant encryption (i.e. Anti-SHOR's algorithm encryption!), it literally took less than one week of new code writing and three weeks of testing to ensure ALL the older programs and newer programs could switch easily and quickly to the latest in data security! None of the upper-level code or user interface code needed to be re-written, just the encryption routines themselves which could STILL call the legacy routines to ensure that older encrypted data could still be accessed and re-encrypted/re-formatted using the NEW routines! The end-users had no idea ANYTHING changed since the production executable code changeover was done on the beginning of a long-weekend and then tested for the extra two days before it was let loose on the employees and customers! It went smooth as silk DUE TO PROPER SYSTEMS ANALYSIS and proper code and application implementation management!

      The coders can now concentrate on the complex application-specific data processing itself and the types of results needed for that processed data. User interfaces, file and record Input/Output and memory handling is all taken care of by the common functions library and is so WELL-ABSTRACTED that it runs on older hardware AND the latest hardware and on MULTIPLE operating systems. The code was so well written and tested that we made this all possible even for today's programmers who STILL use all our code!

      My manager was one of the BEST I have ever seen and no wonder he got to retire after 40 years of management as a mid-single-digits multi-millionaire! He well-earned that money! And I notice the NEW SYSTEMS ANALYSIS GUY is just as good because he is now just as well-trained! I can see they pay him enough to stay AND let him manage such that the company will do very well for another 40 years!

      V

      1. An_Old_Dog Silver badge

        Re: It's not the language, it's just the way it's "talking"

        I want to work where you worked, provided MBAs and beancounters haven't taken over and destroyed the engineering culture.

        ... Oops, too late. They've taken over.

        1. StargateSg7 Bronze badge

          Re: It's not the language, it's just the way it's "talking"

          Luckily the beancounters and MBAs/C-Suite has been smart enough to to ALLOW the head Systems Analyst AND the programmer teams to KEEP using the master common library and keep using the system where all individual coders submit their code to the head-honcho Systems Analyst for code editing and coding-style enforcement. The eight teams of 9 coders and one team lead/analyst are now so well trained that coding style enforcement doesn't have to be done so much AND since we have a SEPARATE QA group that tests every line of code using all sorts of overlimit, underlimit, and min/max/mean/average plus optimal, near-upper and near-lower limit test data samples that STRESS the hardware and software above and beyond the typical end-customer usage limits, we tend to have VERY high quality source code and SMOOTH deployment. We usually also deploy final application in BLOCKS OF TEN USERS so that in case anything happens, only a FEW USERS need to be rolled back!

          It usually takes about 3 months to one year to fully deploy our mostly RDBMS-oriented client/server and grid-processing-based financial processing and commodities/trading applications! While I do STILL consult to my old workplace, I am usually used as a user-interface guidelines designer and rapid-application-front-end developer who can get an initial web or desktop user interface setup within a week so that the other programmers and subject matter experts can EVALUATE if my front end design WORKS for most end users and managers. They then make a big 100 to 1000 item notations list which lets me CHANGE that user-interface and then export a master graphical page description code file that lets the SUBJECT EXPERTS change the final front end to their liking usually within a month of my changes submissions and then include the final interface into the source code master for Alpha/Beta/Release Candidate QA testing. A Typical project starts with a full APPLICATION outline that determines what does the application do, who is it for and how fast are the final results needed and what regulatory bodies do we need to log for and conform to? Using an analogy, we do a complete PAPER EDIT using real paper yellow index cards that get pasted on a HUUUUUUGE whiteboard that outline EXACTLY what the program does logic-wise and front-end/back-end-wise.

          Only when the customer signs off on the PAPER EDIT of the entire Application Front End Design, main Application Logic Tree and ONLY WHEN the head Systems Analyst signs off on the master Sub-routines naming convention that gets connected together to perform the application logic tree tasks list, are individual programming teams assigned to CODE the actual project itself. Again, as much of the common function library as possible is used to complete as many of the front end/back end user interface tasks and program logic tree as possible. The master code tree is DESIGNED to always contain sub-routines that are mostly LESS THAN 100 lines each! If-Then-Else/Switch-Tree branches, array handling plus file, record and field handling are ALL designed to call as many common functions as possible to complete a specific task. Most of the coders end up doing one new application per every 12 to 18 months so time schedules are quite realistic and not overly stressful.

          I now do work for an Aerospace company doing synthetic vision systems research and grid-processing-oriented video/audio/vector/pixel filtering and DSP processing but all that training from people who go all the way back to COBOL and VAX VMS days, provided me with a SOLID BACKGROUND in proper Systems Analysis and code evaluation. I sure did learn a LOT of that from the lead Analyst which has done me very well for front end design and coding. That previous company I worked for had and still has MANY programmers who have been with the company for sometimes 35+ years! They get PAID VERY WELL and they LIKE seeing projects be deployed that SUCCEED and do well for the clients!

          We usually on-board a new programmer only when someone retires and whenever they do on-board, they get a FULLY-PAID 6-month training supervised programming learning curriculum that forces them to learn the entire common functions library tree using rote memory and online learning/testing techniques. They get put in a digital sandbox to play with all that code in a SAFE and well-designed/quiet coding environment! Everyone gets their own office with great ergonomic chair and desk, a set of high-end Windows, MacOS and Linux desktops, a new Windows, MacOS and Linux Laptop, a set of Windows/Android/iOS tablets and custom Android/iPhone/Windows smartphone devices so they can ALWAYS "Eat Their Own Dogfood" when it comes to personal testing of their code on multiple platforms.

          After the 6-months of training is up, the newbies are assigned a set of application-specific programming tasks that gets supervised by ALL team leads to ensure they understand and use the BASE COMPANY CODING STYLE! They tend to end up LIKING the fact that other people's code is so easy to read and understand how it works DUE TO the enforced verbose coding style that all the apps and sub-routines use! Once you learn the entire common functions library, application development is such a breeze and you can concentrate and focus on coding the more-fun-to-do CUSTOM application logic and custom data processing functionality. It keeps your mind VERY SHARP when you don't have to keep doing boring old front/back-facing code over-and-over again!

          In case you're wondering about onboard salary for someone who has two to three years of previous out-of-university experience, we start at $80,000 CDN with the more experienced coders earning around $170,000 CDN after 10 years! Team Lead Analysts get $230,000+ CDN so they are doing VERY WELL! Everyone works 8:30 am to 4:30 pm Monday to Friday with a one hour lunch and all Stat Holidays Off AND everyone gets 3 weeks scheduled summer vacation plus one week off at Christmas and one week off during Spring Break (5 weeks total paid vacation!). Bonuses for fully deployed NEW applications can be as much as 25% of base yearly salary so that helps too! They also do 1:1 RRSP (i.e. 401k) matching where every dollar paid by employee is matched by employer for RRSP retirement funding. There is also a full life insurance, travel insurance and extended medical/dental benefits package! There are also two-person free tickets for up to six games per year of Vancouver Canucks hockey games, six family passes per year for local mountain skiing lift tickets and six per year of $150 CDN gift cards to local restaurants. (Employee can choose which weeks to get the cards and gifts!)

          YES! It is a VERY NICE and relaxed place to work!

          Now You Know!

      2. Anonymous Coward
        Anonymous Coward

        Re: It's not the language, it's just the way it's "talking"

        An old metric I've heard is that properly-written code gets written at the rate of one or two lines of code per developer, per day. Which is why (at least in the old days) systems that mattered cost a fortune. And, unsurprisingly, such systems could be extremely robust.

        The problem these days is that bean counters look at crappy software that mushrooms up overnight on some repo or website, like the pretty colours and the coolness, and think that's "normal".

        There's a lot done even today that inherently makes software a whole lot harder to get right. This article is all about using "safer" languages. A pet topic of mine is data interchange, interfaces. How many times have you seen bugs caused by one program sending malformed data to another. The (very severe) Heartbleed bug was one example of that. Heartbleed happened because - in true RFC fashion - someone decided to define their interface in the worst way possible - hand-cranked code.

        The distressing thing is that there are technologies that go a very long way to solving such problems, yet a lot of projects go a long way out of their way to avoid using them. I'm thinking of tools like ASN.1 - the constraints expression and checking is superb, at least in good tools - yet lots of other serialisation systems have no such concept. When you look at projects like dBus (which plays some important roles in system security), of all the myriad choices of extant data serialisation standards they could have picked up, they decided to write their own! Google did the same thing, "inventing" Google protocol buffers when ASN.1 had already existed for quite a while, did the same job, and did it a whole lot better and more comprehensively.

        Had Google used Google to Google for "binary serialiation standards", come across ASN.1, and put the effort into their own (OSS?) implementation of that instead of inventing something brand new, they'd have been far better off. As it happened, Google didn't use Google to Google for this, something admitted to in the developer conference where GPB was first announced (someone asked, "why didn't you use ASN.1?", and the Google bod had to admit to not having heard of it..). This has cost Google real money. Their motivation for GPB was to save storage money; that's what lies behind the varint wireformat for integers, can get low value ints down to a couple of bytes. Yet, with constraints and PER/uPER encoding, ASN.1 can do far better. Had Google used uPER encoding, they'd have been able to save an awful lot more on storage than they achieved.

        It's like a large proportion of the world's software projects deliberately set out to make getting it right as hard and as complicated as possible. It's pretty negligent, not researching the best way to accomplish a software task, and it's not surprising that bean counters prefer to limit costs.

        1. Anonymous Coward
          Anonymous Coward

          Re: It's not the language, it's just the way it's "talking"

          Upvote for ASN.1 ...

        2. Pier Reviewer

          Re: It's not the language, it's just the way it's "talking"

          ASN.1 parsers written in C/C++ are a fantastic source of vulnerabilities! When I’m doing a secure code review of a C/C++ code base that includes a custom ASN.1 parser, that’s the first thing I’m looking at. So many devs screw it up.

          That’s the problem with C and C++ - reality doesn’t necessarily meet spec, which is basically the definition of a software vulnerability. Rust’s compile-time constraint validation straight up guarantees you don’t get undefined behaviour wiping out the types of vulnerabilities responsible for ~70% of all published vulnerabilities.

          The folk citing the “unsafe” keyword’s existence clearly don’t understand its use case. In C/C++ code bases the entire thing is “unsafe”. You’ve got millions of lines of code to worry about. In Rust, your primary concern is the few thousand lines explicitly flagged as “unsafe”. That’s a far more tractable problem for a human to review, so you’ll get far fewer memory safety issues making it to production.

          1. Anonymous Coward
            Anonymous Coward

            Re: It's not the language, it's just the way it's "talking"

            Well, if a developer chose to spin their own ASN.1 parsers rather than use a reputable tool suit, they are indeed asking for trouble!

            Of course good tools are commercial / proprietary which is off putting for a lot of developers. Good tools for XML are hard to come by too. I’ve yet to find comprehensive tools for JSON too (other than JSON and XML are targets for ASN.1 too). There are some pretty affordable options. Fabrice Bellard (he of QEMU fame and other amazing projects) sells a pretty decent ASN.1 implementation for C.

            What’s weird about Google is that they’ve put in as much effort in tooling up GPB as they’d have needed to do a good OSS ASN.1 implementation. If they did just a couple of things to Google Protocol Buffers, it's actually be a pretty good shout for replacing ASN.1 in a lot of applications (but, not radio interfaces).

    6. Wayland

      Re: It's not the language, it's just the way it's "talking"

      RUST is WOKE.

    7. hoola Silver badge

      Re: It's not the language, it's just the way it's "talking"

      But this is the entire problem now.

      Developers are increasingly lazy and few appear to have any concept of security, testing regression testing.

      The more you dumb things down the worse it gets because the entire process becomes tied to the latest fad that is spewed out of university Computer Science courses.

      Sure the hackers are getting smarter but the number of holes is also getting larger and the rank stupidity of development teams getting worse. Until there is real accountability for the shite (mostly from Agile teams) that is pushed out nothing will change.

      The only side effect now appears to be:

      "we will do better next time"

      "This was a one off" (that similar things have happened passes the morons by).

      "It was a sophisticated attack"

      "No sensitive data was compromised" (We have not yet found it on the dark web for sale)

      I might be sceptical around this but I have been in IT for too long the mistakes are simply getting too numerous to count and the quality appalling.

  2. Will Godfrey Silver badge
    Meh

    Is this a bandwagon I see before me?

    ... and do those wheels look just a wee bit wobbly?

    1. MatthewSt Silver badge

      Re: Is this a bandwagon I see before me?

      No... Just Rusty

      1. Michael Strorm Silver badge

        Re: Is this a bandwagon I see before me?

        We need a "Ba Dum Tsssscchh" icon.

        1. chivo243 Silver badge
          Trollface

          Re: Is this a bandwagon I see before me?

          surely this works--->

  3. mtrantalainen

    Which language do you think is used to implement all those memory-safe languages?

    All the listed memory-safe language examples are itself implemented in C or C++. This is because historically C and C++ have been the only languages that had good enough performance to implement the runtime engine. Nowadays we have Rust and Zig and it's still too early to say if Zig is safe enough in practice. Rust is safe enough but it's harder language to use than C and C++. I personally feel that Rust was much easier language to understand than Haskell so it's definitely not impossible language to learn. And even vanilla Rust cannot handle out-of-memory situation gracefully – the default implementation simply kills the whole program. Sure, it avoids having remove code execution but still allows for DoS problems.

    1. mevets

      Re: Which language do you think is used to implement all those memory-safe languages?

      Go is written in Go.

      The early ones were bootstrapped in C; just like C was boot strapped in machine code.

      1. Ian Mason

        Re: Which language do you think is used to implement all those memory-safe languages?

        C was bootstrapped using its predecessor B, which itself was bootstraped via TMG (TransMoGrifiers - a compiler generator). C was not bootstrapped in machine code, or even assembler.

        1. elDog

          Ancient B (and BCPL) programmer here.

          Lots of good stuff came out the University of Waterloo (and still does.)

          "Fred" - the Friendly Editor.

          1. StargateSg7 Bronze badge

            Re: Ancient B (and BCPL) programmer here.

            Thank You and Thank You Again UofW! We probably worked on it! Actually, you are probably STILL using our packet and records-based data encryption, audio/video MPEG-1/2/4/MP3/H.264/H.265 codecs AND even our our DSP Convolution Filter Image and CNN coding and Wavelets code DAILY in all your major apps! It wasn't just Fraunhofer Institute who did all that! And YEAH! Our B-Tree database code AND Binary Search and Hard Disk Block Management Algorithms that EVERY RDBMS system uses since the 1980's! Arse-Hops! It's scary to think we had THE MAJOR HAND in EVERY ONE of those technologies used today!

            Yay Canada! Thank You All For Using Our Stuff!

        2. bixbyru

          Re: Which language do you think is used to implement all those memory-safe languages?

          Kinda depends on the implementation.

          Typically, for any new architecture C is first built in "Tiny C," which is in turn built in assembler; eventually you've a clunky, slow-but-full C implementation which then builds a proper set of useful tools.

          Might wanna read the essay "Trusting trust."

    2. Dinanziame Silver badge
      Trollface

      Re: Which language do you think is used to implement all those memory-safe languages?

      I personally feel that Rust was much easier language to understand than Haskell

      Wow low bar

      1. Zoopy

        Re: Which language do you think is used to implement all those memory-safe languages?

        I don't think that's fair.

        Any child who's familiar with the basics of category theory, lambda calculus, Peano notation, and algebraic hypo-static transductive applicators of Curried semi-ring monads can learn himself a Haskell in mere decades!

        Although to be fair, it's helpful if they were exposed to compilers a bit during kindergarten.

        1. An_Old_Dog Silver badge
          Joke

          Learning Haskell

          I once knew APL. Would that be helpful to me learning Haskell?

          1. Peter Gathercole Silver badge

            Re: Learning Haskell

            The only thing knowing APL may help with is S or R, and only because of the way that data is handled. They're not really anywhere in this equation.

    3. iron

      Re: Which language do you think is used to implement all those memory-safe languages?

      No they aren't.

      .NET has been written in C# for some years now.

      Prior to that Delphi was written in Delphi from the start. (mentioned becuse it had the same creator as C#)

      And Go is written in Go.

      1. StargateSg7 Bronze badge

        Re: Which language do you think is used to implement all those memory-safe languages?

        Yup! Microsoft brought the Delphi lead over from Embarcadero (aka Borland/Codegear at the time) for the Microsoft Visual Basic IDE re-do and paid him like $500,000 USD per year to lead the project! It was quite the project success, although I do like Delphi much better due to the TYPE of programming language that Object Pascal is!

        In the Canadian Aerospace company I now consult to, we wrote our own custom in-house-designed and coded from-scratch 128-bits wide Object Pascal compiler and IDE for our combined-CPU/GPU/DSP/Vector super-processor that FURTHER LIMITS even the structure data-types of Object Pascal that Delphi uses to more-data-specific keywords and more-modern multi-language string handling and much-safer compiler-enforced-structure-data-type pointer-handling. (i.e. Signed_Integer_128_Bits_Type, UnSigned_Integer_128_Bits_Type, Floating_Point_128_Bits_Type, Signed_Fixed_Point_128_Bits_Type, Memory_Address_128_Bits_Type, Pointer_Four_State_Boolean_Type = TRUE, FALSE, COULD_BE_EITHER, HAS_ERROR, etc). Our KEY data types are very limited in scope and handling ENSURING application security and all local-stack and shared or non-shared/application-specific local machine heap and global network heap memory allocation is further monitored and garbage-collected to ensure no memory leaks! All RDBMS records and individual fields are AUTOMATICALLY saved-to-file and in-memory-stored as encrypted quantum-computing resistant encrypted bytes to ensure RAM scavenger programs can't get to those always-encrypted fields or records!

        We also added expanded easier-to-read #IF_DEFINED, #IF_NOT_DEFINED and #DEFINE/#UNDEFINE keywords plus some of the direct #DEFINE-based code-substitution-handling specifications and operator handling of C/C++ to our Object Pascal version AND we added HUGE 6D ARRAY HANDLING (i.e. 2^127 x 2^127 x 2^127 x 2^127 x 2^127 x 2^127 sub-dimension array indexes) for our MASSIVE multi-level databases! Plus we rewrote ALL the world's major audio/video/still image pixel processing, line and curve drawing and codecs routines for 2D-XY and 3D-XYZ capable multi-frame and multi-sample video, audio and metadata layering, enhancement/filtering and compositing using 128-bits wide RGBA (32-bits per channel) numbers so that massive-sized high-dynamic range imagery can be processed and displayed in real-time!

        We simplified and LIMITED our version of Object Pascal to handle highly size-specific Signed/Unsigned Integer, Floating Point, Fixed Point, ASCII and UNICODE strings, Multi-state Boolean Types and Memory Addresses (aka Pointers), custom-sized RGBA/HSLA/YCbCrA/CMYKA/CIE_XYZA pixels and accelerated RDBMS Records and Fields handling in a very specific manner so that programmers can direct their attention to high performance data processing and complex math for realtime data sensing and filtering.

        You can STILL use the compiler for other CPUs but the NEW Object-Pascal language we created is much smaller and simpler to learn and ENFORCES code readability and ENFORCED code commenting! For graphics, DSP and audio/video we have a super-fast hardware-accelerated windowing, user-interface handling and low-level pixel/line/curve drawing and metadata processing library that is much more readable and easier to understand than Direct-X, DirectShow or Windows Presentation Manager!

        Procedure Play_Media_File_In_Window(

        Destination_Network_Identifier,

        Destination_Computer_Identifier,

        Destination_Display_Number,

        X1_Coordinate_In_Pixels,

        Y1_Coordinate_In_Pixels,

        X2_Coordinate_In_Pixels,

        Y2_Coordinate_In_Pixels : Signed_Integer_128_Bits_Type;

        Transparency_Level_In_Percent : Fixed_Point_128_Bits_Type;

        Name_Of_Media_File : UNICODE_Character_String_Type;

        Starting_Playback_Position : Signed_Integer_128_Bits_Type;

        Playback_Speed_In_Percent : Signed_Fixed_Point_128_Bits_Type;

        How_To_Playback_Media : Semaphores_Type = [ WINDOW_HAS_NO_BORDER_OR_CAPTION_BAR,

        START_FROM_DESIGNATED_VIDEO_FRAME,

        KEEP_ASPECT_RATIO_WITHIN_WINDOW,

        PLAY_CANADIAN_ENGLISH_VERSION_VIDEO_AND_AUDIO_ONLY,

        LOOP_VIDEO_CONTINUOUSLY,

        LOCK_PLAYBACK_WINDOW_IN_PLACE,

        SHOW_ONLY_BASIC_VIDEO_AND_AUDIO_PLAYBACK_CONTROLS ] );

        I would say that the above procedure is MUCH easier to read and understand than what Microsoft provides!

        V

      2. [email protected]

        Re: Which language do you think is used to implement all those memory-safe languages?

        .NET can't be written in C# since .NET requires hooks into the OS. C# is interpreted at run time, so can't have anything to do with the OS.

        Delphi cannot be written in Delphi and Go cannot be written in Go.

        You cannot make memory safe language with a memory safe compiler.

        It's not possible. A compiler has to have the lowest level, direct address access.

      3. bazza Silver badge

        Re: Which language do you think is used to implement all those memory-safe languages?

        >.NET has been written in C# for some years now.

        >Prior to that Delphi was written in Delphi from the start.

        >(mentioned becuse it had the same creator as C#)

        >And Go is written in Go.

        Er, apologies for the pedantry, but if I take even the merest of casual glances around https://github.com/dotnet/runtime, I seem to find an awful lot of C and C++ files. There is C# code there too, but it's clearly a long long way from being entirely written in C#. I gather that the C# compiler is written in C#, but the .NET runtime is (at least in large parts) not. But, more on that later.

        However, I firmly agree in general terms. Once a language and/or environment is boostrapped into itself, you're up and running. One might have got to a running Go compiler by writing one in C++ first, but indeed, so what? That's not the compiler that matters, its the one that's in use (the one written in Go) that matters.

        Runtimes

        Runtimes are interesting, because of what they have to do. For example, a C program cannot print anything on a terminal without using functions in (for example) glibc, and those functions work only because they've been written to load up the appropriate registers with the appropriate data and generate an interrupt to initiate a kernel / system call. In theory you don't need glibc for C, because if one really wanted to your own program could make the system call itself.

        Such things are not options for all languages. There's many languages that have no concept of registers and interrupts within them, and don't include any means of inlining assembler to do it either. So, one cannot write their runtimes (that have to interact with an operating system) purely in that language; there has to be a language to native transition somewhere within the runtime. Presently, that means having to use C, C++ or assembler (or other language that lets one mess around with actual registers and interrupts) for the runtime.

        Rust is interesting because, whilst it shares much of the look and feel of a high level language, it is possible in Rust to make operating system calls directly from within Rust. So, all of Rust (including its runtime) can be written in Rust, with no trace of C / C++ at all (unless one starts considering the implementation of the syscall inside the OS itself!). So, one could perhaps "improve" languages / platforms like C#, Java, by re-writing their runtimes in Rust.

        Which, if done, would eliminate the "Which language do you think is used to implement all those memory safe languages?" question altogether.

        Has anyone done a Python implementation in Rust yet?!?!?!? Oh yes!

        System Calls from High Level Languages?

        C# is interesting, because of C++/CLI. I've often used C++/CLI as a means to integrate calls to native code into .NET, because it makes it so trivial to do so. C# can do it, but there's a lot of marshalling, and pinvokeing going on. C++/CLI, you just call it (and, I guess, it's hidden all the marshalling and pinvoking from you).

        What I've not explored is whether or not one could make a system call in C++/CLI. One doesn't need to (or indeed, know how to) on Windows (that's what Win32.dll is there for), and I don't know if C++/CLI can run in .NET Core on Linux (I think it might be from .NET Core 3.1). However, my point is that there is a .NET language that can (or very nearly can) make system calls directly from within the language. That suggests that without too much extension or difficulty, a high level language like C# could be given the means to make system calls, and then both it and its runtime could be written entirely in the language itself.

    4. StrangerHereMyself Silver badge

      Re: Which language do you think is used to implement all those memory-safe languages?

      Rust handles out-of-memory similar to C, an allocation function returns an Err Result. The code would then need to act accordingly.

    5. DrXym

      Re: Which language do you think is used to implement all those memory-safe languages?

      Rust is actually easier to write than C/C++ in a lot ways but it has a steep unlearning curve from C/C++.

      I say unlearning because compiler doesn't care about a lot of things that the Rust compiler will punish you for until you stop doing them, e.g. abusing ownership rules, memory safety etc. The advantage however is that once you do learn the Rust way then a lot of the practices are transferable back into C/C++ to make code you write in those languages less unsafe than they were before. I say less unsafe, because the language is always going to be broken but some badness can be mitigated.

      I say easier aside from the unlearning because the language is terse, the tools are simple to use, the compiler is friendly stuff like unit testing & package management is built into the tools, async programming is part of the language and code tends to be more portable thanks to a better standard library.

  4. b0llchit Silver badge

    Stop with the useless A better than B crap

    The programming language you use is the right one for the task at hand. That can be any language, be it, to name a few, C, C++, Rust, Java, Python, C# or Assembly. You need to know which tool to use for the project or task. Finding that out is part of being a programmer.

    You need a real programmer, regardless of the language used. You need to learn any language and need to create to become experienced. All programmers need supervision and all programmers need to supervise. That is how you learn and improve to the expertise level of the language at hand.

    1. SammyB

      Re: Stop with the useless A better than B crap

      If I had them, I would give you 10 thumbs up, as it is you get 2.

    2. Someone Else Silver badge

      Re: Stop with the useless A better than B crap

      That can be any language, be it, to name a few, C, C++, Rust, Java, Python, C# or Assembly.

      But specifically excludes APL.

      And probably that thing called "sappeur", which likely is even less mature than Rust, and except for one particular commentard, has never been heard of, much less used, in any commercial project, ever, anywhere.

      1. Gary Stewart

        Re: Stop with the useless A better than B crap

        In my Introduction to Computer Science class (circa 1976) we had to write programs in 2 languages. One was Fortran, using 80 column punch cards of course, and the other was APL that used DECwriters with APL keyboards. I can't imagine a more useless language for that class. And to add insult to injury the APL program was a get the bot to navigate a maze program (CIA looking for exceptional talent?). Well I did manage to get the Fortran program running correctly, I don't remember what the program did but it was fairly simple, but did not have a clue on the APL program. I think I got a C (not sure where this grade lands in the UK system) in the class and was grateful for that.

        1. DoctorNine

          Re: Stop with the useless A better than B crap

          Jeebus... Reading that gave me the same sort of sensation as those dreams of highschool where you're standing naked outside your locker, and you realize you didn't do your math homework.

          1. stiine Silver badge
            Unhappy

            Re: Stop with the useless A better than B crap

            Lucky you. I could never find my damn locker. When I could find my damn locker I could never remember my locks' combination. After a week, I just carried all of my books to every class.

        2. Peter Gathercole Silver badge

          Re: Stop with the useless A better than B crap

          The secret with APL is to forget traditional program techniques with loops iterating over a data set, and learn how to transform the data using something akin to vector and matrix manipulation.

          I did my first year university project in APL, which was to write a system to allow APL code to be printed on normal line printers, replacing the special characters with mnemonics. Started writing it as a traditional looping approach, and it got to the point where a single run of the major program would take minutes to complete and wipe out my whole day's computing budget. It was also many hundreds of lines of quite intractable APL.

          Went away at Easter, and it finally dawned on me how to do the task by setting up a set of transformation data objects, and reading the entire data to be worked on into another, and then just used the built in operators to do each step of the transformation in simple steps.

          The whole project then dropped to about 50 lines of APL, most of which were just doing the setup and outputting the results. which ran almost instantly.

          I got marked down, because the assessor considered such a small program to be too trivial to be a suitable output from the project (even though he had actually been supervising it, and didn't say anything while I was working), but I feel that he overlooked the write-up of the failed first attempt and the subsequent rethink and understanding of the language that the report demonstrated.

          He was also the lecturer teaching APL to us, so I have sometimes wondered whether he really understood the language in enough detail himself (he taught it alongside relational algebra, so I think he wanted to treat it like a database manipulation language).

          1. Someone Else Silver badge

            Re: Stop with the useless A better than B crap

            The secret with APL is to forget traditional program techniques with loops iterating over a data set, and learn how to transform the data using something akin to vector and matrix manipulation.

            From my point of view, the secret to APL is to not use it.

            YMMV, of course...

            1. Peter Gathercole Silver badge

              Re: Stop with the useless A better than B crap @Someone Else

              I never used APL after I left University (although when I worked for IBM in the AIX Support Centre, the fact that I had used it made me the person designated to handle calls that came in about APL/2 (on the PS/2 running AIX) and APL/6000 on the RISC System/6000, not that any did).

              But I've never completely forgotten about it because I thought it was interesting (and I've forgotten several other languages almost completely since so, there must have been something about it), and when I came to learn S and then GNU R, some of the thinking was transferable.

          2. Anonymous IV

            Re: Stop with the useless A better than B crap

            I thought that it was compulsory in APL that all programs must occupy one line only?

            1. Peter Gathercole Silver badge

              Re: Stop with the useless A better than B crap

              No, that's just an elitist desire. From memory, you can do lots of nested operations on a single line, as each function returns an object that can be the input to another function. But this is not hugely different from most other languages (I often write whole quite complex functional programs as single lines of UNIX shell).

              The difference is that each object can be a whole set of data in a multi-dimensional array, so as long as you get the shape of each object compatible, you can operate on whole sets of data as parameters to single functions, and each return value can be used as an parameters to other functions. And these can be nested together to create unreadable code that can do whole programs on just a single line, as you said.

              The example I quoted, the actual functional element of my project was about 5 lines of code, surrounded by some set-up and some clean-up. And if I had desired to make it completely unreadable, I may just have been able to reduce the essence of the program it to a single functional line, but you have to remember this was an assessed project, and writing clever but completely unreadable code would probably have resulted in a low mark from the outset, irrespective of how 'clever' it was.

              I'm not a C++ or any other type of object oriented programmer, but I'll bet that you can do even more convoluted things to complex objects in any number of OO languages, probably even PowerShell. But APL is old. Very old (first implemented in the 1960s). And has been able to work on multi-dimensional data objects as single entities for almost all of it's existence. The only things that are different are that the built-in functions are represented by single characters written in something that looks like over-struck greek, plus the fact that you have dyadic operators (they take parameters from both before and after the operator function name), which is unusual in most languages.

          3. rafff

            Re: Stop with the useless A better than B crap

            "reading the entire data to be worked on"

            So how long does it take to process a non-terminating (potentially infinite) data feed?

      2. Blue Pumpkin

        Re: Stop with the useless A better than B crap

        But specifically excludes APL.

        And pothibly Lithp ath well …

    3. Anonymous Coward
      Anonymous Coward

      Re: Stop with the useless A better than B crap

      Well said. Maybe if companies didn't contract out to the lowest bidder, or employ programmers for peanuts, they'd get the results they want without forcing programmers to wear inefficient buoyancy aids.

    4. bazza Silver badge

      Re: Stop with the useless A better than B crap

      I remember reading an article in a journal from the UK's IEE (now, IET), back in the days when the IEE really was a hotbed of engineering study and standards setting. The article was the results of a study and analysis of real safety critical systems in actual use. This was a very long time ago - early 1990s I think - and I'm afraid I can't remember the title or any reference, and it's pre-Internet.

      The systems were mixed - air traffic control, nuclear reactor control, flight control, etc. All stuff that mattered. A variety of languages had been used - C, Ada, even Fortran I think. All had cost a large amount of money, and all had good operational reputation. The systems were studied for two categories - symantic errors per 1000 lines of code (ie.code that was compiled, but wrong against the specification), and operational errors per 1000 hours of operation.

      They were all pretty good, but none were perfect. The interesting thing was that the very best was an air traffic control system, written by IBM, in C. The worst was some system written in Ada (I can't remember what it was).

      The analysis considered this, and determined that for the IBM ATC system. IBM had rolled out its very best, most experienced A-Team developers, and the dangerous-chainsaw view of C was referenced (if you have a chainsaw with zero safety features, blatantly dangerous to use, you may very well take a lot of care in using it!). The Ada did less well, because they'd had a less experienced team. The IBM ATC system was (I think) also the most expensive...

      Modern World

      One of the challenges these days is getting anyone to do any coding in anything like a rigourous manner. For a lot of systems that get built these days, there's little expectation of producing a high quality result, and the end users have little faith in getting one. Bugs / system cruddiness has become far too "normal".

    5. StrangerHereMyself Silver badge

      Re: Stop with the useless A better than B crap

      This is nonsense. Just like technology programming languages have improved too. People have had new insights over the years and the result of this are languages like Java, C# and Rust, which build upon previous improvements like C (which was built on assembly language and the insight that high-level Systems Programming Languages could improve readability and portability).

      I keep stressing that C is a Systems Programming Language which is erroneously being used as an Application Programming Language. Rust too makes the mistake that it wants to be memory-safe Systems Programming Language instead of an Application Programming Language.

      1. Peter Gathercole Silver badge

        Re: Stop with the useless A better than B crap

        I think you have to be a bit careful when grouping code functions into catagories such as "Application Programming Languages".

        Even back in the day of classic languages, before the proliferation of sooo many different languages, you would say that something like Fortran, Cobol or Algol were all application programming languages, but you probably wouldn't have used Cobol for your engineering modelling, or Fortran to print out bank statements from the customer database (although I'm sure you can find examples of both of these things in history)

        And then you had languages suck as PL/I, which really were designed to be both systems programming languages and application programming languages, and was actually never widely adopted (outside of IBM) as either, even though it was actually competent at both, in a rather heavyweight way.

        Modern applications, like those supporting on-line, web based applications or AI have introduced more, and different types of challenges, but languages such as C and even more so C++ can do them without huge amounts of difficulty, but often specific languages have been written to specialise in those problems. But that does not make these new languages suitable for all application types.

        If you were to try to draw a Venn diagram of application types (and don't ask me how to catrgorise them, I'm just talking generically), and then see which languages spanned the most different types of applications, I'm pretty sure that C and maybe even more so C++ would have the largest foot print of what they covered of any language. Whether this is still a good thing or not is the essence of this debate, but what I think it underlines as that it is still horses-for-courses, and with language development going as it is, we will never get to the point where there is a single, generic application programming language that is good for all application types.

        1. StrangerHereMyself Silver badge

          Re: Stop with the useless A better than B crap

          C was specifically designed as a high-level language for programming operating systems with. It was essentially a "high level assembly language" which enabled porting of operating systems to different architectures with a minimum of effort.

          Therefore, all the snafus in the language such as pointer arithmetic, undefined behavior, pointer confusion, casting any-way-you-like, lack of bounds checking are there for a reason! They weren't oversights by the people that created the C language. These were needed for operating system development and they had the added advantage of increased performance compared to languages which would prevent you from making such mistakes.

          Now for a general purpose Application Programming Language such features aren't needed. You would trade some speed for a vastly more reliable and safe programming environment.

          The universal usage of C/C++ has clouded the judgement of many programmers who fail to see the distinction and try to shoot for a "one language that does it all" type of solution and Band-Aid fixes for its drawbacks.

  5. Chris Gray 1
    WTF?

    Why?

    Can someone give me a summary of just what about Rust is causing the translation problems? Likely there are multiple reasons. I don't think I've seen a summary of what the issues are. Yes, Rust has a different syntax. But seriously, that can be overcome. Is it the semantics of borrowing?

    I've written stuff in several different assemblers (thousands of lines in IBM-370), C, Lisp, Basic, Forth, AlgolW, Pascal, Fortran, a bunch of my own languages, and likely some I've forgotten. Many of those don't have pointer-like things so you can't do stuff you'd want. Is that the main issue? Is it how borrowing works?

    (I've written a memory safe compiler with restricted pointers, structs, safe unions, allocated records, arrays, etc. etc. If I can get it to full maturity, would something like that work? See http://www.graysage.com/Zed/New . I'm close to finished doing all the hacks in Elf .o files to match gcc/ld's desires.)

    1. Paul Crawford Silver badge

      Re: Why?

      Can someone give me a summary of just what about Rust is causing the translation problems?

      Effort. Which in turn requires time and resources to train staff and to translate/replicate existing code and potentially maintain duplicate, etc. And that has a significant cost attached to it that is often not granted as generally customers demand fast & cheap.

      I'm not saying memory-safe is a bad goal, it is clearly very good, but sometimes it is going to be easier to slap an AppArmour profile around some existing code than to re-write in a safer language and then have to debug the new bugs, etc.

    2. Gene Cash Silver badge

      Re: Why?

      Another problem is learning Rust in today's Rust community.

      You ask noob questions because you're a noob, and you get told you're an idiot, instead of being pointed to a FAQ, documentation, or given helpful advice.

      Or you ask "is this the right way to do it?" and the same thing happens.

      I heard about Rust, went to the forums and saw that, and that's as far as it went.

      1. Proton_badger

        Re: Why?

        Bah, you'll see that for any language/library/etc. People are people and there are some like that but there are also lots of friendly devs who do their best to help.

      2. Phil Lord

        Re: Why?

        This runs completely counter to my experience. The rust forums have been extremely helpful, with some folks going to a lot of effort to answer even in depth technical questions.

        In fact, they have a strong policy that says "don't tell people to do their homework". The logic is that if you don't want to answer a question, you don't have. Just move on rather than take the time to say it shouldn't have been asked in the first place.

        1. Caspian Prince

          Re: Why?

          I don't know if it's Reg readers being deliberately ironic for shits and giggles but as if to hammer home the point, the post asking the question has received 10 downvotes and only 2 up as of the time of writing...

    3. thames

      Re: Why?

      The issue is that to convert a C program to a "proper" Rust program you have to approach a lot of things in a different way. Rust is not simply C with different key words.

      If you want an analogy, then if you've done much Python programming you have probably seen Python programs that were either directly translated from C or Java, or written by people familiar with C or Java and are just writing their Python program the same way they write a C or Java program. This is a well known phenomenon in Python and is immediately recognizable by people with experience with the language. The end result is something very verbose and slow because the Python language is being used in ways it was never meant to be in order to make it work like a different language.

      With C and Rust, there have been multiple machine translation projects, including AI based, but the results have been very disappointing. You simply end up with a C program that is written in Rust rather than a "proper" Rust program, and you probably haven't made much if any use of the memory safety features which you were converting to Rust for. So why bother?

      The answer would seem to come down to either completely replacing all existing C and C++ programs with ones written in other language (like that's ever going to happen), or else address the problem in a different way, one that nobody has really come up with yet.

      1. Chris Gray 1

        Re: Why?

        Thank you.

        There have been attempts in the past to make a "safe C", but I haven't been tracking them. My guess is that they don't have answers for some of the "creative" things that C programmers know how to do, and will continue to do until stopped. :/)

        In many ways, my Zed language is a followon to C. Most of the things you can do in C, a privileged Zed programmer can do. But you can also do safe things with slightly different techniques. For example my '@' values are much like C++ refs, but are explicit, not implicit, so programmers are less likely to mess things up (although many will complain about the extra character).

        Similarly, I use the 'nonNil' concept to let many runtime checks for NULL be omitted. (Note: there was a previous language that used the word "nonNil" for something quite different.) This also clarifies the requirements on parameters, etc.

        As for bounds checking, I've got some simple "constraints" that the semantic code can identify and thus inform code generators that some runtime bounds checks are not needed. Part of doing that is that the iterator variable of a 'for' loop is 'con', i.e. not writeable. If your C "for" loop is actually a "while" loop, then just rewrite it as a 'while' loop.

        Also, Zed 'enum' values are guaranteed to be in-range, so their use in 'case' constructs, as array indexes, etc. does not require extra checking (checking is done when arithmetic is done on them). Zed has 'oneof' types that are like C "enum" types.

        Switching to using this stuff, and more, does take effort - I've done some of it when I translated my initial C parser to Zed, but if management demands it, its not going to kill you, so long as time is budgeted. You can even do a bunch of prep work in C (like the "for' loop stuff), checking that your program keeps working. For 'nonNil', you could add asserts at the start of functions where you want one or more parameters to always be non-NULL.

      2. Michael Strorm Silver badge

        Re: Why?

        > So why bother?

        Because someone required you to write it in Rust, and doing so lets you tick that box?

      3. Old Used Programmer

        Re: Why?

        The old version of that is... A *real* programmer can write a FORTRAN program in any language.

    4. Richard 12 Silver badge

      Re: Why?

      Rust is both incomplete and unstable.

      There exactly one implementation, and there is no standard, or even specification - so there cannot be another implementation.

      That doesn't mean it's bad. It means it is not ready.

      There are some design decisions that mean it cannot replace C or C++.

      1) It cannot replace C because it only supports static linking. All external interfaces are C.

      2) Rust cannot ever replace C++, and trying is horrifically unsafe, because it has no inheritance. Anyone claiming you can do that does not understand either language - this was never a goal.

      Given another decade, Rust may fix the stability and dynamic linking issues.

      Given another decade, C++ will be provably safer than Rust.

      1. Jamie Jones Silver badge

        Re: Why?

        > There exactly one implementation, and there is no standard, or even specification - so there cannot be another implementation.

        Until tomorrow. At which point, there will be another implementation, a variation of the previous implementation. But it will be the only implementation, because it will replace the implementation from the day before, and yet still, there cannot be another implementation...

        Until tomorrow...

      2. tttonyyy

        Re: Why?

        While it's true that Rust has no inheritance, there are traits and supertraits that allow polymorphic behaviour.

        This was apparently an early design decision to avoid the classic C++ diamond inheritance problem, but I've not used the language enough to understand if the same problem exists in a different way through traits.

        So far I've not seen anything in Rust that couldn't be achieved in C++, the difference is that a C++ programmer would need the right disciplines to create Rust-like behaviour (not enforced by a compiler) whereas Rust forces the coder to follow the regime.

        Will it turn out to be the next "fad" language? I remember learning (and loving) Ruby, though that didn't live up to its promises in the way that Rust has - so far. We will have to wait and see.

      3. AdelaideSimone

        Re: Why?

        Rust has guaranteed backwards compatibility, except for a few small things, so it's "instability" doesn't matter. They also readily provide binary, prebiolt toolchains even for old versions, so you can simply specify that you want whatever version of you absolutely have to.

        Rust also does support dynamic linking. There's just very little reason to use it most of the time. Android actually provides some interfaces and such through Rust, and they are shared libraries.

        You're really only correct about not having inheritance, and that limits replacing some c++ programs; however, Rust traits can provide some similar functionality with its traits, which is a form of composition. Some of it is down to personal choice and debatable, but many cases of inheritance can be replaced by and better designed with composition.

        1. Dagg Silver badge
          Black Helicopters

          Re: Why?

          Rust has guaranteed backwards compatibility Yea, right until the next version is released and a previous feature is deprecated...

      4. bazza Silver badge

        Re: Why?

        If it comes to that, neither C nor C++ support dynamic linking. Dynamic linking is really an OS thing, and whether or not there are library functions related to tapping that OS functionality.

        There's another way of doing inheritance. Rust - thanks to its strong knowledge of ownership - is an ideal language in which to implement CSP (like, Golang has also implemented CSP). And, there are CSP implementations for Rust. CSP is interesting because it opens up a whole different view of Object Orientated Programming, Encapsulation, Inheritance, etc.

        Encapsulation is a CSP process with its own internal data. That data is not directly accessible to another CSP process (like it would be from another thread in the same process in an SMP environment). A process is an opaque "Object" with interfaces (channels). A method is invoked by sending messages to the process via a CSP channel, causing the process to do something. A result may be returned down another channel. Inheritance is easy; a process incorporates another process, and passes any messages it receives that it doesn't recognise down into that inner process. Multiple inheritance is easy - a process incorporates two or more other processes.

        This way of looking at CSP is quite useful for helping those with a long and deep experience in object orientated programming to switch over to CSP. It's also very useful in that - a collection of CSP processes instead of a collection of C++ objects - the deployment of those processes suddenly becomes very "agile". Go and Rust do not extend CSP channels over network connections, but it would be very nice if they did (and, there's already enough in the language to allow them to work that our for themselves if that's what they wanted to do). Erlang does (I think).

        A closely related idea - Actor model programming - is more easily network distributed. Libraries such as ZeroMQ make that veeeery easy indeed.

        The interesting thing to consider is whether - in this day and age - ideas like CSP / Actor model are any slower than conventional code written for SMP environment (i.e. memory shared between processes / threads, guarded by locks, etc). In today's multi-core multi-CPU machines, data is not shared; to be "shared", it's got to be copied into all the relevant caches, and there's a lot of cache-coherency traffic between cores to make sure they all have the same "version" of the data. Whereas with CSP / Actor model programming, you just copy it from A to B. There is no cache-coherency to guarantee. So down at the microelectronic level, there can be precious little difference. There's quite a few computer scientists these days who think we should go the whole hog and abandon SMP, simply to get rid of defects like MELTDOWN and SPECTRE forever.

        C++ can never be provably safer that Rust. They'd have to depracate most of the language to achieve that. It could equal Rust, but I can't see how it could ever be safer (unless it becomes Rust without the unsafe keyworkd). As others have said, Rust does support dynamic linking. It would be nice if there were an ISO Rust, but it's

    5. Orv Silver badge

      Re: Why?

      I think at this point a lot of C programmers have seen new programming languages come and go, and they view Rust as just the latest fad. No point in learning it when it'll be gone in another five or six years.

      1. bazza Silver badge

        Re: Why?

        There is no doubt some element of that. The difficulty this time is that Rust might not go away in five or six years. It's certainly the strongest challenger there's ever been.

        I think the strength of Rust is causing a lot of understandable angst. If one has decades of C/C++ behind one's career - like I do - Rust is threatening to reduce us all back to square one. That is threatening, as one can no longer say that one has 20+ years more experience than, say, a kid fresh out of college. However, it's largely a matter of attitude. Learning it is achievable, and if achieved, one is ready to go. And, there's an awful lot more to programming experience than just "what language one talks".

        We have been here before. Once, there were a lot of assembler programmers convinced they'd got jobs for life. That didn't last...

        For projects, it's a real problem. Staying in C, now, could mean having a dead project in 5 to 6 years if the world decides to go for Rust. Yet, converting to Rust now might be a wasted effort. Some cautious transitioning (given that it's achievable) is probably a good idea, just to get a toe-hold in the Rust world. Linus seems to think along those lines, though not everyone in the LKP seems convinced.

        I think that the real indicator would be if Rust got standardised. Once there is ISO Rust (or, similar), then it's something that becomes very hard to say is temporary. In fact, I'd not wait for it to become standardised; I'd start when it became clear that ISO were starting to standardise it.

        1. Dagg Silver badge

          Re: Why?

          Yet, converting to Rust now might be a wasted effort

          Yep, with you on that one. Spend considerable time and effort upskilling in Java and never managed to use it. There was way more work and money in C/C++.

      2. karlnapf42

        Re: Why?

        The problem is that Rust has proven that you can have memory-safety without a garbadge collector or other runtime tolls. It does not matter whether rust stays or gets replaced with something else, it has moved the bar already.

        Languages now need to deal with that, one way or thebother.

    6. MikeTheHill

      Re: Why?

      > Can someone give me a summary of just what about Rust is causing the translation problems?

      Data structures and ABIs. Rust has its own way of representing data structures and its own way of calling functions - it needs this as part of its borrow/checker stuff and possibly other reasons.

      That means the cross-over between Rust and C/C++ is a pain as there has to be an unsafe conversion in both directions. And in some cases there are no easy conversions so the C function has to adapt and thus all the callers of that C function also have to adapt.

      So if you are completely replacing a standalone C/C++ executable *and* all its dependencies at the same time, then no big deal. But, if you are replacing a single source module in a morass of C/C++ then it's much harder as you have to write unsafe Rust wrappers for every C/C++ function called and for every C function you replace. And you may have to adapt some C functions to suit Rust.

      This is why Rust in Linux is so hard as essentially every source module in Linux calls many other C functions so replacing even a single source module with a Rust version requires writing a fleet of unsafe Rust wrappers and potentially modifying the C functions to be more amenable to Rust callers.

      Even if someone goes to the trouble of writing a bunch of wrappers to, say, the networking stack in Linux, that's not the end of the job by a long-shot. Quite the opposite in fact.

      The question arises as to who changes these Rust wrappers when the networking folk want to change their C interfaces? The networking folk argue they don't know Rust, so they can't do it, but if they push their change to the repo, suddenly the kernel build fails due to the now-incompatible Rust wrappers.

      So do the networking folk just sit around and hope there is a Rust developer on hand to respond to their every whim? That's a pretty uncomfortable and risky dependency if you're trying to get your code out the door. Or are the networking folk expected to learn Rust so that they can spend time changing the wrappers? And it's not just the wrappers they'll have to change, they have to change the Rust code which calls the wrappers and the associated test programs. And if one of the Rust tests now fails, who debugs and fixes that?

      Whichever way you look at it, the C developers end up with brittle dependencies, stalls in their work-flow, more work and possibly a whole other language to learn that they may not want to learn.

      In an Open Source project that unwelcome burden is highly likely to disengage many C programmers, not a great strategy where engagement is already a problem. In an Enterprise environment, you can probably coerce your C/C++ programmers to learn enough rudimentary Rust to support the gun Rust programmers writing the cool stuff, but that is a pretty risky manoeuvre unless you are absolutely certain of the cost/benefits.

      Which brings up the resource issue. Once you start down the Rust path, the end result is that ultimately you expect to be able to completely replace your C/C++ workforce with a Rust workforce and that 10 years down the road, when your code-base is predominantly converted, that Rust is still popular with the cool kids. If I were a CTO, I'd want to think long and hard about the implications, costs and risks of that little jaunt into the unknown.

    7. bazza Silver badge

      Re: Why?

      >Can someone give me a summary of just what about Rust is causing the translation problems?

      I gather that one issue is that it won't let you get away with things that are commonly done in C/C++ but are actually quite dangerous. It think this can mean that there's a lot of re-work to do before the code is translatable.

  6. Anonymous Coward
    Anonymous Coward

    Well, until the RedoxOS project is mature enough to be viable as an OS, the US government is going to be stuck with C just like the rest of the industry.

    Is the US government even helping to fund RedoxOS? Or are they just too busy writing statements telling other people to use an immature language platform.

    1. Roland6 Silver badge

      “ Is the US government even helping to fund RedoxOS?”

      Alternatively, there is Ada, which is notably omitted from the CISA’s list for no given reason (in the CISA reports).

      Obviously, as you allude to, there is limited value in using memory safe languages unless the underlying OS is also memory safe; so not sure if there is a project for a Unix/Linux-like OS written in Ada.

      Not saying Ada is better than Rust, just noting it has been around a long time and is required by some other branches of government.

      1. Anonymous Coward
        Anonymous Coward

        Re: “ Is the US government even helping to fund RedoxOS?”

        Indeed. Where Ada (and SPARK) both have memory errors creep in is around the binding layers to do... well.. anything with the OS. Rust has not done well here either.

        Something more akin to carbon, circle or cppfront (evolution rather than revolution) will be the way forward. Rust is just a distraction wasting everyones time for another half a decade.

  7. alain williams Silver badge

    Re: Stop with the useless A better than B crap

    All programmers need supervision and all programmers need to supervise.

    Supervisors or rather managers are part of the problem that is all too often forgotten. Many were never programmers and so do not understand what makes a good environment to program; or of they do their own managers will not let them do the right thing.

    A large part of the problem is budgets and timescales. Managers want programs produced in as short a time as possible and as cheaply as possible. The result is that proper design and documentation is not done, testing is skimped.

    Some of that is understandable in a competitive environment: if Microsoft had ensured that its products were properly tested someone else would have beaten them to market and they would not be where they are today.

    1. Paul Crawford Silver badge
      Facepalm

      Re: Stop with the useless A better than B crap

      if Microsoft had ensured that its products were properly tested someone else would have beaten them to market and they would not be where they are today.

      Oh if only!

      1. Ken Hagan Gold badge

        Re: Stop with the useless A better than B crap

        Be careful what you wish for. They would have been beaten by someone who didn't properly test their products.

    2. Anonymous Coward
      Anonymous Coward

      Re: Stop with the useless A better than B crap

      @Alain_Williams

      Quote: "....Managers want programs produced in as short a time as possible and as cheaply as possible....."

      Ah yes:

      (1) Keep the PRODUCER COST as low as possible.....

      (2) ...and export the problems (and the COST of those problems)...to the customer!!

      Fantastic plan!! Keep up the good work!!

    3. Ian Mason

      Re: Stop with the useless A better than B crap

      As any engineer will tell any manager "Products can be had good, cheaply, safe, or soon. Now pick which two out of those four that you want because it is not possible to have all four."

    4. b0llchit Silver badge
      Boffin

      Re: Stop with the useless A better than B crap

      You missed the point. Supervising programmers is not about managing them.

      A supervising programmer is looking at the code. Just like programmers must accept that other programmers supervise them. It is also called peer review, except that there is an implicit educational purpose to the setup. A novice is not a real "peer", but will become one in due time when you all accept the learning curve.

    5. Anonymous Coward
      Anonymous Coward

      Re: Stop with the useless A better than B crap

      "if Microsoft had ensured that its products were properly tested someone else would have beaten them to market and they would not be where they are today"

      Nope. M$ got to be where they are today by eliminating competitors who (might) beat them to the market. Sometimes they just assimilated the competition. For others, they abused their market dominance to put rivals like Netscape and Novell effectively out of business.

      Testing (or not) had no bearing any of that. Which has always been the M$ way.

  8. rgjnk
    Mushroom

    'Memory safe'

    Shame about the horrible horrible things some of these languages do to achieve their 'safety' or easy of use. Plus the inevitable bloat.

    I use all sorts for all kinds of things but don't try to tell me some of these options are inherently better because they made a different compromise on one feature and should become the One True Path.

    It's not like most of the advocates are actually going to churn out better code even after the swap, they'll just have a different flavor of problem while nanny guides them away from a particular type of careless error. GIT issue lists are stuffed full of the evidence that language choice != better code.

  9. Anonymous Coward
    Anonymous Coward

    Heh

    In my glorious place of employment it's not even possible to move from C to C++ified C (replacing buffers and linked lists with strings and vectors and printfs with formats). The first step of modifying the toolchain to compile using the C++ compiler instead of C compiler and the minimum amount of code which trips up the C++ compiler is still as far away as it ever was. If that can't be done then Rust is just not happening and I don't suppose my employer particularly special. If an end client isn't paying for it they just don't want to know.

    1. Richard 12 Silver badge

      Re: Heh

      Which is why the real way to improve safety is to continue improving the safety of existing languages.

      That way, existing codebases can slowly transition to the "safe" subset, one function, one class, one subsystem at a time.

      New languages are exciting and fun, but rewriting an entire something in a new language is almost always an incredibly bad idea. It's expensive, time consuming and absolutely certain to introduce errors.

      Which also means writing a new application in a new language is very risky - if Rust gets replaced by Zig* next year, all your Rusty stuff needs rewriting...

      *I have no idea how Zig compares to Rust

      1. Chris Gray 1

        Re: Heh

        Just scanned the Wikipedia entry for Zig. Will have to look more closely.

        Just from the summary, I'd say Zed and Zig are semantically similar, but Zed uses reserved words where Zig uses punctuation.

        I initially planned to use compile-time procs to implement generics, but it got icky(tm) and I made them part of the language. I don't think I regret that choice.

        Oh, on the issue of polymorphism, Zed has "capsules" (single-inheritance classes) and "interfaces" much like Java ones.

        Zed and C mix well - in the initial implementation its a mixture of the two. Currently, my Sys and RunTime libraries are C code, but bigger stuff like Fmt, Elf, Terminal are Zed code. The current "zedc" compiler is all Zed except for that runtime stuff.

      2. Anonymous Coward
        Anonymous Coward

        Re: Heh

        Which is why the real way to improve safety is to continue improving the safety of existing languages.

        That way, existing codebases can slowly transition to the "safe" subset, one function, one class, one subsystem at a time.

        That's the thing, but the converting the codebase to compile with the C++ compiler is the first step and unfortunately in our case it's a bit of a job to do that and nobody's giving out a timesheet code for it. Otherwise I'd have already been refactoring stuff for years.

    2. JoeCool Silver badge

      Re: Heh

      Try simply adding support for C++ modules to the toolchain.

      That allows building Libs in C++ and linking them to C programs as desired.

      Far less traumatic and "big bang" that way.

  10. chuckufarley Silver badge

    There is one way to make businesses adapt...

    ...Just stop paying they to produce crap. They can't have a bottom line if they can't find the bottom.

  11. Michael Hoffmann Silver badge

    But, but, but...

    ... going by my various video feeds and incoming article alerts, it's all Zig, all the time, now!

    Don't these guys keep up with the cool kids(tm)?

  12. Anonymous Coward
    Anonymous Coward

    This doesn't bother me. I code in HTML.

    1. the spectacularly refined chap Silver badge

      Oh hi Lottie Dexter, I always wondered what happened to you after the Year of Code.

      (Search for the details if you can't remember them, in brief she was the figurehead for a government coding initiative who couldn't code and failed to appreciate what the term meant.)

  13. DS999 Silver badge

    That's a lot of work for fixing only one class of security issues

    If almost every security bug was memory safety related, and it was possible to re-implement everything without using the "unsafe" qualifier in Rust, then at least there would be a light at the end of the tunnel. But even if it was universally applied it addresses a minority of security issues. I'm less concerned about performance since CPUs are still getting faster even with Moore's Law running out of gas - I'd happily give up half my performance for 100% protection against all possible attacks but sadly that's not an option and never will be.

    Fixing every memory related issue wouldn't reduce the flood of exploits, because there are so many non memory related exploits discovered all the time that ne'er-do-wells will have plenty of ammo left for doing their evil deeds.

    What's worse we discover new classes of exploits all the time, and increasingly they are hardware related like Rowhammer and all the speculative execution attacks. I think the inclusion of mandatory additional bits that can be used for host based ECC in LPDDR6 is effectively an admission that no one knows how to completely fix Rowhammer so having systems with RAM that has no possible way to detect attempted attacks is no longer a good option.

    1. GioCiampa

      Re: That's a lot of work for fixing only one class of security issues

      Exactly! Can't see them mandating Intel et al to create "immune" processors.

      1. DS999 Silver badge

        Re: That's a lot of work for fixing only one class of security issues

        Even if they sold a CPU that's immune from all known attacks today won't be after someone clever enough comes along.

        The hardware attacks greatly expand the problem space - even a microkernel that's mathematically proven correct/secure (yes, those exist) is no longer secure if you're able to use hardware attacks on the system it is running on to steal secrets or flip bits.

  14. Tim Chuma

    Yeah sure, are you going to rewrite all the APIs?

    LOL NO.

  15. Rick Mo

    Well, at least we're not talking COBOL here...

    We've been working to replace that, with JAVA & GO, for DECADES now!!!

    ;-]

    post script: I'd recommend you switch to "GO"; it is simple to learn and easy to use - even the old guys in our shop like it!!!

    ;-]]

    1. StargateSg7 Bronze badge

      LOL! My old boss STILL has 1990-era 4-cpu VAX-9000 machines (i.e. a 32-bit supercomputer!) in some of his Toronto, Chicago and New York banking sites that STILL runs financial code written in COBOL which STILL works great that is used as an interface to older IBM System-370 and AS/400 JCL-coded applications which have been running since the 1960's! It has taken 35+ Years to move some of those System 370 and AS/400 apps over to VMS COBOL and then to C++, then to JAVA and now to pure HTML-5 front end with Delphi Object Pascal custom-coded RDBMS-based GPU server-farm grid-processing back-ends that DO NOT USE ANY Oracle database technology! It's a completely GPU grid-processing-specific RDBMS using a 5th Generation Multi-Language Query Language so NO MORE SQL needed with a planned simultaneous user connection limit of 16 BILLION end-user initiated batch-jobs spread out over a planned 100,000 set of AMD GPU Server Cards (160,000 financial-trading and spreadsheet-calculation batch-jobs per GPU card with each job using 512 Kilobytes of VRAM)!

      I would say probably there is about 15 MILLION LINES of COBOL code still running on the eight still-running multi-user VAX-9000 supercomputers which can STILL in the Year 2024 handle over 1000 simultaneous commodities trader users each on a VAX VMS Operating System back-end. It's considered MISSION-CRITICAL CODE which means it CANNOT be shut down for any reason and has been in CONTINUOUS operation 24/7/365 since 1990 (34 YEARS!). There is a 20 YEAR ongoing project to convert that 15,000,000 lines of mission-critical COBOL code to highly-abstracted Delphi Object Pascal back-end and a now pure HTML-5 front-end! It is expected to be finished by late 2028 with a full TWO YEARS of beta-testing so six more years to go until the 2030 Release Date with the same 50 well-experienced programmers assigned to the conversion job (i.e. 375,000 lines of code each to convert) since 2008!

      What a code conversion job! I wish them well! I KNOW they will succeed but it is DEFINITELY a slog to do as each programmer has to convert and TEST at least 50 lines of code per day from Monday to Friday for 20 YEARS! That is a VERY AMBITIOUS PROJECT even when using semi-automated code conversion tools!

      V

  16. Bump in the night
    Joke

    Unsafty First

    All use of computer languages should be avoided, just to be safe.

  17. eldakka
    Joke

    > post script: I'd recommend you switch to "GO"; it is simple to learn and easy to use - even the old guys in our shop like it!!!

    I don't want to go to Go because my lecturers always banged on about how go to's are bad.

    1. yetanotheraoc Silver badge
      Joke

      tldr;

      Go, too, considered harmful.

  18. John Geek

    the entire postgresql database server is written in pure C. It has been upgraded and expanded for 30 years now, and maintained a very high level of stability and security throughout its life. its quite portable and runs natively on most every popular operating system. it can be compiled with gcc, clang/llvm, Sun Solstice C, IBM AIX XLC, Microsoft MSVC, and numerous other compilers.

  19. CowHorseFrog Silver badge

    Strange how the title of the publication only mentions "in Open Source"... its like if you are not communist open source, then that program musst auto matically be safe.

  20. jake Silver badge

    I'm surprised nobody's pointed out the obvious ...

    "Well, because they don't have those years of experience, for one thing."

    Well, yeah. The REAL reason governments and management want to move to rust and away from C is that they have bought into the myth that rust allows you to hire young, cheep, inexperienced programmers and thus they will no longer have to pay for older, expensive, more experienced programmers in order to have bug-free code.

    Which is bullshit, as we all know. Simple truth be told, wet behind the ears coders write more bugs into code than veteran coders, regardless of language.

  21. AndyMTB

    Bandwagon a Bandwagon

    I see someone earlier commented "Is this a bandwagon I see?" - well why not combine one bandwagon with another? Simply tell AI "Convert all this mass of code into <your language of choice>". I mean, AI can do everything else, surely something so procedural as this would be a doddle?

    1. Ken Hagan Gold badge

      Re: Bandwagon a Bandwagon

      Be sure to also ask AI to create a comprehensive test suite (with bindings for both languages) as well.

      Then make it a requirement that both codebases pass the test suite.

      1. stiine Silver badge
        Holmes

        Re: Bandwagon a Bandwagon

        Didn't Brian & Dennis already write one of them?

  22. BPontius

    From a Government with 67 departments, agencies and sub-agencies still reliant on Cobol, including U.S. Treasury, IRS, State Department, and NASA.. Not to mention 45 of the 50 states and the District of Columbia that also run Cobol. most of which crashed during the Covid pandemic with the surge of unemployment fillings. Transitioning platforms to another language will take decades in business alone. It could take Government centuries, if ever. The oldest being the 56 year old mainframe at the Treasury\IRS which runs the Individual Master File program key to processing tax returns and refunds, written in Cobol and assembly.

  23. dippy1
    Joke

    AI is the answer............It will resolve all these problems.......no need for developers.

    Copilot create me the best program ever in $LANGUAGE.

  24. zebm

    Should they rewrite it all in Fortran? Lots of US government code still is.

    Must admit I choose a language based on library support as well as speed of execution.

    1. Blue Pumpkin

      Obligatory FORTRAN text here ….

      1. Anonymous Coward
        Anonymous Coward

        A great link by Ed Post (1982)! I particularly liked:

        "Some programming tools NOT used by Real Programmers:

        [... 3.] Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient."

        ... and ...

        "even C programming can be appreciated by the Real Programmer [...]. It's like having the best parts of FORTRAN and assembly language in one place."

  25. Bebu sa Ware
    Headmaster

    "The whole Rust versus C discussion has taken almost religious overtones"

    The conflict that followed the Reformation was more rational than much of the nonsense flying about in the "discussion" referred to in the Title.

    To be honest most theological disputes are more reasonably argued and certainly more respectfully.

  26. OllieJones

    This gives us coders cover with front office people

    Look, the CISA's latest missive on memory safe(r) languages isn't news to people who have worked in our trade for a long time. I've known about it ever since I forgot to do malloc() correctly a half century ago in some student problem-set code. Yeah yeah we know.

    But, here's the thing. It IS news to some front-office people. It's not addressed to us. But to them. As such, it's a valuable tool to support the business case for reworking legacy code in memory-safe(r) languages.

    People who do, I dunno, control software for municipal water plants, now have another way to pitch security-driven rewrites for their products to executives. And another way to avoid getting all the blame if the execs don't accept their pitches and the cybercreeps break in and make trouble.

    1. Cliffwilliams44 Silver badge

      Re: This gives us coders cover with front office people

      They will still get the blame!

  27. John Navas
    FAIL

    CISA is right

    ​This entire piece is based on specious arguments. When software is developed properly, coding is less than 20% of the total task​, and it's faster to write code in a modern type- and memory-safe language​ than C or C++. Worse, maintenance ​t​ends to be a nightmare that often costs ​much more than original co​ding, and much more expensive in ​C or C++ than in a modern type- and memory-safe language. That's why switching can actually save time and money in the (not so) long run. The actual primary reasons for not switching are (a) sloth and (b) disclaimers of software liability. See https://www.lawfaremedia.org/article/the-eu-throws-a-hand-grenade-on-software-liability

  28. martinusher Silver badge

    What For?

    This is the sort of edict that gets handed down to the lower orders by people who really don't have much knowledge about the subject. They get get a "Rust Good, C Bad" type memo up the management chain and make an executive decision. Its what they do.

    In real life all languages have a place. C is a systems language, its 'unsafe' by design because its what you use to write the underpinnings for the 'safe' languages that people should use to write applications.

    All obvious stuff. But I think there's been a cultural change that may be what's behind all this. I've noticed that recent versions of software are getting a whole lot larger and more complex than the components they're replacing. They're also getting a whole lot less reliable -- stuff that I'm used to 'just work' now 'works maybe most of the time' and the Internet threads on the subject tend to be a mixture of speculation ("wrong hardware, bad processor" etc.) and disdain ("you don't understand" etc.). I've done a lot of software development (and testing, obviously) and I actually easily recognize the "hose the stuff at that barn wall and see what sticks" school of software development. So I reckon a lot of this 'memory unsafe' stuff and the like is really "can't get the stuff to work reliably, fed up with bashing gophers but a new toolset will fix everything (at least that's what we'll tell the boss in meetings)".

    (There is a solution to all this. But you won't like it......)

  29. MrZoolook

    "The US government wants developers to stop using C and C++"

    Sounds like a good incentive to start using it.

  30. Anonymous Coward
    Anonymous Coward

    Torvalds bemoaning the bickering between the C and Rust camps as "religious" seems a little ironic given he's the father of Linux, an OS that's provoked so many religious flame wars...

  31. An_Old_Dog Silver badge
    Facepalm

    Beancounters' Pipe Dream

    They want a rubber-padded-room-like language which won't let programmers hurt themselves, they want the language to be simple and easy to learn, so they can use cheap programmers who have minimal training and experience, they want the language to be self-documenting, they want small runtime routines, they want small compiled code sizes, and they want blistering-fast execution times.

    Did I leave anything out here?

  32. jlaustill

    The issue with rust

    Yes, rust is memory safe. But the syntax is just painful. What we need is next-c, a memory safe version of cpp, with a familiar syntax, then I'll switch.

    1. diodesign (Written by Reg staff) Silver badge

      Re: The issue with rust

      Is this a good time to mention Zig? C like with correctness checks, tho without Rust's nannying and also without Rust's strict safety features...

    2. nijam Silver badge

      Re: The issue with rust

      > ...rust is memory safe...

      Rust is mostly memory-safe. But not entirely, of course.

    3. Caspian Prince

      Re: The issue with rust

      That would be Java or C# then.

    4. StrangerHereMyself Silver badge

      Re: The issue with rust

      I agree. I love Rust but I hate the syntax which to me, in this world of almost universal C-like syntax languages, is different merely to be different.

  33. eDiamond

    Perspective

    OK, why don't we step back a bit. What about the millions of lines of COBOL that are still running!

    Our biggest institutions, government and financial, are still running decades old code because they have nothing to replace it with. They don't have the budgets to convert them to C or C++, much less 'memory safe' languages.

    1. martinusher Silver badge

      Re: Perspective

      A lot of older and embedded code just doesn't use dynamic memory so is inherently memory safe.

      Unfortunately dynamic memory is a must if you're working with objects, there's just no other way to create instances of an object. Since everything has to be built around objects (regardless of whether this is architecturally appropriate for the application, the tools we use demand that its so) then all programs have to use dynamic memory with all the little gotchas and idiosyncrasies that come with it. I personally don't like using malloc pools in real time code for anything critical because you just can't determine when pool housekeeping operations are going to be performed unless you manage your use of memory as if it was a static resource.

    2. StrangerHereMyself Silver badge

      Re: Perspective

      I personally believe AI can help in porting those millions of lines of COBOL code to Java or C#. I hate the AI hype just a much as the next guy, but I really believe AI can help relieve some of the drudgery here and cutting down the lead times 2 orders of magnitudes.

      Porting code is currently more or less manual work, which takes too long and is therefore prohibitively expensive. AI can make this feasible.

  34. Anonymous Coward
    Anonymous Coward

    Software Engineer

    sadly, open is becoming the new closed.

    the whole point of open source is to give users and developers control of the software controlling their hardware.

    mandates are more about taking away control than producing better software.

    this reminds me of the ada mandate in the past which later had to be withdrawn.

    telling developers which language they should use is against open source.

    i have had conversations with both stroustrup and van hoff.

    objective c++ and java are beautiful.

  35. Remurkable1

    It All Comes Down To . . .

    As was said in the dark ages, a sufficiently skilled Fortran programmer will write Fortran code in any language. (Substitute the language of your choice.)

  36. Bsil

    The Smithsonian is doomed.

    I hate these kind of mandates "For safety and security". I remember, having to update a company product for the military based on the "All Army systems running windows must use Windows 7". I had nothng to do with the original project, but all it really did was run the hardware for an emergency engine shut off button on a touchscreen in a jet engine test cell. While I questioned the original product's use of Windows XP to run a touchscreen and at least it was not networked, I really really questioned the requirement to just update without changes to windows 7, not to use a directly embedded baremetal custom software which we did for aircraft parts, but instead, no. Just upgrade it and make people log in like the mandate requires. Yes. Somehow they think it made it safer to require you to login to your touchscreen, and pray nobody turns on the power management or leaves the screensaver going, so you can let it boot up the touchscreen, that let's you hit the all stop emergency shutdown button. It did require a change. Another keyboard was added so the user could log into the screen for the "button" embedded into the wall. The whole project hurt my pride as a developer so much to do, when when a government contract is that specific, you have no room to make it make sense. Here's what I have to say to all this. Computer science is sometimes an art, which is why it's often taught in the Arts and Sciences category. It's is a functional art blending lots of what probably were bad ideas with really good ones to manage the risk and just, make, it, work. Forget the project I just mentioned, except that's how the maintainers are going to asked to "fix" these products. What really worries me is that sometimes the fact they pulled off the requirements at all in that hardware makes it a masterpiece for the time and money they were given to do it and resources and practices used at the time. Have you ever been given 90 days to have a product go from inception to full factory runoff coordinating mechanics and passing of an FAA code audit and new project management system, every single line of code having to match to a requirement in the requirements document made on inception and tested in the test plan, and fully QA tested with each test performed to prove use and all requirements met? And then being placed in 50 military helicopters and not needing updates? Going back and "fixing" programs does not acknowledge everything that made those programs legal to be used on an aircraft in the first place, nor does it pay to do them again. Any change invalidates the product, requiring recertification. But here's the point: This is like going through the Smithsonian and redoing all the art pieces because some used lead paint, and asking them to redo it all in crayon, to make it more "safe" for the workers. It's a foolish endeavor, because those curators are not the original artists, may not understand the techniques or skill needed to remake them and the paintings were safe enough for their original intent, to have on the wall producing a specific textured visual and emotional effect, so long as they didn't each the paint chips that fell off on the floor. And like it always is, you can't get the same colors of red without using lead or mercury. The colors are physical properties of the elements themselves. And they're going to complain it's not the same, eventually ending in mandating production of crayon colors made in lead and mercury so they can remake the art correctly, poisoning children who buy these new crayons and destroying the point of even having safe crayons in the first place. To train the little toddlers that want to make art too and not kill them while doing it. Just make a new project to replace the old one, with actual proper requirements, and make your new language baby a requirement. If you are convincing enough and this isn't just whining about people saying no, because you didn't understand why they're saying no, then they'll get approved to be paid for all over again with real requirements.

    1. Anonymous Coward
      Anonymous Coward

      Re: The Smithsonian is doomed.

      I down voted you because, while your post looks interesting, it was too painful to read. With run-on sentences and no paragraphs to organize a sequence of thoughts, I just couldn't get through reading it.

      If you write code for a living, hopefully this is not how you write code. If it is, I pity the person having to code review or maintain your code. If you don't write code for a living, please don't start until you improve your organization of thoughts.

      Sorry for the criticism if English is not your primary language, but your post is a good example of poor communication skills. Please take a little bit more time to put your thoughts in some kind of coherent form so the rest of us can understand your thoughts.

      I am probably going to get flamed for this, big time, but it had to said.

  37. Hawks-eye

    The minute I read Java as memory safe I switch off. Java is unsafe in its entirety. As for C and C++ being unsafe all languages are unsafe the minute you have a conditional statement. And surely a simple library and code checker can make memory management safe?

  38. efa

    All modern C compilers (at begin was CLANG/LLVM but from some years now also GCC) has an option to generated runtime code that check boundary violtions, double free, use after free, miss free, stack overflows and so on. This code slow down C to the level of Java, C# and Rust, so must be used only when build for debug with -g. Once code Is clean, remove sanitize address option and get pure fast secure memory safe C cide. So today theres no need for slow memory safe languages

    1. An_Old_Dog Silver badge
      Unhappy

      Runtime Checks

      "THIS time we've got all the bugs out. We swear it, we really did! So, release the kraken! Remove the runtime checks!"

      Sarcasm aside, how much help on this are we getting from the current x86 architecture? Hardware-based runtime checks should be much-faster than software-only-based runtime checks.

      1. gnasher729 Silver badge

        Re: Runtime Checks

        "Sarcasm aside, how much help on this are we getting from the current x86 architecture? Hardware-based runtime checks should be much-faster than software-only-based runtime checks."

        Says who? Software based can run through an optimiser. So in a loop for (i = 0; i < n; ++i) a[i] = 0; I can get away with exactly one check in software (is n greater than the size of the array). But then there has been tons of effort to make normal code run fast that is used by software-only runtime checks. With hardware checks, good luck integrating this into the normal code flow.

  39. Blackjack Silver badge

    Hey, remember JavaScript, that security blackhole in every web browser that's over two decades old? Are we doing something about that? No? Oh okay.

    1. Anonymous Coward
      Anonymous Coward

      The one for which "compilation" is basically syntax checking as you don't find out until run time what it's going to do - or not going to do ?

      1. Anonymous Coward
        Anonymous Coward

        re: The one for which "compilation" is basically syntax checking

        That's right! We say this type of language is "Interpreted" as there is no "compilation".

        We understand that compilation is nothing to do with whether the code works or not. Compilation only proves that the syntax is correct, not that the code functions correctly.

        We run the tests to find out what it's going to do, like we do with all the other languages.

        Are you sure you know what you're talking about?

        1. CowHorseFrog Silver badge

          Re: re: The one for which "compilation" is basically syntax checking

          Compilation often does a lot more than syntac checking, it can also analyze and complain...thats what a lot of IDEs do.

  40. Keybounce.two

    First Fortran, then C, then ...

    Lets see. You want us to stop using language X that is fully understood now, and has a massive library out there, in favor of Y, that is not yet fully understood, and does not have that library?

    Tell you what. You tell me how many years after "Stop using Fortran/Cobol" before those languages actually stopped being used. That is how long it will take for C/C++ to be done.

    And if you really want it to be changed? Well, tell you what. Write a replacement for the Linux kernel in your language of choice. When that is functional and stable, then we can get programmers to use it.

    Be sure to also replace all of Git, as that is required to manage a group project of that size.

    Get all of the libraries and support code replaced? Ok, then we can talk.

    =====

    There is no such thing as a safe/unsafe language.

    ** The libraries are far more important **

    C's standard libraries were never designed for memory safety, thread safety, etc.

    Heck, at least one -- gets() -- was never designed in the first place. The manual flat out says,

    SYNOPSIS

    #include <stdio.h>

    [[deprecated]] char *gets(char *s);

    DESCRIPTION

    **Never use this function.**

    This is a function that, from day 1, was never memory safe.

    You have all the tools you need in C to write memory safe code. You just have to put in the effort.

    Apple went to a great deal of work to make Objective C memory safe. 95% of memory issues are handled automatically. (automatic reference counting and smart copy of data for instance variables).

    But languages like Java can leak memory because no one ever cleans things up. "The language automatically tosses garbage, I don't have to worry". Well, thinking like that can kill the Objective C memory stuff as well.

    And the rest of the claim that "newer designs are better"?

    Cobol and Fortran are still around, because migrating those databases is harder than migrating the code base.

  41. Anonymous Coward
    Anonymous Coward

    "minimum viable product"

    Amazing.....no mention of "agile"!!

    Oh....and no definition of "viable" either!!

  42. Groo The Wanderer

    If the USG wants a say in how I code, the USG can pony up a few hundred grand a year for the right to tell me how to code... it'll have to be in cash, though - I don't trust American government cheques not to bounce!

    1. CowHorseFrog Silver badge

      How do you know you can trust the cash the USG gives you isnt fake ?

  43. munnoch Bronze badge

    Here we go again...

    Conflating C and C++...

  44. ForthIsNotDead
    Stop

    Safe C++

    Safe C++ is on its way and will standardised soon enough. This is all a fuss over not very much. Safe C++ takes concepts from Rust (borrow checking etc) so it's likely that the whole Linux in Rust thing will lose steam as Safe C/C++ gains traction.

    1. Anonymous Coward
      Anonymous Coward

      Re: Safe C++

      That's Safe C++. Let's not mix them up, C's sitting this one out.

    2. davoy

      Re: Safe C++

      You're right !

      Safe C++ will gain traction as this is on everyone's table right now. The effort of moving to Rust would me massive in all respects. Safe C++ will allow a gradual and this is important, in-place upgrade of existing code on a per module or even per section basis.

      I also believe that C++ would have been a way better language for additional inclusion in the Linux kernel than Rust. It would have allowed a slow, rupture-free transition without api-bridges, drama and duplicated effort. Even the smallest C++ subset using only RAII with guaranteed destruction on scope-end would have been a huge benefit. No need to use exceptions, new/delete, templates or the STL. Heck, C doesn't even have references.

      After a switch to a C++ compiler this new subset could have been introduced anywhere in the code, existing or new without having to have an api, separate tool-chain, etc. No drama, no hassle, immediate benefit.

      So yes, i'm with you. Rust will be remembered as the language that inspired C++ to become safer.

    3. karlnapf42

      Re: Safe C++

      So far there is just a paper -- and statements from influential committee members that other venues should be checked first. It will probably be decades before this gets accepted, if at all. Thatv the author recently wrote that Safe C++ should be a vehicle to off-ramp code to rust and around use the rust standard library is probably not going to help I am afraid.

      You are also unerlaubt the effort involved from going from unsafe to safe C++. It needs a new typ of references, new move semantics, an entire new standard library with new vector and option types and lots more (or straight out using the stabdard library from rust), and requires rearchitecting the entire codebase to be more like rustbthan C++.

      There is also no Safe C proposal at all, this is all C++ only.

  45. [email protected]

    C and C++ can be memory safe

    The idea Rust is more memory safe than C or C++ is foolish. Rust itself has to be written in C. In fact, all drivers and operating systems have to be written in C.

    All you have to do is store information about the memory allocations, check for null pointers and length of allocated space, and there is no problem.

    But there is no way Rust could be used to do things like access hardware registers. Only C can do that, and it will always be that way.

    1. Chris Gray 1
      Stop

      Re: C and C++ can be memory safe

      Nonsense. Don't try to be definitive about things you know nothing about.

      The first version of my Zed language started in C (actually it started in my Draco language since I started with my AmigaMUD language). Stayed that way for a couple of years as the language evolved. Eventually I added dependency on compile-time operations, and they were only in Zed itself. The standalone "zedc" compiler (Zed => X86-64 .o files) is about 99.99% Zed. It'll be 100% once I get around to translating a few small run-time routines. It does not translate to C or even to assembler code - it goes from source to .o files.

      My first compiler was in fact written in AlgolW. C has never been big on mainframes. My second compiler was written in that first one.

  46. anonymous boring coward Silver badge

    "Take Rust in Linux, for example. Even with support from Linux's creator, Linus Torvalds, Rust is moving into Linux at a snail's pace."

    This is about using Rust in the kernel.

    Using it for applications shouldn't be a problem.

  47. ShortStuff
    Alien

    This Is A Job For AI

    Just make AI useful for once and have it convert all the C/C++ code to whatever memory safe language you want. Isn't it that simple?

  48. ChrisBedford

    Aah, the ambiguity of English

    "Or, at the least, companies must come up with roadmaps for moving their existing codebases by January 1st, 2026."

    They must come up with roadmaps by Jan '26? Or move the codebases by Jan '26?

    1. Paul Crawford Silver badge
      Joke

      Re: Aah, the ambiguity of English

      Coming up with a roadmap is easy, just say budgets allocated in 2080 and it will be done by Jan 1st 2100 and don't worry. After all you won't likely see that date...

  49. Anonymous Coward
    Anonymous Coward

    Is Go really memory-safe?

    I’ve seen a library that has some parts written in Go repeatedly Panic.

    1. tracker1

      Re: Is Go really memory-safe?

      Odds are that's happening at a C library boundary. Possibly intentionally.

  50. Charles Smith

    Warmer coding

    One of my systems, written in compiled Cobol, running on a PC (Intel 486) with 512 KB of memory served the trading activities of 100+ Foreign exchange traders. As things have progressed with "modern" languages the processor power, memory sizes have all bloated morphing into racks of servers consuming more power and generating more heat. Now the Officialdom wants programming to migrate to even less efficient programming methods, that require even more processing power to operate.

    Maybe we can get AI to do the task of migrating the C and C++ to some form of modern grossero coding. :-)

  51. RockBit

    Don't forget about Coq

    It doesn't matter what language is used.

    Program verification with Coq should discover vulnerabilities even in C program.

    Program verification is expensive but it is a required part of mission critical software development.

    1. Torben Mogensen

      Re: Don't forget about Coq

      "Program verification with Coq should discover vulnerabilities even in C program."

      Coq and similar verification systems can be used to prove absence of certain vulnerabilities, but it is not an entirely automatic process. In all but the simplest cases, it requires considerable manual work and even re-coding programs to be better behaved. And it often uses a lot of compute resources.

      Part of the reason for this is that absence of vulnerabilities is an undecidable property it at least one program in the language in question can have vulnerabilities. For example, it any program can follow a pointer to freed memory, deciding if any given program can do this is undecidable. Sure, you can often show that programs do or do not have these, but doing it in general for all programs is not possible. And as programs increase in size, the probability that it escapes proof increases.

      The only reasonable way to guarantee against specific vulnerabilities such as read-after-free is to ensure that they can never happen in any program. This requires new languages and not attempts to prove absence of vulnerabilities in programs written in languages that allow such vulnerabilities.

      The only other alternative is to reduce the computational power of programming languages to make the properties decidable for all programs, i.e., using languages that are not Turing complete. And even then, decidability does not imply effective decidability. Deciding a property may take extremely long time, even if you are guaranteed to eventually get an answer.

  52. StanT68

    Hey, I think I remember writing that! One of my simpler Ada programs :)

  53. Torben Mogensen

    Temporary fix

    For low-level system work, such as the Linux kernel, you need hard real-time constraints, so you can not use garbage collection where there can be unpredictable pauses of a few milliseconds. Yes, concurrent GC exists, and it can reduce most of these pauses, but it can not eliminate them entirely. This is where the Rust model or similar is required: Memory safety without unpredictable pauses.

    But C and C++ are used for many applications that do not need hard real-time constraints. For these applications, it would be acceptable to compile C/C++ with index checks and similar to prevent buffer overruns and a conservative garbage collector, which manages memory to avoid use-after-free and never-free errors (which cause space leaks). It will impose a small performance overhead, but not too much. Most index checks can be eliminated by fairly simple analyses.

    But, let's face it, C and C++ were designed to be close to a machine model (single core, flat memory) that was outdated 20 years ago, so it is only a matter of time before they become irrelevant. Sure, a lot of people will continue using them, but new programmers will choose languages that are a better fit for the highly parallel compute models of today. And, sure, a lot of C/C++ code will stick around even longer, as the effort of rewriting it will be prohibitive. But they will no longer be used for tasks where performance is critical, nor for tasks where safety is critical.

    1. ChrisC Silver badge

      Re: Temporary fix

      "C and C++ were designed to be close to a machine model (single core, flat memory) that was outdated 20 years ago"

      Outdated it might be if your concept of coding stretches no further than what's running on a desktop/server or similarly well-endowed piece of hardware. Outdated it most assuredly is not if your concept of coding includes the myriad of embedded systems and the equally extensive choice of single core flat memory model microcontrollers on which to base them...

  54. Locomotion69 Bronze badge

    As time progresses, (AI) tools are going to be more powerful in detecting suspicious code -in any language-.

    Once the Safe C++ spec gets done, one could make a lint (or whatever) version that supports checks to the specification. I expect this to happen in the near future anyway.

    Abandoning C/C++ does not solve the problem today, or in 2026, or 2030. Some solutions architectures cannot 1:1 be applied given another environment. That is not just a rewrite - it is a fundamental redesign from the early lifecycle phases of a product, and it takes time, a lot of it, to catch up and overtake the mature level the product has today.

    My bet is on better tooling supporting "memory safety" in the near future. It will cause a lot of headaches, but less than redesigning the product.

    1. CowHorseFrog Silver badge

      Says who ?

      Because nearly all the big AI tools have warnings about how they could be wrong...

  55. P@ulos

    What about using COBOL ??

  56. [email protected]

    Change takes time

    Porting a massive installed base takes forever. The first step is to stop creating new projects in C/C++

  57. 10111101101

    These new age programmers want everything done for them like new born babies because they can’t take the time and write properly structured code.

    1. tracker1

      Aren't C# and Java about as old now as C was when they were created?

    2. CowHorseFrog Silver badge

      Exactly

      Most complaints about a language are mostly always about thats too much typing, because im too fucking lazy too type a ew extra words or lines.. but i will. happily type much more on social.

  58. tracker1

    Not just Rust

    While articles like this the to put the focus on rust specifically. Go, C#, Java and other languages are also on the table and in broad acceptance.

    I like rust a lot. I find that it's pretty good for even mundane tasks like web API development even. But it's far from the only option being considered and used.

  59. rwessman

    Conversion is not an option

    My company has (literally) millions of lines of C/C++. Conversion would take years and introduce so much instability that it would be impractical.

    Instead, we’ve invested in memory checking tools. It’s not perfect, but it’s better that turning the code base upside down.

  60. MangyDog

    Am I the only one who also sees this as a ploy for the US security agencies to get people to use "safe virtual machine based languages like c#" because its easier for them to slip in middleware into those packages for the use of surveillance?

    I mean its not a stretch is it...

  61. cantankerous swineherd

    anyone got a cobol to rust converter? asking for a bank.

  62. steviebuk Silver badge

    A country

    that is stupid enough to not only re-elect Trump but allow him to even run when he was/is a convict fellon and insurrectionist can be ignored.

    1. CowHorseFrog Silver badge

      Re: A country

      Nothing about Vlad ?

      1. PassiveSmoking

        Re: A country

        Who voted for Vlad (Of their own free will and not at gun-point)?

        1. CowHorseFrog Silver badge

          Re: A country

          Vlad is honest, he never points guns at voters. He does however switch boxes or just make sshite up at the counting offices.

  63. dauerweck

    C-C++-Ada?

    Perhaps we should go back to programming in Ada...afterall it was the first "AI" language LOL!

  64. StrangerHereMyself Silver badge

    Overnight

    This will take time to enact, but I already see more and more open-source projects switching to Rust.

    And Java and C# have been around for ages (3 decades for Java and 2 decades for C#) and both of these have made computing much more secure than it would otherwise have been. But they're not popular choices for performance oriented stuff since they have large startup times and need a runtime to function. I often wonder how the world would've looked if Microsoft had decided to make C# an AOT (ahead of time) compiled language. Most likely there would not have been a need for Rust.

  65. Breen Whitman

    Torvalds opinion is irrelevant. He either gets with the game or moves aside. His relevance has diminished and moving forward from 2025, he will contribute little.

    1. StargateSg7 Bronze badge

      Linus literally writes and MAINTAINS the ENTIRE KERNEL of Linux itself! He is literally the MOST IMPORTANT PERSON in all of programming and OS development alive today! He ain't going ANYWHERE soon except through illness and/or death or forced-retirement!

      1. CowHorseFrog Silver badge

        Linus does not write any code for the kernal, he only reviews and accepts commits, big difference. Other people submit their changes... they write the code not Linus.

  66. gnasher729 Silver badge

    So let's say I have an array with 100 elements and do an assignment a[101] = 0. Undefined bheaviour in C or C++. In Swift I know it will crash the application. Instantly and guaranteed. Which is likely the best possible outcome, but obviously not good. In Objecctive-C it throws an exception which is unlikely to get handled properly. No matter what, the user is f****d.

    So what exactly does it do in Rust, and how is that safe?

    1. PassiveSmoking

      A program failing completely on an illegal memory operation is still safer than one that continues to operate with a corrupted working set.

      I've not looked into Rust too deeply but I believe that in general the design philosophy is to make such illegal accesses impossible in the first place (for example all "variables" are immutable unless you expressly tell the compiler otherwise, only one variable can "own" point to a given data point at a time so that double-frees should be difficult to cause, etc). I do know that it's possible to turn off memory safety when absolutely necessary, but if you do then you're back into the same situation you are with C-type languages and you're going to get no help from the language.

      Of course it won't be 100% foolproof, and you can turn off memory safety features if necessary, but it probably will catch most of the errors that occur due to programmer mistakes and improper peer review of the code.

    2. Dagg Silver badge

      That is actually a serous question

      So what exactly does it do in Rust, and how is that safe?

      That is actually a serous question. What actually does happen?

      * Does the whole program crash immediately?

      * Does it throw an exception?

      * Does it just block the access and blindly continue?

      The answer to this is extremely important in a production system. And the impact of the resolution will cause an issue depending on the function of the actual package.

      * Game

      * Word processor

      * Banking

      * EFTPOS

      * Retail POS

      * Realtime - industrial process control

      * Realtime - nuclear reactor

      * Realtime - fire control platform

      * etc

      So will someone who know rust please indicate how it will respond?

  67. PassiveSmoking

    "After all, they can write memory-safe code in C, so why can't you?"

    No, they THINK that they can do that. But the number of memory-related CVEs suggests otherwise.

    I'm not dissing C, honestly I'm not, it's a language whose greatest strength is that you can do almost anything in it, but it's also a language that's every unforgiving of very easily made mistakes. And let's be real, a developer unwilling to learn new things is probably not a developer who will be employed in 20 years.

    We have to acknowledge human failings and take care to design languages to take them into account. Should we ditch close-to-the-metal languages like the C family? No, of course not, there are times where you need its features and the performance it can achieve in an experienced developer's hands. But should we not consider other friendlier languages for tasks that need to be quite so tightly optimised? Most applications don't need to be tuned to within an inch of their lives, and the extra safety they offer may be worth the slight performance hit.

  68. cassandratoday

    Retired C programmer and *nix admin

    Why do we assume that the runtime memory-management layer for Rust, Java, et al. is itself free of memory management bugs?

    1. CowHorseFrog Silver badge

      Re: Retired C programmer and *nix admin

      Lets be fair most programmers are lazy fuckers who write shitty code. The Rust and java runtimes are written by far more capable programmers who will do a better job.

  69. Anonymous Coward
    Anonymous Coward

    The problem is not the language but the programmers.

    From may years ago, cientists and engineers did programming, nowadays, even childrens are programming.

    Are children uncappable:? No they're not. Cientists and engineers do things wrong too.

    But as before we had lack os resources, programs were made with much more attention.

    They didn't have an IDE telling them there's a possible leak on line 123 of file foo.c

    They need to know exactly what they are doing, how the computer works and wat they need as output.

    1. CowHorseFrog Silver badge

      Exactly.

      Most people in all discipines are lazy fuckes who cant be trusted, programming and bugs are mostly caused by this basic fact.

  70. plrndl

    Deja Vu All Over Again

    Software quality is a function of quality control, not of language choice. Mandating a "safer" language will translate into more "minor" bugs allowed into production.

  71. lukabloomrox

    There are very well known safe libraries for c/c++

    For example: https://girhub.com/rurban/safeclib or Microsoft’s Secure C Runtime Library https://learn.microsoft.com/en-us/cpp/c-runtime-library/security-features-in-the-crt?view=msvc-170

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like