back to article How a good business deal made us underestimate BASIC

A generation of gray-haired IT folks learned computing using BASIC on 1980s home computers. Every pro since then holds it in disdain. What happened? Fifty years ago, the Altair 8800 computer from Ed Roberts's MITS was the cover star of the January 1975 edition of Popular Electronics magazine. The appearance inspired two …

  1. RJW

    My favorite flavour of Basic was HP Basic. Which came on HP computers in the 80's early 90's, which connected their various electronic measurement devices that they sold. HP basic came with matrix operations. The language was very advance, not basic at all.

    1. Headley_Grange Silver badge

      Came here to say the same. In my early career I used HP BASIC on HP 98xx series minicomputers for driving test gear, microwave circuit simulation, drawing 3D graphs of EM fields on the brilliant HP plotter. One clever chap in the lab even managed to write a programme that could re-write itself and change what it did.

      1. John Smith 19 Gold badge
        Boffin

        "write a programme that could re-write itself and change what it did."

        Ahhh.

        Self-modifying code. Now viewed as deeply unsafe (because it is).

        I learned about this in High School, but it took me years to finally find any actual uses of it "In the wild".

        It was key to the (for it's size) large instruction set of the Apollo guidance computer and of the Bell Labs Blit terminal.

        1. that one in the corner Silver badge

          Re: "write a programme that could re-write itself and change what it did."

          > Self-modifying code ... but it took me years to finally find any actual uses of it "In the wild".

          Oh, you innocent, to have escaped COBOL and the infamous ALTER statement. Yes, built directly into the language.

        2. Jou (Mxyzptlk) Silver badge

          Re: "write a programme that could re-write itself and change what it did."

          > Self-modifying code. it took me years to finally find any actual uses of it "In the wild".

          Which generation are you? Those who were there around the C64 (and before) era should have seen it quite often since it saved space and sped things up. Switch the code 10 bytes ahead between INX and DEX, and/or between BPL and BMI.

          With the huge amounts of RAM the PC, Amiga and Atari ST came along that started to fade, or at least got very uncommon except for the demo-scene.

          1. Simon Harris

            Re: "write a programme that could re-write itself and change what it did."

            One use of self modifying code I’ve seen is fairly particular to the 6502.

            The 6502 had indirect instructions- e.g.

            LDA (addr),Y

            LDA (addr,X)

            but no index free version

            LDA (addr)

            (An omission corrected on the 65C02)

            If you wanted an indirection without having to manage the X or Y register, I have seen code that patched the destination address into the two bytes after the LDA instruction. It also had the advantage that if you needed to access that address a lot, it saved a couple of cycles over the standard indirect instructions.

            Similarly there was a JMP (indirect), but no JSR (indirect) - so again if you have a table of call addresses, you could patch the appropriate one into the code.

    2. Happy Lemming

      Yes. After my first computer encounter - Fortran on an IBM 360/30, punch cards - I switched from engineering to computing science and subsequently found myself using HP Basic on an HP-85 to glue a specialized weather computer to a satellite uplink. Good times. Sometimes I play with an HP-85 emulator, just for fun.

      1. Hawks-eye

        Oh yes, HP-85. I worked on the HP-86. Wrote an unpurge program in Basic. And loved the HP-IB. Used it to measure shearing of soil samples.

    3. Anonymous Coward
      Anonymous Coward

      My favorite BASIC is Power Basic, which still is around and still generates binaries that run almost as fast as programming in Assembly.

      1. md56

        Yes, absolutely, I've been using this for tens of years. It's fast and you can do anything with it. But, simple and straightforward though it is, I seem unable to convince any of my students to use it. They all turn into horrible pythonistas.

    4. gerryg

      But

      IIRC there was a small glitch, (it was the 1980s) HP Basic on the 9000 series didn't like returning number.zero results from functions.

      Otherwise they were great with a novel user interface, the knob. The IEEE-488 interface super useful in the lab.

      1. RJW

        Re: But

        The IEEE-488 bus - I was trying to remember what that interface was called when I wrote my comment.

        It was like an ancient predecessor of the USB bus.

        1. Headley_Grange Silver badge

          Re: But

          HPIB for short in my world.

  2. Dan 55 Silver badge

    Anyone who has a blanket rule banning GO TOs...

    ... doesn't know how to program.

    Change my mind.

    1. Joey Potato

      Re: Anyone who has a blanket rule banning GO TOs...

      That is why we have the rule ... interns and junior engineers don't yet know how to program and write hard to fix bugs with them.

      1. Dan 55 Silver badge

        Re: Anyone who has a blanket rule banning GO TOs...

        So... if the rule is "don't use GO TO", how will they ever learn how to use GO TO?

        1. LionelB Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          Surely the rule is "Don't use GO TO unless it makes the intent of the code clearer"?

          1. Dan 55 Silver badge

            Re: Anyone who has a blanket rule banning GO TOs...

            If only Dijkstra had been so specific.

    2. david 12 Silver badge

      Re: Anyone who has a blanket rule banning GO TOs...

      GOTO Considered Harmful

      It's a cute meme, but the headline applied to Dijkstra's letter misses the underlying problem: it's not the GOTO, it's the Where From.

      When using GOTO, the destination is unlinked. You look at a line of code (BASIC was a line-oriented language), and you have no idea how you got there.

      That is also the problem people were addressing when they said "a subroutine must have only one exit point". The confusing practice is not when there are multiple "exit" statements in a subroutine. It's when the exits from a subroutine don't go to a common point on exit

      1. Dan 55 Silver badge

        Re: Anyone who has a blanket rule banning GO TOs...

        When using GOTO, the destination is unlinked. You look at a line of code (BASIC was a line-oriented language), and you have no idea how you got there.

        There are few suspects... either it's the previous line or a GO TO with the line number or label name.

        That is also the problem people were addressing when they said "a subroutine must have only one exit point". The confusing practice is not when there are multiple "exit" statements in a subroutine. It's when the exits from a subroutine don't go to a common point on exit

        Which is just what good use of a GO TO can do, it can jump to a common exit point for that subroutine which tidies up the heap/closes files/etc... before returning. No need for extra variables to jump out of loops, no need for nested ifs, and no need for elses for each if to clean up in a slightly different way before exiting the subroutine.

        1. Apocalypso - a cheery end to the world

          Re: Anyone who has a blanket rule banning GO TOs...

          > There are few suspects... either it's the previous line or a GO TO with the line number or label name.

          A GOTO with a hard-coded number is almost structured programming. :-) It's GOTO N where N is a variable that's the killer.

          1. Dan 55 Silver badge

            Re: Anyone who has a blanket rule banning GO TOs...

            Forgot about that, the better BBC and Sinclair BASICs could use an expression but the MS BASICs needed ON n GO TO a, b, c so there were still few suspects for MS BASICs.

          2. Roland6 Silver badge

            Re: Anyone who has a blanket rule banning GO TOs...

            No a GoTo <label> is more akin to structured programming. A GoTo <line number> which is the real problem.

            Remember, it was considered good practise to number Basic statements in steps of 10, to leave room for he insertion of (a few) new lines without disrupting all subsequent line numbers…

            1. This post has been deleted by its author

          3. Torben Mogensen

            Re: Anyone who has a blanket rule banning GO TOs...

            "A GOTO with a hard-coded number is almost structured programming. :-) It's GOTO N where N is a variable that's the killer."

            FORTRAN required that such a computed GOTO statement listed all the possible values of N. This made control-flow analysis easier.

        2. munnoch Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          But RAII is a much better way of ensuring the clean up. Works in the face of exceptions etc. I'd suggest that if you need that level of cleanup then you should be using a more sophisticated language -- I say that as a child of the 80's who got his start with BASIC. Contrary to what Dijkstra asserts I think I turned out alright. I do still detest early returns from functions, can be confusing as fuck.

          1. Dan 55 Silver badge

            Re: Anyone who has a blanket rule banning GO TOs...

            I'm absolutely in favour of RAII but I find myself using gotos to jump to the "tidy up and return" code at the end of each function/subroutine in those languages which require you to tidy up resources but don't allow RAII as I've yet to find anything better.

        3. Simon Harris

          Re: Anyone who has a blanket rule banning GO TOs...

          I think possibly worse, from a structured programming point of view is misuse of GOSUB linenumber. Since BASIC (until things like BBC BASIC came along) didn’t have a mechanism for defining a subroutine as a distinct block of code, you could GOSUB to multiple points within a subroutine, not just a nominal first line, so not only could your routine have multiple exit points, it could have multiple entry points too.

          Also since there was no defined block structure, one subroutine could GOTO somewhere inside another and borrow its RETURN, so it was quite possible for one exit point to serve many subroutines.

          Of course, with memory at a premium in the late 70s/early 80s, people would make clever use of these ‘poor programming practices’ to cram as much as possible into the small space available, even if it was a nightmare to try to work out what was going on.

          1. Torben Mogensen

            Re: Anyone who has a blanket rule banning GO TOs...

            "Also since there was no defined block structure, one subroutine could GOTO somewhere inside another and borrow its RETURN, so it was quite possible for one exit point to serve many subroutines."

            Another common practice was source-code level tail-call optimisation: The sequence GOSUB N: RETURN was replaced by GOTO N. Sure, it saves stack space and time, but it makes the code harder to read.

            1. that one in the corner Silver badge

              Re: Anyone who has a blanket rule banning GO TOs...

              > Another common practice was source-code level tail-call optimisation ... but it makes the code harder to read.

              That is a very good illustration of what the "ban" on GOTO is all about:

              Yes, as keeps being pointed out, the CPU does a lot of GOTO (well, JMP) and even GOSUB (JSR) can be decomposed into separate PUSH and JMP instructions (and you can decompose the PUSH into...).

              But we don't hear calls, even in these comments, that we should be writing out a GOSUB in decomposed form - with, of course, the decomposed RETURN to match. Why not?

              Of course, the point is that providing composed operations makes life easier - both writing and reading. And for "convoluted" thinking, like tail call optimisations, life is far, far easier if we can leave the conversion into GOTO to the language processor.

              99.99 (add as many 8s as you like) % of GOTOs are best left to the language, auto-replacing the WHILE-DO, REPEAT-UNTIL, FOR-NEXT, GOSUB/RETURN with the actual low-level JMPs.

              Of course, when the profiler shows a hot-spot, dive in and use a GOTO, even add some assembler. IFF it gives a worthwhile improvement (or you hate the guy who is trying to read your code, in which case go wild).

        4. big_D Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          There are few suspects... either it's the previous line or a GO TO with the line number or label name.

          Or a statement like: GO TO A*1000

          Go Tos on their own aren't always bad, but lazy or untrained programmers have a habit of using them badly and they thus get a bad reputation.

      2. Anonymous Coward
        Anonymous Coward

        Re: Anyone who has a blanket rule banning GO TOs...

        David12, you may be correct but Bjarne came off as a total d-bag.

        His OOP has been multiple decades in the fixing off, and his scorn belies the fact that Basic as others have opined, brought actual computing to the masses. And a heck of a lot of us started there and were successful enough that we ended up far more successful than we otherewise would have.

        The fact this his famous screed on GOTO is directly contradicted by the reality of how cpu's actually work, shows either ignorance or more likely, some misdirected envy.

        Mooks who spout similarly simply show themselves as nothng more than rote repeating smooth brained monkeys.

        1. that one in the corner Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          > Bjarne came off as a total d-bag ... his famous screed on GOTO

          Write 1000 times: "I must read TFA before commenting"

          > His OOP has been multiple decades in the fixing off

          Ah, merely someone who hasn't been able to understand even the basics of C++ and what has been happening to it across the decades; as in, what a "multi-paradigm language" might be (I know, I know, "paradigm", a Big Word, all very confusing).

      3. Anonymous Coward
        Anonymous Coward

        Re: Anyone who has a blanket rule banning GO TOs...

        Starstroupe was an elitest blowhard.

        BASIC served its purpose and probably the largest number of people into the field in the history of the world.

        His scorn probably derives from the fact that his baby C++ never reached near the heights of popularity, common-use than BASIC. Its not hard to imagine the legions of people who started their first CS class with C++ and promptly dropped after week 1.

        He knew Goto was almost 1-1 to how cpu's actually work, so his blathering on about it as a defect is obviously not rationale from that perspective. He could, as an expert, have as easily bemoaned the fact that Gosub wasn't taught much earlier with Goto reserved for the end of course as a final option with demonstrable potential to spaghettizing code as a result.

        But, that would have meant taking the high road as professionals are expected to act, and treating BASIC as a legitimate Peer.

        He couldn't, and didn't, and thats about all you need to know when you see he later endless excuses and revisions to C++.

        1. Dan 55 Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          Section 9.6 (goto) of The C++ Programming Language is pretty tame. Are you sure you're not confusing your forrin names?

        2. that one in the corner Silver badge

          Re: Anyone who has a blanket rule banning GO TOs...

          > Starstroupe was an elitest blowhard. ... He knew Goto was almost 1-1 ... treating BASIC as a legitimate Peer

          You might want to revise your basic computer history - or even just, you know, read TFA!

          It was Edsger Dijkstra who wrote the screed against GOTO.

          And he did so in 1968, way before C++ was A Thing (started in 1979, cfront 1.0 released around 1983).

      4. Torben Mogensen

        Re: Anyone who has a blanket rule banning GO TOs...

        "It's a cute meme, but the headline applied to Dijkstra's letter misses the underlying problem: it's not the GOTO, it's the Where From.

        When using GOTO, the destination is unlinked. You look at a line of code (BASIC was a line-oriented language), and you have no idea how you got there."

        I have designed and implemented several (low-level) languages using GOTO-like jumps to labels, but the rule is that every label must occur exactly twice: Once in a jump, and once in an entry point to a statement. If you want several places to jump to the same statement, you must specify multiple labels at that statement. This makes it easy to find where jumps to a statement come from. This also makes some transformations (such as program inversion and specialisation) simpler.

      5. James Anderson Silver badge

        Re: Anyone who has a blanket rule banning GO TOs...

        "you don't know how you got there ". But this is true of GOSUB , PERFORM and any subroutine call.

        It's part of the point if having subroutines, they can be used by code you have know knowledge of.

    3. gerryg

      Re: Anyone who has a blanket rule banning GO TOs...

      If I recall correctly even ADA had a goto. Long time though

    4. StuartMcL

      Re: Anyone who has a blanket rule banning GO TOs...

      Guess we need to ban JMP in Assembler as well :)

    5. big_D Silver badge

      Re: Anyone who has a blanket rule banning GO TOs...

      When I first started work, my first assignment was to update a corporate monthly reporting tool (DIVAT - Data Input, Validation And Transmission), which came in various "numbers", defining what financial reporting was being done...

      It used a definition file to define the input fields that should be displayed and how they should be validated... All very nice and advanced. The problem was, it was written in MS BASIC and had been written by FORTRAN programmers and maintained by COBOL programmers. I was the first developer to work on it that had even opened a BASIC manual, let alone actually understood how to program in BASIC...

      It didn't have any For...Next loops or, God forbid, a Repeat or While loop. It did:

      10 a = 1

      20 do something: do something else: and something else...

      30 a = a + 1

      40 if a < 25 goto 20

      It also used computed GOSUBs to action menu points:

      100 PRINT "menu option: "; : INPUT a

      110 GOSUB a * 1000

      120 GOTO 100

      Over the years, great swathes of code had been REMmed out and the code was pushing on the 64KB code limit. No problem, I thought, delete all the commented out code... Nope the program stopped working. It had computed GOTOs and GOSUBs jumping into the middle of the commented out code, deleting the unused code caused it to fail, because the destination no longer existed! It took months to untangle the code!

    6. ForthIsNotDead
      Meh

      Re: Anyone who has a blanket rule banning GO TOs...

      Indeed. The only reason line numbers existed in BASIC was to serve as references for GOTO and GOSUB. If you ban them, then you don't need line numbers. But it's all bullshit anyway. There's nothing wrong with GOTO. It was only the snobby academic types that wanted to look superior to everyone else that had an issue with it. It was just code snobbery.

      In 1991 a mate and I wrote a full management system for a nursing agency including payroll. It was written in Microsoft Quick Basic. It was used for over a decade. It was later re-written in VB6 (look ma, no line numbers!) and for all I know it's still in use!

      1. Farmer Fred

        Re: Anyone who has a blanket rule banning GO TOs...

        Not true. Line numbers also told the interpreter the order in which the program should be evaluated - particularly important when entering and editing code via a command line rather than an editor. If you don't have an editor, how do you replace or alter a line of code? How do you insert a new line of code between two already existing lines?

  3. Pascal Monett Silver badge

    GOTO

    Yeah, dump on GOTO.

    Let's completely ignore that Assembly language doesn't have any other means of pointing a program to another part of code . . .

    Assembly.It's the language the computer understands.

    Everything else is made for your convenience.

    1. Boris the Cockroach Silver badge
      Boffin

      Re: GOTO

      Even in assembly, there are times where the jump command is a right royal pain in the butt

      I always prefered to use Gosub routine, or a conditional jump to exit a routine(think of that as repeat... until condition = zero type structure).

      And my personal favourite... the jump table (after pushing the return point onto the stack so that every routine on the jump table could return to the same point)

      But that C64 basic....... my god what a bad implementation on such a capable machine. 67 poke commands to define and move one screen sprite.... no wonder everyone hated it and moved straight into 6502 assembly helll, even the Zx81's basic was more advanced and capable than the C64.... and thats saying something

      1. David 132 Silver badge
        Happy

        Re: GOTO

        ZX81 BASIC even had FAST and SLOW commands. What other implementations allowed you to flip between "slug in oobleck" and "slug in treacle" speeds with a single command?

        1. Stumpy

          Re: GOTO

          And with the correct timing, you could even get the ZX81 to play tunes by switching between FAST and SLOW modes ... purely from the whining of the monitor's LOPT as it switched.

      2. Rich 2 Silver badge

        Re: GOTO

        “6502 assembly helll”?

        At least 6502 assembler is incredibly simple - you can easily write all the opcodes (plus addressing mode options) on the back of a fag packet.

        I remember making my first C64 animated alien sprite, written in assembler. It had a club foot and a bent antenna but I loved it and it was great fun to play with.

        1. This post has been deleted by its author

          1. Rich 2 Silver badge

            Re: Pfft, Sprites!

            Ah PMG :-)

            I had an Atari later on. But documentation wasn’t easy to come by and I had to rely on magazine articles etc to try and work out how to do stuff. As it was (by then) very close to the end of the 8 bit era, I decided life was too short and sold it for a song

            1. This post has been deleted by its author

            2. John Brown (no body) Silver badge

              Re: Pfft, Sprites!

              "sold it for a song"

              Was it this song?

              Sorry, yes, this is the 8-bit equivalent of a rick-roll. But you'll click it anyway just to hear what it is :-)

              1. Jou (Mxyzptlk) Silver badge

                Re: Pfft, Sprites!

                You are running down a dark alleyway at night. You hear the sound of plastic scraping against concrete. You look over your shoulder and see it. The Commodore ALWAYS keeps up with you.

        2. big_D Silver badge

          Re: GOTO

          We had used PETs in school and when I got to college, they had PETs as well. Our first lesson we had to write a simple program to calculate the minimum number of coins to give as change for a given value, so the lecturer could see what our understanding of computers (BASIC) was.

          As a "veteran" PET programmer, I was finished in under 10 minutes, so what to do with the remaining hour?

          A bit of machine code to "tidy up" the presentation, drawing a border around the screen and split it into two windows, the classic PET code for 8x8 matrix characters and Get to make the input large at the top and then piles of coins at the bottom of the screen...

          When the lecturer looked at my screen, she said, "ooh, I didn't know you could do that with a computer!" <facepalm> I was supposed to be the one there to learn! :-D

          No, I wasn't some hot demo or games programmer, I was just a mediocre student who had spent too many hours reading the PET machine code handbook and had memorized enough to program it off the top of my head.

      3. HorseflySteve

        Re: GOTO

        There some really fun tricks in the ZX Spectrum ROM, like the USR function for calling machine code routines from BASIC, which :

        evaluates the argument to an absolute 16 bit value & puts it into the BC register pair,

        pushes the address of the routine to returns the BC register contents to BASIC as a floating point value on to the stack,

        pushes the BC register contents onto the stack,

        executes RET

        The RETs might not look like GOTOs, but they are...

        I learnt Data General Time Shared BASIC at college in the 70s, then Sinclair ZX BASIC & QL SUPER BASIC before moving on to Z80 assembly & later C but I've never forgotten that, at the bottom of all that high level procedural stuff, it's all just loading a Program Counter with a new value which is a GOTO whatever you call it

        1. John Brown (no body) Silver badge

          Re: GOTO

          "There some really fun tricks in the ZX Spectrum ROM, like the USR function for calling machine code routines from BASIC"

          TRS-80 Level II BASIC has prior art on that :-)

          Admittedly only one USR() function and you had to poke the start address before calling it and the argument went into and was returned from the HL register.

          But one neat trick was creating a string variable (up to 255 characters long) and using VARPTR to find where it is in memory and then poking your code into that area, usually from read/DATA statements then deleting the DATA statements before [C]SAVEing the program, allowing for multiple Z80 "subroutines" to be called from BASIC. Disk BASIC extended on that with USR0-9, acting as a replacement for what many cassette BASIC users did, which was to re-purpose the (non-functional) ROM built-in disk BASIC commands (each Disk BASIC command was in a look-up table which had a JMP/RET instruction, you just had to POKE your own JMP address in and remember to RET correctly :-)

      4. toejam++

        Re: GOTO

        "But that C64 basic....... my god what a bad implementation on such a capable machine. 67 poke commands to define and move one screen sprite.... no wonder everyone hated it and moved straight into 6502 assembly helll, even the Zx81's basic was more advanced and capable than the C64.... and thats saying something"

        What is especially frustrating is that Commodore opted to use BASIC 2.0 in the C64 instead of the more capable BASIC 4.0 that was released with the CBM-II. Nor did they include any of the graphics or sounds extensions for BASIC 2.0 that were included in the Super Expander cartridge for the VIC-20. All because Jack Tramel was a cheap SOB and didn't want to use a larger ROM.

        Commodore did release a Super Expander cartridge for the C64, but I don't recall it being very popular. I do recall a number of people with Simons' BASIC, which was powerful enough to write games with. And Simons' BASIC 2 was even more powerful, and fully supported structured programming, IIRC.

    2. doublelayer Silver badge

      Re: GOTO

      There is a difference between writing goto in a language where you have better tools and writing it when that is your only tool. There are lots of things in assembly that we insulate ourselves from because it was causing preventable problems. When Dijkstra wrote his famous criticism, he was talking about people who had better tools and he wanted them to use those. He was not insisting that someone write assembly without any jumps.

    3. Orv Silver badge

      Re: GOTO

      Most chips I've programmed for have a jump-to-subroutine command that's a little more structured; unlike GOTO it pushes a return location onto the stack.

      1. runt row raggy

        Re: GOTO

        pushing the PC as the return address is only helpful if you want to return. so it's not useful if you want to loop or break from a loop, which is literally the best two uses of goto

    4. Roland6 Silver badge

      Re: GOTO

      Since at least the mid 1970s Assembler languages permitted the use of labels rather than requiring programmers to insert magic numbers. Now debugging and amending hexcode on the fly…

    5. demon driver

      Re: GOTO

      "Assembly. It's the language the computer understands. Everything else is made for your convenience."

      Actually that's not correct. Beside what others already said: binary machine language is the language the computer understands. Assembler, its mnemonics and other stuff that is not machine language already was made exclusively "for our convenience". Some assembler languages (like for some mainframes) actually look like high-level languages compared to machine language. And obviously "our convenience" is the core purpose behind the concept of "programming language"...

    6. LionelB Silver badge

      Re: GOTO

      > Everything else is made for your convenience.

      Yes, but isn't that the entire point of higher-level languages? Programming in assembler is, apart from some special (usually, but not always) hardware-related cases, generally not the most convenient option. And once you're using a higher-level language, there is generally no obligation to use GO TO, and the fact that it is "the language the computer understands" becomes irrelevant; so you use GO TO if and only if it makes the code intent clearer (to yourself and others).

    7. werdsmith Silver badge

      Re: GOTO

      Assembly.It's the language the computer understands.

      Everything else is made for your convenience.

      Assembly is for your convenience. The computer understands only machine code. It’s hex numbers in the memory locations it’s reading, not text representations of op codes.

  4. Rattus

    I cut my teeth on BBC BASIC, Learned 6502 on Z80 Assembly at collage and COBOL, PASCAL, & C at Uni.

    I like what you say about making things easy for beginners, but I positively hate loosely typed languages (although my reason for that is that I tend to (A) write control systems and (B) write them on very very small devices, deeply embedded (you know the sort of machine I'm talking about - there are probably a dozen or so lurking inside your laptop)

    /Rattus

    1. John Brown (no body) Silver badge
      Happy

      So you love BASIC then? Two data type. Number and string :-)

      Some, like TRS-80 BASIC even went further and had integer, single precision and double precision, such as A!, A and A#.

      1. StuartMcL

        Must be a long time since you actually looked at a BASIC implementation.

        Today you will find most or all of the following:

        Integers

        BYTE, WORD, DWORD, QWORD, INTEGER, LONG, QUAD

        Floats

        SINGLE, DOUBLE, EXTENDED,CURRENCY,CURRENCYEX, DECIMAL

        Strings

        STRING,STRINGZ,WSTRING,WSTRINGZ

        VARIANTS

        ...

  5. steelpillow Silver badge

    Really?

    Not sure how much of that I agree with.

    This old fool was once a beginner. Everything got saved to media on principle, weeks before any of us got enough bugs out that it would actually run. When I got my own cheapo Science of Cambridge Mk.14, the lack of non-volatile storage made it almost useless as a beginner's tool.

    Then there was Mallard BASIC, a particularly fast implementation optimised for business applications, including filesystems and databases. Ran on CP/M if you booted an Amstrad PCW from the other disc. Also on the Amstrad-owned Speccys and the BBC Micro if you plugged in the Z80 option.

    And SAM BASIC, which did procedures and smart auto-renumbering (also updating the GOTOs for example) and all kinds of other Advanced Good Things. You could compile it too, with a BLITZ compile function so chunks of your code ran like hot shit (if you had designed it properly, as always!). I used it as a poor man's FORTRAN.

    I suspect that the "BASIC is crap" mantra came about because an entire generation suddenly wanted 16/32 bit boxen, and dissing everything from the 8-bit era was a prerequisite. Some guru peddling his own 16/32-bit shit just happened to provide the quote everybody wanted to hear. There was certainly never anything rational behind it, at least I agree there.

    1. Liam Proven (Written by Reg staff) Silver badge

      Re: Really?

      > Then there was Mallard BASIC, a particularly fast implementation optimised for business applications, including filesystems and databases. Ran on CP/M if you booted an Amstrad PCW from the other disc.

      Yes -- and a now-forgotten addon for it called Lightning Extended BASIC. It overrode the (optional) LET keyword with LEB, and then that could be followed by graphics commands.

      I wrote a Wikipedia page for it but it was deleted as not notable enough.

      1. Anonymous Coward
        Anonymous Coward

        Re: Really?

        Couldn't LEB have been included/retained as a section within the Mallard BASIC article (and its article "page" turned into a redirect to that section) keeping most people happy?

    2. Nugry Horace

      Re: Really?

      BASIC 2 for GEM was a development of Locomotive / Mallard BASIC, with labels instead of (or as well as) line numbers and all the GEM graphics API natively supported. Apparently Microsoft tried to get Locomotive to do a Windows port, but it never happened.

  6. Chris Gray 1

    pot?

    You on pot when writing this article Liam?

    Historical stuff is interesting, but the part about line numbers and no files is bonkers.

    Actually, I think this is an issue that comes up more often than folks think. Some people, like Liam, appear to have excellent memories for bare facts. He's good with administering tons of Linux variants (lots and lots of weird command names and syntaxes). Now he wants folks to remember functionality by number! Maybe Liam can do that. Lots of folks can't. Using files lets one use *names*. Most folks can remember names much better than numbers.

    You see the same sort of effect in programming languages. Some use a small number of words and lots of punctuation and other syntax. C++ is an example. Others use lots of reserved words - e.g. Pascal. Which kind you prefer (all other aspects being roughly equal!) may well depend on your ability to remember bare facts.

    As for GOTO's... Many organizations and programming standards forbid them. One of the main reasons is not the GOTO itself, it is the place that it transfers you to. You see the label, so you know that there is probably one or more GOTO's that land there. What you can't readily know is what conditions exist (variables, etc.) when landing there. Again, a very good memory for facts can let you know that with some accuracy. A poor memory could require you to scan for that label and examine the conditions active at all of those GOTO's.

    1. Flocke Kroes Silver badge

      Re: pot?

      INTERCAL has a COME FROM statement...

    2. Michael Strorm Silver badge

      Re: pot?

      > the part about line numbers and no files is bonkers

      Line numbers existed in large part because of the limitations of the machines, memory, the basic, er, BASIC editors and screen displays back then.

      I grew up using line-numbered 8-bit BASICs- that was all I knew at the time, so I never really thought about it.

      I returned to Atari BASIC recently (*) and dealing with line numbers was *not* easier than organising text with a modern editor. Quite the opposite, it was more obviously just a means to get the lines in the right order, and added the complexity of having to worry about numbering the lines such that there was room for reorganisation. (I ended up using Turbo BASIC- a much improved and expanded Atari BASIC- in part because it had built-in line-renumbering).

      I don't want to have to list a range of lines and cursor over them so they can be edited (and re-entered).

      Also, I found editing via the 40-column display annoyingly cramped by modern standards, but that's another kettle of fish.

      As someone else noted, the lack of "files" were because those BASICs were designed before disk drives were commonplace.

      (*) Albeit in this case more as a means to call and test machine code I'd never learned properly back in the day, and to experiment with things that didn't need to be written entirely via assembly language.

      1. steelpillow Silver badge

        Re: pot? - calling the kettle?

        Line numbers and GOTOs are - or ought to be - an option to use like any other tool. I started out on punched cards. When the technician dropped all our homework in a heap on the floor, I was one of two in the class who had numbered our cards and could get all the lines back in the right order. We were also the only two who ever got our code to run. Later, moving from ZX BASIC to a procedural SAM BASIC, I pretty much stopped using GOTOs. I'd use it occasionally within a procedure, sometimes to the line number and sometimes to a label, whichever suited the structure at that point. It also had line renumbering where you could specify the step size. I'd frequently renumber to a step of 10. Inserting an extra line just meant typing out the number and the code, and hitting return. Need more than ten inserting? Just RENUM 10 again. No need to think ahead, no need to highlight with a cursor or scroll to bring the location into view. Nowadays it is mostly the occasional shell script, my text editor has optional line number display but is altogether more user-friendly than a terminal display and I almost never turn on numbering.

        But a power cut before I can take a non-volatile copy is still as pissing as ever, even if I am just trying to get python to say "Hello world".

        1. Dan 55 Silver badge

          Re: pot? - calling the kettle?

          Not many BASICs had a good renumber, either it you couldn't choose a section of program, or it didn't modify the calling GO TOs or GO SUBs, or it didn't check for overlapping, or something got mangled in some other way. It really is a pain in the arse moving the cursor over lines and changing the line numbers to make a copy then deleting the original lines. The antithesis of user friendly.

          1. John Brown (no body) Silver badge

            Re: pot? - calling the kettle?

            On the other hand, it taught you think in a more structure fashion and to plan ahead. Otherwise you end up in the situation you just described. I remember a book I bought (aimed and TRS-80 users, there was a range them, with a wizard or sorcerer on the covers) that spent a chapter or two teaching a system of variable naming and line numbering to make things sane :-) It was a long time ago, but something like major code blocks using 4 digit numbers, smaller blocks with using the 3 less significant digits, subroutines all in the 5 digit range and the main program loop[ at the start in the 3 digit range. Probab;y other guidelines on setting up constants, DATA statements etc in their own number ranges.

            Ah yes, just found it. The ...& Other Mysteries series, specifically "BASIC Faster & Better..." :-) Ah, is that the smell of nostalgia in the air? Oh, and here is the inevitable scan I found once I got the name of the book from Ira Goldklangs site :-)

        2. STOP_FORTH Silver badge
          Happy

          We don't need no steenkin' numbers

          Grab punch cards in one hand. Align them by bashing one end against a hard, straight surface.

          Draw a sloping line on the edges of the cards with a pencil.

          No need to even learn numbers, though they might be helpful for programming, especially bloody FORTRAN.

      2. david 12 Silver badge

        Re: pot?

        Line numbers form a "artificial key". That is a level of indirection. Back in the days of the database wars, it was interesting to observe that people coming from the programing side preferred indirection (and artificial keys), and people from the database side preferred natural keys.

        in Database schemas, an artificial key duplicates data. DBA's from a data background didn't like it because it made the data bigger, could become out of sync with the actual data, and was "just wrong". DBA's from a programming background liked it because "indirection solves problems", and named constants are "just right".

        I learned to love artificial keyed source-code when I was supporting a very large BASIC program, under constant development by multiple teams, deployed in multiple versions at geographically diverse sites. Our exception logging scheme gave us logic line identifiers that were fixed over multiple years and multiple versions. I didn't need confidential user data or a remote debug session into an air-gapped machine in another city to tell me what operation had failed on user-input fuzzing.

      3. Liam Proven (Written by Reg staff) Silver badge

        Re: pot?

        > I returned to Atari BASIC recently

        Yes. You *returned.*

        You forgot that you have already mastered the more complex tool.

        1. Michael Strorm Silver badge

          Re: pot?

          Just because I learned to edit with line numbers first, doesn't inherently mean that the line-less editing I used more later on was (in itself) more "complex".

          I think you- like I would have done back then- confuse the familiarity with what we first knew with it being inherently simpler, like one's native language.

          If line numbers ever seemed "simpler" to me, that's likely only because it's what I learned first and the "default" I took for granted as to how programming and computers worked.

          Yes, I *did* learn more complex tools and programming techniques later on, but that "complexity" had nothing to do with the whether the editor used lines or not.

          I don't see that a text editor displaying a bunch of lines in easily-accessible (and easily-modified) order is inherently more "complex" than having to refer to and navigate code via lines, a page and/or range of lines at a time. Then having to use one of the various 8-bit-style techniques to edit a line, e.g. call up the line or range you want to make sure it's on the screen and- e.g.- put it in edit mode (Sinclair), navigate over the displayed line and edit/insert (Atari) or retype/copy with changes (BBC).

          (Yes, I'm sure the former style of editor might benefit from line numbering as an aid to navigation if it doesn't have any more advanced tools- just like many modern editors optionally include- but that isn't the same thing as the line numbers themselves being- or *having* to be- an integral part of the code itself).

    3. JRStern Bronze badge

      Re: pot?

      There was a time, circa 1975, when the "files" thing was considered an issue.

      The first versions of Microsoft's NT operating system, and the infamous and never released NT 5.0, was going to, I dunno, do away with them, or something.

      And watch for this idea to come around again, it kind of has, with "data lakes" and such instead of data bases.

      But I agree with you, files are a good idea not a bad idea.

      1. steelpillow Silver badge

        Re: pot?

        The original MacOS tried desperately to hide the filesystem, so you just clicked arrays of icons to do stuff. Despite a neat and elegant feel, I found it unusable.

        Today's MS Office 365 SharePoint collaboration space does much the same, only for neat and elegant read chaotic and shite.

        1. Dan 55 Silver badge

          Re: pot?

          There was a short happy time when MS was forced to open up SMB, so then they went all out on SharePoint, and of course businesses marched from SMB into SharePoint like lemmings. Everything safely locked up once again in the next walled garden.

        2. fromxyzzy

          Re: pot?

          iOS has tried to do this as well, it fights you hard to keep you from seeing the filesystem structure. They claim it's for 'security' (which surely means 'so you can't jailbreak it') but it only really makes sense if you understand it as being philosophical.

          Makes it impossible to use iPhones/iPads to get any serious work done.

    4. doublelayer Silver badge

      Re: pot?

      This is not the first time that Liam has argued that files are harmful. When he wrote his eulogy for Optane, he was unhappy with its death because he hoped it would kill the file system by uniting disk and RAM, a similar approach to what he's said here. I didn't understand it then and I don't understand it now.

      Files are useful. They let you store things much more conveniently than an internal structure that something else writes to storage for you. You can find it, copy it, duplicate it, transfer it onto disks or over networks, it's the ultimate form of portability. We only managed without them because resource limitations were so restrictive, but as soon as the machines made storing separate data files feasible, people did so. Once you have separate data files, having code as data was useful in so many ways. Related to this article, early students of programming get two additional benefits from files. The first is that they can review and edit their work, even if their first attempt crashed the computer. They don't need to write down what they're typing so they can see what it was later. The second is that they can have multi-file programs, or in other words they can have a piece of code provided to them which they don't have to type in from scratch. Want to know how it did what it did? Open that file. Want to use it, just type in an import statement as supported by your language of choice, BASIC or otherwise. No need to renumber something or manually type in that code.

      1. Anonymous Coward
        Anonymous Coward

        Re: pot?

        Not aware of the previous files controversy.

        So whats the result if no files, going with some sort of uber-Registry containing it all?

        Files exist to fit a purpose, and its both understandable by humans and computers.

        Not sure whats at the bottom of this distate, nor what/how a supposed better way of doing it could work as effectively.

        1. doublelayer Silver badge

          Re: pot?

          "So whats the result if no files, going with some sort of uber-Registry containing it all?"

          That's one of the several things I don't get. As far as I can tell, it's objects held by software in their indexed, ready to edit form. What happens when you want to use those objects in other software, I don't know. I would guess that you're supposed to export it from the software into some format that you can import into another piece, and I'm not sure where the data is supposed to be while you do that, but I'm now trying to figure out what you do to promote data interchange when the simplest mechanism is explicitly denied. I think it's up to Liam to explain what you would do where files aren't available, and so far, I've found that part either vague or missing entirely in any proposal he has posted.

    5. Liam Proven (Written by Reg staff) Silver badge

      Re: pot?

      > Historical stuff is interesting, but the part about line numbers and no files is bonkers.

      That's such an over-simplification that I wonder if you were on the jazz woodbines while reading it. ;-)

      2 unrelated things, linked by a theme.

      1: line numbers in BASIC are a useful thing for beginners. Those who wanted to get rid of them don't realise that they are replacing a simple metaphor with a very complex one, because they have forgotten that "files" _are a metaphor_.

      2: This one is more difficult. Take it step by step.

      [a] More advanced languages, such as Logo and some Lisps, make compilation invisible. Edit a module code and run it, and while you're working on it, it's interpreted; but when you _give it a *name*_ then it's compiled in the background and kept around so suddenly the same module is now effectively a binary.

      [b] OSes built around this kind of tech blurred the lines between "interpreting code" and "compiling code" away so much that you could no longer tell _and did not need to._ If the OS is good enough then tools like "compilers" disappear because they are a hidden background function like keeping recently-accessed data in a cache: you never see it and never need to. It's accessible to experts doing low-level fettling but hidden from ordinary Joes writing a payroll routine.

      [c] Once you accept this idea then we can imagine a more advanced OSs that also blurs away the distinction between temporary data and permanent data. It's all permanent unless you get rid of it. A terabyte of PMEM -- persistent memory, such as Optane -- and you don't need to track files any more. It's all just storage and whether it's kept when the power is off becomes an engineering implementation detail.

      That's why I linked to my Optane comment piece:

      https://www.theregister.com/2022/08/01/optane_intel_cancellation/

      The point was a metaphor comparing the simplicity of the line numbers model and the theoretical simplicity of an OS that uses persistent memory where "files" and "folders" fade away like the technical debt they are.

      1. doublelayer Silver badge

        Re: pot?

        The problem is that we've seen what happens when you blur away the existence of files. You get the smartphone. It's your data, you know. No need to wonder about where it is, Google's on it, you don't have access to the raw data anyway, the apps know where to find it.

        And that is wonderful, assuming that you only ever want to do the thing the app gives you controls for. If you want to have some of those files on portable storage because your phone doesn't have enough internal memory, but the app didn't give you that button? Sorry, you can't do it. Do you want to back up your data and restore it to a different device? You could copy all the files and transfer them over, if we let you see them, but since you can't, you only have a one-size backup method which might work or it might simply restore the app, helpfully back to factory settings except it remembers that you asked for dark mode. Maybe the app is doing something wrong? On a computer, I could edit a configuration file to make it do something differently. But even though the same configuration file exists on the phone, I'm denied access to it, so I'm out of luck. This is the kind of stuff that breaks when you try to pretend that files are unimportant.

        A lot of the time, you don't have to look at or worry about internal files, but you really do want to have access to yours and having the internal ones is always an option that's useful to have around in case you find a reason to want it. It's necessary if you're going to write good, structured code. Partially it's because it makes organization of something complex more feasible, and partially, it's because portability and reproducability is critical here. Programming involves multiple layers of understanding how the machine works at one level, ripping that away and learning what's under that, then putting back the levels until you have the level of abstraction relevant to your task. Hiding something as simple as where your data is under a layer of vagueness is not helpful.

        1. Anonymous Coward
          Anonymous Coward

          Re: pot?

          Excellent comment describing current times. Multiple stories recently about just this, files/folders are an alien concept to most of today's 'computer expert' children/young adult the parents want to gush about.

          If the apps won't let you export data to anyone aside its mothership OS/SM site, your cooked.

          Then its on to Google to find out how to connect your shiny to a laptop or weird desk top computer thing, and navigate this weird conglomeration spaces, die rector ees, and files?

          Seems pretty clear most education systems are flat out failing basic educational duties in preparing kids for anything but how to USE applications. All that money and treasure largesse being flung at them and kids are confused by files and directories.....

      2. Dan 55 Silver badge

        Re: pot?

        Optane was dropped because there was no real reason for it. All it is is a RAM cache on top of storage, perhaps useful for spinning rust but by the time DDR5 and faster SSDs came along there was no point. It required a special CPU and motherboard, it was expensive, and anyone who really needs to access files on SSD at RAM speeds can write their own solution to stream them in.

        I mean, if you wanted to only address data on storage by track, sector, and byte offset like some kind of memory map in RAM you could, but there's a reason why everyone uses a filesystem.

      3. Chris Gray 1
        Coat

        Re: pot?

        I'd have to Google or something to know what "Jazz Woodbine" is.... :-)

        The concept of being able to execute expressions, etc. right at the usual command prompt is indeed useful. It's one of the first things I put into my AmigaMUD system (late 80's?). In systems that allow significant syntax for non-expression command lines you usually want a prefix character to select expression-evaluation mode to make it unambiguous. Argghh - my old brain can't remember, but there was an early system that used a close parenthesis (')') at the beginning of a line as such an escape.

        Are files really an abstraction? They are usually implemented via a name lookup table (file names) pointing at a description of the storage that contain the file's data. Given that an externally visible representation of that storage could get real ugly, and almost no-one would want to type it directly, I don't see this as an abstraction situation. It's more of a "We need names for such things!".

        Also, the generalization from just file names to full file paths (and the concept of current working directory in a hierarchy) quickly became mandatory - very quickly the ability to organize your names (and re-use names in different contexts) made things much easier, even though the concept of a name hierarchy added complexity. The lack of a file hierarchy on early systems like CP/M worked when all you had was floppy disks with little capacity. Even an early 5MB hard drive got annoying - you had to remember the (numeric!) USER groups to go beyond the one set of names. And, given the filename length limitations, creating your own personal hierarchy within the names you used wouldn't be very satisfactory.

        As for using numbers of varying lengths as a kind of hierarchy, I know *I* would have had to have a piece of paper beside the teletype (hey, we are going back as far as we can, but I'll ignore front-panel input switches on computers) telling me what the various numbers were supposed to mean. If I was paranoid enough, there would have been copies of that in several places since it would have been crucial to the use of the computer. Nope, give me file names anyday!

      4. steelpillow Silver badge
        Flame

        Re: pot?

        > "files" _are a metaphor_

        Only if you have been smoking the propaganda Woodbines.

        Most words in computing are metaphors in that sense. It's how we build our vocabulary of understanding; the metaphor is used as a label for a specific technical construct. Please show me a real "cloud", a real "mouse", a stored "image" that is not just a string (sic) of bits (sic). If you want to see a real filesystem, go find an old library that still uses Dewey card indexing. Metaphor my arse! Using such an argument to diss virtual filesystems - or anything else for that matter - is just semantic handwaving.

    6. ibmalone

      Re: pot?

      Some interesting history that then runs into weird opinion piece:

      Type a bare expression, the computer does it. Number it, the computer remembers it for later. That is all you need to get writing software.

      It eliminates the legacy concept of a "file". Files were invented as an indirection mechanism.

      This is both oversimplification and obfuscation at the same time. First part: python and many other interpreters have an interactive mode which works as described (bash shell scripting could be described as just this). Run ipython or jupyter notebook and you get that same experience. Files are a storage mechanism, a way to store those programs, just as I had to learn to do to tape when trying to program on the ZX Spectrum.

      Second, maybe you will decide to call something like Jupyter Notebook extra levels of indirection. Now the thing is, so are those basic interpreters. This is the thing that it took me a long time to unlearn, the ZX's Basic interpreter, MS DOS, Linux command line, none of those are in any way some kind of direct interface or native link into "the computer" in a way that Harrier Attack, Windows 95 or Plasma aren't. Because the assembly of hardware is running some combination of that software and its own microcode that provides APIs to hardware. You are not any more directly entering commands into a Z80 with BASIC than you are in something like ipython or Bash. To this point, some python starter guides actually start with interactive mode.

  7. tiggity Silver badge

    AtomBASIC

    Mention of AtomBASIC sent me down memory lane.

    Though TBF, I did not use that BASIC much.. I started out using it but soon hit the limitations of memory and performance and so started using more and more assembler to get more bang for my memory buck (AtomBASIC was great for really easily letting you call assembler code from the "high level language", probably easiest to use "assembler & high level language mix & match" capability of any language I have used). It wasn't long before I ended up jut doing pretty much everything in assembler, but AtomBASIC was a good introduction to programming (never understood GOTO hate BTW, you could argue GOTO helped you get to grips with concepts of jump / branch calls common in most assemblers)

    1. Pat 13

      Re: AtomBASIC

      I was never a programmer but using my Atom's Basic and a thick book on Programming the 6502, I did write a program which moved a blocky spaceship smoothly and quickly across the screen. So, at least I got a brief glimpse of knowledge about programming languages and their levels.

  8. Andy 73 Silver badge

    Hmmm... two different things..

    Sure, line numbers, lack of files and folders and the immediacy of BASIC are advantageous for beginners.

    But those same omissions are probably the root cause of the decline of BASIC - they make modular programs impossible. Even with BBC BASIC, the best you could hope for was one long, long screed of text that did everything. It stopped beginner programmers from moving to the important next step of programming: breaking down a problem into smaller units that could be solved once, stored and shared.

    Hence the association of BASIC with a lack of structured thinking. There was no 'meta programming' (in the sense of organising a group of program units, libraries and input data), just a big blob of BASIC.

    It is also true that Commodore BASIC was abysmal. Though we might argue that Tramiel was right in that no-one really bought their computer to write BASIC programs on by the mid eighties - so providing a perfunctory implementation probably met the required criteria even if it left a small number of kids feeling incapable of making their computer "do something". It's still the case that we could make much more welcoming 'beginner' languages and computers, but the budget is rarely there to do so.

    1. Dan 55 Silver badge

      Re: Hmmm... two different things..

      Sure, line numbers, lack of files and folders and the immediacy of BASIC are advantageous for beginners.

      It's just a side effect of these systems' base configurations being cassette rather than any thought about the user. There were no folders and the file was the next screeching noise you heard after pressing play. All the computer could interpret was what was in memory at the time and the editor was simple so the user had to order the commands... but that doesn't mean that newer BASICs required line numbers (QL SuperBASIC) or were so limited.

      Though we might argue that Tramiel was right in that no-one really bought their computer to write BASIC programs on by the mid eighties

      Commodore put BASIC 3.5 in the TED machines (developed under Tramiel but released after he left) and BASIC 7.0 in the C128. The C128's BASIC had improved so much it even had FAST and SLOW like the ZX81. ;)

    2. heyrick Silver badge

      Re: Hmmm... two different things..

      "Even with BBC BASIC, the best you could hope for was one long, long screed of text that did everything."

      RISC OS would like to introduce you to the LIBRARY keyword, to add extra bits of code to a program. It is frequently used for writing multitasking programs because using the indirection to fiddle around in blocks of memory to set up icons and windows and such is icky, so people use libraries to save having to get their hands dirty over and over and over until insanity ensues.

      There's also OVERLAY, which is similar but different.

      Oh, and while BBC BASIC technically needs line numbers, my editor doesn't show them. It doesn't matter what they are so long as GOTO/GOSUB aren't being used. Indeed, one can hack a program to set all of the line numbers to '1' and it'll still work.

    3. lordminty

      Re: Hmmm... two different things..

      I was introduced to structured programming AND subroutines while at school using BASIC+2 on a PDP 11/70.

      The thing that didn't have "structured thinking" was the PICNIC, not BASIC the language.

      1. Bebu sa Ware
        Windows

        Re: Hmmm... two different things..

        BASIC+2 on a PDP 11/70.

        Brings back memories of writing a BASIC program to convert BASIC +2 programs to vanilla BASIC on a RSX-11 system (the +2 license lapsed I think.)

        Pretty much assigning line numbers to labels, functions; converting functions to gosubs etc ... most of the elements of a rudimentary compiler I guess.

        The DEC manuals are preserved in the Internet Archive PDP11 BASIC PLUS 2 Users Guide. Nowadays seems a rather peculiar language but on machines that sported 64kb memory you could actually do real work using these tools.

    4. Bebu sa Ware
      Coat

      Re: Hmmm... two different things..

      "programming: breaking down a problem into smaller units that could be solved once, stored and shared."

      This was something largely absent from computer science education in the '70s.

      Really brought home to me at the time after reading a newly printed copy K&R The C Programming Language (1/e;) where pretty much each small example function implemented some part of the standard library (of 7th Edition Unix?)

      Being able to use the RMAC assembler on CP/M 3 to build relocatable objects and libraries meant I could compile and assemble programs in a C subset using the existing abstractions in my libraries which would be linked into a executable. This stuff just wasn't being taught.

      As well as using indirection to solve problems, most problems would benefit from delaying binding a particular implementation or solution. ELF shared libraries are good example of using both indirection and delay. (Indirection is arguably also a delay.)

      Even without "files" you still have to identify, distinguish and presumably name objects - I guess a file system just one method of doing this, another could be a database which I recall Hans Reiser giving as a rational for some the design decisions in reiserfs. Multics also did things differently.

    5. Anonymous Coward
      Anonymous Coward

      Re: Hmmm... two different things..

      As others have said, there are plenty of BASICs with a variety of structures, and ways of being modular. And plenty of BASICs that don't have line numbers (at least in the GOTO sense)

  9. David 132 Silver badge
    Pint

    Upvote for the mention of Blitz BASIC on the Spectrum

    BB was one of my first purchases* after I received my Spectrum+ in 84/85. It allowed flood-fill (well, slowly-rising-ooze fill, technically), true circles, 80-column text, and all sorts of other enhancements.

    The manual, as I recall, was printed as an A5 booklet, in black ink on red paper to deter photocopying.

    A pint for the BB author!

    *my very first purchase was Elite, from a shop in London on a day-trip to the Big Smoke. I remember opening the box and admiring the LensLok on the train on the way back North. And then cursing that same LensLok when I couldn't get the darn thing to let me play the game :)

    1. Dan 55 Silver badge

      Re: Upvote for the mention of Blitz BASIC on the Spectrum

      Beta BASIC on the Spectrum, Blitz BASIC on the Amiga? Though both are worthy of mention.

      1. David 132 Silver badge
        Pint

        Re: Upvote for the mention of Blitz BASIC on the Spectrum

        D'oh! You're absolutely right, and I clearly had a brainfart when I typed that. I meant Beta Basic of course.

        In mitigation I can only offer that I was intrigued by Steelpillow's mention of the Blitz mode on SAM Basic and clearly that word was on my mind. I sort of wanted a SAM Coupe for a while, but by the time I'd saved up almost enough pennies for one, I'd discovered the existence of the Amiga and took a different track!

  10. Roopee Silver badge
    WTF?

    English is one of the easiest human languages...

    Do you have a source for that statement? I think many, many people would beg to differ!

    1. HuBo Silver badge
      Trollface

      Re: English is one of the easiest human languages...

      Well, civilization came very slowly to the Saxons (originally Vandals and Barbarians), from their Greco-Latin neighbors, who themselves got it from their near-east and far-east neighbors (al'gebra, al'gorithms, right-to-left Hindu–Arabic numerals with zero, chinese noodles, gunpowder, ...) and so language had to be simplified along the way as well, from ideograms, to latin alphabet and declinations, and eventually English as a great improvement over cave-dwellers' grunts and suchlikes.

      It's a simple language, but it's also quite expressive, concise, and flexible ... maybe not as romantic, or poetic, as others though (I'm told Persian poetry is particularly subtle and enjoyable). No one should be particularly vexed for having such a primitive language as their mother's tongue imho (we don't really get to choose that)!

    2. Bebu sa Ware
      Coat

      Re: English is one of the easiest human languages...

      "English is one of the easiest human languages, both to read and write as well as to understand."

      "many, many people would beg to differ!"

      Just about everyone outside the anglosphere, I should think.

      Just taking the size of the English vocabulary one only has to look at the paper OED to realise that English has begged, borrowed and stolen its way to the largest lexicon of any language. As I frequently have to remind my non native English speaking spouse that English has few actual synonyms as most are differentiated by very fine shades meaning modulated by context.

      The syntax of English is simpler than say German but still has the joys of the genitive and for the non left pondian, the subjunctive - a long explanation to her indoors of why in a particular text "I" followed by "were" wasn't incorrect.

      The spelling of English isn't as haphazard as might appear if you understand the evolution of the language from Old English and have some exposure to those texts but for the non native speakers it's a complete dog's breakfast.

      As a native English speaker (albeit of an antipodean dialect), I would contend that French is a much easier language for non francophones to learn and understand, possibly Spanish even more so.

      Ironically dominance of English while initially due to the far flung British Empire was cemented by the post WW2 economic and geopolitical supremacy of the US - arguably a nation which subsequently hasn't done the language any favours.

      1. BILL_ME

        Re: English is one of the easiest human languages...

        Native English speaker here, and I also think its the easiest language in the world.......

        Every non-English speaker is going to have a different opinion on what language is easier to learn.

        German was easier for me than Japanese because of the reverse sentence structure though the actual language is easier as pronounciation is the same as is writen without the weird exceptions English has.

        If pronounciation of the new language is not too dissimilar to the native speaker, then probably the most important criteria in classifying it as 'easy' is does the new language sentence structure follow the native one.

        Aside from having to learn new words, training yourself to reverse ubject–Object–Verb to Subject–Verb-Object, or others like High-context vs low-context.

        Not sure how/why this is in the discussion though, BASIC was written in English because thats what its creators spoke.

        1. This post has been deleted by its author

          1. ebruce613

            Re: English is one of the easiest human languages...

            That would be 'weird'?

            1. that one in the corner Silver badge

              Re: English is one of the easiest human languages...

              What would be weird about a Brit using the word weird? I would personally consider it a peculiarly uneventul week if nothing happened to make me say "that was - weird". One doesn't use it every day, of course - sometimes the thing is more bizarre than weird, on occasion merely odd or, on the best days, curious (and then curiouser). But once a week seems about right.

        2. TheMeerkat Silver badge

          Re: English is one of the easiest human languages...

          > Native English speaker here, and I also think its the easiest language in the world.......

          That is because you are a naive speaker.

          For non-speaker English is quite difficult. More difficult than, say, Italian. Just the fact that spelling is quite random makes it really hard.

          > If pronounciation of the new language is not too dissimilar to the native speaker, then probably the most important criteria in classifying it as 'easy' is does the new language sentence structure follow the native one.

          You are wrong again.

          The most difficult and time-consuming part when you actually learn a language is to learn vocabulary. So the farther away the language is from your native one, the more difficult and frustrating it is to learn. There is no magic way to go around learning the vocabulary.

          As for pronunciation English is not easy for speakers of other languages. Plus it is more difficult to listen compared, say, to Spanish or German due to its pronunciation.

      2. LionelB Silver badge

        Re: English is one of the easiest human languages...

        > As a native English speaker (albeit of an antipodean dialect), I would contend that French is a much easier language for non francophones to learn and understand, possibly Spanish even more so.

        Spanish has the advantage that the orthography is pretty close to phonetic (as long as you pay attention to the accents), and in particular (with a few very specific exceptions) all the letters are at least pronounced. French... not so much; to my mind the gap in French between what a word looks like it ought to sound like and what it actually sounds like is almost as wide as in English.

        The thing about English, is that as a hybrid between two language groups, on the one hand the grammar has been forced to simplify (in comparison with both its Anglo Saxon and Romance progenitors), and it jointly inherits a particularly rich, expressive and nuanced vocabulary - but the spelling is a car crash.

      3. Michael Strorm Silver badge

        Re: English is one of the easiest human languages...

        > the US - arguably a nation which subsequently hasn't done the language any favours

        Ironically enough...!

    3. F. Frederick Skitty Silver badge

      Re: English is one of the easiest human languages...

      Yup, Liam is clearly not a linguist (nor am I but I studied linguistics as part of my degree). English is an absolute mess, with more exceptions than any other European language. It's pervasiveness has nothing to do with it being a particularly well structured or easy to learn language.

    4. Richard 12 Silver badge

      Re: English is one of the easiest human languages...

      Indeed.

      English is objectively one of the hardest languages to learn, because it's a giant hairball of a language. For a start, English is Germanic, except where it's Romance or Greek. Or words and phrases nicked from elsewhere.

      Spanish, French, and Italian are all far easier to learn because they're entirely Romance - except for recent technical loanwords from English of course.

      For a concrete example, the concept of a "spelling bee" is almost uniquely English. In Spanish the idea is nonsensical as almost all phonemes have a single spelling.

      English is only easy if you already know it.

      1. Caver_Dave Silver badge

        Re: English is one of the easiest human languages...

        "English is objectively one of the hardest languages to learn", but I will add the word "correctly" to the end of that statement.

        I talk and write daily with people all over the world. They mess up large chunks of vocabulary, grammar and spelling, but the meaning is easy to work out as so much of the communication is contextual (although that itself is often seen as an impediment to 'correct' English usage.)

        Esperanto was an attempt to provide a simplified language (IIRC about 900 base words and 2000 in total), but a phrase like "blood, sweat and tears" famously translates as "body water, body water and body water".

        "If you learn only 800 of the most frequently-used lemmas in English, you'll be able to understand 75% of the language as it is spoken in normal life." Prof Stuart Webb

        The preferred tabloid newspaper of the UK working class, The Sun, often has around 8,000 lexemes (walk/walks/walking/walked counter as one lexeme), the more high-brow newspapers often contain over 20,000 lexemes.

        Is this 10 times difference between 75% and 100% understanding really a problem for the average man in the street?

        BTW. I get by in France, shopping, eating and even trying to find Diesel in a fuel strike, with a vocabulary of probably less than 100 words. (I chose French as my example as that is the only country I have visited where the locals don't immediately switch to speaking English when they realise you are not local.)

        I did manage a short conversation in a Dutch restaurant last summer, but it turned out the waitress was from New Zealand and also wanting to practice Flemish!!!

  11. Doctor Syntax Silver badge

    "A generation of gray-haired IT folks learned computing using BASIC on 1980s home computers"

    Just don't say such things. This is my children's generation.

    1. Dan 55 Silver badge

      If only I had known learning BASIC on 1980s home computers would be responsible for so many ills. GO TO, gray hair...

    2. steelpillow Silver badge
      Windows

      A generation of gray-haired IT folks

      So what colour hair do those of us have, who learned computing in the 1960s?

      1. Philo T Farnsworth Silver badge

        Re: A generation of gray-haired IT folks

        Hair? What's that?

        1. David 132 Silver badge
          Windows

          Re: A generation of gray-haired IT folks

          Hair? What's that?

          It's the thing that, after a certain age, sprouts wildly from one's nostrils.

          (Old git icon is a self-portrait.)

          1. Fruit and Nutcase Silver badge
            Happy

            Re: A generation of gray-haired IT folks

            I wonder... Do they do hair transplants using hair from the nostrils?

            Icon - bald head

            1. This post has been deleted by its author

              1. David 132 Silver badge
                Coffee/keyboard

                Re: A generation of gray-haired IT folks

                You bastard. You owe me a new keyboard and possibly monitor for the effect that mental image had on me - see icon --->

                :)

                Also @Fruit and Nutcase - if they did, and such a transplant was performed on me, I suspect I'd end up looking like Cousin It. There is no shortage of donor material.

            2. John Brown (no body) Silver badge

              Re: A generation of gray-haired IT folks

              "I wonder... Do they do hair transplants using hair from the nostrils?"

              ISTR a comedy sketch, possibly Alexi Sayle, in a photo-booth, noticing a nasal hair which may spoil his passport photo and so decided to put it out. He pulled and instead of teary eyed pain most of us would expect, he just kept pulling and pulling as it became longer and longer and the hair on his head disappeared :-)

              1. HorseflySteve

                Re: A generation of gray-haired IT folks

                That was Gregor Fisher as The Baldy Man and the advert was for Hamlet cigars.

                The Baldy Man character was originally seen in BBC Scotland's Naked Video comedy sketch show & had a spin-off series of his own (2 seasons totaling 13 shows).

                1. John Brown (no body) Silver badge

                  Re: A generation of gray-haired IT folks

                  Thanks! :-)

                  That was a *long* time ago when tobacco advertising was still allowed on TV :-)

          2. Tim99 Silver badge

            Re: A generation of gray-haired IT folks

            and ears and eyebrows...

          3. StuartMcL

            Re: A generation of gray-haired IT folks

            > It's the thing that, after a certain age, sprouts wildly from one's nostrils.

            And ears and eyebrows and the tip of your nose!

            The stuff on the top of your head doesn't disappear, it migrates.

      2. Doctor Syntax Silver badge

        Re: A generation of gray-haired IT folks

        It doesn't matter. We're just pleased if we've got any left.

  12. gaffe77

    Education should involve mistakes...

    ...made in the safety of the classroom, not just an appeal to the authority of some white-bearded wizard. Let students program with GOTO until its "harms" become obvious, then introduce them to Edsger . No velociraptor will take them.

    Brag: I first wrote 30 GOTO 10 on a tty (no, a real one with oiled parts and paper dust) in 1977. Some kind of Digital kit, I think, though I was far from concerns about architecture at that point. Later, a Commodore PET arrived in the lab (no, a real one, with microscopes, test tubes and dye-spattered white coats), placed on a bench and mainly ignored. But I RTFM-ed....

    The problem with the gloriously uncomplicated world of educational BASIC described in the article is that it doesn't last longer than about a working day. Anyone who is going to end up writing worthwhile code will go mad with boredom if it goes on much longer than that. Whether you want to call it a filesystem or a workspace, some kind of persistent context for the student becomes necessary very early.

    I never wrote anything longer than about ten lines on the tty (to be fair, I could barely type). The PET had a five-and-a-quarter FDD. Posh, eh? But it enabled even a poor typist to reproduce a program listing in a magazine, over a series of lunch times.

    But something like workspaced BBC BASIC on a 6502 emulator should last for quite a while, and provide an introduction to assembler.

    1. Anonymous Coward
      Anonymous Coward

      Re: Education should involve mistakes...

      reproduce a program listing in a magazine

      One summer, I typed a game onto a handed down Dragon 32. Apart from some very limited time spent on someone else's PET, had no prior experience or instruction. Correcting the syntax errors on input was key - at first, correcting by looking up the source code. Later, depending on the error, I was correcting without looking up the source (usually things like missing spaces/brackets etc). Got the program working eventually - got there at last, but the journey of correcting all those mistakes and learning from the (effect of those mistakes, as not all errors on input led to syntax errors) set me up for my future career

      1. John Brown (no body) Silver badge
        Joke

        Re: Education should involve mistakes...

        "set me up for my future career"

        You're a proof reader? :-)

  13. disgruntled yank Silver badge

    Basic, etc.

    I wish that I had learned BASIC, given its widespread availability. Eventually I learned VBScript and VB.NET, but I don't know how much they had in common with the older BASICs.

    It is unfortunate that Dijkstra is so much remembered as an insult comic. I suppose that to some degree he brought it on himself.

    1. Dan 55 Silver badge

      Re: Basic, etc.

      Dijkstra also argued in the same letter that loop constructs like "while ... repeat" or "repeat .. until" could all be replaced with recursive procedures. I can't understand how anyone took him seriously about to GO TO.

      1. Philo T Farnsworth Silver badge

        Re: Basic, etc.

        Did Dijkstra like anything?

        I was thinking perhaps Modula but that was Niklaus Wirth's baby.

        I remember him dumping on the first machine I ever programmed, the IBM 1620, which was admittedly a unique beast*, but it was a fun machine to program.

        Of the OG computer scientists, my favorite was always Donald Knuth. At least he had a sense of humor.

        ____________________

        * It was a decimal-based, variable word length machine, somewhat similar to its sibling, the IBM 1401, that had to have addition and multiplication lookup tables loaded into its lower memory, thus earning for itself the sobriquet CADET -- Can't Add, Doesn't Even Try.

        1. david 12 Silver badge

          Re: Basic, etc.

          Dijkstra was an algorithms guy. He thought that the proper subject of Computer Science was algorithms, and that everything else was not CS. The idea of "functional languages" was not unique to him: a lot of people still think that functional languages are a simple and natural way to implement algorithms. And although he is now remembered for a headline, that was actually written by the editor of the journal.

          1. Philo T Farnsworth Silver badge

            Re: Basic, etc.

            I had the privilege of meeting with the late Fred (Mythical Man-Month) Brooks a few times at UNC when I was working in the Research Triangle Park area in North Carolina.

            I don't exactly recall the context but I do remember him telling me that any discipline that has to call itself a science probably isn't one. Computer science should more properly be called computer engineering -- at least most of it.

            Someone (not Brooks) also told me that computer science, which often started out as the poor stepchild of the mathematics departments, suffered from math envy, and tended to obscure simple procedural concepts in what yet a third person liked to call epsilon sauce and gamma grease.

          2. that one in the corner Silver badge

            Re: Basic, etc.

            > Dijkstra was an algorithms guy...

            And he was also writing back in the 1960s, when people were able to understand the use of hyperbole and didn't just start whining about the nasty mean man (followed by examples from decades after the article was written to show "where he was wrong").

  14. Jou (Mxyzptlk) Silver badge

    C16/Plus4 (and C128) Basic were much better

    Not only did they offer more commands than C64 basic, less bugs, and commands for graphics. The offered both a built-in assembler/disassembler as a easy way to do some things faster. Mix basic as structured (sort of) vehicle, and ASM for special things.

    What basic teaches: You can concentrate on actual algorithm since you can skip variable initialization and many other things. Many "better" languages need 50% or more of the code just for definitions and/or structure setup, and the actual algorithm is somewhere hidden in there. There are enough Hello World examples out there demonstrating that.

  15. Baudwalk

    You had me...

    ...until "line numbers good - files bad".

    Beginners should definitely not care about line numbers. And some sort of storage is needed.

    My favourite beginner platform for teaching to code is Makecode Arcade.

    Drsg and drop code that isn't gimped but pretty fully functional.

    It does support multiple files, but that's well hidden at first. Just name your project and go.

    Sure, it's by Microsoft, but it's great for getting kids interested and the graphical coding concepts map directly to "real code" ideas. In fact, you can toggle between (a subset of) TypeScript and the graphical representation.

    1. Fruit and Nutcase Silver badge
      Mushroom

      Re: You had me...

      Beginners should definitely not care about line numbers

      or the number of leading spaces on a line

    2. HorseflySteve

      Re: You had me...

      @baudwalk I think that you missed the point Liam was making. Imagine a world without pocket calculators where you've never interacted with a minicomputer and you are sitting at a teletype which is waiting for a BASIC command.

      You type PRINT 2+6 & press the return button

      The paper feeds up a line and 8 is printed

      You have learned how to make it do something immediately.

      You type 10 PRINT 2+6 (return), then 20 GOTO 10 (return), the RUN and you have learned (a) how to create a sequence of instructions, (b) how to execute that sequence, and (c) how to waste a lot of paper. You now also need to learnd how to break out of the loop.

      You got the thing to do something without needing to start a editor that you don't know how to use, save the file somewhere, compile it somehow find where the compiled result is & run it only to discover when you do that you accidentally type PRONT instead of PRINT and all you get is ERROR 02 - SYNTAX

      Remember BASIC was intended as a tool that allowed complete novice to learn how to make a computer do something useful with immediate feedback and the line numbers were a brilliant method to distinguish between immediate and sequential execution.

      I don't have to imagine the scenario I described at the start as I lived it...

      T

      1. Baudwalk

        Re: You had me...

        I think I got his point. I had a C=64 too 40 years ago.

        I just don't think the scenario is still valid, or at least not optimal, today, as we have better options for getting beginners started on coding today than was available back then.

      2. doublelayer Silver badge

        Re: You had me...

        I think we understand that it worked that way. I can even agree that, at the time, that made sense with how to do it. Most of the reasons no longer apply. For one thing, we have better error messages. If they typed "pront" in at the terminal, it still wasn't going to work, but now we'll write out the part that wasn't recognized so they aren't as confused. Whether that happens when they run the file or when they make a typo at the terminal, they're still going to have to learn that the machine won't fix your mistakes for you.

        In many ways, all the things you praise about BASIC are maintained by Python, possibly one reason why people keep using that as an introductory language. Sit down at a Python REPL and you can also issue one-line commands and see results. Instead of line numbers, you write your blocks:

        for i in [1,2,3]:

        print("Hello")

        The main difference is that you don't have to re-enter everything because we've decided storing your work to be edited at will is something you're too stupid to have right now. That made sense when there wasn't nonvolatile storage except for a tape or when the editor couldn't be used freely because you had to wait for lines to be printed one by one onto paper. Neither apply anymore, and any student will have used free editing into files many times before they write their first code. The argument that we should go back to those things sounds like someone who thinks the way they did it must be the best way in the world, in which case I have to ask them why they weren't using older methods that real programmers did in the 1950s and 1960s. Let me guess, those were superseded by better technology, but technology stopped getting better when you touched it.

        1. HorseflySteve

          @doublelayer I agree with all that you've said. However, my experience with early BASIC did help with my subsequent career which involved writing firmware for small microcontrollers in assembly language as well as designing hardware.

          For modern multitasking multi thread applications, especially those with a graphical UI, then Python is a great language to learn on, though I don't like the significant indentation; it seems an unnecessarily easy way to introduce bugs by accidentally deleting a space or two...

          1. Anonymous Coward
            Anonymous Coward

            I usually use the Basic argument to seperate the wheat from the chaff when it comes to assigning native intelligence to programmers.

            Anyone alive during the time period when Basic was seen as what it was, a simple learning language, knows how and why it was popular from the masses.

            Kids/people are trained from an early age on steps to take to do something, and lists are a common starting point.

            Early on, what do children do when make a list of things that need to be done in a certain order?

            They write a 1 and the step, then a 2, etc, etc.

            Not difficult to grasp, and certainly a low easier to follow than a page full of lines of text a la C with more sophisticated flow mechanics.

            So people who spout of their stupidity do me a favor.

            Second test is the monkey who rote repeat Bjarnes famous Goto nonsense.

            Goto/Gosub are direct takes on cpus JMP/JSR natives. Without them, s/w at the lower level would be far, far harder to write.

            All the goto/gosub spaghetti code complaints really highlight was a failure at the teaching level of how to properly use them, and showing them why using them improperly was simply going to make their coding work harder.

            Considering the absolute mess C++ has been for multiple decades, the guy simply showed himself to be an arrogant prick who's masterpiece has still to this day been subject to revision and revision to try to fix intrinsic issues.

            Wait,what? Sorry, what were we discussing?

            1. that one in the corner Silver badge

              > Considering the absolute mess C++ ...

              Ah ha, it is you again! Still haven't bothered reading TFA and looking at the names of who said what and when.

              > Wait,what? Sorry, what were we discussing?

              You were telling us where the nasty C++ compiler touched you.

        2. Jou (Mxyzptlk) Silver badge

          Re: You had me...

          > all the things you praise about BASIC are maintained by Python

          but added a few loopholes on top. Has anybody seen my space?

      3. Doctor Syntax Silver badge

        Re: You had me...

        Imagine a world without pocket calculators where you've never interacted with a minicomputer, you've never sat at a teletype which is waiting for a BASIC command and you're sitting at a desk with a pencil and got a stack of coding sheets in front of yu.

        That's where some of us started. What was worse, once the compiler (for that's what it was) had encountered PRONT it would issue an error but then continue and as, after PRONT, it had no continuing context which made sense so it would issue an error on the next line and each succeeding line until it was knocked on the head by a counter which ony allowed it to issue 100 errors or the like. Then you had a little while to correct it - if you were very lucky you'd be allowed to punch the card yourself - before the next deadline to put the job through again followed by another 2 hour wait.

        You either made very slow progress or you got fairly good very quickly. You also really appreciated every step towards a personal computer far more than anyone who started out later than that.

      4. Anonymous Coward
        Anonymous Coward

        Re: You had me...

        Thats a long-winded excuse that still fails to explain much.

        BASIC is/was always intended to introduce students to real world, hands-on concept of the at the time 'new' computing world becoming available. And I was there as well.

        The issue is that once you've plumbed the depths of BASIC, including peek/poking in your assembly language, congratulations!

        You've passed the initiation, and now are ready Player 1 for the next level of interaction and control with far more detail which will include IDE's, compilers, etc. Its called skills progression.

        If your's and Liams main complaint were mainly about introduction to computing for kids, you wouldn't be wrong that some simple, interpreted language like BASIC is an excellent introduction point.

        Problem is, you're still going to need to know some basics like files, directories, etc to save work in progress after week 1.

        We all did it without losing our minds in the late 70's, 80's, 90's.

        Granted 50% of potentially good programmers probably dropped their first CS class and future careers when crap like C++ became the defacto CS starting point. If you'd argued that, I'd happily agree.

        But arguing that having to learn some domain specific language and concepts is too much is inane, every job from plumber, roofer, sale, etc have similar DSL they have to learn if they want to succeed.

  16. Antony Shepherd

    First BASIC I used

    The 10K Microsoft Basic roms which you plugged into the TANEX expansion card for the Tangerine Microtan 65 was the first BASIC I used.

    If at the "MEMORY SIZE?" prompt you typed A it said "WRITTEN BY WEILAND AND GATES".

    No idea who Weiland was.

    1. Phil O'Sophical Silver badge

      Re: First BASIC I used

      No idea who Weiland was.

      2nd person hired at MS, apparently: https://en.wikipedia.org/wiki/Ric_Weiland

    2. Neil Barnes Silver badge

      Re: First BASIC I used

      The Microtan system was a surprisingly good system, particularly in its ability to be expanded. At one time I was using a CP/M machine as a data storage and text editor for my Microtan 10k Basic programs!

      (I see there are a few MT65 users still around here).

    3. ZX8301

      Re: First BASIC I used

      Weiland was the 6502 programmer. Gates wouldn’t touch the ‘brain damaged’ CPU (preferring the 8080, and thus showing little respect for addressing modes) but had a point about the 8-bit stack pointer. Hence Weiland’s 6502 ports were almost given away by Microsoft, who copied their BASIC from the one DEC bought in, right down to quirks like byte-length limited strings, the ambiguous RIGHT$ and the Cobolesque FIELD. Atari, Sinclair and ANSI stuck with the more general slicing of Dartmouth BASIC and string length limited only by RAM. C victims could learn useful lessons from BASIC’s string-handling.

      1. that one in the corner Silver badge

        Re: First BASIC I used

        > C victims could learn useful lessons from BASIC’s string-handling.

        Flippin' 'eck, not this canard again.

        C has types (structs, typeset) and libraries, lots of libraries, which are integral to *using* the language (not part of the language proper, although you can *generally*, not always, get an initial set of "get you started with 'Hello World'" standardised libraries with the compiler).

        One subset of those My First C Libraries are a few routines that let you do basic manipulation on strings represented in the way that the compiler presents constant strings that you put into the source code. After all, the compiler needs *some* way of presenting those to you and there is no point in doing anything complicated. Because...

        Anyone who wants to do anything more exciting with strings in a C program can simply use a library that provides a representation that best fits their needs.

        I am quite fond of using structs for the (buffer, buffer len, how full now) representation. But also like a one-codepoint-per-cons-cell approach when digging in and doing macro expansions: less space efficient but constant-time insertions and deletions.

        The same goes for C++ by the way: the compiler uses the simplest possible form for constants, you get to choose which library to use for your code (many people use the STL).

        Any language that promotes use of libraries can do the same, although more recent languages do promote a more complicated compiler/interpreter representation for constants - even though there is nor, and can never be, One True Representation Of Strings that is Optimal In All Circumstances. The same goes for numbers as well, of course.

  17. Tron Silver badge

    I did comment the other day....

    ...that someone could write a modern (internet-capable, web-capable) BASIC for the Pi. Computing is just data in, data out, storage and processing. Any language can be configured for any system.

    I would point out that the implement of Sinclair BASIC editor was particularly good. Syntax errors were possible, but the one key commands were a delight and reduced mistakes. And you could easily call machine code routines within BASIC.

    Back in the day, the abuse directed at BASIC came from the OOP lobby, who looked down upon everything that came before.

    BASIC taught a generation of kids how to think logically and actually do something with their computer. What followed was the most extraordinary development of technology we may have ever had, because next generation stuff was being produced and consumed, not by scientists or academics or the rich, but by the mass market. Ordinary people on their kitchen tables were taking consumer tech and making it do things nobody thought possible. Things that would become standard in consequence.

    And you can say that most languages since BASIC simply implement it differently.

    The complexity of modern systems is what makes them insecure and takes away their resilience. The bloat wastes resources. In some ways, we had to buy faster PCs to accommodate more bloated OSs and applications that did pretty much the same as previous ones. Now we are being told to buy AI PCs when we don't even want to use it. We need Pi PCs and we need a new, simpler, closer-to-the-silicon OS. Even Linux may be too bloated for its own good.

    1. doublelayer Silver badge

      Re: I did comment the other day....

      "someone could write a modern (internet-capable, web-capable) BASIC for the Pi."

      Of course someone could. You can make a language with any set of capabilities you want. The question is what you put in it and how you make certain things convenient and powerful. If it's not at least one, the language is doomed, and some languages get close enough to both that they'll get used unless yours does too. BASIC's structure doesn't lend itself well to scaling the number of options, because you either end up creating hundreds of builtin commands or you try to make all your functionality as a library and force it to work in the small subset of commands you're willing to include. That means that, if you want to be able to send TCP and UDP and ICMP packets, you're going to need to plan out a lot of syntax. That's not the biggest problem though. The biggest problem is that you likely also want to receive those, which means you're most likely to build a network stack in a different language that can handle this better and just abstract it out for the BASIC program. That works for students, but it's very different from the BASIC that let you drop to assembly when you wanted.

      "The complexity of modern systems is what makes them insecure and takes away their resilience."

      I disagree. That sometimes happens, but the simplicity of early systems often also made them insecure. Complexity means that interprocess communication mechanisms can have vulnerabilities that people didn't catch before they were put into production, but before we had those, inter-process communication was unlimited, meaning that those vulnerabilities couldn't exist because everyone was allowed to do that. Before that, inter-process communication was impossible because multiple processes was impossible. So did that make us more secure? No, it did not, because it meant that any process running could do anything it wanted to any piece of data. For example, consider the ubiquitous DOS malware that copied itself onto boot sectors and autoexec files, which could not be prevented because there was no per-file security mechanisms and no monitoring while the process ran. Complexity didn't just add more features, although it definitely did that. It also made the structures necessary to have security in a world where not every line is trusted.

      1. that one in the corner Silver badge

        Re: I did comment the other day....

        >> The complexity of modern systems is what makes them insecure and takes away their resilience

        > I disagree. That sometimes happens, but the simplicity of early systems often also made them insecure.... consider the ubiquitous DOS...

        If we look back a year, to Liam's article on War of the workstations: How the lowest bidders shaped today's tech landscape, you have fallen into the trap that Liam was pointing out then: DOS and from that all too much of what we put up today is *NOT* the epitome computer design..

        If you go back just a tiny way beyond the creation of the - trivialised - "computer" that was the IBM PC (and, yes, every other microcomputer out there[1]) then you can find the more secure and resilient systems of yore.

        [1] The microprocessor was terribly clever, ''but'' when they were made available to the great unwashed it was still far too expensive to provide the full grown-up computer experience to everyone, but we could provide an ALU, Program Counter and a few kilobytes of RAM at a price that a few really keen people could afford - think the IMSAI and the MITS Altair. We took decades to get much further than that, other than simple bulk, as RAM prices (so very slowly at first...) dropped.

        1. doublelayer Silver badge

          Re: I did comment the other day....

          My argument did not rely on DOS being the epitome of anything. It applies equally well to any of the other operating systems that had no security in their design. A lot of the operating systems of the time, especially on the home computers that this article and the comments are talking about, were exactly the same. Mac OS did it. OS/2 did it. RiscOS, BeOS, AmigaOS, all of them did it. This means that, when comparing any of those systems to modern ones, they'll always come up short when comparing their security chops because zero is an easy number to beat.

          But your point is correct: not every operating system was like that. Some of them did implement inter-process security. They did not do it perfectly. Some of the places where they had vulnerabilities were due to things that cause modern ones: coding mistakes, resource limits, or just making it easy for admins to mess up. Others were a result of the restrictive design, which is less of a factor with modern operating systems. Their simple design did prevent some classes of vulnerability, but not because the design handled it, but because the features that are necessary to have the vulnerability weren't available. It's similar to saying that my house with no windows is more secure than your house with windows because you can't throw a rock through my nonexistent windows. That is true as far as it goes, but by being able to look through your windows, you are more able to see and respond to possible threats. A system with no networking wasn't vulnerable to attacks over that connection, but it's not because their code was any better.

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: I did comment the other day....

      > ...that someone could write a modern (internet-capable, web-capable) BASIC for the Pi.

      I agree.

      There was the beginning of the germ of such an idea:

      https://www.riscosopen.org/wiki/documentation/show/Software%20information:%20RaspberryPi:%20RISC%20OS%20Pico%20RC5

      1. graemep

        Re: I did comment the other day....

        There is at least one modern basic available for the RaspberryPI - Gambas.

        However, it like other modern Basic variants has dropped line numbers.

        1. werdsmith Silver badge

          Re: I did comment the other day....

          You can use the very simple 1 core Raspberry Pi Zero and run RISCOS on it and use BASIC all you want.

    3. that one in the corner Silver badge

      Re: I did comment the other day....

      > Back in the day, the abuse directed at BASIC came from the OOP lobby, who looked down upon everything that came before.

      What, the guys who created SIMULA (started in 1962) were being abusive about BASIC, which debuted in, um, 1964?

      Unless, by "OOP lobby", you mean the pile of people who didn't actually create OOP but just built a bandwagon so that they could shout how clever they all were (which is just something that keeps on happening in the tech world and actually had nothing to do with OOP qua OOP).

  18. heyrick Silver badge
    Happy

    BBC BASIC is still around today, maintained by Richard Russell

    Actual BBC BASIC, as in the one created by Sophie Wilson (who wrote the original 6502 version), is still around today - in RISC OS.

    Couple that with Russell's ones, there's plenty of BASIC to go around.

  19. ComicalEngineer

    1987 - 1989 I was working at a university on a UK government program to introduce computer aids to the batch chemical industry. We were writing models to simulate what happened in a batch chemical reactor. The main software we used was Mitchell & Gauthier's ACSL (Advanced Computer Simulation Language) which took the the Fortran 77 source code into machine code and then executed the model. But, as I have alluded to previously, you could buy a Ford Fiesta or Vauxhall Nova for the price of an IBM PC (10MB HDD, 4 colour graphics-- luxury). We shared 2 PCs between 4 of us.

    BBC Basic is, however, very similar to F77 and includes use of subroutines, although there are minor differences in the way variables are passed in and out of the subroutine. Thus we were able to develop the routines (heat transfer, chemical reaction, distillation etc) on cheaply available BBCs before porting the code to the PC.

    We even had a program that did graphical output on the BBC screen which could be printed. Some of our programs were sufficiently complex that they took 8 - 12 hours to run on a BBC B with maths co-processor. That wasn't a problem as we had access to a dozen BBCs in the computer lab (out of university term time) and so would set half a dozen of them running on variations of the same program.

    In any case the 5 - 5 rule applies, if it takes less than 5 seconds you can wait. If it takes more than 5 minutes you can get a coffee or do something else useful whilst waiting.

    In the end we had developed a series of modules which could successfully model a chemical reactor with heating / cooling, distillation and simultaneous removal of distillation products. The model used a Runge-Kutta 4th order solution of a number of simultaneous nonlinear differential equations. Some pharmaceuticals in common use today used our models to simulate and develop the process.

    My personal opinion is that there is little wrong with a decent implementation of BASIC (such as BBC) for a range of computer problems, especially where mathematical modelling is required.

    1. steelpillow Silver badge
      Joke

      Did you ever try controlling a nuclear power station with a ZX Spectrum?

      1. seldom

        3 Mile Island

        It wasn't my Sinclair Spectrum, I didn't write the programme and anyway why would anyone mix up us, imperial and metric in the vavle control systems.

      2. HorseflySteve

        "Did you ever try controlling a nuclear power station with a ZX Spectrum?"

        No, but I did make a concept demonstrator of a car sensors system tester that plugged into the wiring harness of a Rover Montego in place of the Lucas Hot-Wire EFI ECU and performed 13 real tests on the sensors in a guided sequence and produced a diagnostic printout on an OKI Microline 182.

        It was shown to various people @ Rover group, including the chairman of the board and resulted in a contract for the creation of the real thing, called COBEST, from which the company made an awful lot of money!

        This was 10 years before OBD existed.

        1. HorseflySteve

          COBEST clarification

          Just to note that the product I was talking about has nothing to with any company, group or product that is currently using that name.

          This was the mid to late 1980s and the product name was an acronym of Crypton On-Board Electronic Systems Tester, Crypton being a well known UK brand of garage equipment (now owned by Continental AG)

      3. Rob Daglish

        Having been in the control room of a nuclear reactor... Nothing that advanced was in use. A well known facility near me was controlled and monitored by some very clever pneumatic, hydraulic and electromechanical systems, and as a side effect pretty much immune to EMP, a side effect of being designed in the 1950s when computer control wasn't really a thing!

    2. martinusher Silver badge

      The IBM-PC is really just an Apple ][ built with Intel chips that wasn't as flimsy as most small systems were at the time. Having "IBM" behind it held a promise of standardization and so potential future proofing. From a performance perspective the PC running a 4.7MHz processor with 56KByte of RAM was about the same performance level (at best) as a BBC Micro.

      Software -- languages -- have always suffered from a bit of snobbery. It seems that unless you're using 'the latest' in whatever language is fashionable then you're some kind of subhuman, barely worth the time of day let alone be talked to by a person**. I've always had a more pragmatic view of programming -- there are languages I prefer to use, languages that I think are good for teaching but overall I tend to use what's available. Original BASIC was pretty naff, that's true, but by the time it appeared in PCs and the like it was half-decent and only continued to improve. The real problem with it is common to all programming -- a perfect storm of clever people solving problems using tools and techniques not intended for, or up to, the task invariably results in frightful code.

      (**Ever tried calling support from a language vendor in the early to mid 80s?)

      1. that one in the corner Silver badge

        > Software -- languages -- have always suffered from a bit of snobbery. It seems that unless you're using 'the latest' in whatever language is fashionable then you're some kind of subhuman, barely worth the time of day let alone be talked to by a person

        Curiously, I have never seen that sort of faddism when talking to people who actually created code that worked, worked well and could be respected.

        On the other hand, the attitude of the consultants who insisted on shoe-horning a Python interpreter into an otherwise C & C++[2] embedded system because "that will make it configurable"[1]...

        [1] You know, INI files are pretty good at that, or - if and only if you need the extra complexity - there are a range of other simple declarative formats, with well-proven parsers.

        [2] And their C++ was very faddy as well; shame it didn't run very well.

  20. Anonymous Coward
    Anonymous Coward

    My very first commercial programming task back in the '70s involved using BASIC ; a version that only allowed two-character variable names ( letter followed by digit, IIRC - we had a paper sheet with a grid on it so we could tick off which names we'd used ). Oh, and the RETURN statement took an optional numeric value which set how many lines before/after the matching call it should return to ( so a bit like a GOTO, but without the clarity ). Also the system imposed a stricf size limit on the code text, including the comments. Oddly, I was never a BASIC fan after that

  21. heyrick Silver badge

    English is one of the easiest human languages

    Oh really?!?

    [rolls up sleeves]

    I live and work in France and a topic of discussion from time to time is about how awful English can be. I find it easy as I grew up speaking it (and by consequence I really can't get my head around all this gender crap, it's a book, it's a thing, it's not masculine or feminine, it's flattened bits of squashed tree flesh with ink splatter in various patterns that magically impart information).

    English grammar is a mess riddled with exceptions to the rule.

    English is a stress-timed language (that's how Shakespearean stuff works) which can be not only very confusing for somebody from a syllable-timed language, but there are plenty of arcane rules about where to put the stress. Contract is not the same word as Contract, for example.

    To ram this home, say this sentence seven times, and each time put the emphasis on the next word along: I never said she stole my money. You've just said seven different things.

    Pronunciation is a mess. It's not just that there are words that look identical that are completely different (like "wind" (that which blows) and "wind" (what you'd do to fishing line)), even certain letter sequences are batshit crazy, like "ough" (rough, thought, through, though, etc etc). On the flip side, "wine" and "whine" sound identical but are two completely different, unrelated, words.

    And good luck explaining to a foreigner words like "choir". Really, English is a language crying out for a whole pile of accents so people can look at a word and know how to say it.

    Speaking of nonsense words: Queue. That is all.

    But this wouldn't help as, thanks to the widespread use of English and people's regional variations, there are hundreds of similar but notably different ways to say the same things. Hell, in England there can be a dozen entirely different accents in a hundred mile radius.

    Oh, and this means that the same words can be pronounced in entirely unexpected and interesting ways because of these differences. My mother, born and raised in Maryland, found out the hard way that "Woking" and "Basingstoke" are not said like she expected (for non-Brits: woe-king and bay-zing-stow-k).

    We shall politely gloss over the extra Us in English words like "colour", and the use of "s" for a "z" sound, like "realise", and the missing words in American English, such that "write me" is a complete sentence. But we shall take a moment to consider the startling array of sounds that an English speaker needs to master in other to speak good English. Nobody at work, and there's like a hundred and fifty people, are capable of saying my name correctly. It's just a bunch of sounds that literally don't exist in French. Related to this, a cow-orker was annoyed and decided to swear in English...but the best she could do was "oh sheet". Or should that be "oh sheat"?

    Word order is always a fun one. While there are many variations in the world's languages about which way to put the subject, the object, and the verb; English has rules and is quite happy to break them because screw you.

    English also has a long long long history of pillaging words and grammatical prefixes from wherever and whenever. This gives us complete bullshit like flammable and inflammable mean the same thing, but visible and invisible are complete opposites.

    Lots of homophones to make things harder to understand. You might see the word "date". You'll need context to know whether it's a day, a romantic outing, some sort of fruit, or a vague concept of time (like carbon dated), or just referring to something old fashioned.

    Sometimes one needs to use "that that" in a sentence and even though it is correct it just looks peculiar.

    Adjectives... Oh my. In order to speak "correct" English, there's a big list of what adjective goes where in relation to the rest. It's why we might talk about the small round plastic mirror, but not the round plastic small mirror.

    Mouse pluralises to mice, but "house" pluralises to "houses". It's regional (and possibly old fashioned) that roof pluralises to rooves instead of roofs.

    If you feel proper chuffed that you make it this far, it doesn't matter whether you're pleased or annoyed. Chuffed is a contronym, so it'll happily mean whichever you want.

    Oh, and one final thing about English. Always remember, it's i before e except after c... right? right?

    .

    I'm not familiar enough with languages to know what would be an easy language to learn, but it's not English. The only benefit that English has is that so many people speak so many variations (some more broken than others) that English speaking people generally understand even somewhat messed up English.

    As opposed to French where I (living and working in Brittany) had the bizarre experience of watching my cow-orkers completely fail to understand a girl from Toulouse. Yes, her French sounded different enough that I noticed it was different, but it wasn't that different. So there's me, qui parle comme une vache espagnol, translating between the two. A bloody Brit interpreting a Frenchie for another Frenchie. What. The. Actual.....

    I'll leave you with this: "tittynope" is a word. You're welcome.

    1. Neil Barnes Silver badge

      Re: English is one of the easiest human languages

      English is (at a guess) about sixty percent Germanic, thirty percent French, and the remainder odd bits of Celtic, Old English, Scandinavian, and imports from various bits of the British Empire (as was). It's picked up spelling and grammar choices from all of them, and that's probably why there's such a confusion. Learning German as an English-native pensioner is an, um, interesting challenge. All those words hanging around agreeing with each other and having sex with each other all the time...

      English is of course remarkably easy to speak, if you happen to have been born in the UK (USAin is a different language in many respects, thanks to Noah Webster's changes and different immigration patterns over the years). When I was at school in the sixties and seventies, English was not taught in spite of their being classes entitled 'English Language' and 'English Literature'. You were just exposed to it and basically expected to pick it up as you went along. I don't know if this has changed in recent decades...

      The nice thing about it is that although it has 'rules', of a sort, they can largely be ignored while still allowing comprehension. After all, English has something over a million words to choose from; one of them must be right.

      It has been said that most languages borrow from other languages, but that English follows other languages into dark alleys and beats them over the head, rifling their pockets for anything that seems useful...

      1. steelpillow Silver badge

        Re: English is one of the easiest human languages

        A significant amount of English has Latin and Greek origins. Besides the church Latin of the medieval period, which we also imported indirectly by way of French, the Victorians tried hard to crush English grammar into the Latin model, and promoted many words deriving from Classical vocabulary and spelling. Much of that still hangs around today, in one form or another.

        Old English was already a mashup of Celtic, old Norse (aka Scandinavian), Saxon/Old Frisian (aka Germanic), and the Dwarvish runes stolen from an early draft of The Lord of the Rings.

        1. werdsmith Silver badge

          Re: English is one of the easiest human languages

          It’s fascinating to see English being changed by social media, because people repeat what they read and “incorrects” are becoming correct.

          Talking of corrects

          “qui parle comme une vache espagnol”

          Needs an extra e to be gender consistent.

          1. that one in the corner Silver badge

            Re: English is one of the easiest human languages

            > because people repeat what they read

            Repeat, without attempting to think if what they are saying makes basic sense as a sequence of words (looking at you, "I could care less").

    2. Liam Proven (Written by Reg staff) Silver badge

      Re: English is one of the easiest human languages

      > English is one of the easiest human languages

      > Oh really?!?

      All I will say is this...

      I'm a qualified TESOL teacher and have done it professionally.

      I'm also a hobby linguist and can communicate successfully, if poorly, in 6 foreign languages. I've studied 4 more that I can't usefully speak.

      I stand by what I wrote.

      1. doublelayer Silver badge

        Re: English is one of the easiest human languages

        Could you explain what you wrote, though? In addition to all the valid criticisms of English, it seems to be missing the point that English wasn't used for the sake of simplicity, whether it has it or not. It was used because the people who wrote the language had English as a main language and were planning to have the software used by people who had English as a main language. Soviet programming languages used Russian as their main language, not because Russian is simple (anyone who tries to spell things will find that it's not, not that English should get many orthography points either), but because everyone involved spoke Russian, either because they were Russian or because they had to learn it in order to get to the places where the computers were available.

        When dealing with a language like BASIC, language choice is actually quite unimportant. The small number of keywords means that you could use a language you don't speak because memorizing the keywords won't take very long. This is, for example, why all the words commonly used in music are Italian. You don't need to learn to speak Italian to memorize maybe fifty of them, and that's only if you study it for a few years. Modern languages, with lots more library functions, do need to pick a language that the users understand, and English was chosen, again not for simplicity's sake, but because most of the people using and making it already spoke it to some extent and no other language met those criteria.

        1. Liam Proven (Written by Reg staff) Silver badge

          Re: English is one of the easiest human languages

          > Could you explain what you wrote, though?

          Here is the bit I edited out while writing this:

          You are talking to it in words, words which resemble English because that is one of the simplest world languages when you think about scripts -- writing systems -- as well as sounds. Hangeul is easier but only works for Korean which is harder than English. Grammatically Chinese is simpler, but spoken Chinese has tones which are very hard, and written Chinese is insane. Typed Cyrillic is no harder but handwritten gets weird and complicated and Russian is much harder than English. And so on.

          1. This post has been deleted by its author

            1. doublelayer Silver badge

              Re: English is one of the easiest human languages

              "I also think that agrees with doublelayer's view (and others'), reading between his lines (my bad if wrong) that English is in part, widely spoken for the same reason - its ease of use."

              Sorry, it seems we did misunderstand one another a bit. I don't think English is used for its simplicity. I think English was used in BASIC because it was written by people in the US and UK to be sold to people in the US and UK. If there had been a much more straightforward language, it still wouldn't have been used because English is what they spoke. If people were building the early microcomputers on which BASIC would be used in Argentina and planning to sell there, then the language used would be Spanish and people would have lived with it.

              English does have several simpler grammatical structures that make it easier to use and understand than other languages. I agree that simplicity is one of the reasons that English is popular in the world today, though to some extent I think the simplicity is the result rather than the cause of that global success. However, I don't think that explains most of it. Most of the popular languages are popular regardless of their complexity. English, Spanish, Arabic, and French are four out of the top six languages by number of speakers because those are languages that spread all over the place and were held there for long enough that many had to learn it. Chinese and Hindi are to some extent an exception, but they had periods of forcing others to learn it as well. Simpler languages failed to achieve success because they had fewer people forcing others to speak the language, fewer incentives for learning it voluntarily, and fewer people expanding it to be more versatile.

              1. This post has been deleted by its author

              2. This post has been deleted by its author

              3. Doctor Syntax Silver badge

                Re: English is one of the easiest human languages

                "I think English was used in BASIC because it was written by people in the US and UK to be sold to people in the US and UK."

                A little more indirection might be necessary. It was more likely to have been because English was used in FORTRAN & COBOL but used there for the reasons you give.

                It becomes a self-sustaining processes. People whose native language is not English are likely to create programming language that use English words simply because the programming languages they land their potential users earned, possibly even BASIC, also use English words.

                1. This post has been deleted by its author

                  1. Jou (Mxyzptlk) Silver badge

                    Re: English is one of the easiest human languages

                    > Considering the upswing of BRICS will they "disown" English as the... goto (soz!) language?

                    They will and do and have done so for decades. It is the practical, pragmatic and efficient way. The comments and variable names are different, but the code is not. Assembled machine code has no human language. Wasting resources on such political agenda economical nonsense and narrow-minded.

                    1. This post has been deleted by its author

                      1. This post has been deleted by its author

                      2. doublelayer Silver badge

                        Re: English is one of the easiest human languages

                        My guess is that it will still be English for a long time. The people designing these things have seen lots of examples, and those examples were in English. If they're working internationally, the lingua franca they're going to use is English. For example, the Loongson architecture, which should theoretically be easy to do in Chinese because it's only developed there, still uses English-style mnemonics. You could argue that they copied a lot of that from MIPS, and since MIPS used English maybe that explains it, but I bet it was considered easier to use Latin letters to avoid anything weird with encoding of Chinese characters, and if you're going to use Latin letters and you speak English, why not use ones that make sense there?

          2. Dan 55 Silver badge

            Re: English is one of the easiest human languages

            Using a bunch of English infinitives for commands in a programming language is easy, as it would be in for any language. Now if we turn to page 437 of the documentation, with its irregular conjugation and phrasal verbs, that could be more challenging for the non-English speaker.

            And not even English people can get the pronunciation right - how do you say sudo? It shouldn't rhyme with Nintendo.

      2. steelpillow Silver badge

        Re: All I will say is this...

        I am a professional Technical Author, qualified in Business French, and have translated various minor French texts into English. Etc.

        STE - Simplified Technical English - is defined in one giant cockroach-crusher (ASD-STE100. Joy unbounded) yet barely resembles most of what we techies all say to each other.

        French is a significantly more minimalist language, with both simpler grammar and smaller vocabulary, carefully if pompously regulated by the Academie Francaise. It doesn't need to be simplified.

        1. Doctor Syntax Silver badge

          Re: All I will say is this...

          "French is a significantly more minimalist language, with both simpler grammar and smaller vocabulary"

          OTOH it suffers from serious inflectitis* and has to borrow vocabulary from other languages for new concepts until the Academie Francaise gets round to dealing with the matter.

          * But at least it handles 2nd person better. English has removed the intimate tu/thee.

      3. F. Frederick Skitty Silver badge

        Re: English is one of the easiest human languages

        You're still wrong. Try learning a consistent and well structured language like Finnish or Estonian.

      4. LikeAStone

        Re: English is one of the easiest human languages

        You are not only saying that English is one of the easiest human languages*, you are claiming causation here. You claim that is _the_ reason most programming languages use English characters and words. How about the bias of the people who designed the languages/computers?

        I'm sorry, but that is one of the most outlandish paragraphs I have read here.

        * OK, there are about 7000 human languages, and between 200 and 400 alphabets. If you meant that "if English is the 99th easier language to read and write then the assertion is correct", I'd say that seems reasonable. In other words, it all depends on your sampling.

    3. seldom

      Re: English is one of the easiest human languages

      I have to confess that I'm very curious. Is "tittynope" english, french or just a word that wards females from Gerald Depardieu? And what does it mean?

      Chuffed is always positive.

      P.S.

      You completely forgot the measurements.

      1. Jou (Mxyzptlk) Silver badge

        Re: English is one of the easiest human languages

        Aw, come on, just a copy paste into a search engine away. https://en.wiktionary.org/wiki/tittynope

        A typical local word used in a region, which escaped somehow and then gets used in book as synonym. Simply 'cause the writer like the sound of that word.

      2. Rob Daglish

        Re: English is one of the easiest human languages

        Chuffed is, in my experience at least, always positive, although you can be dischuffed at something.

        1. This post has been deleted by its author

    4. JRStern Bronze badge

      Re: English is one of the easiest human languages

      Thanks. Enjoyed that. But the French can't even count to a hundred without invoking Roman numerals or something.

      I just happened to trip across some YouTube video the other day that listed all the features English doesn't have, for better or worse, than other world languages.

      People, huh.

    5. This post has been deleted by its author

    6. Jou (Mxyzptlk) Silver badge

      Re: English is one of the easiest human languages

      I totally agree on English being more and more confusing. It is very simple at the start, you learn it, you love its first perceived simplicity. And after > 30 years of heavy usage I know so many cases of illogical nonsense, and you cannot unsee it once you start to notice.

      Typical and most prominent example: Why is "i" spoken "ai" or "i", why is "e" spoken as "i" and "e". "Wikipedia" = I. "wide" = Ai. "Tissue" = i, "Tiger" = ai, "Tin" = i. "Write" = ai, "wirtten" = i... Tons of stuff out there. I have to use dict.cc regulary just to hear how it is spoken, last example was "precipice", used in <h ref="https://en.wikipedia.org/wiki/The_Windup_Girl>"The Windup Girl"</a>. Until then I used abyss, pit, chasm, edge and so on - no chance to know how it is spoken until you hear it and go "huh?".

      Why is it "Sai-Fai" when it is the short of "Science Fiction", it should be "SaiFi"... same with "HiFi". Why is "w" a "Dabbl-you" instead of "dabbl-v" when it is written as "VV" and not as "UU" (<- Though maybe it was written "UU" a few hundred years ago ?).

      But French is not an easy language. The connection between written and spoken is quite often even more out of whack. And that French pride which must force-french-ize words which we Germans simply take over 1:1 from other languages, usually including the way it is pronounced. Oh, and German is not easy too. There is a reason why most non-German speakers do not get the wordplay in "Du hast" from Rammstein, which is not "hate", but deliberately used in a way to create that confusion.

    7. Rob Daglish

      Re: English is one of the easiest human languages

      A problem not confined to the French... My (English) mother in law once acted as a translator in Germany between a Glaswegian coach driver, and the cockney mechanic who came to repair the coach when it broke down.

      1. Doctor Syntax Silver badge

        Re: English is one of the easiest human languages

        My wife had to translate between her Co Anrim father & me although I'd have thought that a few years in London should have ameliorated the Yorkshire accent.

        1. This post has been deleted by its author

    8. Andy Landy

      Re: English is one of the easiest human languages

      Always remember, it's i before e except after c... right? right?

      Except for weird, science, caffeine, sovereign and a couple dozen others. It's a really simple rule as any fule kno!

      1. Neil Barnes Silver badge
        Headmaster

        Re: English is one of the easiest human languages

        I think I've read that there are slightly more 'ei' words than 'ie' in English. I speculate that it is an introduction from German, which is at least consistent in its pronunciation of that pair...

        Trying to learn German has completely buggered my English spelling, in the matter if 'ei' and 'ie' and it usually takes me two goes at them nowadays.

    9. that one in the corner Silver badge
      Joke

      Re: English is one of the easiest human languages

      English is a The Language to use for BASIC (and COBOL, and FOTRAN and LISP[1] before that) because it can be fit into 7-bit ASCII.

      Icon: only half joking

      [1] if you can call CAR, CDR and SETQ "English"!

      1. Jou (Mxyzptlk) Silver badge

        Re: English is one of the easiest human languages

        No, you are probably more right than you want to.... six bit would be enough, but they made it with five bit (and a bit encoding when switching to numbers)... Albeit that was started ~250 years ago, it was kept for a very long time in the computing world. Maybe a bit extended, but still five bit for a long time.

        1. Neil Barnes Silver badge

          Re: English is one of the easiest human languages

          How common was EBCDIC (or its six-bit predecessor) compared to ASCII at the time Basic was written?

      2. Anonymous Coward
        Anonymous Coward

        Re: English is one of the easiest human languages

        >> [1] if you can call CAR, CDR and SETQ "English"!

        I once saw some Lisp written in Chinese characters and they used the characters for 'head' and 'tail' for 'car' and 'cdr', which one could argue is much clearer.

    10. Anonymous Coward
      Anonymous Coward

      Re: English is one of the easiest human languages

      >> Always remember, it's i before e except after c... right? right?

      It's :-

      i before e

      except after c

      when the sound is ee

      which works for :-

      field

      receive

      height

      Although there are a small number of exceptions...

  22. JRStern Bronze badge

    Basic will rise again!

    Hey, so Basic. IBM released the BC compiler, I think it was, I may still have the box up in a closet, circa 1985 to compile many/most GWBasic programs, and I made a good gig from it, speeding up execution by about 100x, on that speed demon 4.7 mhz PC.

    Quickbasic came out somewhat later, interactive editing and compile-speed execution. Not bad.

    And then Visual Basic, that Microsoft bought from what's his name (Alan Cooper?), was only the best app development product Microsoft ever had - for Win apps. They could bring it back today, though it would have to be retargeted for web apps as well. They should certainly have kept it running in parallel with the insipid dotnet Visual Basic, whatever the exact name is, or was. But Microsoft had made some blunders in non-standard "object" features and they just washed their hands of it.

    You could easily extend it to use statistical, math, and AI packages like people do with Python, but keep simpler syntax.

    In fact I shall predict: Basic will rise again!

    1. This post has been deleted by its author

      1. Neil Barnes Silver badge

        Re: Basic will rise again!

        I wonder how much of 'basic is 'orrible' was down to the differences between dialects, rather than the language itself?

        And that the core of the language had so little structure: for/next, if/then, gosub/return, (cough) goto. Though that's enough to write complex code, of course, the claims that one dialect was better than another seem to be down to extensions to the language - particularly sound and graphics commands - which on a {} language might be expected to be handled with a library.

        <declaration: last month I wrote a simple basic, in C. I'm currently porting it to 65c02 for a single board computer, with no sound and no pictures, so it's pretty simple.)

        1. This post has been deleted by its author

          1. Neil Barnes Silver badge

            Re: Basic will rise again!

            I disliked Basic for its insistence on all uppercase, which to my eyes made the screens unreadable. The first change I made to the MS 10k basic on my Microtan was to make the keywords lower case. Though to be fair, on many of the domestic systems at the time, lower case was simply not available to display.

            FORD=STOP. Hmm. That was often quoted as a how-not-to-do-it example, but when you had tiny memories, people squeezed all they could. And while spaces vastly improve legibility, they're around 20% of text. In general, whitespace metadata is your friend.

            My progress is being documented on 6502.org; the (tiny) basic has for/next, if/then, do/while, gosub/return, and of course goto. All recursively called to be allow proper nesting. And a whole 26 signed-16-bit variables!

            1. This post has been deleted by its author

            2. Persona Silver badge

              Re: Basic will rise again!

              I learnt BASIC at school and typed my first program onto a Model 33 teleprinter at the local technical college. Don't knock it: UPPERCASE was all we had back then. The teletype had a paper tape punch.

              I took my rolled up paper tape from the terminal room to the computer room and gave it to the white coat wearing "all mighty operator". Without uttering a word he took my tape and fed it into the reader on the mainframe, then wound it up on a winding machine and handed it back to me. All the time I was expecting him to press some button to run my program and the line printer to burst to life. Instead he said "Parity Error", turned his back and I was dismissed. Being a teenager with no computer education I had no idea what those two words meant. It's amazing I persisted and went on to work with computers. I can only attribute it to the technical collage also having a "glass terminal" with 300baud modem link to the Open University computer that had a lunar lander game.

          2. HorseflySteve

            Re: Basic will rise again!

            "FORD=1TO10"

            Data General TS BASIC was like that. It recognised its keywords & ignored whitespace so

            FORK = 1 TON STEPH

            was syntactically valid

            1. JRStern Bronze badge

              Re: Basic will rise again!

              +1 for Data General!

              though not particularly for that version of Basic

              DG worked on a couple of much improved versions, but I think none was ever released ...

          3. sw guy

            Re: Basic will rise again!

            The "spaces do not matter" quirk comes from FORTRAN

        2. JRStern Bronze badge

          Re: Basic will rise again!

          --

          I wonder how much of 'basic is 'orrible' was down to the differences between dialects, rather than the language itself?

          --

          Indeed.

          I actually wrote a fairly massive amount of code on HP Basic for the HP3000 circa 1980, longest variable name was A1$.

          SMH

          Those were the days.

    2. Tim99 Silver badge
      Windows

      Re: Basic will rise again!

      I used QuickBASIC a lot in the mid-late 1980s to connect technical equipment to PC networks. It could easily open a serial port, get data from the equipment, manipulate the data, save it to file and then write back to clear the downloaded data. Similarly setting up an operational sequence from a database avoided a lot of typing on the equipment (often fitted with a rudimentary membrane keyboard/screen).

      A lot of mission critical stuff for public utilities used it - Some of which was converted to QBasic to run on NT4 and Windows95/98/Me. After I had retired, a younger friend was asked to oversee the transition of older equipment to PCs for a water company - Everything had to run on Windows 7. Much of their kit was Vista, with some XP/2000. He mentioned to me that a lot of equipment was written with a "funny MS language" that his team of (.NET?) "children" had not seen before. I suggested that it was probably QuickBASIC, and pointed him towards relevant manuals and training literature. Apparently some very expensive equipment took months to transition...

      1. Mage Silver badge

        Re: Basic will rise again!

        Forth was far better than Quick Basic for I/O.

        1. Tim99 Silver badge

          Re: Basic will rise again!

          Probably true. The user base was mostly science and engineering but many older staff had not used computers, or were used to PDPs etc. I, like some of my contemporaries, came from FORTRAN. Many of the instruments could use BASIC, so we had some background and had the resources to use/teach BASIC. We were standardizing on MS/PC DOS based networking to link everything together. For political, as well as operational reasons, Forth was never going to happen.

    3. KeepItSimple
      Thumb Up

      Re: Basic will rise again!

      BASIC never left, it's just been sleeping.

      If you want to experience a top-flight "boot to BASIC" computer then look no further than MMBasic running on a Raspberry Pi Pico. You can download MMBasic (and its excellent, over 200 page manual) for free, and it will run happily on an original Pico or clone. It will run even better on a Pico 2. :)

      You can access MMBasic's console interface via a terminal emulator (Tera Term works well) over its USB port.

      The "embedded" version - "PicoMite" - supports TFT LCD displays.

      The "PicoMite VGA" version supports 16-colour VGA using a simple 4-resistor interface.

      The "PicoMite HDMI" version (Pico 2 only) is pretty self-explanatory. It uses the new HSTX peripheral in the RP2350.

      The "Webmite" (Pico W only) supports local WiFi network connections.

      PS2 keyboards can be used. It is also possible, with some versions, to connect the console via a USB-TTL converter and run the on-board USB socket in host mode. A hub can then be used to use USB keyboards, mice and game controllers.

      There are a lot of commands for hardware support in MMBasic. The user can even write programs for the Pico's PIO modules using built-in commands. There are options for several different audio systems, RTC chips etc.

      MMBasic itself stores the user program in flash so it isn't lost when you switch off. There are two more flash areas, each capable of holding a full program, that can use CHAIN to link them into one huge program if required. A fourth flash area can be used as a "library" of subroutines and functions that are accessible from all the other areas without taking up space in them. There is also a flash area used as a A: drive for program and data storage. Additionally you can add a SD card as a B: drive.

      MMBasic is a relative of Microsoft GW-BASIC, but has additions to support structured programming, similar to BBC BASIC. You can use line numbers if you really insist, but they aren't needed anywhere. As it was originally written for microcontrollers it is rich in hardware support. Variables are 64-bit floating point or integer. Strings can be up to 255 characters long, but "Longstrings" can use arrays to store strings, these can be any length up to the limit of the available memory.

      I know of one engineer who is using it for industrial machine control. He uses low cost tablets as HMI panels, which are linked to the MMBasic platform using a Bluetooth module. He finds these systems to be simple to set up, low cost and rugged. His customers are more than happy!

      There are several other MMBasic platforms, I just thought the Pico may be of interest because of it's very low cost for experimenting with. Even if you don't bother with a display and keyboard, just using a terminal emulator, it's a very powerful system. The editor is simple but gives a fast program development flow as, of course, MMBasic is interpreted.

  23. lordminty

    BASIC+2 was the cool kid on the block

    I learnt BASIC at school in the late 1970s on a DEC System/10 using coding sheets, punched cards and paper tape.

    Then we got regular access to a swanky new DEC PDP-11 running RSTS/E that came with BASIC+2 which was a massive leap into the future. The arrival of a Research Machines 370/Z brought 8-bit BASIC, but compared to BASIC+2 it was quite poor.

    Next stop was Uni and a bew fangled VAX-11/780, with Pascal, Fortran, COBOL, VAX Macro32 assembler (and Z80 on a Exidy machine).

    In my first job at an engineering consultancy I wrote a fair but if Fortran, but I just couldn't get my head around code to calculate all the natural frequencies of the steel cylinder for a nuclear reactor pressure vessel.

    So I wrote it in BASIC+2, all nicely structured, with subroutines.

    I kicked the run off on a Friday afternoon thinking it would run before I went home.

    It ran all weekend. And finally finished on Monday afternoon.

    The engineers I worked with didn't believe the result and the thousands of natural frequencies of the cylinder.

    Not long after I got made redundant, and a few years later on TV was a animation of a wobbly reactor pressure vessel and a load of political finger pointing over who hadn't spotted the problem!

    By the time the micros were all the rate I'd graduated to working on IBM big iron mainframes, so micros were just toys to me.

    IBM has it's own BASIC like language, ReXX, which is super-powerful, so much so that a colleague and I wrote an entire job scheduling system for VM/VSE in it.

    1. Rob Daglish

      Re: BASIC+2 was the cool kid on the block

      Oh god, I remember trying to learn ReXX on my (dual boot with Win3.11) OS/2 running 486/25, then on to Delphi, but I did love BBC Basic!

  24. anthonyhegedus Silver badge

    FORTH anyone

    I used to have a Jupiter Ace. For those who aren't old enough to remember, it was a crappy little rubber-keyed plastic thing developed by a company called Jupiter Cantab. Its main USP was that it jumped on the "Basic is bad, don't use GOTO" bandwagon by using a rather good implementation of the FORTH language.

    FORTH could be interpreted like basic. You could type commands in to calculate stuff, display stuff etc and they would be interpreted and executed immediately. However, you could also define procedures/functions, called "Words" also in the command line. Instead of giving commands line numbers, you defined words by starting the line with a : and ending with a ;

    It semi-compiled these word definitions into a sort of code that would be interpreted rapidly at runtime. Words could be run just by typing the name. The clever thing with FORTH is that the entire language is written in FORTH - and interpreted by itself. There are just a few core words that are written in machine code.

    On the Jupiter Ace, the entire OS *IS* the FORTH language. It's like BASIC, in that you are stuck in the FORTH environment because there is no other environment. So it maintained the same file-less paradigm of early BASIC computers but implemented a language so different from BASIC that it was jarring to the non-technically-minded general owner who just wanted to play games. You see, one of its strengths was that it used RPN - Reverse Polish Notation - where you specify parameters before the function. So to add 1 and 2, you use "1 2 +". That was also its downfall.

    Using RPN enabled one to make "write-only code" that was unintelligible later on. For example, here is a relatively short word to calculate square roots:

    : sqrt ( n -- root )

    DUP 0= IF DROP 0 EXIT THEN

    1 SWAP

    BEGIN

    DUP DUP ROT / + 2 /

    OVER - ABS 0.00001 <

    UNTIL

    DROP ;

    This assumes that that version of FORTH could handle floating point by default.

    So despite FORTH actually making very structured programs, it completely failed because BASIC was so easy by comparison. It's an interesting aside. At the time, it was hoped that FORTH could even start to supplant BASIC but it was never going to be. Home users needed easy programming and games. Business users needed an OS with files and a word processor. FORTH couldn't really handle any of this. So BASIC continued its dominance of the home computer industry with hardly a blink.

    1. Anonymous Coward
      Anonymous Coward

      Re: FORTH anyone

      I wrote a few Forth programs at uni in 1975 or 76 ; Chuck Moore gave us a copy and a brief overview ( I think he was trying to raise interest at various UK universities ). I remember thinking it was pretty neat, but at the time there was no development environment other than the interpreter itself - which encouraged concise coding if nothing else.

    2. Blue Pumpkin

      Re: FORTH anyone

      It also had an interesting way of storing and addressing data on "screens" or blocks if I remember.

      RPN is never going to endear people to your language - though it does tend to make them think more about what they are trying to do

    3. Mage Silver badge

      Re: FORTH anyone

      I used a Jupiter Ace with Forth for industrial control / Test. Very cost effective, easy to develop and reliable.

      Most BASIC users learned to use BASIC and few learned to program.

    4. Chris Gray 1
      Go

      Re: FORTH anyone

      Yeesh. I don't remember enough of the base words to figure that out...

      I did a bit of programming in Forth, way back. The lack of an easy way to store the program is what killed it for me.

      I had read somewhere (USENET?) that the inner kernel of Forth could be one instruction on a DEC PDP-11. So, I gave it a try. You can, but you are better off not doing that since it makes it too hard to access stuff on the stack (or something like that - its been a long time).

  25. Orv Silver badge

    A large reason for games mostly being written in machine language on the C64 was the BASIC interpreter was slow, and when you're using a chip clocked at 1 MHz you really want to save those cycles.

    That said, both the C64 and the Apple II (another machine with Microsoft BASIC) both had some commercial software written in BASIC. Just not usually anything speed-critical.

    Some versions of the Apple IIe also had a mini-assembler in ROM, which was handy for putting together short machine language routines. Traditionally the dumping ground for those was the tape buffer starting at 0x300, since no one really used tape on an Apple II system.

  26. Gene Cash Silver badge

    A novice does not know the difference between RAM and disk, and they should not have to

    NO.

    There are important differences between RAM and disk. One is fast and volatile, and one is slow and permanent.

    You don't want a novice writing all your payroll records to RAM and exiting, right?

    I can't see how experienced people can gloss over that sort of thing.

    1. timrowledge

      Re: A novice does not know the difference between RAM and disk, and they should not have to

      I don’t want a *novice* having anything at all to do with the payroll software except possibly reading the code as part of learning about it.

      And in the bigger scale, I hate having to think about anything to do with files; a barbaric idea that the system should handle for me, preferably with a good (and hidden) database. My stuff should just be there as and when I want to fiddle with it.

      1. This post has been deleted by its author

        1. timrowledge

          Re: A novice does not know the difference between RAM and disk, and they should not have to

          Yes, files do exist. I’m not convinced they need to. Why isn’t a storage thing (disk, ssd, network connection, magic container as yet unknown) just a chunk of memory space? Files are merely one possible abstraction for storage. Allow for 128bit addressing and we could cover most plausible scenarios. 256 bits would address every atom in the observable universe which ought to even handle Windows 29 downloads. An acquaintance by the name of Ted Nelson worked on number systems for that a few decades ago.

      2. doublelayer Silver badge

        Re: A novice does not know the difference between RAM and disk, and they should not have to

        And in what form should your stuff be returned to you? Because you can insulate yourself from files whenever you want but before you can, you have to decide on a format. Have a database server and you never have to manually read or write to the disk. You don't have to worry about endianness of your numbers or packing your data. But that only works if you've agreed to use the database's formats instead. You can go one level up and never serialize any of your types because a library is doing that for you, but you still need to limit yourself to things that library understands. You don't have to worry about any of those things, but you can stop worrying only after you've decided how they will be handled for you. Trying to opt out of handling them at all is laziness, the bad kind, and works the same way (I.E. badly) as people who insist that you just solve the problem without being clear on what the problem is.

        1. timrowledge

          Re: A novice does not know the difference between RAM and disk, and they should not have to

          What format? Well, the one I need in memory in order to use the information it represents. All I care about is my data going somewhere and coming back the same. What happens in between is irrelevant. For all I care it could be converted into big endian just so long as it comes back as proper little-endian.

    2. Anonymous Coward
      Anonymous Coward

      Re: A novice does not know the difference between RAM and disk, and they should not have to

      There are important differences between RAM and disk. One is fast and volatile, and one is slow and permanent.

      It never used to be like this when your RAM was made from ferrite cores .... bloody new-fangled technology

    3. Hawks-eye

      Re: A novice does not know the difference between RAM and disk, and they should not have to

      Please buy an SSD and you won't notice the speed diff between ram and "disk".

  27. This post has been deleted by its author

  28. EduQuint

    Pyton is the current Basic

    Pyton is as low level as Basic was. Or maybe Visual Basic.

    1. Hawks-eye

      Re: Pyton is the current Basic

      Python is an unneeded ugly reading language. No need for it. Use C or VB

      1. that one in the corner Silver badge

        Re: Pyton is the current Basic

        Yuck. From that choice, I'm far more a C person and am forever struggling to "get into" Python[1] but I'll keep on doing that than go back to VB! And not just because Python is available on so many more platforms than VB.

        [1] just trying to be down wid da yoof - and have a language[2] I can share with family who don't seem to understand the sheer beauty of driving the C compiler.

        [2] oh, if only Lua were more widely used; I liked using Lua.

  29. Maximus Decimus Meridius
    Facepalm

    Comments about BASIC

    Surely these should be REM's?

  30. VicMortimer Silver badge
    Megaphone

    You don't want files? Really?

    That's just insane. PEOPLE KNOW WHAT FILES ARE. The concept has a real-world analog that makes sense to anybody. And what's the alternative, the electronic equivalent of a desk covered in massive piles of paper? Yes, I know some people work like that, but it's NOT efficient or sane. And yes, I know some computer companies have tried that stupidity at the user-facing level (looking at you, Apple) but they ultimately had to give up and put a file manager in and set up a folder mechanism for apps.

    Ultimately, data has to be stored and just throwing everything into a giant pile is the stupidest possible way to do it. Yes, people SHOULD have to know what a file is, it's a simple concept and it takes minutes to teach the dumbest noob about files. If you can't grasp the concept of a file, you have no business using a computer and you should probably be in a care home so you don't hurt yourself.

    1. Anonymous Coward
      Anonymous Coward

      Re: You don't want files? Really?

      "If you can't grasp the concept of a file, you have no business using a computer and you should probably be in a care home so you don't hurt yourself."

      I understand why you said that ... BUT ... there are some people who do not get the mapping from the 'realworld' physical file to the computer version which is 'invisible' to them being on 'disk' somewhere.

      I have, fleetingly, had the same thought about people who 'just don't get it' BUT it just requires some extra effort to help them understand.

      :)

  31. Anonymous Coward
    Anonymous Coward

    WTF?

    Liam,

    You normally write excellent articles with an interesting mix of commentary and actual information.

    This piece was your normal tour de force, until the last 1/5.

    Oddly enough, you seem to have gone off the rails with the... files and directories?

    I know there've been numerous articles recently pointing out the fact that todays youngins are mystified with such as all they've used are apps who handle their files intra-apps. This is true, and I think the adults gushing over their children's 'expertise' with phones has sadly been inflated to be an incorrect all-in belief that encompasses computers et al.

    Kids today are mostly far less computer savy than Gen-X and Millenials, they're just expert at certain GUI's.

    The bemoaning of HW abstratactions levels, compilers, etc, I get.

    Re-use is great and allows for incredibly fast RAD, however almost like the physical world reality, re-use entails negatives that continue to increase as re-use is crafted on top of re-use.

    A technical debt of all the layers becomes increasingly difficult to remember and manage, and every addition decreases the efficiency of the monlithic growing tower of babbel.

    OSS is great, however it does have its drawbacks as any system does, as Proprietary does as well.

    One of those is that a lot less fundational writing is being done, as its 'already good enough' work that is freely available to borrow so that one can get on to the cool part someone wants to write.

    Sometimes this is good, sometimes not so.. Problem is, sometimes even if it is good, after 3,4,5 generations of adding functionality on top of functionality intrinsic operations it was made on from lowly beginnings simply isn't as efficient as a less effective path the original developer chose not to follow.

    Who article could be written on the subject alone, not sure why the sudden angst at the end of the article.

  32. idler

    PureBasic is alive and kicking 20+ years If you want, easy to use, easy to learn, quick to develop with and still want bare metal performance with the ability to use inline c or asm, take a gander at PureBasic. Cross platform, comes with with ~1600 built in commands, compiles x86/x64 Win Linux mac OS and for Arm Mac and Raspberry PI.

    Maybe it's because I've been using it for 15 years but it's just so easy to develop with compared to c which is a dyslexics nightmare.

    1. Anonymous Coward
      Anonymous Coward

      PureBasic

      This. It's so good, and actively developed. Windows on ARM support in the latest beta. As a sysadmin, I've used it for many small utilities, but it's also great for larger projects (I have one at around 60,000 lines).

      If you don't like C-style syntax (all those {}s and ;s ), but you need self-contained binaries, then it's well worth a look.

  33. Herby

    B A S I C ???

    Just remember what the 'B' stood for. Of course the next letter in the alphabet is 'C'.

  34. CorwinX

    I'd argue GOTO or equivalent still has a place

    Structured is fine, but there needs to be an exit strategy...

    Along the lines of ON ERROR GOTO x (whatever language).

    x being an a hard coded destination that does analysis, clean up, notification and restart.

    Otherwise you can end up in an infinite loop.

    1. Richard 12 Silver badge
      Boffin

      Re: I'd argue GOTO or equivalent still has a place

      The problem there is fixing it afterwards.

      The "analysis" part is impossible unless the computer knows where it just came from.

      GOTO erases the "from" by design, so the context is lost. That alone makes it unusable for the purpose you just described, as the analysis cannot include what exactly failed.

      Many languages use "exceptions" for this purpose - explicitly marking the instruction that failed, along with the full callstack of how it got there. They don't always call them exceptions, of course - trap, signal, bork...

      Really, GOTO is an implementation detail - eg for loops and conditions. The human should never write them.

  35. toejam++

    It didn't have to be so terrible.

    Commodore BASIC 2.0 (introduced in '79) wasn't just lacking commands for graphics and sound. It also lacked commands for managing filesystems. That's why you had to use weird LOAD/LIST commands for displaying directories and third party tools for copying and deleting files.

    Commodore BASIC 4.0 (introduced in '80 with the PET 4000) included 18 new commands (vs 2.0), mostly for managing filesystems. The Commodore Super Expander cartridge for the VIC20 (introduced in '81) included 17 new commands for graphics, sound, and input. The Super Expander cart for the C64 increased it to 32 new commands. Commodore BASIC 3.5 (introduced in '84 with the C16 and Plus4) included 37 new commands, mostly taken from BASIC 4.0 and Super Expander.

    So, why didn't the C64 either use something like BASIC 3.5 or at least the Super Expander extensions, since they were available at the time? My guess is cost. BASIC 2.0 was 8KB, BASIC 4.0 was 12KB, BASIC 3.5 was 16KB, and the Super Expander 64 cart was 9KB. Tramel was a notorious cheapskate. So, BASIC 2.0 was what we received.

    But even with BASIC 7.0 in the C128, Commodore BASIC was still lacking when it came to array and string manipulation and structured programming control. That's where Simons' BASIC really shined.

  36. Jason Hindle Silver badge

    I found Amiga Basic surprisingly decent

    Notwithstanding the occasional Guru Meditation.

  37. Acrimonius

    Earliest Programming Language

    My earliest attempt at programming was at schol with a lanuguage called CECIL. Much like BASIC. Then I bought a BBC Micro and that got be hooked, especially being able to embed Assembly. Then doing something for real at work it paved the way to Assembly on the Commodore PET. Later Excel dominated my use of computers at work and another much reviled language,VBA, become my obsession. I had completey moved away from Goto and in-memory thinking by then.

    1. that one in the corner Silver badge

      Re: Earliest Programming Language

      > a lanuguage called CECIL

      I think you meant CESIL, if you later moved on to the BBC Micro (that article starts by pointing you at the link for CECIL). Although, if you went from the Beeb to a PET then things were going backwards, so maybe...

      > Much like BASIC

      Huh? From what I recall, the only similarity was that we had to use character entry forms for both languages, then ship them out from school to the card punch operators at the closest university. CESIL is way cruder than the worst BASIC, more akin to assembler for a vastly underpowered CPU.

      1. Acrimonius

        Re: Earliest Programming Language

        Thanks, I was trying to remember the early days. Yes, now I remember, CESIL was much like assembler. My move from Beeb to PET was mainly in connection with 6502 Assembler

  38. Hawks-eye

    Funny how upperty snotty nosed some "language" self proclaimed experts like my fellow Dutchman are. I remember that rediculous statement. You can program beautifully in Basic. Started in Algol-68, then Pascal, then Fortran and Basic on HP-86. Worked well. Did C which is a horrible language to read, full of bugs that can hang you, that later on morphed into C#. What have all these language wars brought us over the past 40 years? Nothing other than endless rewrites of libraries, never yielding a faultless library of code. Now it is Rust, apparently better than C cause you can't trust a programmer with Malloc() . What nonsense. Rather use chatgpt to check the code.

    My preferred language is VB and VBA. Beautiful to read. Easy to understand and thus to maintain for non experts. Remember code is read 95% of the time! As for speed and integration, what you write in VB inegrates at the op code level with C#. As for VBA, it is about 50% more productive than python, and just as fast, if you use Excell as your display library.

    So go VB. Good stuff :)

  39. Bsquared

    Replace "Beginner" with "casual programmer" and I am 100% on board

    "Beginner programmers should not have to know what a "file" is, or what an "editor" is, or that they need an "editor" to manipulate their "source code" in "files". This is pure undiluted technical debt: these are implementation details, which should be invisible.

    This goes double for "compilers" versus "interpreters", and for "source code" versus "binary code". These are legacy implementation details, ones that had been hidden from view by systems decades ago. Both ordinary users and beginner programmers shouldn't have to know about them."

    Yes, YES a thousand times this! I cut my teeth on ZX Spectrum BASIC and still kind of miss it. Since then, I have dabbled with assembler, C, C#, PERL, VB, Java, Python, Arduino and R. And probably others I've forgotten.

    I am not a programmer - I'm a medical research scientist. I pick up these tools when I need to do a job and put them down as quickly as I can and get on with my job. Which is, y'know - actually COLLECTING DATA.. I do not have the mental bandwidth to become fluent in a bunch of languages and hold them in my head.

    And I definitely never wanted to know the nerdy details of the filesystem (yeah, thanks a bunch Unix). The only one of those languages I have reasonable fluency in is R. I hate it, its weird syntax has pitfalls over all the shop, but it's critical for medical research statistics these days, and I grok it.

    After R, Python is probably the one I use the most and I hate that too. Wrapping my head around the syntax to interrogate those stupid dictionary datatypes was just painful. "undiluted technical debt" indeed.

    It may be partly nostalgia, but I would love to have a clean, intuitive programming language like BASIC that would run on my Windows workstations and let me hack quick, simple programs together with simple GUIs.. The alternative seems to be that you have to have a wizard onboard who is fluent in these languages, which disempowers me and gives power (and employment) to the tech priests.

    I did try to use .NET VBA (or whatever it's calling itself this year), but after a day wrangling with the weird IDE environment, I gave up and went back to C. Which gave me little pleasure.

    1. Jou (Mxyzptlk) Silver badge

      Re: Replace "Beginner" with "casual programmer" and I am 100% on board

      > intuitive programming language like BASIC that would run on my Windows workstations

      Actually Powershell (5.1 which behaves the same from Windows 7 up to Server 2025) is VERY close to be exactly that. Clear distinction between variables and non-variables, which I miss in many other languages. Not having to care about initialization. Many datatypes auto-convert silently, but be aware that STRING 1+1+1 = 111 not 3 :D. Not wasting time on enforced struct around your algorithm. With Powershell ISE an integrated IDE which, once you got used it, is much better than its reputation. High precision math: Is there. If something is missing (very rare for me) you often have a module freely available.

      Even if you have to dabb in DOTNET for a few things to avoid needing modules or to work around limitations of the builtin commands, the documentation is there. I have to avoid needing modules quite often since many customers don't like "external untrusted" stuff.

      Make graphics (my last thing I needed, make my solar data colorful, antialiased please)? [System.Drawing.* In Powershell ISE: Just type [System.Drawing. and after the dot it will show everyhting that is available. Transform / Invert color of images and so on at reasonable speed with [System.Drawing.Imaging.ColorMatrix], 'cause most PS examples, even DOTNET examples, do it pixel by pixel.

      Read-Write big binary files at speed of the storage? And above 2 GB size? $FileStream = [System.IO.File]::Open($Path, [System.IO.FileMode]::Create) / $FileStream = [System.IO.File]::Open($Path, [System.IO.FileMode]::Open), $BinaryWriter = [System.IO.BinaryWriter]::new($FileStream), and then you have your $BinaryWriter.* to read/write big chunks of [byte[]] into.

      TCP / UDP (including UDP-Ping for DNS NTP etc) / Serial direct communication? Easy. Even WakeOnLan does not need extra tools. Simple web calls to read and control devices like shelly? Easy. XML/JSON on board ? Yes.

      Calling .DLL functions directly? Yep. Can create nice crashes of your shell when done wrong :D.

      To me Powershell is that what is BASIC for C16/Plus4/C128 was, but with a lot more capabilities. And if speed is a real concern you can switch to Powershell 7, or use C# / C inline.

      > and let me hack quick, simple programs together with simple GUIs.

      Does work too, but many use VStudio or other Powershell Tools for that. Saves time.

    2. doublelayer Silver badge

      Re: Replace "Beginner" with "casual programmer" and I am 100% on board

      When you're collecting data, how do you keep track of it? Do you store it in a location with a name, where you don't store unrelated data? Congratulations, you know as much about files as you'll need while writing software. Unless you're writing a filesystem, kernel module, or something that's trying to deal with disks, all you need to know is that files have names and contain bytes and directories have names and contain files. That last bit is not technically true, but you don't have to care. I'm sure you know both of those things already.

      There are a lot of languages that are simpler for the novice than the ones you listed, but they all have limitations, which is why more general and powerful ones as you have listed are popular. Nothing requires you to use them. I would also encourage you to find the many tutorials that are out there for all the languages you name, because we really aren't trying to keep you away from knowing how to write them. There will be many that describe how Python syntax handles dictionaries which could help explain it*. Having a data structure like that is necessary for most programs, so having one provided for you is generally helpful. If you got a modern BASIC, then you'd have to build that yourself, and I'm guessing you'd find that a little trickier than learning the Python syntax for them.

      * The simple version is that you name your dictionary and put your key in brackets. If your dictionary is d and your key is "hello", you can set a value with d["hello"] = "world" and retrieve it with d["hello"]. There are other options, but that version is all you need to use dictionaries at the start. Or, if these aren't useful, you could just not use any dictionaries. Python is a lot like BASIC if you just don't use the structures you dislike.

    3. HorseflySteve

      Re: Replace "Beginner" with "casual programmer" and I am 100% on board

      For knocking together quick programs to do simple jobs on Windows PCs, try AutoIt.

      It's got some quirks but it's quite powerful.

      I used it to make a Product Installer CD/DVD/USB stick that would turn a clean Windows PC & install all the required software & libraries and company branding to make it into an UK MOT emissions testing workstation in less than 8 minutes (<2 minutes with SSD instead of spinning rust) when coupled with our gas analyser and/or smoke meter hardware modules.

      I tried using HDD images but preparation took far longer and installation took >30 minutes

      1. HorseflySteve

        Re: Replace "Beginner" with "casual programmer" and I am 100% on board

        @Bsquared "I would love to have a clean, intuitive programming language like BASIC that would run on my Windows workstations and let me hack quick, simple programs together with simple GUIs.."

        AutoIt is exactly this. It's a scripting language originally intended for making automation tools but it can do much more than that and it's freeware. When a script is fully debugged and working, it can be compiled to a stand-alone exe.

        Have a look at https://www.autoitscript.com/site/autoit/

        1. Bsquared

          Re: Replace "Beginner" with "casual programmer" and I am 100% on board

          Ooh, AutoIt looks very useful indeed - thanks!

    4. StuartMcL

      Re: Replace "Beginner" with "casual programmer" and I am 100% on board

      @Bsquared "I would love to have a clean, intuitive programming language like BASIC that would run on my Windows workstations and let me hack quick, simple programs together with simple GUIs.."

      I do that frequently with PowerBASIC :)

      (Sadly no longer available for purchase)

  40. very cowardly anonymous

    Ignoring AmigaBasic, Liam?

    AmigaBasic was imho very nice. And guess what it did not have: line numbers.

    I always hated line numbers such an idiotic concept and waste of my time.

    AmigaBasic had labels of course. But it does not fit your single minded agenda, so you chose to ignore it. A full, line number free basic on a 16/32 bit computer....

  41. Anonymous Coward
    Anonymous Coward

    Oh Dijkstra...

    If not for BASIC, I might never have learnt any programming at all.

    When I was a kid, BASIC was pretty much the only easily accessible programming language. At one stage I was almost fluent in it...as the internet progressed in the mid to late 90s I started learning other languages and BASIC was never really a barrier I had to overcome...the main barrier I had to overcome was actually getting access to other development tools...because back then it was neither easy nor cheap to get access to compilers, interpreters, IDEs etc etc...I pretty much had to stick with anything I could write using a text editor.

    These days I'm reasonably comfortable in a wide range of languages...PHP, C, Python, Javascript (bleugh), Ruby etc etc...the ones I use on a regular basis though don't require any fancy setup or anything though...I still find myself mostly using PHP for a lot of things....C I really only use for embedded device stuff (ESP32, Arduino etc)...Javascript I only use for client side stuff, it makes me feel queasy using Javascript server side...it's just a horrible idea in my opinion...especially in a world where PHP and Python exist.

    I've recently started giving Rust a go.

    There are also plenty of other languages that I used to build things in that I don't touch anymore...like Java, C# and indeed BASIC. Java I dropped because it fucking sucks in more ways than it is good, C# I dropped simply because it takes so much longer than other languages to achieve the same result (also, it's not broken, just use NewtonSoft) and BASIC...well time just moved on. I still have a soft spot for it, I can't code in it nearly as well I could 25 years ago and too much time has passed to pick it up again and I don't really have any interest in it anymore. There was a period in the early 00's where I was a wizard at throwing together VB scripts for managing Active Directory and such. I wrote one of the most glorious printer management scripts Earth has ever seen, it was legendary amongst my colleagues at the time...and I also wrote a few BASIC based maintenance tools to speed up customer routine maintenance that cut down the maintenance time from about 2 hours to less than 1 hour. Unfortunately though, I worked for a shitty MSP that had no respect for engineers (especially young ones, I was 19 when I started there) that went off piste to solve problems etc...mostly because the CTO was a massive prick...if he didn't think of it, it wasn't a valid solution...so he'd reject it, ignore it for a few months, then come back with a slightly changed version of the same solution and call it his own...massive wanker of the highest order...probably why he ended up at Rackspace...if you work underneath a complete and utter bonehead at Rackspace...the kind of tosser that if Harrods stocked the highest quality overpriced tossers, he would be stocked at Harrods...you're probably working under my former boss...you'll know it's him because he'll make you angry with ease but drain so much energy and willpower from you that all you can think in your head is "what a knob".

  42. EddieC

    BASIC on home PCs came later for me

    I first learned BASIC on a PDP-10 that I connected to via teleprinter. Wrote a pretty decent Pontoon game on it. Gave me a real leg up when I started at University with learning Fortran, then at work writing COBOL for a living. What cured me of using GOTOs wasn't Dijkstra, it was trying to understand deliberately obfuscated code written by a paranoid alcoholic for a voice response system which used EVERYTHING.

  43. Mage Silver badge
    Flame

    BASIC in disdain

    FROM the BEGINNING on 6502 & 8080 MS BASIC WAS terrible and obsolete.

    Gates ported Dartmouth's Beginners All purpose Symbolic Instruction Code, an interpreted language based on cut down ForTran (compiled) so as to reduce resources. It was garbage from the start and totally obsolete by 1976. It bootstrapped Microsoft.

    The GOTO was only one issue.

    However VB with Option Explicit and no GOTOs and a small piece per visual element was practically not BASIC. Demise of VB6 was sad, because VB.net couldn't do the same stuff and was really a C programmer's idea of VB. It made more sense to ignore VB.net and simply re-write and write new code in C#, which was really MS version of Java.

    I used Prolog, Pascal, Modula-2 and Forth on 8 bit micros and CP/M. The BBC BASIC was a stupid decision but better than the previous MS Basic on Apple II, Pet, CP/M and DOS.

  44. Caspian Prince

    See Pico-8 for how to do it properly

    Without line numbers. Also, Lua is the new BASIC, amirite?

  45. sw guy
    Mushroom

    GOTO is not the worst

    A surprise for me that nobody had a word about longjmp (or its sibling siglongjmp)

    1. HorseflySteve

      Re: longjmp

      Longjmp isn't a command I've encountered in any BASIC variant I've used. It sounds very close to the CPU machine instructions where a relative jump opcode would adjust the program counter by +/-128 for loops, jump tables or FSMs, and an absolute jump opcode for anything more which would have used additional program memory and CPU cycles.

      A bit too close to the metal for a Beginners language, I'd have thought as you would want to bothered about those sort of things until you started working with real-time or embedded code.

  46. Anonymous Coward
    Anonymous Coward

    FORTRAN -> BASIC

    I learned FORTRAN at university in 1971 (it was a mandatory first year module on all science or engineering courses there). We stared with Waterloo FORTRAN where we were limited to 100 lines of execution (it ensured that any infinite loops were quickly terminated); once we could prove ourselves we were allowed to use the full FORTRAN IV. In my first job, the site had a networked system where we could use PROCALC for immediate execution on a PDP 9, or queue longer programs in FORTRAN for batch processing on the mainframe (I don't recall what it was). However, one of my early projects was to use a Varian (V/70?) micro in our department to explore the potential for automated inspection using a camera. The unit was the size of three office filing cabinets, had 32k of ROM, a twin cassette drive and disk drive (nor sure on the data capacity but the disks were, physically, quite big), a VDU and a teletype. It could be booted from paper tape although, if you broke the tape you had to program a new one using a large row of switches on the front panel (I made sure I never broke the tape). I had to teach myself BASIC from the computer's manual - not too difficult as I saw it as quite similar to FORTRAN. It ran slow (watching images build up on the VDU was like watching paint dry, but BASIC was just to prove an idea - if something worked it would be reprogrammed in assembler).

    Several years later, I used BASIC on my ZX81 and then BBC BASIC on my Acorn Electron. BBC BASIC had subroutines, so GOTO's could be largely dispensed with, but the ace in it, for me, was the ability to include assembler routines in the BASIC code. I used these for various routines, especially when I added a 6502 co-processor to upgrade various magazine games.

    I was never "officially" an IT bod, and the last time I did anything remotely like programming was almost 20 years ago when I used VB to program a spreadsheet application for auditors (and the programming was fairly simple, limited to validation checks and compiling reports). I leave the difficult stuff to others.

  47. Missing Semicolon Silver badge
    Happy

    No files.

    Ok, so we had no files. What we had was endless cassette tapes, with handwritten labels!

  48. Blackjack Silver badge

    I haven't coded in BASIC in a long ass time. The thing stopping me getting back to it is all the different versions available I cannot decide.

  49. NickHolland

    Giving BASIC a bad reputation

    I never programmed in C64 BASIC. Based on the statements in this article, I'll just grant it was bad. But I don't think I believe that was really the "killer" for BASIC's reputation. No one bought a C64 thinking it was the ultimate computer to write business applications in, and no one was going to be writing cool games in BASIC in home computers. It was colorful, it had good sound, it was a game machine you could justify because it was also tried to be "practical". I'm not glad Commodore skimped on the BASIC for it, but...it was a budget machine, we all knew that. This is why I never had any interest in the C64 at the time.

    If you wanted to do serious programming in an 8 bit machine, it was generally assumed you would have to do either a compiled language or assembly. The computers were just too small for interpreters. Not unique to Commodore at all here. I believe most "serous" programs were developed using cross-compiling from "big iron" systems, so they were probably too small for compilers, too.

    In my mind, it was Microsoft's MSDOS BASIC (IBM PC BASIC, BASICA, GWBASIC, whatever) is the "BASIC that killed BASIC being taken seriously", because it just wasn't what it could/should have been. For small systems users, MSDOS systems had huge (for the time) memory capabilities but the BASIC interpreter was a machine rework of the 8 bit CP/M BASIC, with a lousy 8 bit 64k program space (one 8086 RAM segment for the interpreter's code, one segment for your BASIC code and data) . So...this was the Commodore "BASIC that doesn't support the hardware" all over again, but without the "shoestring budget" justification. For MSDOS systems where BASIC wasn't included, it was a non-trivial cost product, and an utter disappointment. Just that CP/M product with graphics. I get their need to "get it out the door quickly" in 1981, but ... it got very little development after that. (There was a much-later "QBASIC", but that was a decade late, and the world had given up on BASIC by that point, and I don't know if it fixed the memory limitations).

    So..I'm gonna blame MS BASIC on the PC for giving BASIC a bad reputation.

  50. Frank Leonhardt

    CBM64/Pet BASIC wasn't the start

    The original Microsoft BASIC (written by Bill Gates) was for the 8080; the 6502 port was done by Ric Weiland in 1976 for the OSI (those boards in the background during OS/2 drinking club meetings). It was already old software when Uncle Jack bought a license for the PET.

    The important thing is that it was shoehorned into four 2K ROMs, which given the code density on the 6502 was remarkable. It's not really the case that the language lacked features by old design, there just weren't the bytes available. Error messages strings only got two characters, with the high-bit set on the last one to save a termination byte or string length.

    Incidentally, the Atari ST didn't come with ST Basic. In fact I never saw a copy, and I was developing the launch software with Atari in 1985. I'd have known. There were a few BASICs that appeared later, the most popular in Europe being GFA BASIC. But the ST was launched with the Alcyon (Motorola) 'C' compiler and the Metacomco editor (both pretty rubbish - sorry Peter). It was quite pricy unless you were in the club, in which case it was free.

  51. fg_swe Silver badge

    Zwiebelbert Eater Here

    Our version of the Quiche is

    https://omasrezeptewelt.de/schwaebischer-zwiebelkuchen/

    We are also linguistically and nationally close to Mr Wirth of ETH Zürich.

    I did some Basic programming on C64 and some kind of Schneider CPC128(?) in 8th class or so. I can remember the practice of entering line numbers in 10er increments "in case we need to expand". We already stored these programs on 3 inch disks. No proper CS teachers for these Basic machines, though.

    Then we had a proper CS course in grammar school, using Turbo Pascal and an 8Mhz 80286 machine with 3MB RAM. I never really had a problem with files and the lack of hard coded line numbers. After all, I would also use the computer to write texts, which are stored in files and directories.

    Now, after a CS degree and more than 25 years of software engineering( yes, different from "coding"), I still remember TurboPascal as an excellent+fast IDE and compiler. It showed me the light of Algol-type languages with strong typing, abstract data types, number domains, functions, very nice control structures and the same time efficiency.

    Currently I work mainly with my own "memory safe C++" and with plain C++ on Linux and Windows. I feel I learned the basics of program construction with TurboPascal and it still is an excellent teaching system (besides Pascal being used for the heavy weight semi-mainframe OS HP MPE and the revolutionary Apple Lisa ).

    So, directly go for the Quiche and skip the Doughnut, that is my honest advice.

  52. fg_swe Silver badge

    No Files, No Serious Work

    All serious work on even small programs requires an iteration of development sessions. Of course one would also save BASIC programs to floppy disk, even then.

    Except for

    10 Print "hello"

    20 Goto 10

    But what exactly is the value of such a program ?

    1. CorwinX

      Re: No Files, No Serious Work

      It's actually "Hello World!" and it's used to introduce the basic syntax structure of the language in question.

      1. that one in the corner Silver badge

        Re: No Files, No Serious Work

        A "Hello World" is also what I'll still use whenever starting a new program that I expect to grow into something non-trivial.

        That way, I can set up the project directories, Makefiles, doxygen etc etc and get all all that infrastructure working and committed before writing any "real" code. Vast overkill for the end result, at that point, but it means Working On The Fun Stuff won't be suddenly interrupted by some faff when a lib's header isn't on the include path.

        Plus it acts as a reminder when switching languages again (so I do actually type it, rather than copying a pre-written hello.some_language)

  53. Henry Wertz 1 Gold badge

    Atari BASIC

    Atari BASIC was a tad interesting, they actually didn't use Microsoft BASIC at all and wrote their BASIC themselves. It was pretty compatible -- Compute! magazine would have a common listing, then what changed were needed for Atari, PET, etc. which were pretty small.

    I do recall people commenting that the divide and (I think?) multiply on it were APPALLINGLY slow, if your program was going to do serious amounts of math it was actually faster apparently to write code in BASIC to do long division (let alone use some optimized assembly). The 6502 didn't have hardware multiply or divide it was all bit shifts and add/subtract internally; apparently it was common to use some tables to speed this up, but they ran low on ROM space and used a smaller but much slower algorithm to shave off some bytes for whatever to use. (This didn't make much practical difference, since MOST software might have a few multiplies or divides, but just all that many that you'd notice a slowdown.)

    On the bright side, BAISC didn't support player/missile graphics ("sprites" on other platforms that supported this) or any too exotic use of the very flexible graphics hardware (like changing modes every 2 scan lines and the like) but it did have graphics modes (including the option of having a graphics mode with like 3 or 4 lines of text at the bottom... or full-screen graphics, your choice), plotting, sound support, etc. at least. There were aftermarket BASICs and languages like Action! that were faster and supported more features.

    1. ZX8301

      Re: Atari BASIC

      ATARI bought that BASIC in, they didn’t write it. With long variable names (tokenised for constant speed) and long strings it was substantially more readable and capable than MBASIC, and closer to the Dartmouth standard rather than the DEC BASIC+ fork (also bought in, BTW, and the model for Gates and Allen’s subset copy).

      ATARI BASIC arithmetic was slow because it used BCD rather than binary floating-point. Microsoft copied that for Z80 MSX BASIC, perhaps to make their machine-translated 8088 BASIC look less embarrassingly slow in comparison.

  54. CorwinX

    WTF?

    The file structure came about because of the real-world analog of Filing Cabinet (drive), Folder and Document within said folder.

    It's a concept a 5-year-old can grasp.

    My music collection is filed Artist/Group > Album Name (year) > 0x Song Title

    My book collection is filed as Author > Series Name (1-x -- y expected Month 200x) > Series Name 0x - Title - Author

    Talking over 6000 files here!

    I'd love to hear your opinion on a more sane approach.

    1. fg_swe Silver badge

      Flat File System

      HP MPE had only a single directory and thousands of files.

      Managing this complexity was done by filename prefixes for each sub-aspect of the system.

      But I agree with your observation that nested folders are not too hard to learn.

  55. Torben Mogensen

    C16 and Plus4

    Commodore tried to replace the VIC-20 and C64 with the C16 and Plus4 computers. These had much better BASIC and graphics, but they couldn't run C64 games, so they never became hugely successful. The later C128 added compatibility with C64, and was somewhat more successful, but it was essentially too little too late.

    I actually won a C16 at a computer fair when it was first released, so I played a bit with it. But since I already had a BBC Computer, I sold it off fairly soon. I did like the larger colour palette of the C16, but apart from that the BBC was far better.

  56. Torben Mogensen

    My first BASIC

    was called "RC BASIC", where RC was short for Regnecentralen, a Danish computer company. RC BASIC ran on my high school's RC7000computer, which was a rebadged Data General Nova (with ferrite memory, no screen, so all interaction was through a paper teletype terminal). RC BASIC was actually just Regnecentralen's version of COMAL, a structured BASIC supporting while and repeat loops and named procedures/functions with parameters and local variables much like the later BBC BASIC. Line numbers were optional in COMAL.

    The next BASIC I learned was on a Commodore PET which one of my friends bought. This was vastly inferior. Not only did it lack structured statements, variable names were limited to two significant characters. But it had a screen and limited block graphics, so it was fun to play with. I did some professional BASIC programming, first for a CPM machine and later for the Swedish ABC80 home computer. Their BASIC versions were not significantly better than Commodore BASIC, though.

    The last BASIC I used in a significant way was BBC BASIC. First on a BBC Computer that I bought shortly after its release (a group of friends imported a number from England, as there was no Danish retailer). My friend with the PET sold this and bought a BBC after he saw how superior it was. Next, I bought an Archimedes, which also used BBC BASIC. After this, I haven't used BASIC much in any version.

  57. GrahamRJ

    Files considered harmful?!

    In his rant about files, Liam has missed a fundamental difference between old desktop computers like a C64 and anything modern - or for that matter between a C64 and any commercial computer at the time. The reason you could just use line numbers like that is because your C64 could only run one program at a time. Multitasking of any form was impossible. Multi-user sessions of any form were impossible. As soon as you wanted to do anything more useful, you needed to abandon that idea - which is why AmigaBASIC and other later BASICs didn't use line numbers and did use files. It's also why every language developed on larger machines - C for a prime example - used files with human-readable text from day one.

    Line numbers also directly encouraged spaghetti code and directly discouraged structured programming. Sure you could do structured programming if you knew those structures, but it wasn't anything like common.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like