Re: Searches for import PST to Thunderbird. Nope, nothing.
Not directly, maybe, but PST -> MSG or similar is a possibility.
4168 publicly visible posts • joined 26 Jul 2007
Oh dear, it is common practice to keep the initials the same e.g., Embrace, Extend, Extinguish, looks like the need some assistance in finding something with three L's. Let me think now...
Listen, Learn and Lunch
Listen, Learn and Lock-in, ah, yes, that's much more appropriate.
Your turn, fellow commentards, for that final L...
Which reminds me. I wonder how many systems out there can cope with hyperinflation. This will be where we find out if systems use "lossless" number formats, and if they have thought properly about boundary conditions. On another note, I'm sure there were inconsistencies when UK made changes to their VAT rate after a long period of stability, particularly with regard to transactions that straddle the point at which VAT is accounted for by different traders.
I did my fair share of Y2K consultancy.
I admit to it having to pay off some technical debt though. I knew for a fact that there would be big problems on 1/1/2000 if appropriate action was not taken.
When I worked on London Underground's computerisation of the Northern/Victoria lines (way before the millennium) we had a problem that the computers being used didn't have the oomph to manage all the tests needed to perform some of the required processes. On at least one program I worked on (to do with programme machines: see google) there were several tests that needed to be repeated on demand every couple of seconds. If the processing of those tests (manifested as inter-process messages in the system) took longer than the allowed time then the message system would collapse (with a negative queue count error: for those of you familiar with GEC 4XXX architecture), taking the whole system irretrievably down with it. It was like a snowball effect, which could be set off by a completely unrelated process taking slightly longer to run. Among the discussions we had to resolve the issue was my suggestion to treat 1/1/2000 as a "special value", which meant that one comparison would suffice for at least two purposes. It doesn't sound like much of an optimisation, but it made a lot of difference in performance. I had already asked what the expected lifetime of the project would be and was reassured that the system would be upgraded well before 2000.
An example of another kludge we had to do in the original system we worked on was also to do with programme machines. IIRC there was not enough space in core to deal with the total number of these on-site, so special code was created to deal with those situated at Kennington (there were an inordinate number located there, compared to other locations). I was totally unaware of this code until I was tasked with taking on and upgrading the program for a replacement system. The performance was dogged by these special tests so I asked if there was any other reason for treating Kennington in this way. The answer was nope so we took up more core to make all programme machines act in a more homogenous way, and performance improved.
In the old days, if you got an email from a Big Corp with spelling mistakes and/or grammatical errors in it, you would identify it straight away as fraudulent in some way.
Now, if the prose is too perfect, then we're getting to the stage where it is put in the dodgy category.
The ocassional spilling misteak nowadays is good to prove some text was written by a meatsack... Until such errors start to find their way into AI systems, and round the loop we go again.
I don't quite see what all the fuss is about. In my day we had things called Diaries and, in generations since people might have had a thing called a Filofax.
Has society moved on so much that we have forgotten about these things? I appreciate Sam has a further disadvantage than most of us, but sometimes, just picking up an old diary might remind us (him?) of a lot more than what is formally recorded - that hastily scribbled phone number on a beer coaster wedged inside, for example.
Maybe this is a wake-up call to get your butt down to Rymans to pick up a 2025 Diary - size is your call. Week to a view or two pages per day, or anything in between. Be inspired...
Hmm, there are "concepts" in between that adequately describe a process to a "lay" person.
I've been thinking about your comment today. Is it important to understand how things work... to a point? To what level? Very briefly, I think the deciding factor for me is that when our ancestors used to roam the jungle we would automatically associate "food" with "food", but if someone had laid a trap for us to fall into, then we could easily become the "food". Evolution engenders an interest, for survival sake, of looking beyond how things are presented to us, just in case. If we don't ever need to use that information, fine, but it is there, for future reference, if necessary. In today's society it is paramount that we understand the prize of a free iPhone is likely to be a trap, so we must be alert, not merely to "face values", but also to what might lurk behind. When Armageddon hits then arguably we're all going to have to brush up on our foraging, building skills, etc., etc. because humans will adapt.
(1) Mullard's Electronic Counting book. The frustration with that book though was that it omitted "debounce" circuitry essential to make things count without skipping.
(2) How A Computer Works by W.L.B. Nixon, publisher: University of London, Insititute of Computer Science
https://www.computinghistory.org.uk/cgi/archive.pl?type=Books&author=W.L.B.%20Nixon
Typewritten and mass-produced using a Gestetner machine (eh? what's a gestetner machine? Oh Lord, how we suffered without laser printers. Now listen, you've got a browser, go google it). Seriously though, extremely well written, with exactly the right depth to it.
Bought from the bargain basement of Pooles, Charing Cross Road. My 2nd edition copy stays with me until they wheel me out in a wooden box.
(3) Texas Instruments orange hardback guide to the SN7400 TTL range of IC's.
When I was at South Bank Poly, no matter how good your labwork was, you only got 9/10 max. Even going down to the British Library to seek out papers by Turner on his method of measuring inductance fared me no better. When we did a lab on TTL though I went the extra five Irish miles and I got a 10! (No that is not a factorial sign, please be realistic).
(4) Motorola M6800 Microprocessor manuals. IIRC one of the breakthroughs Motorola had with the M6800 was their PIA and ACIA peripheral interface chips which got rid of the need for any Input/Output instructions, and you could have as many of these chips as could be addressed by the bus.
(5) Software Tools/Software Tools in Pascal by Kenighan and Plauger. A few here have I think, have alluded to these.
(6) The course book we used for digital design was Douglas Lewin's Logical Design of Switching Circuits (I still have a 2nd Edition copy, with dust jacket!). One of the interesting projects outlined in this book is the semi-automated method of optimising Karnaugh maps devised by McCluskey.
I may add to this list.
Maybe some of us commentards should have a discussion about providing some kind of legacy resource. In the coming years this kind of knowledge will be lost forever. Google is on a trajectory to forget material it thinks is no longer relevant. Who determines that, then? This is not just I.T. I've come across people who are amazed that Plate Tectonics is a recent advancement in science. It is, I feel, essential that the evolution of such technology is not lost. It highlights how dogmatic and arrogant experts were, and how inspirational it was for someone to gradually change everyone's minds until now, we can't think of anything different.
My first lessons on programming were at South Bank Poly. I remember one incident very vividly and has stood me in good stead ever since. When we'd written our programs we'd submit them as two Hollerith decks: 1. the program and 2. the data (showing my age).
So we all got to eventually get clean results (a week or so later @ one run a day, if you were lucky), then the lecturer asked us to submit some data he had prepared, and we were aghast.
The program he had asked us to submit implemented the customary quadratic equation (-b +- sqrt(b^2-4ac))/2a
We had all used sensible values for a b and c. But he wanted us to submit a=0 (for example) (I bet the computer operators all loved us).
We all know what happens when the tyres are not properly kicked prior to putting code into production (ask Microsoft about buffer overflows, for example). But far too many coders are guilty of coding to run cleanly with no thought to rogue input at all. Nowadays coders use try except constructs to bat away erroneous input, but this can have unintended consequences if not properly thought out, and I've seen many too many examples of swallowed exceptions to think that is not a commonplace 'solution' for anything that can't be immediately solved.
When I worked for A Big Organisation my boss would test his programs on the test system with his fingers crossed. If it 'held up' for a 'decent' length of time then 'it was fine'. If it crashed he would restart the system and assume it was a minor glitch. Only if it did that a couple of times would he investigate. Me? If the system fell over once I would immediately get a core dump and go through it until I found the culprit.
This is another thing: Log files imho are completely useless these days. I worked with a company that had a Silverlight application installed. My $Deity, please spare me from users reciting Silverlight error codes to me. Even the designer didn't want to know what they were.
Anyone who touched my programs would know of my love of the circular buffer for anything my code eyeballed, in the order it was to be dealt with. Conversations about race conditions were easily settled by a look at the core dump (this was a real-time messaging-driven system btw).
And that's another thing: Do students still learn about things like circular buffers?
Not going to embarrass the vendor here, but I used to have Wireshark logs which show a WellKnownRouterBrand dishing out the same IP address to two devices in close sequence. Notified them, but I've never seen this advertised as "fixed", but then again I've moved on from using their products...
I can't remember the nitty gritty now but one of my customers had a problem with their website. Anywhere in UK it worked fine, but in USA (their target market), surfers were being served up with the International Herald Tribune homepage. I managed to run Traceroute from locations across the USA to determine where the fault manifested itself, then contacted the relevant site. Turned out that a server along the path had been compromised, had to be rebuilt from scratch then forced to respond properly to DNS requests. In those days it could take several days for invalid DNS requests to be properly squashed which was a frustration.
I suspect W7 is where the problem lies in your case. HMRC Devs were given details, but that message may not have found its way to end users.
https://www.accountingweb.co.uk/any-answers/moneysoft-payroll-wont-connect-to-hmrc
The reason this was left in limbo was that my client unfortunately died whilst this matter was in progress, his successor had the same problem but suddenly rectified itself of its own accord, many months later.
The latest problem I've wasted hours on - still ongoing - is a program that needs Office 365 32-bit. So you uninstall the 64-bit version, then install the 32-bit version, which it says can't be installed because the 64-bit one is installed. But it's not on the Apps list. Trying to run MS's Easy Fix [sic] didn't work either. MS seem completely unable to uninstall and reinstall programs cleanly.
Unfortunately Windows belongs in Witchcraft territory.
One of my customers has been complaining that their payroll package can't connect to HMRC. This has been going on since April (a workaround has been in place). Every so often we'd give it another look, but same result, the manufacturer's test utility, and everything else says it should work. Today we go to Factory Reset his pc (for other traumas) and I'm told "oh btw, Payroll's now working." He swears he did nothing. Unlike another of my clients, where things right themselves after a hefty effluxion of time, this client does not have a cat. Now I'm not suggesting there is any connection between witchcraft and cats, but...
(Can we have a cat icon?)
Reminds me of the condrum of someone wanting to send a pole through the post.
The pole was 5 ft 5 in length, but the maximum allowable package size was 4 ft by 3 ft by 1 ft 6 in.
(I think I've got that right). This was the sort of question that used to get asked in interviews, is that still true?
Order identical stuff from Amazon at the same time, and get it delivered to different places, is there consistency in arrival time/date?
This does have to be done over the course of several deliveries as things can go pear-shaped in isolated cases. I have to say delivery mistakes are rare with Amazon, but where mistakes are made they can tend to happen when most inconvenient. Worst one of recent times was ordering two 230 Litre water butts (a few days apart). One arrived in the advertised time frame. the other went for a tour of England Mainland, luckily when it did arrive they dumped it in my front garden rather than getting my neighbour to take custody of it lol.
The way data is extracted from the evidential hard drive is not by sticking it in a pc and doing a copy. First off the drive should be cloned under careful supervision, and that cloned copy should solely be used to issue copies to plaintiff and defence alike. We're talking about the entire history of everything that ever was written to the drive, including deletion of files, extension of edited documents, because they have outgrown their initial hard drive allocation, copying new files, and overwriting of deleted files because that is how file allocation algorithms may work. SSD's have their own peculiarities regarding these algorithms, but this is of no consequence so long as the clones have faithfully captured the results. This is data forensics 101, no?
If either party wishes to stick their copy in a pc and copy files from it is entirely up to them, but that ignores such important things as chronology, and what is now no longer there on the surface.
To my mind Cybersecurity is difficult to start a business in, for the simple reason that your actions have to be formally clearly defined and approved before you can start tinkering with other people's systems. There have been cases where, even with that explicity approval, pentesters have been arrested because different departments within the target company (with understandable reasons) were not notified of intent.