Re: Game
Putting buckets over the heads of real life NPCs so you can steal from their shops doesn't work. Ask me how I know.
5533 publicly visible posts • joined 18 Jul 2007
Age verification is stupid but especially here. Operating systems are disproportionately installed by adults. Even if a child installed one, what purpose is age verification solving? What even is an operating system? Is it firmware? Is it a docker image? Does it have to connect to the internet? Does it have to have a browser? Do firmwares for consoles count as operating systems? Do in car entertainment systems? Or phones? Or LeapFrog gizmos aimed at toddlers? Or Raspberry Pis?
It's just perplexing, stupid, unenforceable legislation that serves no purpose.
At yes, the 5000 page long specification for MS Office formats compared to the 760 page long specification for OpenOffice formats. And it was belated specification tossed out there as a sop simply so Microsoft can proclaim they're a "standard" even though it is so impossibly complex that few competitors would bother attempting to implement it.
I think in some cases it's a sign of extremely conservative development - the company is literally paralysed, fearing any change could generate support calls. Or if they do change the UX it's a thin veneer over the terrible old system. I saw this first hand with Lotus Notes which got a facelift but only until you opened a dialog and then it was as awful as it ever was. It's like baking a cake with a dog turd in the middle.
That's all enterprise level software. A salesperson beguiles the CTO / CEO / CFO with horribly expensive software which looks good on paper but isn't fit for purpose. They'll sell it on bullshit claims like it reduces infrastructure costs, or improves metrics or whatever when it will do none of those things. Instead it will become a money pit requiring expensive contractors just to make the thing work at all and thereafter a constant drain on productivity because the UX is unforgiving arcane dogshit and the software needs constant nursing by full time employees.
What data???? All there is in the McDonalds app is discount vouchers and occasionally promotions. I had to double check there and there is NO WAY to put any payment information in there. There is literally nothing of value to protect beyond the login email address and the most basic of information.
So the 2FA is just pointless, at least for retrieving the vouchers.
I haven't entered any credit card details into my McDonalds app so my wallet is quite safe. Attempting to login to get a discount off a burger is not something that should require having to check email for a code to paste into the other window. Aside from anything else it means I'm wasting an extra minute at the kiosk for no benefit to McDonalds.
And if there was something of value in the app, e.g. Monopoly tokens, the simple answer is to surround that stuff with the 2FA.
The McDonalds apps insists on 2FA just so I can avail of some stupid hamburger deal. If there was any reason their app had to be protected for some niche reason (e.g maybe some people order through the app) they should protect that rather than the innocuous stuff, i.e. Sometimes security has to be proportionate to what it is protecting.
That's not what they said and no reason to assume it either. Media files are complex with all sorts of values and buffers whose size could be manipulated to trigger writing data beyond the end of a buffer for remote execution attacks. So rewriting the filter in a safer language is a desirable security mitigation.
Mozilla Firefox replaced its media filter all the way back in version 48 for exactly the same reason. You can see some of the code that Firefox wrote in Rust here - https://github.com/mozilla/mp4parse-rust. I assume Whatsapp implements something similar and probably for a variety of common file types.
Whenever I used BoundsChecker, or Purify on a real-world project the machine usually ran out of memory. Because the overheads of these tools and the instrumentation they added was so bloated and slow the machine died trying to build the code. Even when they actually worked they generated so many false positives that any genuine issues were drowned out in the noise. Given how obscenely expensive the tools were I found very little value in them at all.
I think a better approach would be linters that enforce rules like MISRA (rules that make C programming safer). It's still not perfect but at least it would focus on questionable programming practices.
Imagine a cold war between the US and Europe - not much of a stretch. The US government could well enact laws allowing law enforcement far easier access to foreign accounts. If Microsoft / Apple / Meta / Google complies with legal demands then they will be handing information like candy.
So entrusting some big US corporation with recovery keys is probably a terrible idea, especially for individuals or companies whose line of work could be considered in the national interest.
There is an obvious conflict of interest when Oracle controls one of the most popular open source databases that happens to compete with its own proprietary, expensive database.
So its not surprising that volunteers withered on the vine and decamped to the fork along with almost everyone who had an interest in MySQL.
I'm not "demanding" anything. The reason Powershell is still not a niche on Windows is because it's longwinded and provides no migration path. When I or anyone else opens a shell they want to type a few terse commands, not participate in a game of Verby-NounyNoun. Most devs know bash and/or command prompt and they just want brevity and familiarity. And for anything complex they'd write a script of some sort in NodeJS, Python, Perl or whatever which would have the added bonus of being portable.
PowerShell could easily have implemented "dir" the way people expect - write Get-DosChildItem cmdlet which supported the old arguments and alias to that. All of .cmd could have been supported and people would have no reason to use command prompt because the commands are unsurprising and PowerShell is a superset. They could have done the same for Linux/Unix file utils for that matter, as long as there was a way to switch personas / aliases.
But that didn't happen. So I and probably a lot of people only bother with PowerShell when they have to dip into the bowels of Windows to do some specific thing that is achievable no other way.
I don't use Powershell, which is what I was getting at. Occasionally there will be some weird admin command that requires I use it but not often enough to want to stay there when it's so verbose and strangely arcane - like a mainframe. The aliases suck and the cmdlets are verbose and unfamiliar.
It should have been a seamless transition had Microsoft decided to provide cmdlets that mimicked the old command prompt but they didn't.
Powershell supports cmdlets (mini scripts) so they should have written cmdlets for legacy commands like dir, move, deltree etc. If they strove for compatibility there wouldn't be a need for a command prompt today since they could have implemented everything over Powershell.
Instead they just lazily aliased commands onto the nearest Powershell equivalent even if it behaved differently. So "dir" is not "dir", it's "Get-ChildItem" and none of the arguments work the same.
Shame really. I suppose it's more powerful but it's also very longwinded and different way of doing what command prompt is capable of most of the time.
Red Hat take the source code from upstream projects and build their own packages from them. Presumably their build pipeline would just do a pull and rebuild and ignore the rpms Firefox supplies.
This effort appears more for users who want to refresh their browser with nightly builds to be on the bleeding edge. Although that come without added risk of stability, data corruption etc.
The way to approach a "metaverse" is to give people something fun to do and come back, e.g. an MMO with quests and progression, places to explore, danger. Then let users go wild creating races, factions, clans, bases etc to organically grow out this place. They'll come back for the progression, to hang out, for the raids etc. Plenty of opportunity to monetise the experience without being too heavy handed. It should feel slightly chaotic and mad, but fun.
OR go the Meta route. Create a bunch of boring zones where there is NOTHING to do except play some lame and broken minigames. A metaverse so profoundly dull that all the avatars look like humans, and it was a big deal when they got legs. A metaverse where Zuckerberg envisioned people gathering in virtual conference rooms to look at presentations. God knows how many focus groups and committees Meta ran their vision through to get to where they got by dear god did it fail hard.
I was suggesting this last year. Just generate sites full of garbage - mislabeled or badly mangled images, false content about people and places, false movie trivia, false research papers, false sports & hobbies.
An AI can even generate large quantities of this garbage based on prompts. It works better the more esoteric a subject since the information is not competing with other information.
Then abide by its rules. They are set in place to ensure privacy and security and no more discriminate against US firms than they do European ones. If a firm steps out of line and does so egregiously it'll get a heavy fine for its troubles.
But it does emphasize that Europe (and the UK) really need to reduce their dependence on US firms. e.g. there are European based cloud services who do take the law seriously and maybe its time for governments and companies to think long and hard about digital sovereignty.
If you're using a smart pointer then you're using the heap and have a new somewhere, or some kind of std::make_unique from a std::move going on. If you just declare your class / struct in a block of code then it'll be in the stack, but of course member variables might use the heap, e.g. if you had a string or collection in your struct.
The danger with C++ is that using smart pointers is optional and it's still possible to dereference a pointer or subvert reference counting. It's still better than GC in most cases but object ownership and destruction order can get pretty disgusting for shared (ref counted) pointers.
Theoretically Java could do limited lifetime analysis during compilation and zap some shortlived objects programmatically rather than in a GC. e.g. maybe there is a loop where a temporary collection gets dereferenced and it and the objects inside are obviously not referenced elsewhere so just force their destruction from the bytecode. Or maybe an object which is expended / drained / nilled or whatever you want to call it could say so to the runtime to expedite its removal.
But this is only going to work in trivial cases because Java and similar high level languages don't enforce how many references an object has at a time so either the object says "I'm done", or the language has to track and enforce it.
Garbage collection can also be a huge problem for things like games, or real time, or running on anything which only has physical RAM where you have to strictly budget heap to avoid out of memory situations. If you can reasonably foresee GC being an issue, then don't use that language to write the thing.
User experience is definitely some open source products forget. Making the UI only show things for the task in hand, removing clutter, being discoverable, being forgiving / helpful, and internal consistent are really important. Sometimes they're more important than the power under the covers.
For example I'd love to use FreeCAD more but the UX is so awful that I'll stick to a free tier on OnShape or Fusion 360. Not because I want to but because those products are way easier to use despite being functionally similar.
Linux (disambiguated here to mean a common dist with a Linux kernel + user land) is a Unix lookalike. It is not compiled or derived from BSD or System V source code so technically it is not Unix. But it implements the same functions, commands and concepts as Unix so for all intents and purposes it's Unix even though it isn't.
About 20 years ago SCO claimed they owned Unix and threw sueballs at everyone to try and wipe out Linux but failed hard because the implementation was different. They also lost ownership of Unix in the process to Novell - oops.
Unix lives on in various derivatives. The most mainstream one in existence is Mac OS since it is a BSD derivative. Most *nix systems implement POSIX standards so there is a lot of similarity regardless of how they came into being.
It's also worth mentioning, this is one guy saying his goal is this, not Microsoft's.
But even if it were Microsoft's, the real issue isn't whether Rust is safer than C (it obviously is) but whether it's worth rewriting existing code for the sake of it. I could entirely see the point of writing new code in Rust but not necessarily some ancient Win32 DLL which hasn't been a vulnerability for as anyone can remember.
Cloud has a place for some companies for uptime / disaster recovery and other conveniences. For most companies it probably doesn't matter much who their hosting provider is because their data isn't worth spit to anyone else.
But when a company is big and can afford to run and operate its own servers cloud becomes a really bad idea. Especially for a company like Airbus. It must be aware of a lot of threats from competitors and governments and must be subject to a lot of regulations regarding cybersecurity & resilience. The cloud becomes a point of failure and unique threat that has to be considered very carefully. At the very least, don't use an adversary's cloud hosting - and that includes the USA these days - and ensure digital sovereignty. But even better, bring stuff in house. I'm sure Airbus has honking big server rooms and hundreds of IT staff. They have the means and motivation to do stuff in-house and probably should.
It means the data is stored in a centre somewhere in Europe and not in the hands of a potential adversary. It's not just that it could be stolen, but could be basically held to ransom. e.g. maybe Trump decides to be a dick to Airbus and threatens their data in some way as leverage in a "deal".
Personally if I were any company with concerns about foreign adversaries stealing their data I would want to bring as much of it in house as possible, or at least host it as securely as possible. For a BIG company like Airbus I don't know why they'd want to use the cloud anyway if they could avoid it.
The boards of these companies should be legally and criminally liable for accidents their product causes. i.e. if they run a red light and hit a pedestrian then they're on the hook for the bills and potential charges for failing to stop.
Perhaps the law cannot be applied the same way as when a person causes an accident or breaks the law, but there has to be something. There has to be serious risk to the company in terms of fines and personal risk to the board for criminal negligence. There has to be a real possibility that the board of Waymo / Tesla (including Musk) ending up doing time if their product kills a family through negligence or oversight.
The answer is "money" and the question is "how do we take it from people?".
Samsung knows there are people who'll pay a two and a half grand for some flimsy gadget that will break and need replacing in a year. Even though they could buy a separate phone and tablet for half the price.
Most tinned "cream of chicken" soup is chicken stock, chicken fat, corn starch, milk / cream powder, salt and a few specks of meat. So basically the byproduct of chicken carcasses which have been stripped of their high quality meat. What's left is used either directly or through products reconstituted products like stock powder / fat / meat granules thrown into the mix.
So high quality it is not. Chopping, shaping, extruding or 3d printing the meat would be a costly extravagance for these companies.
It is but it goes still goes further than this. It boots up as an Amiga and takes Amiga peripherals, like 9-pin controllers, floppy drives, and is designed to go in A1200 case. There are FPGA style solutions too which go further again. Nobody realistically thinks OCS / AGA custom chips are going to be fabricated any time soon, but I see the THEA1200 as a pretty lazy effort.
All of which are better than what this thing will be which is basically just a custom keyboard with a tiny ARM board rattling around inside. No doubt they'll have to toss in some metal bars to make it weigh more substantial than it is.
I'd consider myself part of the nostalgia market. I have a gazillion Amiga games, workbench, kickstart roms etc. and they run on my PC. They run on a RPi even.
THEA1200 has a superficial resemblance to an Amiga, but it's keyboard with some ARM board inside. Like I said, if it were like a A1200 NG (or a Minimig), it might have some merit, but as it is, it doesn't, at least for me. I honestly don't get the appeal.
I asked an AI last night to generate a Python script that draws some boxes with PIL (Pillow) for an e-ink display I'm playing around with. It managed it and saved me faffing around writing the same code by hand. HOWEVER, I only trust AI for this sort of noddy programming - as a glorified wizard for making simple code samples or throwaway stuff. When tasks are complex, or require security / safety I wouldn't trust it as far as I could throw it.