Re: American fart research.
I guess you're checking your coat for the emergency fresh-air spray? Never leave home without it.
319 publicly visible posts • joined 13 Aug 2021
Nobody stores a password in a database - you store a salted hash and compare those.
But strcmp is the wrong function to use. A human wrote that to return as soon as the answer is known, which is as soon as a character differs. The optimiser had nothing to so with it.
Also, that would be in glibc, your -O3 is not going to recompile that anyway.
You want a function that has to compare every character. Maybe something that returns how good of a match it was.
But the XB-70 Valkyrie was way cooler.
Doesn't sound like it's safe for work, but it is : Cup of Brown Joy
You said:
Can you add punctuation to the following sentence ...
ChatGPT said:
Of course! Please share the sentence you’d like me to punctuate.
You said:
You just have to ensure that your prompt uses terrible grammar and is one massive run-on sentence like this one which includes all the information before any full stop which would give the guardrails a chance to kick in before the jailbreak can take effect and guide the model into providing a "toxic" or otherwise verboten response the developers had hoped would be filtered out.
ChatGPT said:
Here’s your sentence with punctuation added for clarity:
“You just have to ensure that your prompt uses terrible grammar, and is one massive run-on sentence—like this one—which includes all the information before any full stop. That would give the guardrails a chance to kick in before the jailbreak can take effect, and guide the model into providing a ‘toxic’ or otherwise verboten response the developers had hoped would be filtered out.”
Would you like me to make it fully polished (split into multiple shorter, cleaner sentences), or keep it closer to your original long-run style but just punctuated?
You said:
Yes
ChatGPT said:
Got it — here’s a fully polished version of your sentence, split into shorter, clearer sentences while keeping the meaning intact:
“You just have to ensure that your prompt uses terrible grammar and functions as one massive run-on sentence, like this one. It should include all the information before any full stop. That way, the guardrails have a chance to activate before the jailbreak can take effect, guiding the model away from producing a ‘toxic’ or otherwise prohibited response that the developers intended to filter out.”