Spontaneous boiling is the scientifically correct term, so, yes, he is wrong for correcting her.
https://scholar.google.com/scholar?hl=en&q="spontaneous+boiling"
Spontaneous boiling is the scientifically correct term, so, yes, he is wrong for correcting her.
https://scholar.google.com/scholar?hl=en&q="spontaneous+boiling"
The only reason this is “click bait” is because someone chose to do this, rather than their own mental instability bringing this out organically.
This is my point. The case we are discussing now isn’t noteworthy, because someone doing it deliberately is equally “impressive” as writing out a disturbing sentence in MS Paint. One cannot create a useful “answer engine” without it being capable of producing something that looks weird/provoking/offensive when taken out of context; no more than one can create a useful drawing program that blocks out all offensive content. Nor is it a worthwhile goal.
The cases to care about are those where the LLM takes a perfectly reasonable conversation off the rails. Clickbait like the one in the OP is actually harmful in that they drown out such real cases, and is therefore deserving of ridicule.
Does the marketing matter when the reason for the offending output is that the user spent significant deliberate effort in coaxing the LLM to output what it did? It still seems like MS Paint with extra steps to me.
I get not wanting LLMs to unprompted output “offensive content”. Just like it would be noteworthy if “Clear canvas” in MS Paint sometimes yielded a violent bloody photograph. But, that isn’t what is going on in OPs clickbait.
And, the thing is, LLMs are quite well protected. Look what I coaxed MS Paint to say with almost no effort! Don’t get me started on plain pen and paper! Which we put in the hands of TODDLERS!
If someone is trying to do the most good with their money, it seems logical to give via an organization that distributes the funds according to a plan. To instead hand out money to people closest at hand seems it could be motivated more by trying to make me feel good than to actually make a difference.
Furthermore, there are larger scale systemic issues. Begging takes up a lot of time. It becomes a problem if it pays someone enough to outcompete more productive use of time that could, in some cases, pay, and in other cases, at least be more useful: childcare/teaching kids, home maintenance, cooking, cleaning, etc. In contrast, state welfare programs and aid organizations usually do not condition help on that the receiver has to sit idle for long times to receive help. Add to this that begging really only works in crowded areas, which may limit the possibility to relocate somewhere where living might be more sustainable. Hence, in the worst case, handing out money to those who begs for it could actually add to the difficulty for people stuck in a very difficult situation to get out of it.
This “analysis” of course skips over the many, many individual circumstances that get people into a situation where begging seems the right choice. What we should be doing is investing public funds even heavier in social programs and other aids to (1) avoid as much as possible that people end up in these situations; and (2) get people out of these situations as effectively as possible.
Crush both apples with the blunt side of the knife. Divide applesauce equally.
I very much understand wanting to have a say against our data being freely harvested for AI training. But this article’s call for a general opt-out of interacting with AI seems a bit regressive. Many aspects of this and other discussions about the “AI revolution” remind me about the Mitchell and Web skit on the start of the bronze age: https://youtu.be/nyu4u3VZYaQ
On the topic of things to never forgive Redhat about, aren’t there other things that are more pressing? Like, inventing a whole scheme to circumvent the idea of the GPL license via service contract blackmail?
I couldn’t agree more. If there only was a somewhat user-friendly setting that allowed the oom killer to be far more aggressive, killing or freezing processes as soon as their memory use starts to affect system responsiveness, and just tell me this is what has happened.
ADA should be the lawful good.
Bash is chaotic neutral.
Java is lawful neutral.
Javascript fits ok as chaotic evil.
Move ASM to neutral evil.
And maybe f77 as lawful evil.
Indeed-- the way forward would instead be regulation to weaken the record labels hold on licensing rights: basically, if they license music X to streaming service A, any other reasonable competing service must be allowed to license the same piece of music for the same price. This would open for real competition in the space in a way that doesn’t necessaily drive down license fees.
Something utterly meaningless, like a bag of generic candy, from the closest corner store “wrapped” only in that store’s type of plastic bags, clearly purchased last-minute on your way over to them. As they unwrap it you slip an “oh, I forgot to take that” and snatch up the receipt that you’ve forgotten in the bag, but only after they’ve seen that the item was on sale for $0.99.
“Dude, Where’s my car” turning into prophecy wasn’t in my bingo card:
https://youtu.be/iuDML4ADIvk