• 0 Posts
  • 21 Comments
Joined 3 years ago
cake
Cake day: June 9th, 2023

help-circle
  • I don’t understand why this works, but it does

    What was happening before was this: Git received your commits and ran the shell script. It directed the script’s output stream back to Git so that it could relay it back over the connection to display on your local terminal with remote: stuck in front of it. Backgrounding the npx command was working, the shell was quitting without waiting for npx to finish. However, Git wasn’t waiting for the shell script to finish, it was waiting for the output stream to close, and npx was still writing to it. You backgrounded the task but didn’t give npx a new place to write its output to, so it kept using the same output stream as the shell.

    Running it via bash -c means “don’t run this under this bash shell, start a new one and have it just run this one command rather than waiting for a human to type a command into it.”

    The & inside the quote is doing what you expect, telling the subshell to background this task. As before, it’ll quit once the command is running, as you told it not to wait.

    The last bit is &> /dev/null which tells your original, first shell that you want this command to write somewhere else instead. Specifically, the special file /dev/null, which, like basically everything else in /dev/, is not really a file, it’s special kernel magic. That one’s magic trick is that when you write to it, everything works as expected except all the data is just thrown away. Great for things like this that you don’t want to keep.

    So, the reason this works is you’re redirecting the npx output elsewhere, it just goes into a black hole where no one is actually waiting for it. The subshell isn’t waiting for the command to finish, so it quits almost immediately. And then the top level shell moves on after the subshell has finished.

    I don’t think the subshell is necessary, if you do &> /dev/null & I think it’d be the same effect. But spawning a useless shell for a split second happens all the time anyway, probably not worth worrying about too much.





  • Opt out means “we will be doing this, without permission, unless you tell us not to” and opt in means “if you give us permission we will do this.” Codebases can contain important and sensitive information, and sending it off to some server to be shoved into an LLM is something that should be done with care. Getting affirmative consent is the bare minimum.


  • The right thing is to make it opt-in for everyone, simple as that. The entire controversy goes away immediately if they do. If they really believe it’s a good value proposition for their users, and want to avoid collecting data from people who didn’t actually want to give it, they should have faith that their users will agree and affirmatively check the box.

    If free users are really such a drain on them, why have they been offering a free version for so long before it became a conduit to that sweet, sweet data? Because it isn’t a drain, it’s a win-win. They want people using their IDE, even for free, they don’t get money from it but they get market share, broad familiarity with their tool amongst software engineers, a larger user base that can support each other on third party sites and provide free advertising, and more.







  • chaos@beehaw.orgtoScience Memes@mander.xyzthe welsh are fish
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 months ago

    There isn’t a simple evolutionary definition of “fish”, not the same way there is for, say, mammals. If you found the common ancestor of everything we call a mammal and said “everything descended from this one is also a mammal”, you’d be correct. If you did that for everything we call fish, every animal in the world would be a fish. Also, we decided which animals were fish mostly on vibes, so without a clear definition you can pedantically argue that everything is a fish including mammals.




  • I'm pretty sure the ball landed in

    C3.

    Albert is very sure that Bernard doesn’t know either. Bernard would know the location if it was in 5 or 6, indicating to all of us that Albert was told a row that isn’t A or B.

    Now that Bernard can also deduce that it’s not A or B, he’s narrowed it down to one possibility. That means all of us now know it can’t be column 1 either, because if it were, he wouldn’t have gotten anything from that new fact.

    Finally, now that column 1 is eliminated, Albert has deduced the location. Row D would’ve left two more possibilities, but row C leaves just one. Albert must know it is in row C.

    For the rest, well, there isn’t even actually a question, I suspect you’d open a door and pick a box and hope that you’ve got a gold ball to pick, and it’s not clear that he’s following Monty Hall rules and always opening a bad door, but I think knowing which ball got thrown would make the rest of the odds fall into place.


  • More specifically, they’re borrowing the more mathematical meaning of variables, where if you say x equals 5, you can’t later say x is 6, and where a statement like “x = x + 1” is nonsense. Using “let” means you’re setting the value once and that’s what it’s going to remain as long as it exists, while “var” variables can be changed later. Functional languages, which are usually made by very math-y people, will often protest the way programmers use operators by saying that = is strictly for equality and variable assignment is := instead of == and = in most C-style languages.


  • Do you think there’s any reason to believe that these tools are going to continue their breakneck progress? It seems like we’ve reached a point where throwing more GPUs and text at these things is not yielding more results, and they still don’t have the problem solving skills to work out tasks outside of their training set. It’s closer to a StackOverflow that magically has the answers to most questions you ask than a replacement for proper software engineering. I know you never know if a breakthrough is around the corner, but it feels like we’ve hit a plateau for the foreseeable future.



  • Yeah, they’re probably talking about nulls. In Java, object references (simplified pointers, really) can be null, pointing nowhere and throwing an exception if you try to access them, which is fine when you don’t have a value for that reference (for example, you asked for a thing that doesn’t exist, or you haven’t made the thing yet), but it means that every time you interact with an object, if it turns out to have been null, a null pointer exception is getting thrown and likely crashing your program. You can check first if you think a value might be null, but if you miss one, it explodes.

    Kotlin has nulls too, but the type system helps track where they could be. If a variable can be null, it’ll have a type like String?, and if not, the type is String. With that distinction, a function can explicitly say “I need a non-null value here” and if your value could be null, the type system will make you check first before you can use it.

    Kotlin also has some nice quality of life improvements over Java; it’s less verbose (not a hard task), doesn’t force everything to belong to a class, supports data classes which are automatically immutable and behave more like primitive values than objects, and other improvements.


  • I see this as an accessibility problem, computers have incredible power but taking advantage of it requires a very specific way of thinking and the drive to push through adversity (the computer constantly and correctly telling you “you’re doing it wrong”) that a lot of people can’t or don’t want to do. I don’t think they’re wrong or lazy to feel that way, and it’s a barrier to entry just like a set of stairs is to a wheelchair user.

    The question is what to do about it, and there’s so much we as an industry should be doing before we even start to think about getting “normies” writing code or automating their phones. Using a computer sucks ass in so many ways for regular people, you buy something cheap and it’s slow as hell, it’s crapped up with adware and spyware out of the box, scammers are everywhere ready to cheat you out of your money… anyone here is likely immune to all that or knows how to navigate it but most people are just muddling by.

    If we got past all that, I think it’d be a question of meeting users where they are. I have a car but I couldn’t replace the brakes, nor do I want to learn or try to learn, but that’s okay. My car is as accessible as I want it to be, and the parts that aren’t accessible, I go another route (bring it to a mechanic who can do the things I can’t). We can do this with computers too, make things easy for regular people but don’t try to make them all master programmers or tell them they aren’t “really” using it unless they’re coding. Bring the barrier down as low is it can go but don’t expect everyone to be trying to jump over it all the time, because they likely care about other things more.