RSS BotMB to Hacker NewsEnglish · 1 month agoRobot Jailbreak: Researchers Trick Bots into Dangerous Tasksspectrum.ieee.orgexternal-linkmessage-square1fedilinkarrow-up16arrow-down10file-text
arrow-up16arrow-down1external-linkRobot Jailbreak: Researchers Trick Bots into Dangerous Tasksspectrum.ieee.orgRSS BotMB to Hacker NewsEnglish · 1 month agomessage-square1fedilinkfile-text
minus-squareSpaceNoodle@lemmy.worldlinkfedilinkEnglisharrow-up4·1 month agoAnd this is why you don’t just plug your LLM directly into motor controls. You need an executive unit that acts in a well-defined manner based on sensor inputs, with failsafes.
And this is why you don’t just plug your LLM directly into motor controls. You need an executive unit that acts in a well-defined manner based on sensor inputs, with failsafes.