@[email protected] to [email protected]English • 3 months agoResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comexternal-linkmessage-square26fedilinkarrow-up1134arrow-down140cross-posted to: technology
arrow-up194arrow-down1external-linkResearch AI model unexpectedly modified its own code to extend runtimearstechnica.com@[email protected] to [email protected]English • 3 months agomessage-square26fedilinkcross-posted to: technology
minus-square@[email protected]linkfedilinkEnglish35•3 months ago“We put literally no safeguards on the bot and were surprised it did unsafe things!” Article in a nutshell
minus-squaremagnetospherelinkfedilink3•3 months agoNot quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.
“We put literally no safeguards on the bot and were surprised it did unsafe things!”
Article in a nutshell
Not quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.