- cross-posted to:
- hackernews
- cross-posted to:
- hackernews
Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you.  😑


Because AI is a blackbox, there will always be a “jailbreak” if not a hardcore filter is used in afterFX