Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiexternal-linkmessage-square26fedilinkarrow-up1299arrow-down17
arrow-up1292arrow-down1external-linkCursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requestsconsumerrights.wikiPro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square26fedilink
minus-squareQuadratureSurfer@lemmy.worldlinkfedilinkEnglisharrow-up183·2 days agoSomeone just got the AWS bill.
minus-squaredouglasg14b@lemmy.worldlinkfedilinkEnglisharrow-up2·1 hour agoMore like they just got their Anthropic bill. Cloud compute is gonna be cheap compared to the API costs for LLMs they use/offer.
minus-squarecrunchy@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up82·2 days agoThat’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
minus-squarefmstrat@lemmy.nowsci.comlinkfedilinkEnglisharrow-up14arrow-down1·2 days agoI’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
Someone just got the AWS bill.
More like they just got their Anthropic bill.
Cloud compute is gonna be cheap compared to the API costs for LLMs they use/offer.
That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.