

0·
6 days agoit’s all mikan to me


it’s all mikan to me
too soon, pi day is in 18 days


you can like… enforce this rule programatically? you don’t have to say “pretty please” to ai? basically, when AI requests some potentially unwanted thing (like deleting an email), this request goes through a proxy that asks the human for confirmation. Also you can have a safe word set up in the chat interface to act as a killswitch. I thought these are ABCs of ai safety but apparently these are foreign concepts to this “safety director”
bros never heard about phatic expressions💀