language interface
examining how different languages and mediums serve as interfaces for expressing human intent to AI systems
I’ve been using custom instructions with ChatGPT
. Today, Cursor
deploys a new feature for setting repo level system prompt. They got me thinking about the role of language in the world of AIs.
Rule of thumb:
- Human to AI: the purpose of language is encoding clear and accurate intent.
- AI response: the purpose of language is interpretable and useful output.
I found the framing useful to evaluate questions like:
- Should one learn English, Chinese …?
- Should one learn to code?
- Should one code in python, js, sql, C, cuda, assembly …?
For human, language is the medium to encode intent for AI to achieve the goal. How effective of one medium vs another is conditioned on the goal, required level of details and control.
LLM compiler
Take Karpathy’s LLM compiler idea for example.
Since the goal is high performance cuda code, English won’t be enough to facilitate the ideal level of control and details. In this case, python, high “bit rate” programming language, is a better medium for interfacing with AI.
Image as foreign language
draw a flying bison in anime style
Could be an init prompt to image generative model. However, for different level of control and details, one may have to switch to image to image, inpainting and more advanced interfaces to get the ideal artistic rendering out of the model.
tldr
Choose the medium that offers enough detail and control to achieve the goal. Prompt engineering may phase out, but expressing clear intent would never go away.