Most modern programming languages allow you to define macros, new symbols or other extensions. Generally, you don’t want to use this features. Here’s why: First, keeping with the standards of the programming language you choose makes the code generally more readable. Not necessarily for you, but surely for anybody else. This saves time and effort if someone else has to go through the code. It could be your peer for the code review on the large scale enterprise project or some voluntary who helps out with the open source app you put on the web. And because the others have spent less time and pain going through the code, they can help you faster and will eventually do this again, compared to a hard to read code. Additionally, you can keep up to date more efficiently. If you have spent lots of code shipping around some insufficiency of the programming language to get things done and in the new version there is a feature that makes these problems go away, how you are going to find the problematic spots if you buried them? Sometimes there are migration guides with examples of the new features. If you can start out right away going through the code semi-automatically, why spend time to adopt them to your customized language additions? Moreover, why bother? Is there a real need to add some new features, macros or somethings to the programming language? Or would it be easier to address the problem with another, more suited language? You have to make clear what you want. Solving the problem or extending your favorite programming language? If it’s the second answer, why are you doing this within the code of an application?