Earlier today I had a very interesting discussion with another tech local to my area. His working career in electronics started sometime in the late '60]s or early '70's and is still strong and active today so I and many others nationally value his vast experience.

While we digressed briefly onto the subject of voltage ratings he reminded me that it is important to note that excessively high capacitor voltage ratings can work to the detriment of the circuit they are used in.

Using a 50V rated cap in a 12V or 9V circuit is reasonable, but using 600V caps on the small 100mV signals such as within a guitar can actually de-rate the capacitive value of the cap. The problem is 100's of times worse with electrolytic caps as they require a minimum voltage to be applied to maintain their capacitive value. So right here and now I'd like to offer and personally strongly suspect that all caps have their ideal working voltage range, and that a bigger voltage rating isn't always better.

As an example, if you look at most tube amp circuits the cathode bypass caps are typically rated at 25V or 40V which is fine as most cathode voltages when operating normally are in the order of a few volts, and that is when there is a few hundred volts on the anode. It is always the case of using appropriate voltage ratings and capacity value for the job/task to be done. Throwing a 450V cap on a cathode bypass may work for a while or during testing but that cap will fail reasonably quickly as a 450V cap needs probably at least 50V or more on it to maintain condition internally and even the 10V or so on a self bias EL84 cathode will not be enough to maintain its health long term.

So, if the range of your tone pot in your build is not up to your expectations when using that 600V orange drop or whatever cap, then considering the work/task/job to be done maybe a 200V or 100V version might be a better choice...