## Introduction and background

As already described in the previous article, in the process of reworking the DSO138 oscilloscope toy, the idea arose in the DSO303 firmware at some point to try to double the maximum sampling frequency to achieve scanning times of 500 and 200 nanoseconds per cell. In fact, for the STM32F303, the theoretically maximum achievable sampling rate from the point of view of the ADC input, and this is determined by the minimum opening time of the ADC sampling unit, which in our case is 1.5 clock cycles x (1/72 MHz) = 20.8 nanoseconds, is 48 MSPS (millions of counts per second). However, with the parallel operation of 4 ADCs at 6 MHz, it is possible to achieve only 24 MSPS due to the limited speed of the ADC.

Let's imagine that we are considering correctly-periodic signal, which is also constant, i.e. it does not experience fluctuations in frequency and amplitude over time. Is it possible to somehow digitize it not in one, but in several passes, thereby increasing the effective sampling frequency?

## Triple pass digitization – the idea

Let's turn to Fig. 1., which actually shows the idea of multi-pass digitization. Suppose that each ADC is triggered by its own signal from a timer that operates at twice of the clock frequency of the microcontroller and ADC (for 32F303 – 72 and 144 MHz, respectively). In this case, the digitization cycle takes 12 cycles of the ADC (24 clock cycles of the timer) at a resolution of 10 bits. We divide this cycle into 8 slots of 3 timer cycles (1.5 ADC cycles), which exactly corresponds to the opening time of the sampling unit.

Channels ADC 1 and ADC 3 will be reference channels, they will always be triggered at the same time of the cycle, and the fire time of channels ADC 2 and ADC 4 will be moved inside the cycle depending on the pass. That's actually it, after performing 3 passes of such digitization, we will get the final result.

"However," an attentive reader will say: "but pardon me, why do you assume that your digitization cycle in the next pass will be in the same position with respect to the signal? After all, the start of the digitization process occurs at a random moment in time...". And, of course, he will be absolutely right.

But for this, we have two reference channels out of four available, and time to run the digitization process an unlimited number of times. You just need to run the next pass many times until the position of the digitization cycle relative to the signal will be close enough to the one that was at the first pass, checking the correlation through the reference channels. Thus, our digitization process will look like this:

Digitize and record the data of Pass 1;

Digitize the data of Pass 2;

Calculate the correlation of the reference channels, if it is good enough, then record the data of Pass 2, if not, then repeat step 2 again;

Digitize the data of Pass 3;

Calculate the correlation, if everything is good, then record the data and start the final assembly and processing of the digitization data and then display the resulting image, if not, then repeat the digitization of Pass 3 again.

## Results

Yes, sometimes the process takes quite a long time, and digitization can be repeated tens of thousands of times before the result is achieved; yes, this method is not suitable for all signals; yes, the result is not without artificial artifacts, as in Fig. 2., but quite often it turns out as in Fig. 3. In both figures, a 1 MHz meander is applied to the input.

At the same time, the worst results are obtained for signals with rapid shape changes and sharp edges (meander), and the best results are observed for smooth signals (sine wave). For illustration, Figure 4 also shows a picture of a sine signal with a frequency of 200 kHz. The signal is "a little" noisy, which immediately leads, unfortunately, to quite large distortions, since one of the conditions for a correct image with this method of digitization, which was already mentioned above, is the constancy and absence of signal fluctuations in time, which this picture also illustrates.

The signal period for the correct operation of the correlation algorithm theoretically cannot be less than the digitization cycle time, but a practically acceptable result is obtained only when the signal periods are larger than 3 cycles, i.e. >500 nanoseconds. Thus, unfortunately, it will still not be possible to examine signals with frequencies greater than 2 MHz. But at the same time, at least such signals can be seen in sufficient detail, which will undoubtedly be useful.

That's how the DSO303 got scan times of 500 and 200 nanoseconds per cell. In the mode of 500 nanoseconds per cell, the image is an honest result of digitization, each point = a count. Then I thought that it would be nice to enlarge this image also, so the mode of 200 nanoseconds per cell appeared. It uses exactly the same data as for 500 nanoseconds per cell scanning, only they are enlarged so that one count = 3 points on the screen. It turned out to be quite convenient and got its right for life.

All of the above is an experimental idea. The author does not claim that this experimental method of digitization is fully operational, and is not responsible for the result. Everyone is free to use this method of increasing the effective sampling rate used in the DSO303 oscilloscope program exclusively at their own risk.

Any questions, comments and suggestions are welcome.