Variable Digital Delay

Hi all,

How do we implement variable digital delay.

Hi Sachin,

It is not very clear what you want to do. What do you want to model?

A changing interrupt or execution rate of the algorithm?
Multiple interrupt or execution rates for different parts of the algorithm?
Do you want to model a changing sample point of the analog to digital conversion?
What sort of processor is this going to be implemented with?