It all comes from the defined requirements and specifications.
i.e. "You shall handle x messages in y milliseconds."
from that, you derive your worst case buffer size given that you can service that buffer every 'z' milliseconds at most (Note, that involves a hard-real-time requirement as it is a bounded maximum time).
i.e. "You shall handle x messages in y milliseconds."
from that, you derive your worst case buffer size given that you can service that buffer every 'z' milliseconds at most (Note, that involves a hard-real-time requirement as it is a bounded maximum time).