Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It all comes from the defined requirements and specifications.

i.e. "You shall handle x messages in y milliseconds."

from that, you derive your worst case buffer size given that you can service that buffer every 'z' milliseconds at most (Note, that involves a hard-real-time requirement as it is a bounded maximum time).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: