Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It uses about 750KB total, spread over a dozen or so tiny programs. Probably 2x the memory and more CPU time because it's broken into so many pieces, both in time and code. And at least 2x-3x the development time.

It pushed me to work on minimum-memory maze solving. It costs a lot to test a cell (this involves ray-casting in the simulated world) so something like A*, which examines most of the cells, is out. An algorithm in Wikipedia turned out not to work; some anon had snuck in a reference to an obscure paper of their own. Had to fix that. What I'm doing is "head for the goal, when you hit something, follow the wall, if it will get closer, head for the goal again". Wall following is both ways simultaneously, so you don't take too long on the long path when a short path is available. After getting a path, the path is tightened up to take out the jaggies. This is not optimal but is usually reasonably good.

The rest of it is more or less routine, and a pain to break into sections. The programs have to communicate with JSON, over a weak IPC system with bad scheduling.

I always liked the 3D "metaverse" concept. Second Life, which has about 30,000 to 50,000 users on line at any one time, comparable to GTA Online, is the biggest virtual world around. Everybody bigger is sharded, but all SL users are in one world. The technology needs a refresh, but every competitor who's tried to build a big virtual world has been unable to get many users. So, you're stuck with outdated tech if you want to get something used in a virtual world. I think they're still in 32 bit mode on the servers, even.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: