I've been creating something akin to Minecraft creative mode in the browser since 2016: https://github.com/alanszlosek/voxeling. It all started when I found the VoxelJS project. The possibilities really hooked me so I forked it, learned a ton about WebGL, then started my own repo and rewrote nearly everything from scratch. I've been having a blast. There are endless opportunities for optimization (must maintain 60fps!), which is extremely rewarding to me.
Great timing on this post as I'm looking for something to replace my aging Dell Inspiron 1420n from 2008. Thanks for asking! It sounds like you just need something to run an editor (maybe VIM?), PHP and a browser... Like others have mentioned, if you need to run docker it will require something quite modern.
What's your budget?
Can anyone chime in about the Asus Eee PCs, or the Dell Inspiron Mini series? Or anything else near the 10 to 11-inch form factor? Would they be hard to type on for someone under 6ft tall?
I'm personally looking for something that can run Linux (perhaps even as slimmed down as Puppy Linux), Vim (or maybe VSCode) can run a browser (preferably Firefox but Pale Moon or other might be fine). Just bought an Asus VivoBook X202E on Ebay, but that's an 11.6" display, so I'll see how it goes.
What I'd really love is to find a blog post of the "Best Linux-compatible Netbooks Through the Ages". Has anyone come across anything like this? Would help me search Ebay for successively older machines until I get to the price point I'm looking for.
I second the point that it's helpful to know the actual landscape, so the division within GE should definitely be listed. After all, if they end up contributing something positive it deserves visibility.
And your mention of Zipcar, Maven, etc makes me think it'd be great to include the entities that own or fund each organization. Definitely means there's more information to hunt down, but crowd-sourcing can lessen that pain.
I wrote a self-hosted tool that fills this need for me. Been using it since 2008. Content is written in markdown, though it doesn't have a fancy editor. Supports tagging and searching too. Since you mentioned self-hosted, I thought you might be interested.
Seconded. He was on episode 199 of Software Engineering Radio two years ago (http://www.se-radio.net/2013/12/episode-199-michael-stonebra...), which I thoroughly enjoyed. It was so informative that I took a page of notes while listening! Really great stuff.
Two projects you may want to review for ideas are Redis and toybox.
Redis comes to mind because it started out as largely a single file of code that has since been split and organized into multiple files. The code is quite approachable; you'll likely understand how most of it works after a day of causal browsing.
http://redis.io
Toybox comes to mind because it's insanely modular, and aggressive about code re-use. The logic can feel a bit dense at times, but he's going for size and speed. I'm a big fan of Rob's efforts.
http://landley.net/code/toybox/
Thanks for this pointer. We also tend to be more organic in modularity. Start with a big chunk of code and split it when the time is right.
>> Toyboy, "insanely modular"
I haven't read the toybox code (so maybe it's not that bad) but we've had some folks go to extreme in the modularity/code-reuse direction. Their code is NOT easy to read.
Reading through this book[1] was a really mind-expanding experience for me. It is an overview of the kernel and doesn't dive into any one part in depth, but it was extremely valuable for me in learning how to structure a large c project as the same techniques and ideas are useful for many large projects.
I take the snapshots. I also have servers send backups to each other each night. I also have a nightly cron job run and rotate backups of the most critical databases to an external drive on my home network. A Tonido Plug does that job (Ubuntu on a tiny ARM server in a plug that costs virtually nothing to run).
Now, some of the databases are simply too large or under too much load to take a live backup while the sites are running. Those I run on Amazon RDS with the MultiAZ feature enabled.
There should be two copies of the database running at all times, both servers keeping a 3 day binlog for point-in-time backups, and making a nightly snapshot to S3. I have to rely on Amazon for that.
But I still take daily home backups of the most valuable individual tables off those servers, like user registrations and payment records. Even if I can't have off-site backups of the whole database, I'll have off-site copies of the part I'd need most in case of an Amazon-entirely-offline catastrophe.