Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah fair enough. I'd argue that this is kind of a double standard (a traditional RDBMS may well be copies of "deleted" data on dirty pages, and may well leave that data on the physical disk indefinitely, for much the same reasons as Kafka does - it just makes it a bit fiddlier for you to access it), but your legal team may decide that it's required.

I don't think your overall scorn is warranted - there are bigger problems that are endemic to RDBMS deployments, and the advantages of a stream-first architecture are very real - but there are genuine difficulties around handling data obliteration and it's something you need to design for carefully if you're using an immutable-first architecture and have that requirement.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: