Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Homebrew is a new package management system for OS X based on git + ruby (github.com/mxcl)
99 points by blasdel on Sept 23, 2009 | hide | past | favorite | 46 comments


Someone's finally gone and written the package manager I've always wanted -- one that can just run in your home directory unprivileged and reuse most of the system packages. Also source-based and plays along with upstream by default!

I'd really love to see someone make this portable to other unixes -- it'd be lovely to finally have good package management for the ~/mine/bin dirs I have on servers everywhere (though a major complication is Debian's obnoxious segregation of headers into separate packages)


"a major complication is Debian's obnoxious segregation of headers into separate packages"

What's so obnoxious about this? The -dev packages will depend on libfooSONAME (and have far more stable names) so it's hardly more effort to keep them installed.


I don't know of any way to have apt always pull them in as dependencies, so I constantly find myself installing them manually.

Debian's policy of splitting all features off into their own packages (or disabling them entirely for ideological reasons), and having duplicates of many packages (with mutually-exclusive dependents) constantly fucks with me. I can't ever depend on its dependency tracking to actually give me what I want. It's not a good sign when all the digg-bait walkthroughs always instruct you to paste in a huge list of packages.


Aw, your argument was good until you referred to digg-bait walkthroughs. :] Ideological and/or legal disabling aside, these seem to be implicit drawbacks of using a binary distribution. If you link some app against libfoo, you will almost certainly need libfoo at runtime.

Hm, no apt configuration needed, just install the -dev packages and you'll get the relevant .so too. This seems to be the correct case to optimise for, not the other way around.


Except that other packages always depend on foo, and never foo-dev as a matter of policy, even where it makes no goddamn sense (setuptools doesn't depend on python-dev, rubygems doesn't depend on ruby-dev, etc.).

I'd still have to either constantly be installing foo-dev packages despite already having foo, or succumb to the digg-bait technique of never letting apt think for itself.


Please give up on digg bait. It's an unhelpful useless addition that only serves to insult people. What does insulting primary digg users get you? It's now gotten you two negative replies, and derailed the conversation from its original purpose.

Sure, I didn't have to reply, but I'd like to keep this community interesting and most importantly, constructive.


Is there a better term for the standard adsense-fueled "Hack your Ubuntu iPod with Compiz Fusion!" articles that always contain big <pre> blocks of "sudo apt-get install ..." for the reader to mindlessly copy and paste?


I'd really love to see someone make this portable to other unixes -- it'd be lovely to finally have good package management for the ~/mine/bin dirs I have on servers everywhere

Amen. That could be hugely powerful.


Hi I wrote Homebrew, and this is one of my goals, I've always wanted installing from source on Linux to get package management too. I do a lot of cross platform dev, and would also love for this to work on Windows. But that may be too lofty a goal.

I also added the feature where you can specify a git:// or svn:// protocol for a formula so you can easily keep up with HEAD development for your favourite project.


For home directory stuff, I've found that just git can be quite sufficient for tracking and syncing dotfiles and ~/bin.


Naively syncing ~/bin only works when all your systems run the same kernel, on the same architecture, using intercompatible versions of system libraries and programming language runtimes. It also doesn't solve the package management problem at all!

With homebrew, you could sync just the dotfiles and the list of installed packages, and let it take care of installing/upgrading packages on each machine. It'd be even better if it was smart enough to not shadow system packages when they are present+adequate.


> That only works when all your systems run the same kernel, on the same architecture, using intercompatible versions of system libraries

If dotfiles aren't compatible across OS kernels, you're doing something wrong. I keep scripts in ~/bin under VC, but with anything that needs to be compiled I just track the source and build with make (or emacs, etc.). You don't need to put everything in ~/bin in the repository!

From the summary, I don't really see how the system would work transparently across OS X, Linux, and BSD. It looks like it expects git + Ruby + OS X, and I don't use the latter two. (I prefer Lua and OpenBSD, respectively, but that's me.) Also, I work with i386, amd64, and occasionally sparc, and the platform thing isn't an issue since I'm not syncing binaries.


Yes, right now homebrew is specific to Mac OS 10.5+ -- the second paragraph of my original comment is pining for portability, so it could be used on shell accounts and beyond.

A major use case is the ability for your app to be able to pull in it's own sandboxed dependencies in a crossplatform way -- so anyone can develop on their macbook independently of their system is set up, and deploy on any worthwhile server regardless of its setup.

gem does a pretty damn good job of this for Ruby libraries, but it doesn't help at all for the native libraries and utilities that the gems are wrappers around :)


Well, there's the Unix distro package system and another repository for Ruby gems and CPAN and LuaRocks and Hackage and ... They keep stepping on each others' toes, but I'm not sure the solution is to add even more mutually incompatible packaging systems. By their nature, they need to see the system (inc. what dependencies are available) as a whole to be really useful, and fragmenting that picture makes the situation worse for all involved. (On second reading, it looks like it's designed to try to work with CPAN et al, so that's an improvement.)

It would be nice if it actually worked, but coming up with a perfectly portable global packaging system would be hard enough if it was working across platforms for one vendor, let alone the big mob that is the open source world.


Any idea why the Mac crowd seems to shun binary packages? Before playing with MacPorts I wasn't even aware of the concept of a source-only package manager. Each time I spend 20 minutes recompiling the entire xorg dependency chain because the minor version number got bumped from .1 to .2, I cringe at the thought of how many CPU cycles are being wasted all over the world to produce essentially the same thing.


Simply because there isn't an up-to-date package management system that provides binary packages. Typing "brew install <whatever>" is a million times easier than searching out the software package's website and hunting around for a binary package that's for your OS version.


...and that works together with all the dependencies you've also installed, and the set of configure --options they were built with.

It's also a strong guarantee that if the package is built any different than the upstream default, you know exactly what changes are being made, since they happen on your machine.


Personally there were two reasons:

1) I don't want bandwidth costs

2) It's much harder to get contribution for a binary system (I expect). If you're installing foo and it doesn't work for some reason you probably won't want to install Xcode and Homebrew-dev in order to try and fix it. When you're installing from source you're already a `brew edit foo && git push` away from helping fix it.

It's also worth noting that most packages don't have dependencies in Homebrew. This is because we don't duplicate what is already there and OS X comes with quite a lot of the common low level libraries. So compiling foo takes significantly less time than it does with Macports.


For the last couple of years there have been different CPUs for Macs. First we had the G4, then the G5, followed by the switch to Intel. And even for the Intels there's lots of different subtypes.

And of course now with the switch to Snow Leopard we've got the 64-bit issue as well.

Yes, it does take a while, but these days it doesn't take that long anymore.


Yeah, but as of right now you could basically call it PPC/Intel, and most users are probably running Intel. Binaries (universal or not) are good enough for every single application distributed for the Mac platform. Why would command-line tools or libraries need to be any different?


The majority of these packages link against non-system libraries, and the ones that don't are there to be linked against by other packages.

OS X is unique among modern unixes in that absolutely nobody distributes app bundles that have external dependencies on non-system libraries (except for a few that are too freetarded to bundle ffmpeg).


> Any idea why the Mac crowd seems to shun binary packages?

4 different architectures (PPC/32b, PPC/64, Intel/32 and Intel/64) are probably hell on a binary-only package manager. Source-only is simpler and lighter on the architecture side, though it's much heavier on the client one.

Fink does binary packages though, I think.


the openbsd project builds 1700 to 5600 binary packages for each of 12 architectures on a weekly/monthly basis and continuously mirrors them throughout dozens of ftp sites. building a few dozen or even a few hundred packages for 4 architectures isn't that hard, it's just a matter of resources. someone has to have 4 of those machines sitting around doing nothing but building packages, uploading them somewhere, and then take the time to fix ports and dependencies that don't build on certain architectures.

fwiw, i use fink on mac os and it supports binary packages. many of the ports in its stable branch are available as binary packages.


From reading the article it doesn't seem like it does a whole lot more than MacPorts... most of what is discussed is possible by tweaking port settings (prefix, partial/custom builds, having multiple versions of things, etc).

Maybe the CLI app exposes some of this a little more easily? I am not sure but I am curious why they built an entirely new setup rather than just improving/fixing up MacPorts...

Nonetheless a promising start!


The philosophy of Homebrew is radically different from MacPorts or Fink. Homebrew tries to work as far as possible with the pre-installed big items (Perl, Ruby, Python). It drags in as few dependencies as possible, in general, from what I can tell.

Both Fink and MacPorts, on the other hand, install virtually an entire second world of software - parallel to the built-in Apple stuff - in their respective sub-directories (/sw for Fink and /opt/local for MacPorts).

If you install Getmail with MacPorts, for example, you instantly drag in a full Python installation as a dependency (even though OSX already has Python installed). Similarly, if you install Weechat with Perl support, MacPorts installs its own version of Perl to build Weechat against. There are pros and cons, I think, to the Homebrew and MacPorts/Fink methods, but they are very different.


There was an attempt to get gentoo's portage working on OS X with the same philosophy, but they gave up and said that it had been a mistake to go in that direction and not just install parallel copies of everything.

I don't know the specific problems they had.


Historically OS X shipped with ancient versions of Python that were not always complete. Same with Perl and others. It's only since the last two or three releases of OS X that they included halfway decent versions of them.

I agree that it may seems somewhat stupid now, but there was (and perhaps still is) a very good reason for installing them separately.


> It's only since the last two or three releases of OS X that they included halfway decent versions of them.

And even then, sadly the pretty decent versions don't get updated (OSX 10.6 comes with Python 2.6.1… 2.6.2 was released in April…), so you're essentially SOL between two versions of OSX.


Well, it looks like the entire footprint (Ruby LOC) on github is smaller than port.tcl in MacPorts, so I imagine it seemed more approachable.


Interesting.

I tend to just build stuff myself from source and use GNU stow. I'm amazed I hear so little about stow, it's so useful.

I just configure with a prefix arg of something like /usr/local/stow/package-version, then use stow to install in /usr/local/{bin,etc,lib} etc. etc. Works equally well in one's home directory, which is what I used to do at work.


Heh, it's handy I didn't know about this, or I might never have started Homebrew.

Still at this point our ambitions are quite vast, so I expect Homebrew will prove more useful in the long run.


I'm sure it will. I'm generally only installing fairly small packages, with bigger stuff the stow method would probably get tiresome.


  > Homebrew helps get you chicks
  > here's no conclusive scientific evidence as yet,
  but I firmly believe it's just a matter of time and statistics.
Geek humour is awesome!


How difficult is it to port something from Unix to BSD?



How did your answer get up-voted 7 times. This confuses me; you put me down AND didn't answer my question.


You asked the equivalent of 'how is babby formed?'


It's not that hard, provided the people who wrote it are aware that other Unices exist besides Linux.

(I've done a bit of porting to OpenBSD. It's a pet peeve of mine when people hardcode references to bash for no good reason.)


It doesn't help matters at all that bash leaks bashism-support when run as /bin/sh -- it's much better that people specify bash than ship scripts that falsely claim to be bourne-compatible (since they are unlikely to test against a pure-bourne shell).

Not having a /bin/bash a losing battle -- it's way too late to get people to use the theoretically more correct #!/usr/bin/env bash just because the greybeard unices don't ship with it (despite shipping with plenty of other GPL software) and their package managers put it (when they are nice) in /usr/local/bin

Edit: That's not a diss on the greybeard unices -- I used FreeBSD and loved it dearly for years. I actually really like that they segregate all user-installed packages into a seperate root at /usr/local/ -- but for practical reasons I always make symlinks for binaries with well known locations (to use the robots.txt / favicon.ico pejorative).


It's about as easy as translating from American into English.


Did I seriously get down-voted for asking a legitimate technical question? That's an abusive down-vote in my opinion, people should be able to ask technical questions without being ridiculed because they don't know the specific inner-workings of different OS kernels.


Um, BSD is a unix, right? Did you mean to ask about porting from Linux to BSD?


Ah ok, fair enough. How hard is porting from Linux to BSD?


In my experience not hard as long as you wrote portable code.

The hardest bit is compiling a universal binary.


It looks cool, but I wish the early adopters the best. It will be a little while before I let this near my system.


When I buy a new MBP I'm going to use this, it's pretty much perfect for my use case and I'm not much of an early adopter as far as developers go. I've tried MacPorts and Fink, but they always inevitably had some issue that made me switch back to manually compiling things to /usr/local.

Homebrew seems like a very nice thin layer over manual builds. Some people maybe can't live without dependency resolution, but the lack of it is one of the main reasons that I'm confident that Homebrew is simple enough that it will help me with a lot of the mundane and annoying issues (ie. getting compile flags right for OS X, etc) without hamstringing me when I need to go outside the system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: