Like last year, here’s a list of Debian related projects that I’d like to tackle this year. I might not do them all, but I like to write them down, I am more productive when I have concrete objectives.
- Translate my Debian book into English.
I will run a fundraising campaign with Ulule.com and if enough people are interested, I will spend a few months with Roland Mas to translate the book. Hopefully this project can be completed until the summer.
- Finish multiarch support in dpkg.
I’m working on this with Guillem Jover right now, thanks to Linaro who is sponsoring my work.
- Make deb files use XZ compression by default.
It’s a simple change in dpkg-deb and it would literally save gigabytes of space on mirrors. It’s particularly important so that single CD/DVD can have a more complete set of software. #556407 (on DAK) needs to be fixed first though and a project-wide discussion is in order. Some archs might want to have a different default.
- Be more reactive to review/merge dpkg patches.
While we welcome help, we have a backlog of patches sitting in the BTS and it happened several times that we failed to review/merge submitted branches in a decent time. It’s very frustrating for the contributor. I already tried to improve the situation by creating the Review/Merge Queue but nobody stepped up to help review and update the patches.
As I am getting more familiar with the C part of dpkg, I hope to be able to jump in when Guillem doesn’t have the time or the interest.
- Implement the rolling distribution proposed as part of the CUT project and try to improve the release process.
I really want the new rolling distribution but it will inevitably have an impact on the release process. It can’t be done as a standalone project. I would also like to see progress in the way we deal with transitions (see discussion here).
- Work more regularly on the developers-reference.
Hopefully I will be able to combine this with my blog writing activities, i.e. write blog articles on the topics that the developers-reference shall cover and then adapt my articles with some docbook markup.
To the above list, I shall add a few supplementary goals related to funding my Debian work:
- Write a 10-lesson course called “Smart Package Management”.
It will delivered by email to my newsletter subscribers.
- Create an information product (most likely an ebook or an online training) and sell it on my blog.
The precise topic has not yet been defined although I have a few ideas. Is there something that you would like me to teach you? Leave your suggestions in a comment.
- By the end of the year, have at least 1/3 of my time funded by donations and/or earnings of my information products.
More concretely it means 700 € each month or a 9-fold increase compared to the current income level (around 80 €/month mostly via Flattr).
That makes up lots of challenges for this year. You can help me reach those goals, join my Debian Supporters Guild and you’ll be informed every time that I start something new or when you can help in specific ways.
Hope for the best and you sure reach the goal and make Debain GNU Linux easiest then windows 🙂
> Make deb files use XZ compression by default.
Please don’t! Uncompressing XZ archives takes much more time, switching to XZ would be really painful for users of older computers.
Raphaël Hertzog says
Obviously we’d have to gather statistics about such issues before taking the decision. Do you have numbers? Is there an optimization level that provides better speed while still compressing much better than gzip?
But I also don’t think we should block the decision because it will be slower for some users. I’m more concerned by the memory consumption than by the slowness factor. Because lack of memory might mean you can’t unpack a package at all.
Here’s some numbers:
$ du -sh data.tar.*
data.tar.gz is taken from linux-image-2.6.32-5-686_2.6.32-30_i386.deb (I believe it’s good package for this test — not too small, not too big, and quite popular ;)), *.xz created with default options
$ time tar xJf data.tar.xz
tar xJf data.tar.xz 10,00s user 2,63s system 20% cpu 1:00,31 total
$ time tar xzf data.tar.gz
tar xzf data.tar.gz 3,53s user 2,41s system 10% cpu 55,400 total
> I’m more concerned by the memory consumption
Since I don’t know good method of measuring memory consumption per process other than htop and top (and it’s rather hard to copy & paste their output), here’s some highest results of free -m taken while uncompressing tar.xz:
total used free shared buffers cached
Mem: 246 242 4 0 50 131
-/+ buffers/cache: 60 185
Swap: 486 0 486
total used free shared buffers cached
Mem: 246 242 3 0 44 144
-/+ buffers/cache: 53 192
Swap: 486 0 486
(most memory-hungry proces running at the time was Xorg)
> Is there an optimization level that provides better speed while
> still compressing much better than gzip?
AFAIK (I don’t want to pretend I’m some compression expert) there is no magical –make-it-friendly-for-old-machines switch, XZ’s algorithm is simply designed with modern multi-core CPUs in mind. You can change the numbers above with -[0-9] flag, but decompresion is still much slower than in case of Gzip.
> But I also don’t think we should block the decision because it
> will be slower for some users.
Sure, I just want to make sure you know, there are also some minuses . 🙂
> More concretely it means 700 € each month or a 9-fold increase compared to the current income level (around 80 €/month mostly via Flattr).
Thank God, I was starting to worry there was no hint to Flattr this time, but: TA-DA! There it is. Nice goal for the new year, man!
xz is a *much* better choice than bzip2, and I think a compression factor
of -1 (about same ratio as bzip2 with MUCH faster decompression) or
even -2 (much better compression and not much slower than bzip2,
especially decompression/compession speed ratio is better with xz)
should be viable for all architectures. (Even m68k.) Up to -6e on “modern
beasts” sounds reasonable, but maybe not as default (think of LibreOffice).
Also, talk to Lasse (larhzu on Freenode), he’s really approachable.