Announcing walctl for PostgreSQL Transaction Log Management

March 27th, 2014 | Published in Database, Tech Talk | 2 Comments


I’ve managed to convince my employer to open source one of the tools I recently wrote. That tool goes by the name of walctl, and I believe the importance of this kind of tool can not be overstated.

The PostgreSQL Write Ahead Log (WAL) files are key to crash recovery, point in time recovery, and all standby use not derived from streaming replication. WAL files are extremely critical to proper database operation. Yet their archival is treated as an afterthought. Some people use regular cp while others go as far as to use rsync to send data to a replica server.

This isn’t enough. A much safer architecture is to involve three servers. One server produces WAL files. One server acts as a storage and distribution location. One (or more) final server consumes the WAL files as necessary. In instances where streaming replication gets disconnected and the replica is too far behind for the WAL files the master server has available, the archive is a good mechanism for catching up.

Well, wallctl does all of that. It forces you to do all of that. Just set up a couple SSH keys prior to installation, and it needs nothing else. I added a database clone tool as a convenience which needs superuser access to the master database, but otherwise, walctl is very unobtrusive. This toolkit is simple, and will have a few limited advancements in updated versions. It’s meant to be small, fast, and light, following the UNIX philosophy of doing one thing well. In fact, I wrote this tool after examining several other PostgreSQL WAL and replication management systems. Almost all of them require transmitting WAL files directly from the master server to one or more slave systems. But this presumes you only have one replica, or forces the master server to do much more work by contacting several systems in a row. Why not let the slaves do all the hard work?

I highly recommend all DBAs use a WAL management tool of some kind, no matter which. Barman and repmgr are great alternatives that do much more than walctl. But if all you want to do is stash your WAL files in a safe location that multiple replica servers can utilize, this is the easier path.


Tags: , , ,

Review: PostgreSQL Server Programing

December 12th, 2013 | Published in Book, Review | 1 Comment


There comes a time in every DBA’s life, that he needs to add functionality to his database software. To most DBAs, and indeed for most databases, this amounts to writing a few stored procedures or triggers. In extremely advanced cases, the database may provide an API for direct C-language calls. PostgreSQL however, has gone above and beyond this for several years, and have continuously made the process easier with each iteration.

So once again, I’m glad to review a book by three authors in the industry who either work directly on PostgreSQL internals, or use it extensively enough to contribute vastly important functionality. Hannu Krosing, Jim Mlodgenski, and Kirk Roybal collaborated to produce PostgreSQL Server Programming, a necessary and refreshing addition to the PostgreSQL compendium. I don’t know who contributed each individual chapter, but I can make a fairly educated guess that anything PL/Proxy related came from Mr. Krosing, its original designer.

As usual for a book of this type, things start off with relative simplicity. The authors make a very important note I try to convey to staff developers regularly: let the database do its job. The database is there to juggle data, handle set theory, and otherwise reduce traffic to and from the application to a minimum. This saves both network bandwidth and processing time on the front end, which can be combined with caching to service vastly larger infrastructures than otherwise possible.

Beyond this, are the basics. Using stored procedures, taking advantage of triggers, writing functions that can return sets. The gamut of examples runs from data auditing and logging, to integrity control and a certain extent of business logic. One or more of the authors suggests that functions are the proper interface to the database, to reduce overhead, and provide an abstract API that can change without directly altering the application code. It is, after all, the common denominator in any library or tool dealing with the data. While I personally don’t agree with this type of approach, the logical reasoning is sound, and can help simplify and prevent many future headaches.

But then? Then comes the real nitty-gritty. PostgreSQL allows interfacing with the database through several languages, including Python, Perl, and even TCL/TK, and the authors want you to know it. Databases like Oracle have long allowed C-level calls to the database, and this has often included Java in later incarnations. PostgreSQL though, is the only RDBMS that acts almost like its own middle layer. It’s a junction that allows JSON (a Javascript encapsulation) accessed via Python, to be filtered by a TCL trigger, on a table that matched data through an index produced by a Perl function. The authors provide Python and C examples for much of this scenario, including the JSON!

And that’s where this book really shines: examples. There’s Python, C, PLPGSQL, triggers, procedures, compiled code, variants of several difficult techniques, and more. In the C case, things start with a simple “Hello World” type you might see in a beginning programming class, and the author steps through increasingly complex examples. Eventually, the C code is returning sets of sets of data per call, as if simulating several table rows.

In the more concrete, the authors provide copious links to external documentation and Wiki pages for those who want to explore this territory in more depth. Beyond that, they want readers to know about major sources of contributed code and extensions, all to make the database more useful, and perhaps entice the reader join in the fun. Everything from installing, to details necessary for writing extensions is covered, so that is well within the realm of possibility!

I already mentioned that at least one of the authors encourages functional database access instead of direct SQL. Well, there’s more than the obvious reasons for this: PL/Proxy is a procedural language that uses functions to facilitate database sharding for horizontal scalability. Originally designed for Skype, PL/Proxy has been used by many other projects. While it might not apply to everyone, sharding is a very real technique with non-trivial implementation details that have stymied the efforts of many development teams.

I actually would have liked a more extensive chapter or two regarding PL/Proxy. While several examples of functional access are outlined for a chat server, none of these functions are later modified in a way that would obviously leverage PL/Proxy. Further, Krosing doesn’t address how sequences should be distributed so data on all the various servers get non-conflicting surrogate keys. It would have been nice to see an end-to-end implementation.

All in all, anyone serious about their PostgreSQL database should take a look. Getting a server up and running is only half the story; making the database an integral component of an application instead of a mere junk drawer provides more functionality with fewer resources. It’s good to see a book that not only emphasizes this, but conveys the knowledge in order to accomplish such a feat. Hannu, Jim, and Kirk have all done the community a great service. I hope to see revisions of this in the future as PostgreSQL matures.


Tags: , , ,

Lamentations of a Budding Chilihead

August 13th, 2013 | Published in News | 4 Comments


Recently, I’ve come to the conclusion my tastebuds are changing somewhat drastically. How so? As it turns out, where once I couldn’t even tolerate medium heat, and Taco Bell Medium sauce packets were the equivalent of agony, now all commonly available hot sauces only impart a mild zip.

Tabasco? Watery vinegar. Frank’s Red Hot? Tomato sauce. Cholula? Ketchup. Sriracha? Garlic ketchup. Insanity, to put it bluntly, and it was becoming a problem. What do I put on my tacos, pizza, and salad when anything I can buy at a supermarket is the equivalent of tepid bathwater?

It just so happens that Pepper Palace is just down Michigan Ave. from the hotel where I regularly stay when I’m in Chicago for work. So I went there for a visit, because they had to have something I could enjoy with a little heat.

Boy did they! I bought a few infamous samples, in order of heat level:

Then I got a couple of Pepper Palace’s branded sauces:

But this isn’t a review. I loved them all, though the Blair’s was rather painful at first. I purchased the Mango Habanero because it’s a great flavor. The Kutbil-Ik is the hottest blend of El Yucateco, and it’s one of my favorites. The Jolokia sauce was smokey and probably my favorite of them all, so much so that I’ve already gone through almost half the bottle. The Iguana goes well in salads and has a standard medium heat, and Blair’s I’ve reserved strictly for burritos or when my tolerance builds a bit more.

Now I’m running into a similar problem as before. The more of these sauces I use because I like their flavors, the less hot they are. A couple weeks ago, I could barely tolerate Iguana Radioactive, and now I can only describe it as a strong medium heat. How is this possible? I have a theory.

Every Saturday in Urbana, they have a farmer’s market that gets dozens of stalls. One in particular had a pretty sad assortment of random peppers in a small cardboard box. Unlike the other vendors, he had things hotter than Jalapeños. I saw some cayenne, which can get up to 50,000 scoville, so I had to grab a couple. What I didn’t expect is that he had a couple peppers that I couldn’t recognize. So what else could I do? I bought them.

One of these mystery peppers was hotter than the cayenne, but not remarkably so. I still don’t know what it was, but I ate the whole thing, and suffered for a half hour with no other ill effects. The other looked like this. It was a jagged, studded, curled up, mean looking bastard. I don’t think it was a genuine Bhut Jolokia, but it was probably a variant of Naga. The vendor only had one of them, and he clearly didn’t know what it was, or he wouldn’t have sold it to me for a quarter.

When it came time to test the second mystery pepper, I nibbled a bit off the end, chewed for a bit, and shrugged. But the heat built. Then it kept building. Then my lips started to burn. About fifteen minutes later, my stomach started to cramp. Just the tiny end of this thing was as hot as the cayenne I tested, and it imparted a more drastic overall effect. It also extremely pungent. I cut it open to investigate, and the entire inside was slick with capsaicin oil. It didn’t have the smokiness often associated with Jolokias, which is why I think it’s only in the Naga family. Still, that’s more than enough!

I was shocked, and left with a single question: How on Earth did a random stand at a farmer’s market get ahold of a fresh pepper related to one of the hottest strains in the world? Was it just growing on one of his plants and he harvested it? Did one of his friends put it in the box as a prank? I have no clue.

But after abusing myself with it, every other thing I’ve tasted since registers at a lower scale. Jalapeños are like bell peppers to me, except they impart a mild warmth. My coveted Jolokia sauce? Smokey, medium spicy ketchup.

From what I’ve heard, there is a limit to this tolerance effect. Eventually I’ll stop acclimating and be able to enjoy a hot sauce without having to step up the heat. Still, I am very much unaccustomed to eating spicy food and finding it mild or bland. I’m not sure I qualify as a chilihead just yet, but I may be by the time I’m done.

Until Tomorrow


Tags: , ,

I Love Nintendo, and That’s why it Needs to Die

June 8th, 2013 | Published in Rant | No Comments


I’ve been a fan of Nintendo and its content since I first played Super Mario Brothers in a 7-11 back in the 80′s. I slaved over my Nintendo Entertainment System in 1988 to master Super Mario to such a degree that I could play through the entire game without warps, all on one life. I was awed by The Legend of Zelda, subscribed to Nintendo Power for the free copy of Dragon Warrior, and made Contra my bitch after months of practicing with the aid of the infamous Konami Code. Then I watched a 90-minute Super Mario 3 commercial disguised as a summer movie, and reached a new level of devotion. When I was sick, the only thing I did aside from barfing my guts out and sleeping, was abuse my NES.

When the Super Nintendo came out, it was like a new and golden dawn. Super Mario World was amazing. Zelda reinvented itself to much acclaim with A Link to the Past, which I still consider one of their unparalleled classics. I played and replayed Final Fantasy II/IV and Final Fantasy III/VI like they allowed my very survival. Then came Chrono Trigger, which many consider one of the best RPGs ever released, and I’ve forgotten how many times I’ve bought and played its various incarnations.

Nintendo was a blockbuster, an unstoppable colossus that put enough pressure on Sega (with some help from Sony, more on that later) that it eventually dropped out of the console business entirely. And that’s when things started to go wrong. With Sega flailing haphazardly with all of its weird and half-baked incarnations, Nintendo got cocky. After breaking their contract with Sony in the great schism that eventually birthed the Sony Playstation, Nintendo stuck with cartridges in the Nintendo 64 when discs were clearly the future of gaming.

Among the strengths of cartridges, is the ability to include extra chips for customized processing if games require it, and their quick boot times. However, memory comes at a premium, and Final Fantasy VII and its ilk would have never been possible on anything but CDs at the time. Once the home of most critically acclaimed RPGs, almost all of them fled to the new Playstation, including Squaresoft and Enix. This clearly hurt Nintendo, but it had plenty of momentum from two successful consoles. Yet the console wars continued, and Nintendo suffered a 33% decline in sales compared to the Super Nintendo, almost a third of the total Playstation sales.

This slide continued with the GameCube, which again bucked the trend of standard disks for proprietary mini-discs. Sales saw yet another 33% drop while the Playstation 2 went up by 50%. If not for the runaway success of Nintendo’s handhelds, it may have ended up in a similar situation as Sega. Maybe it was that doomsday scenario that forced Nintendo to reinvent itself and go after a new market: casual players.

Unlike devoted fans, casual players exist in great, untapped numbers. With the Nintendo Wii, Nintendo pursued a kitschy populist take on gaming that paid off tremendously. They won the console wars for the first time in two system iterations, with sales figures more than quadrupling since the GameCube. It was a new dawn for Nintendo’s invasion of the living room, and that could have been the resurgence it needed to continue their former domination.

But something was already rotten in the state of Denmark. Critical third party developers shed in Nintendo’s previous two missteps never really returned. The vast majority of Wii game sales were due to their own in-house titles. Regardless of their quality, Nintendo can’t develop games fast enough to keep an entire console alive by itself. As early as 2011, complaints of Wiis gathering dust became the prevalent complaint of the console. This should have been a clue to Nintendo, and I really wish they’d taken it.

Nintendo, like Sega before them, is now synonymous with its various franchises. Nintendo means Zelda, Mario, Metroid, and Pokemon. Right now, the only way to enjoy a game featuring these dynasties is to buy a Nintendo console. Is this enough to support a whole console generation? Maybe not, but the Wii had stupendous sales thanks to the new casual market and the Wii’s motion controller gimmick. But there’s a very subtle problem here in the definition of the word, and a complication computer retailers have long suffered. Without enthusiasm or need, comes no drive to upgrade. For some, the Wii can fill a role for occasional gaming much longer than Nintendo would like. Casual players live up to their name, and see no need to upgrade their game-o-matic.

But it gets worse. Tablets, phones, and similar devices can now fill that role without requiring a TV or specialized device. Want to play a quick game? Pull out a phone and bang out a round or two, and put it away for later. Games like this can even be engineered to be collaborative or competitive from any distance, again without the need for a console. The Nintendo 3DS can help deflect pressure from that vector, but as many sales as its new portable has reached, that’s only a tiny fraction of the mobile market.

With all of this in mind, it was strange when Nintendo announced the Wii U. Compatible with older Wii games, and with another gimmicky controller, potential buyers confused it with an upgrade or an add-on. I personally fell into this group until a few months ago. As a result, sales have been dismal. The ridiculous and confusing name didn’t help this situation, either. Super Wii? Wii 2? Swiit? No? Literally anything else would have implied a new console generation better than Wii U. I can’t help but wonder if the Wii name itself backed them into a corner. In Japanese, two is pronounced: nee. Making Wii 2 sound like WeeNee, or Weenie for those in the dick-joke crowd. The problem stems from Nintendo allowing its marketing team dictate too much of the corporate direction, and now it’s going to suffer for allowing a one-time resurgence to embed that kind of hokum.

It doesn’t have to be this way. Ever since the NES, Nintendo has long been identified by it’s franchises. When Sega called the console business quits, it transitioned to developing games it were known for. By the end of 2005, it was once again profitable and was actually showing strong sales. The main difference of course is that Nintendo has always been profitable, even when it had the smallest piece of the console war pie. That, of course, is irrelevant in the long run. If buyers purchase Nintendo consoles mainly to play Nintendo franchises, why not cut out the extra step? There are far more phones and tablets than any Nintendo platform will ever sell. The proloferation of mobile Nintendo emulators alone attests to the fact consumers are clamoring for Nintendo games on their mobile devices.

No other gaming company has that kind of strong identification and user nostalgia, aside from possibly Sega and its Sonic games. Even Square Enix, long a mainstay of consoles, is testing the Mobile market with releases on iOS and Android. At this point, Nintendo is leaving money on the table for no other reason than stubbornness. My Wii quickly became a glorified Roku after the novelty wore off because Nintendo couldn’t produce enough games to keep it viable in the absense of third-party developers. This latest generation with the Wii U is reportedly worse, with publishers like EA going so far as to announce no further plans for any future games on the platform.

I’d never call for Nintendo itself to go out of business. That’s not what I want. In fact, considering the power of its various properties, that might not even be possible. However it needs to get out of the console business. It had its day, and with physical media itself slowly being phased out in favor of downloads, consoles themselves may just see their last generation with this latest batch. If Nintendo doesn’t transition to software, it may not survive that migration, and that would be a loss to us all.

Please, Nintendo. Stick to what you are good at, before it’s too late.


Tags: , , , , , ,

Free Copy of Instant PostgreSQL Starter

June 5th, 2013 | Published in Database, News, Tech Talk | 13 Comments


Another free book giveaway? What, am I running a library here? Well, it turns out Packt liked my review of Instant PostgreSQL Starter so much, they want me host a short period where you can obtain your very own copy for the low, low price of $0.

To those ends, I have a few brand new digital copies comprised of shiny premium electrons ready to dispense to three lucky commenters. Does that sound good? Is that something you want? In that case, you need to know that Instant PostgreSQL Starter really is a starter in every sense of the word. The assumption here is that you’ve never (or rarely) interacted with a database, and want to whet your appetite with some basics.

As my review stated, it starts with SQL basics, and moves on to several features the author personally found noteworthy and useful. It casts a very wide net, and seems to target application developers who need data storage, or someone tasked with administering a small database out of necessity. Everything from basic performance analysis, to cryptography, to full-text search, and simple backups. It’s a tiny smorgasbord that may lead a reader to pursue greater heights.

If that sounds like something that’s up your alley, all you need to do is leave a comment in this blog entry. Three lucky winners will be bequeathed an official copy to show off to their friends.

So remember:

  • Free e-book
  • What interests you about it?
  • Submit a comment
  • You’re entered

As always, I hope to see more PostgreSQL publications soon, and the popularity of works like this is the first step to getting there.


Tags: , , , ,

« Older Posts