December 30, 2004
One of the most complicated and often vexed relationships in the tech business is the one between product managers and engineers. If you ask a product manager what they do, they’ll often say something along the lines of “I [lead, guide, direct, oversee] product development” or “I come up with ideas for products”. A really good one will often put it more like, “I’m a hub for input into the product from all the functional areas, including marketing, sales, and customer service — then I interface with engineers to make sure the right product gets built”. Tension occurs between PM and eng when engineers push back on a product spec or (even worse) an idea that never makes it to a written spec.
Complicating the relationship is that technical product management performs many of the tasks you’d normally think of as “general management” in other fields — driving product, supporting revenue functions, dealing with operational crises — except that they are not empowered with hire-fire authority over the engineers who implement the product nor the operations folks who keep the infrastructure of the company running. In my experience, PM and eng have always been in separate reporting hierarchies up to the highest levels — so if there are serious conflicts of opinion between the two groups, often they cannot be resolved below the SVP or CEO level.
But if you step back from how things are today, one of my big questions is whether this is the correct relationship between PM and eng in a high-innovation company. Should product management propose, and engineering dispose? I’d like to suggest that it should be the other way around: that new product ideas should be extracted from engineers and then validated by product management.
In a company with serious engineering “special sauce”, engineers should be the only ones who have sufficiently deep technical knowledge to suggest major new products — if this is not the case, you have to start wondering how special the sauce really is. Contrary to popular misconceptions, engineers also often have a deep interest in user needs and revenue opportunities. What they don’t have is the time and training to validate their ideas for market opportunity, user impact, and cost-benefit to the company. That, it seems to me, is the proper domain of product: to conduct sufficient research into other groups’ ideas to efficiently validate or disprove them. (Please remember I’m talking ONLY about high-innovation work with significant novel engineering “special sauce”. I am explicity NOT discussing incremental improvements, me-too features, or simply moving a well-understood real-world task online. In those cases, I think it’s possible PM might in fact be closer to user and revenue needs than engineers, and therefore may be able to provide useful guidance.)
And when I say “validate or disprove”, I mean by producing hard data and putting down firm bets — like “this feature will help grow average revenue per user by 2.5% within 3 months, and these are the data to support that thesis”. And this information should be collected in some central location, like a wiki, so the entire company knows exactly what research has already been done. This sounds easy, but I’ve never seen it effectively done by any product management organization. I’ve heard Amazon does something in this direction, and Google certainly has an “eng proposes, PM disposes” model; but too many other companies stick to the model that product management’s deliverable should be novel technology product ideas.
December 30, 2004
There’s a special satisfaction in seeing a well-written and information-packed article in the mainstream business press about something you care about. So although I generally stopped reading the “blog, short for weblog” articles eons ago, let me commend David Kirkpatrick and Daniel Roth for their piece from the 1/10/2005 issue of Fortune (pointer from Jeff Barr). Not only do the authors do a brisk but thorough job of explaining how blogs might impact businesses of all sizes, but the piece is filled with hilarious quotes like “biting the karmic weenie” and MSN Spaces as “the third leg of the communications stool” (note to Microsoft PR: you might want to read that shit out loud to yourself before you give it to the media). I especially enjoyed Bill Gates saying business blogs signaled a corporate culture that’s “open, communicative” and not “afraid to be self-critical”. Not that I’ve ever been critical of an employer on my blog, but it’s good to know that Bill has the right idea.
December 28, 2004
The highlight of our customary holiday trip to Chicago this year, at least for me, was a short visit to the Chicago Historical Society. Their unfortunately fussy and badly organized website gives no sense of the essential mission of the place, which is a shame. The best exhibits are the humblest and least curated (and, sadly, worst-placed) ones, like the ledger from the garrison at Fort Dearborn or the maps of the stockyards. And they have dioramas! Who doesn’t love dioramas?
While we were standing there, quietly looking over a scale model of the original Fort Dearborn (made of twigs!), we were accosted by Abe Lincoln and Stephen Douglas, who wanted us to attend their debate. After recalling that each of their debates went on for more than 3 hours, we politely demurred.
It’s funny, because when I was training to be a historian I was rather sniffy about people who were into artifacts. In my view, which I didn’t change until very late in graduate school, artifacts were for anthropologists… history was about The Word and it was a great mistake to be taken in by the glamor of old things. I guess now that I don’t have to worry about such doctrinal points for a living, I can enjoy historical museums and the spirit of civic pride that motivates them.
December 19, 2004
It’s now of course a truism that Gmail shook up the moribund world of webmail with its newer, slicker, webbier user experience (“Huh… whodathunk anyone would want a whole thread on one page??!?!”). What might be slightly less obvious, unless you happen to be a webdev, is the larger effect Google is having on the web development world.
I dunno about other places, but around here from approximately the beginning of 2001 to the end of 2003 — three LOOONNNGGG years — there was absolutely no money available for any interesting web stuff whatsoever. More than that, there was a whinging-pissing “been there, done that, more jaded than thou” attitude going around, where people were sort of determined that they were never again going to be enthused now that the bubble had burst. You could have done the webdev equivalent of shooting firebolts out your ass while juggling live turkeys and driving a Formula One car with your pinky-toes, and people would still have sighed and said, “But why would anyone want that?”This willful ennui was most marked towards front-end webdev, which was in crisis around that time for other reasons anyway — leading to a phenomenon we’ll call “DHTML Winter”. The nadir for me was when ScottAndrew et al’s long-awaited DHTML Bible was cancelled in press (by my own publisher!) for an alleged lack of salability. After that, is it any surprise that by 2004 there were so very few accomplished DHTML practitioners to be found? Lots of webdevs quit the business and went to law school, or basically just stopped giving a crap. It was a shame and an irony, because shortly after the darkest hours of DHTML the situation started to get radically better due to the release of one DOM-capable browser after another — but with no one interested in building anything that pushed the limits of the new browsers, webdev couldn’t develop much beyond the limits of small personal experiments.
But after Gmail hit, the whole “But why would anyone want that?” thing deflated overnight. It turns out that if you build dope shit, people often do want it. It makes competitors look old and tired. It makes your engineers happy. And it may very well provide new monetization opportunities as well.
The Goog has also been good for the careers of the few remaining survivors of DHTML Winter. Now it seems like every web property with the slightest claim to hotness has suddenly decided they need their very own DHTML bunny. Meanwhile, Google is quietly hiring — they probably have between 5 and 10 of the top front-end devs, which doesn’t sound like much until you realize there might be only a couple dozen out there with significant experience. That decreases the supply of the remaining ones, which as we all know means you’re gonna have to show them some serious love to enjoy their scarce favors.
Most importantly, Google seems to be putting some chips down on the DHTML side of the table instead of the Flash or XAML or XUL or Laszlo sides. Given how important a few technology leaders are as role models for all developers — I dunno how many arguments I’ve had where the magic words, “But Yahoo/Amazon/Google does it this way!” work their incantatory magic — this is very much a Good Thing. So I’ve got to thank Google for making it viable to spend money building newer, faster, more responsive, standards-compliant, user-centric, cross-browser interfaces. Whatever the opposite of “collateral damage” is — collateral benevolence? — Google is doing it for webdev now.
December 18, 2004
A few months ago when Tim and I went to Best Buy for some reason, we saw these brochures for their new Geek Squad service. Since we are definitely part of the Tech Support generation, we thought this would be a brilliant way to lay off some guilt quickly and easily and relatively cheaply.
Finally we got our chance. A year or two ago, I set up the in-laws (who are surprisingly wired for their age) with DSL for their new matching iMacs. But something went wrong recently, which seemed very likely to be the shared router. So I called the Geek Squad hotline (their branding is all Men in Black cum Maxwell Smart) to order up a fresh geek to visit the in-law home with a new router.
So far, pretty good but not perfect. The visit was scheduled promptly, but there was some static about the Macintosh thing — apparently most of the Agents are Windows-only specialists (surprise!). What surprised me was that I had to make a special request for them to pick out a router and take it over there to install. Wouldn’t that be a super easy revenue-maximizing opportunity for Best Buy? And very bumming is that I couldn’t pick up the tab for the service call, because they bill during the visit. In-home computer service is a brilliant gift that a lot of people would appreciate being able to send to their loved ones. They don’t even seem to have gift certificates, which seems weird. Are you listening, Best Buy?
UPDATE: The geek showed up ahead of schedule, installed the router, and made my in-laws’ day.
December 16, 2004
For various reasons I have recently been elbow-deep in wikis, and have come to loathe them as much as I find them useful. I’ve used or evaluated PHPWiki, MoinMoin, TWiki, TikiWiki, PMWiki, Mediawiki, Tiddlywiki, coWiki, and Trac — and I hate them all. They might be OK if you just want to use them out of the box… but the minute you start customizing, it’s enough to make you want to throw up.There are a few good things to mention. MoinMoin has got to be the ugliest wiki ever made, and if there’s a way to make it prettier I certainly haven’t figured it out — but it has the best site indexes of the bunch. You can easily get a list of all the pages on the wiki, all the wiki-words, all the orphaned pages, etc. I do not understand why more wikis don’t do this, given the magnitude of the “treasure-trove without a mineshaft” problem endemic to wikis. MediaWiki gets rid of the stupid wiki-words, and also uses stylesheets correctly to separate code from presentation — try switching from the default style to the BlueDanube style, for instance. TiddlyWiki is a sweet experiment in user interface, which unfortunately will need better plumbing to actually be useful.
But the bad and the ugly far outweigh the good. For instance, PHPWiki is now my number one exemplar of the “Why OOP in PHP is fat and slow and hard to debug” theory. Try dumping out the data on any page… no, on second thought, I can’t wait that long (hint: last time I did it, I ended up with 1600+ lines of data). An average wiki page should, in my humble opinion, take no more than .01 seconds to be served… PHPWiki regularly exceeds that by more than 10X. Now try changing any little thing, and note how this results in an instant and information-free fatal error. One of the nice things about PHP is that generally when it fails you get good debugging info… but not with its object system, which simply fatal-errors out. Or check out the fact that Mediawiki actually has two different stylesheet systems, depending on whether you use PHP4 or PHP5 — more or less totally undocumented, of course. Or note PMWiki, which I couldn’t even figure out how to install.
I know some of you are thinking that I’m a moron right now… and maybe I am. But I’d challenge anyone to take any two wikis, and time themselves doing simple basic tasks: for instance, change the main graphic, the colors, the copyright notice in the footer, and the links in the navbars. Those would be sort of the minimum customization things I’d expect to do to any software package. For extra credit, limit edit permissions to only registered users from a certain email domain, add a new CSS style for links that go off-site, and figure out how to dump out all the pages into a zipfile. Then come back and tell me how usable and maintainable these things are.
December 12, 2004
Every year about this time, as I prepare to write a check to the EFF, I have a twinge of guilt. It is, after all, an institution that is predominantly of interest to one of the most privileged segments of our society. Instead of supporting Iraqis bombed out of their houses, or Sudanese refugees, or unemployed Americans, or the environment… I choose to funnel checks to the protector of abstract liberties for technocrats. I always feel apologetic and almost furtive about it.
In the end, I can only justify it because no one else cares. Yes, it’s self-interest on the part of the digerati… but that doesn’t make the Bill of Rights in the age of mechanical reproduction any less important. I can barely make intelligent, well-educated, First Amendment-loving non-techies even understand the issues, much less shell out buckage to fight the forces of Big Media and the police. And other interested parties, like librarians and small music publishers, don’t have massive financial resources to contribute. So I guess I’ll continue to squeamishly write my end of year checks, and follow the stories, and hope that the EFF’s finger in the dike buys us some time… but it’s sad that we have to pay money to wage a small rearguard action on behalf of the Constitution.