I don't normally mention this sort of thing but I really admire the work and the sheer hilarity of the interface for Ground Sniper's Bypass. The popup when loading the page says that it is currently broken but the funny as hell tribute to corporate web presence is still there. I'm glad to see more efforts like these being made. Hats off to routing around damage...
I saved Danny Sullivan's post on the suckiness of search interface in an offline reader and promptly forgot about it for a week or so. This may sound like more of my milking a post from something I forgot to do methodology but there is actually a point buried in here somewhere. The point is about interface and by this I don't mean the more commonly and narrowly defined concept of how buttons are arranged in an application panes and in which order they appear. What I'm thinking of has more to do with the much maligned metaphor interaction. Neal Stephenson gave us a good deconstruction of why this is conceptually flawed in In The Beginning There Was The Command Line and compared the commercial operating system interfaces to the faux-thenticity that Disney employs in makes us suspend disbelief in order to participate in the illusion as an actual participant being emotionally drawn into the display instead of a simple observer apt to notice inaccuracy. The devil is really in the details when you need to just perceive the veneer as a zeitgeist entity that offers an illusion of choice in a world of constraints determined entirely by the environment. There is no way you're going to be truly convinced you're tromping around in the jungle if you simply circulate through the props on a cart that moves along a track. Ideally Disney might force you to travel the same route through the fake scenery but it will also maintain the illusion that you've either made a choice or reacted to another piece of the scenery. I guess that is the complicated part about interacting with metaphors as reality.
What I started thinking about was the metaphor of searching the web and now many different areas of enquiry are folder into that term. In many ways 'search' is a pretty simplified term that is close to the use of 'internet' to mean the large blue E on the desktop or just the web. Defining hugely expansive categories of both objects and sources with generic terms always steers like the Titanic into semantic icebergs that gains its massive size by alternating between the lexical and conceptual. This is where the interface to 'search' becomes really important. If I am running a query in a database of peer edited academic journals I really shouldn't be surprised by the returned results. I went into the text field with an expectation and the end result was within the constraints of my expectations. The academic database carries a whole lot of semantic weight with it and is usually more formally organized than say a Wikipedia article on the same general topic. It draws from a limited number of known quantities and makes for a whole lot less time spent thumbing through dusty indexes. You know what to expect.
The same is true of traditional search interfaces: you slap some words in a textfield and then press the appropriate go button and your results are returned in a hierarchical list, sorted by invisible algorithms that you're accustomed to. If you've come up using Yahoo as your search weapon of choice then you've come to expect the first set of results to be sponsored crap and know to skip those results unless you're in the buying state of mind. It works the same for Google or whatever as well. If you've come to depend on that format and accepted its limitations (advertising more or less) then you actually get what you're looking for. I think that is the reason why so many of the ultra-deluxe interface search engines haven't been successful. The initial presentation doesn't matter a whole lot as complexity or a whole lot of pulleys and levers that don't have an obvious use or impact on the submission of search terms. It's more crap to navigate through if you're unfamiliar with the interface and more presentation layer between user and results. Extended features are better presented as a part of the search and not another knob that will likely be ignored. CSS-fu is probably wasted on the search interface.
This all comes back to the reason that this article sat unread in my offline reader for so many days. I consciously went through the effort to archive this article with the intent of giving it a more thorough read while riding the bus to work and promptly forgot about it. I had a version open in an actual browser and accidentally closed the tab so in my head that article was gone and would have to be looked up again when I had a live network again. Why didn't I realize this? A good portion is probably attributable to it being before 7 am but again the interface that I'm used to didn't have the content so I just assumed that I wouldn't be able to access the content. Another item to add to the infinite to do list I guess.
Yoon noticed earlier tonight that I was missing a post and when I actually looked into it I was missing two posts. Right now I'm too groggy with a dose of Nyquil to go digging any deeper than this but I restored both posts from draft copies and twiddled the data stamps in WordPress for what I hope will be the last time. Any clues why this might have happened? I'm tempted to blame on of the weblog clients that I use but I don't think they're at fault. I may post something over the WordPress forum when I'm feeling a little less bleary as I'd love to understand what actually happened and not deal with it by posting drafts that may be missing later edits. Urgh.
Seems as if some MySQL table data was corrupted and my host had to restore from a day old backup. It figures.
Even though it is pretty much irrelevant these days it looks like the GIF patent is going to expire on 10/01 this is both good news and meh news. The good news is that you’re wandering into less sketchy territories when designing for the awfulness that IE6 represents if you need to use transparency. The PNG versus IE crappiness is probably never going to end and neither will the overuse of WYSIWYG editors that default to GIF and JPG formats. In a few short days you’ll be free to litter your own little hunk of the web with as many terrible animations and eye mauling contrasts as you feel like. It could lead to renaissance of free web pages on places like GeoCities or something. Don’t know or care but that becomes a legal option now. The bad part is that this freedom to use the format was simply the result of a patent expiring. The black boxing of such a ubiquitous format should not have been legally ambiguous for so long. It probably doesn’t make a bit of difference.
This probably isn't the proper place for these sorts of postings but I really can't resist as unsafe as it might be to mention at this exact moment in time. I've start sending out applications and resumes in search of a new job only five months after starting this one. I really, really hate looking for work and the availability of online job postings that sometimes contain 700 words or better of job description only bolsters that distaste. Getting there is not half the fun nor any fun at all in particular. The clinical and descriptive language of these listings makes me feel like a badly bent plug looking for a socket loose enough for me to slide into while trying to avoid the sparks, dimming lights, and other side effects of being wedged into a new void that may or may not be the void I am looking for once the dividing line of the face plate is passed. Melodramatic as fuck I realize but also a fair approximation of the way I'm feeling which is a combination of exhausted, dispirited, and, well, exhausted.
The important point here is that I came to a terrible realization a couple of weeks ago and it's an uncomfortable realization to make: I don't like my present job and, in terms of moving on and away from it in the future, it is a death trap instead of an environment where I can learn anything new. I do learn new things every day but they are usually pure procedure and contradictory to what I reluctantly absorbed in the weeks before. The problem is that the work simply isn't challenging in the intellectual sense. Sure, it's grueling being micromanaged for nine hours a day and having mindless and repetitious tasks piled on top of an already precarious work load but at the end of the day I don't get the same feeling that I had when working elbow deep in and unending line of broken machines. The "Wow, I kicked that mess into shape and I still have most of my fingers" feeling has been replaced with a numbness that is slowly degenerating into despair. Anyone, with a little training, could walk into this job and excel. It would probably be easier if I knew absolutely nothing and could be enthusiastic about describing the path to Tools > Internet Options > Temporary Internet Files but I'm not and no amount of pretending is going to make me feel any better about this.
So, after admitting all of that, first to myself before working it out on the keyboard, I'm pretty relieved to admit that what I'm doing is utter shit although secure and relatively well paying shit and that I need to get the fuck out now. The process will suck as it always does but it feels like nothing when compared to the thought of continuing this in perpetuity. If that isn't a sure sign that it is time to move on then I don't know what is. Getting myself mentally prepared for this has been oddly cathartic as filling out what feels like reams of Word documents to submit to an invisible human resources department actually felt productive for some reason and finishing up all that hoop jumping and creative arrangement of fact put me a in a more positive frame of mind. I guess there is also the feeling that an invisible countdown has begun although that will slow nearly to a halt by the time anything actually starts to happen. Weird, huh?
It Won’t Ever Go Away Even After Silver Bullets Are Fired and Stakes Are Driven Through Numerous Hearts
Having far too much experience (ha! get it?! ha!) with the various shortcomings and features that are worse than the problems they attempt to fix Rob Pegoraro's summary of five years of suffering through Windows XP is pretty amusing in a cringing way like watching a YouTube video of someone getting plowed in the groin with a shovel handle: it's funny because the pain is resonant with my own experience.
One of the points that never crossed my mind that Pegoraro brings up is the willingness of users to let updates slide out of fear that WGA will cripple their OEM purchased systems. Ouch. I suppose I'm too accustomed to dealing with site licensed copies of XP to have really thought about what a conundrum this represents for most single license home users. What is the preferable choice here really? You can follow a how-to available online that may or may not work, depending on how your individual machine is set up, or you can opt for the cyclical pattern of simply pulling a back up and reinstalling whenever Windows starts complaining about something being amiss. That is a tremendous time sink although it may be one that Windows users at home (especially those who tend to tinker with their systems above and beyond the usual web browsing and email sending expectations that MSFT has for home users) are accustomed to by this late date. I wonder how many people are now completely unwilling to run Windows Update simply because it now lies to you about the actual content of patches instead of just randomly breaking applications and areas of the operating system. It makes one long for the days of simple self implosion.
One point that Pegoraro brings up that produces nothing but winces and nervous glances around the room is the unavailability of System Restore from the default installation. It is buried a couple of menus deep but I'd prefer if it were sealed in a lead box deep under the surface of the Earth. Nothing in Windows is more accommodating to either spyware or viruses than System Restore. I guess it made a nice window decoration (I kill me) to reassure folks that they could actually extract their useful stuff or files that they needed first thing in the morning for the wreckage of a downed system. The bad part is the malware authors have also been keenly aware of the promise of Windows Restore since it was birthed by a combination of a pentagram chalked on some board room floor in Redmond and the force that truly drives Windows development: inertia. I disable it by default on every machine I set up.
The domination of the registry, if you can call such a Lord of the Flies disorder any kind of systemic organization, has still not ended and there is still no useful versioning system (unless you call date stamps versioning) for DLLs. The registry would be a useful development tool but trusting your asschip to something that doesn't version libraries is a risk I am nowhere near foolhardy enough to make or lazy enough to expect anyone else to do. regedit has unfortunately become a more necessary administrator tool than any of the other management consoles corralled under the actual title "Administrative Tools" and that is bad, bad juju. What I'm really curious about is how this nightmare of hackishness is going to be circumvented in Vista. I have pretty grave doubts about the reality of this promise. This relates to another point made in the article: when will Windows do something like keeping track of what installers install? Trusting this responsibility to individual developers seems like another terrible decision waiting to strike fear into my heart on yet another day and on yet another platform.
Vista really has a lot of bad precedent to overcome and I'm guessing that feature-itis will be the true focus of the release. I imagine that hoping otherwise even if it a calculated self deception aimed at keeping us sane through just one more problem that shouldn't be or one more reinstall that could have been avoided through proper account separation is a false hope and the more we collectively buy into the marketing hype that this will be the BEST WINDOWS EVAR the more slack this gives MSFT in dealing with this and the myriad of other bugs that should be release blocking.
I've noticed that since the last round of Apple updates that the LCD in my MacBook Pro has been flickering on occasion. I thought that I was probably losing my mind until one of my coworkers mentioned that he was noticing the same behavior on his machine which is a slightly newer version of mine (2.0 versus 1.83). Has anyone else noticed this either on Intel machines or older? Curious but it hasn't annoyed me enough to do any real searching for a possible cause of the weirdness quite yet.
There Are Two Jobs That Need Doing Well Here And Most Applications Suck At Doing Either Of Them Simultaneously
I'm going to agree heartily with Matt Dorn and say that most word processing applications are intended to do far too much. The kicker here is that they are set up to do all of these tasks with surprisingly small yields in terms of work to product ratio. The actual composition of text is secondary in the functionality of word processors to the howling wasteland of features that are piled atop the already overloaded heap in hopes of making a desktop environment. What aids and abets this abuse is that most people are familiar with a single WP interface (pretty safe to assume that the default here is the Cthuluoid horror known in certain circles as MSOffice where only a few scant tentacles exposed to the surface world are visible to hint at the misshapen continent of fangs and other pointy appendages that hides behind that idiotic paperclip) and don't separate the idea of editing text from the idea of formatting text with footnotes, clip art, and other gewgaws. If you come from the elder school of computer use the idea of this separation might not seem so alien. Have you ever written a letter in Quark? It is not pretty and makes the task less about finishing this thing that you eventually need to finish and more about navigating an endless stream of options and menus.
OpenOffice is the shining example of why simply cloning interface is such a shitty idea. It is, more than anything else, a marvel of engineering as evidenced by its amazingly fast evolution and feature to feature compatibility to MSOffice. That said, do we really need a second Galactus in the computer universe sucking down entire planets worth of energy just to sustain itself? I'm guessing not but then again I am just a user who wants his application to work sanely. This means I would like not to feel like I need a independent instance of a computer to dedicate to writing some text. If I were a usability expert then I'm sure I could be adequately distracted by the order of buttons in some print dialog to even notice this catastrophic abuse of resources.
Another distressing part of this text producing monoculture (at least in the corporate/business world) is that this arbitrary inflation of resources moves outward from whatever is produced under its aegis like pollution from an oil spill. Why wouldn't you want to use Word as your default editor in Outlook (that is another rant entirely and by that I mean the idea of Outlook) when the editing interface is so damned familiar? If you need to read some mail through a text based mail client which I do daily as part of my job you'll rapidly understand how atrocious scads of markup and other line trash that is displayed in its plain text incarnation when not paired with Mount St. Office to translate for you. This sounds like the rant of a crank and to a certain degree it is but the one really, really important distinction that needs to be made is that most text reading software for the visually impaired depends on things intended for transmission in plain text to actually read like plain text. In that case, you're spending time and energy to do excessive formatting by default that will make the end product completely useless to the intended recipient.
I can't imagine that users would ever tolerate a return to the plain text editor. I'm not going to advocate the wider use of Emacs by non-programmers or anything as I think that is also a move in the direction of burying users under a pile of features they will never use or even know about. What I would love to see and this is of course would depend on something like being appointed Bad Application Czar by the federal government is the separation of the steps of composition into more discreet portions on the application side.
Why not have an edit portion of an application suite that works like AbiWord only with a feature set reduced to what might be necessary to compose plain old text or another relatively complete yet decidedly non-monolithic text editing (yes, definitions are being stretched here but bear with me for a few hundred more words or so) application or an even more stripped down version of that very basic functionality that when signaled invokes the formatting portion of the application (hopefully a separate and also smaller application) that handles layout and formatting. The implementation that I have in mind is one that also saves the file (text, then formatting, and then the final product) in discreet steps that save individual files instead of dedicating more square feet of memory to caching for an anticipated undo.
I like the idea of applications being strapped together so they're launched in an order that makes sense for doing things with text but pairing them so that each function becomes the domain of a piece of software that happens to do that particular task very well and doesn't require the loading of a small solar system into memory simply to edit a document. Does this make the task more complex? Maybe but it also makes the individual yet important steps of creating documents distinct from one another and doesn't encourage the user to attempt all of these steps simultaneously.
Another point that interests me is what distinguishes a text editor from a word processor. Matt calls attention to this ambiguous distinction in his post in the Challenges section. My mind returned to this question a couple of times during the day when I wasn't immediately occupied with other things. I think I try to use word processors more like text editors. I don't format as I'm typing. This isn't a decision that I made consciously but it developed as a reaction to the addition of many features of MSWORD that are enabled in the default setup. Have you ever really fucked up the formatting in a Word document? I've done it so badly and apparently irreversibly that I ended up saving a copy of the document as text and then opening a copy to work on. The more times that I painted myself into a corner the more often I thought about leaving the text as just text until I was ready to print the sucker out. I think that is what really distinguishes the two concepts for me: word processors are used to create documents intended for consumption as documents by other people and text editors in which features like syntax highlighting, indentation, and parentheses/bracket matching are more prevalent and important are intended more for the production of things that are basically finished once they are composed like programming. Programming is close to a penultimate when it comes to requiring focus on the actual composition of a file.
That aforementioned intended presentation of work done with a word processor is what really makes me feel like my own predilection for simplified flow of tasks and less specific file formats are more important than I initially thought. The weighing of the classic KDE versus Gnome desktop environments in terms of uber configurability against the reductionist view of the desktop as something that must be configured one way and whittled down by removing many common options seems pretty insignificant in comparison to the concerns you should have when sharing documents with other people. This becomes especially important when those documents are produced by software designed to output things for printing and duplication and are often viewed as files on machines configured differently than those of the creator. The model isn't obsolete but it seems pretty broken in a world where MSFT didn't squash every other operating system and piece of software out of existence. Users are eventually going to become more attuned to this more sophisticated digital divide if only by being annoyed to death by various platform adherents, the requirements of government organizations trying to rid themselves of an incompatible document format, or the simple fact that more mail servers are stripped potentially dangerous attachments. Text works for everyone and requires a lot less of its users on both sides of the equation.
This week has been oddly meat grinder-ish and I'm really far behind on really simple tasks like composing 100 word replies to emails rapidly decomposing over the past two weeks. To begin, sorry if I owe you an email. I will try to just dedicate some late night time over the weekend to working through all of the stuff that requires an answer of some kind. I'm not promising anything but I occasionally need to throw stuff out in a tone approximate to guilt about inactivity and apathy. The mileage of these sorts of proclamations will vary with individual use but if you're expecting a reply from me about anything short of life or death importance I suspect you're aware of this already.
For the first time ever (I think) in the four year history of Team Murder I'm going to link to an article at OSNews that I agree with for the most part. Advocating Linux as some kind of technological panacea for all that ails both corporate/business and personal computing has always seemed like a pretty stupid idea. One, because the user isn't going to change and that necessarily leaves the obligation to adapt to the distribution instead.
I think it's become pretty obvious at this point that the strongest development done on either the kernel or the hojillion pieces that make up a Linux distribution may well be done under the auspices of a corporate benefactor but I have yet to see a commercial and end user focused distribution do very well. The ones developed commercially to begin with end up penniless and with a broken distribution where countless man hours have been wasted customizing desktop environment windows and wrapping sudo around fucking everything and distributions that have been forked or acquired seem to fare a little better on the server side of the table because they offer actual technical support when a production machine goes belly up whether or not that support is used. Big organizations love to have someone to yell at preferably attached to a toll free number. Anyway, right now a lot of companies with money to throw around are backing Linux development which is great because it subsidizes important and low level stuff that hobbyist hackers just don't have the resources to tackle as efficiently. That association seems to generally work: salaries are paid, code is hacked out full time, and everyone benefits whether they directly interact with that software or not -- chances are that they do without ever knowing it.
The problem with tailoring anything towards users is that users neither know the limitations of programming languages that are passed along to any given pile of software written in that language nor exactly what they want aside from vague suggestions like 'I want it to work with everything.' Taking these sorts of complaints as anything other than general feedback is the part of the reason that so many operating systems end up nightmarish and full of hacks. Part of that has something to do with the fact that 'desktop' is the most nebulous term every imaginable. What exactly does it mean when you say 'aimed at the desktop user'? Everything and nothing.
All of this is, of course, immaterial as people seem more than willing to launch projects with this imprecise audience as their intended target for effort. One of the concepts important to this that Martin Girard expresses very well in his post is the intentional accessibility and visibility of the guts of a Linux distribution to its user and how important this sort of accessibility is to keeping the geek crowd interested in an operating system and in developing for the operating system. While APIs are nice as a way to avoid unnecessary complexity when dealing with interface issues it is also preferable to see how the lower level components actually interact with each other. No sane person wants to develop exclusively on a platform that severely limits a programmer's access to a handful of methods that are determined, without any real developer input, behind closed doors. Some people are willing to balance between the two in order to develop commercial applications which more often than not requires the use of tools controlled by the same company.
The reason that I find myself agreeing with most of the what is said in the article is that people aren't necessarily talking about something that really exists. They're speaking about an idealized conceptualization (as incomplete and backasswards as that may be) of how an operating system should work for them which isn't implemented by any of the big players or even the more marginal entries. I think Linux development, more than any of the others that have an interest in the desktop market, will be more likely not to make stupid and hasty decisions based on the demands of users. Does that make Linux development inherently user hostile? Maybe but that also creates an environment more accommodating for people who develop software to use on the operating system and one that doesn't have the slew of exceptions to common sense rules of low level design. It makes sense to the people who build and use it which is far more than I can say for the commercial variants.
It seems like a whole lot of people are having trouble with the new version of iTunes on both Windows and OS X. I ran software updates the minute they came out and haven't seen any problems other than the first start mysterious quit that most other people have experienced. I'm curious what the hell is going on here and I'm guessing that people minus the iPods and on the Intel architecture are having less problems. Guess but I've yet to see any noticeable glitches. The only one that really bothered me was the kinda lame album artwork tool. I had just disabled it and then when I was showing a coworker what it looked like a good majority of the covers had mysteriously appeared.
I've also been hearing more rumor mongering about the potential for an Apple phone. Although I wouldn't spend what is sure to be a premium for another feature bloated, sexily designed, and utterly fragile cell phone I can certainly see how other people might. How many people do I know that have what seem like disposable Razr phones already? Something that could match the fashion accessory iPod? Please. I'll have to disclaim this to a certain degree because I'm continually surrounded by college students that have less hand-to-mouth problems than most of the fellow students I'm accustomed to being in the midst of. The alleged preview shot of the device looks pretty similar to the crap that sells like crazy now so if the lag between rumor and release isn't eons they might have a passable winner here for a first iteration.
If you've ever installed an older version of Debian (think Potato and earlier though it will pain you to do so) then Joey's bit of humor will strike home with you. The installer really is that much better and does a fair amount of the auto-detection that so many have grown accustomed to with more specified and/or commercial distributions. I managed to get a laptop up and running with a much earlier version of this installer in a matter of thirty minutes. I used to rely on the free version of Libranet to take care of this but it is looking like those days are over. I'm probably going to wipe the install on my Acer laptop at some point in the near future and increasingly I'm thinking that Debian and I might be due for another fling.
I too am pretty goddamned tired of September 11th and not just on the day when everyone simultaneously turns into a weepy mourner and a bloodthirsty cheerleader. I'm probably also less sympathetic than the current target of ire on the topic. Although it is impossible not to empathize with the losses of people who actually lost loved ones and friends during the attack it is equally hard not to feel a scarcely restrained fury towards those who would piss on the tragedy like some kind of territorial marking ritual and hijacking it like a lobbyist maneuver to slide people closer to their side of the political spectrum. It used to be called waving the bloody shirt instead of presumed patriotic duty and now it is swallowed whole as duty to god and country as inseparable and those two things have become in the damaged American mind. Instead of becoming some kind of experience that marks a dark time in American history it has become a blanket excuse for 'fighting terrorism' which apparently comes from every direction depending on which way the money is blowing in this week.
It is also frustrating that many Americans got themselves so wrapped up in the event that they fail to remember that they did not know a single person that died in either of the successful attacks (the Pentagon isn't quite as sexy though so I forgive you for never remembering never to forget that less glamourous strike) and instead drew all of the turmoil inward with all of the emotional depth of a reaction to one of those "Hang in there!" puppies on the posters. If you're going to be devastated over the losses that others suffered you might want to consider pulling your head momentarily from the depths of your fat American ass and looking at the devastation that followed the attacks. You will be paying for the mission that wasn't accomplished for the remainder of your life and, since you're probably not a millionaire unless you had the foresight to invest in Halliburton stock, so will your children. It's fucking exhausting just to consider which I guess is why it is so appealing to simply slap more empty slogans emblazoned on magnets on cars.
So, I stopped listening. It helps me feel less doomed and less like I live in a country looking for a tar pit large enough to stagger into to make room for less ridiculous countries. Not coincidentally I will likely delete comments. I'm guessing you weren't there and your tear soaked utterances that would shame Hallmark into silence benefit no one. Fuck it. You're entitled to anger (rage even) and sadness but the endless stream of excuses for whatever insanity pops into your head is wearing me fresh out of patience. It seems like the handful of countries that make up the rest of the planet are sort of feeling that way as well.
Actually the host I use did some kind of upgrade of hardware on the server which, as usual, fucking broke everything in the world for ten hours or so. I only lost like one post so I guess this was a more smooth upgrade than I'm used to. I did manage to fuck up the order of things and their associated numberings by experimentally adding archived posts in an arrangement that made sense when I was half awake this morning but makes little to no sense at the moment.
As aforementioned, there was only one post lost in the process that I did not have a copy of. It was some babble about vi search and how stuff like it related to things like surfraw. You didn't miss much.
Maybe I've spent too many hours logging though reams of code in interpreted languages (I guess that would be 'scripting' or 'not real' to all of the purists out there) but handling memory management in a language that isn't good old fashioned C where deallocation is just a 'release' away just seems strange to me. I've been playing with Objective C a fair amount lately and the incrementing method seems completely alien to me.
I guess it's called 'retain count' but it is basically the same thing but in a weird limbo between being the lower level control offered by memory management in C (some would refer to the lack of it but I'm being nice here for reasons I don't really understand) and the more automated garbage collections found in higher level languages. Objective C uses a weird count attached to objects by release and retain messages that seems much harder to keep track of. On one hand you can send release messages to objects you no longer need to reference but that invisible integer is really what controls whether memory being deallocated or not.
All of this is dandy for the dinky little snippets of stuff that I'm writing now but I'd love to find a debugging utility that keeps track of these counts. Leaking memory makes the baby jesus cry tears made of flame and this seems like the ideal setup for those sorts of fits of crying. They are second only to the flood of tears I will produce when trying to debug the allocs in a larger batch of code. Am I just being stupid here? My acquaintance with OO oriented languages is admittedly pretty tentative but it seems like there should be an easier way of keeping track of piles of arrays than keeping track of a count manually. As always clues are much appreciated and applied liberally whenever possible.
When Land of the Dead was released there was a fair amount of talk circulating about future Romero zombie films. The man himself was a little vague when he mentioned it in the DVD/Director's Cut extras but it was definitely there and, of course, led to all sorts of crazy speculation about Land of the Dead being the start of another extended story and other iffy extensions. None of those are bad concepts but none of it really sounds like a George Romero sequence of events. So, now Diary of the Dead has actually been announced and has dates attached to it.
I feel uneasy about the movie if only because The Blair Witch Project is an obvious point of comparison since it mirrors parts of that story: in the woods to make a movie and scary things happen. I am curious about what the film will actually look like since the film/video footage combination might make for some interesting atmosphere and Romero is an incredibly skilled editor. Those quick cuts so common in horror movies (especially the 1980's splatter movie variety) were popularized by his early movies. I'm excited but fairly worried at the same time. The word on the internets is that it will start filming in October. Okay, so I am more excited though the 2008 release isn't as exciting as the fact that it is going to start filming next month. We'll just have to see what the final result will look like.
Content Generated By Monkeys And Given Proper Context By Overuse Of Profanity And Obscure References
I do not particularly understand the terror that most of my fellow tech support folks feel when they are confronted by a Macintosh. Even before I had one that I carried around with me everywhere it was pretty easy to chuck a quick Google search from my Linux desktop at work and read the answer back to people. I'm guessing that a good part of it is just reluctance to taint themselves by touching a machine that doesn't have a registry but the unwillingness to learn simple tasks that are largely delegated to users by the operating system. Adding a printer should not require a whole lot of anxiety but I am routinely consulted for things like this that are executed by pressing a button adorned with a plus symbol and following the bouncing ball afterwards. What makes this more interesting and troublesome is that so many people I've spoken to recently have acquired their first piece of Apple hardware and are baffled that so few people can answer very basic questions.
Suddenly the outright fear displayed by most people that are 'technical' at least by job title when confronted by any Linux desktop that isn't a neutered variety of KDE makes a whole lot of sense. Intuitive interfaces really don't matter in most cases as people seem to see only !Windows and freeze. Pairing this with the inability to use search engines for anything but porn/video games or to read and comprehend simple instructions presented in numbered steps is enough to make me feel like some kind of super genius by simply being able to read and follow simple instructions and to necessarily deviate from them slightly when they are a little out of date. I've overheard so many phone conversations that terminated with an utterance something like this: "You don't see a button on the lower left that says "x?" Well, you're going to have to see desktop support for that then." I don't particularly empathize with this but at least it makes more sense to me now. This also makes the incredibly tedious arguments that come along with any interface seems a little less ridiculous. Again, it's comical but I guess the people I'm surrounded by aren't especially brain damaged and I should be less surprised when people ask questions that could be answered with a minute or two spent trying to learn something new. I don't find this comforting but at least there is some kind of reason behind it. No attempt will be made to properly quantify 'some' because the effort would probably kill me. This may have something to do with a lack of sleep.
Just thinking about the prospect of job hunting/changing early next year provokes a pretty phobic response from me. I've never been a fan of the interview procedure, a feeling made more unfortunate by the fact that I have to do a fair number of second interviews these days though thankfully via telephone and focused entirely on generic technical knowledge. All of this is made extra pointless as the job that I'm interviewing for really doesn't require any technical knowledge whatsoever as nearly every aspect of the job has more to do with procedural busy work and any solid foundation in troubleshooting technical problems is more likely a hindrance than a help.
Seth Godin makes a whole bunch of great points in his post that questions the point of interviews in terms of benefit to an employer as well. The questions he asks are solid: what the hell does an interview really prove other than a very basic understanding of interview protocol and the most fundamental ability to successfully interact with other people without switching into some kind of serial killer mode? Usually the first couple of weeks after a person is hired at a new job are stressful enough without introducing a bunch of theoretical situations posed by managers who likely have no idea about the actual goings on of the position. To me, that irrelevance is one of the factors that make interviews more like jumping through hoops and less like an assessment of ability or how well someone will fit in to a given work situation.
Most of his suggestions are pretty good in terms of concept and at least veer away from the lying contest that most interview situations inevitably degenerate into. When interviewing for this last job I did have 'the tour' but it was mainly to have other people pitch pretty random questions at me and most of them seemed of a less pointed 'interview' variety anyway. This sounds like it would be somewhat relevant but mainly turned into a mock social interaction and was a little more cocktail party than I would be comfortable advocating but at very least it was a departure from the conference room conversations that preceded it. Again though, the well equipped interview liar (and we are all interview liars at least those of us who are hired at the end of running the gauntlet of confusion and misdirection) will breeze through this without giving the interviewer a clue as to what their real interactions will be like.
I do think that a partial inclusion into the actual process of day to day work is a way more sane measure of ability than any of the above. For me, this also means that my potential employer has some interest in hiring people who know what the hell they're doing and less about getting people on board who have a future in management. Getting thrown into the water to see if you can indeed swim is a relief when compared to all of the other wacky steps that modern management methods believe they need to throw at you and brings the assessment process into more sane terms for all parties involved. It seems like the more traditional interview process is more a case of old habits dying hard even when they don't yield any tangible results. This is compounded by the number of 'interview' texts out there which are the inverse mirror of the same books that create the seed questions that make up most interviews. Maybe we need to get some efficiency experts in to measure this and the make the whole process full circle, eh?
One thing that should excite all the slobs out there is how disgusting your keyboard is when the illumination below it suddenly comes on while riding the bus home a little later than you'd really like to be and a little darker than it should be on account of rain. Suddenly the backlit keyboard is a lot less cool than it used to be especially since the grime covering part of each display looks a little too much like Klingon for my tastes. Another thing is painfully added to the checklist of things to do when I'm a bit less sick.
I can't say that I was terribly surprised to hear the mostly woeful assessment of the direction and evolution of NetBSD and this is entirely apart from the running 'BSD is dying' gag replicated in some many forums and comment sections of tech sites. I've always wondered how the hell something like NetBSD with the goal of running on as many architectures and devices as (in)humanly possible could possibly be organized. I know that work on new stable versions of Debian is often impacted by its commitment to so many different chip architectures but those folks are generally drawing on a much larger pool of people who take QA work (especially for stable) pretty damned seriously and will draw and quarter loudmouths who don't even out their complaint-to-lines-of-working-code ratio by providing patches, albeit some of them incomplete or just plain wrong, in attempts to rectify the problem that vexes them.
The system of organization does seem like a disaster waiting to happen as it does afford far too much time in the limelight to those who complain the most and not nearly enough to those who fix broken shit. If you're not giving top priority to pounding showstopper bugs flat they you're definitely doing something wrong. Decentralizing the control of the project seems like a good first step as placing more responsibility in the hands of the people actually coding the project will relieve a good deal of the tension. Having the hands of those who don't have a clue about what they're doing meddling around in your code is incredibly frustrating and even more so when that meddling is done through organizational tomfoolery instead of line edits to bases of code.
Another thing that sounds worrisome to me (and I probably need to state right here for those who don't know me that I have very little insight into how NetBSD works organizationally other than the things on the mailing lists and other places that I've read over the past couple of days) is the dependence on bringing in features from the other BSD projects without solving essential problems in their own bases. If you're a NetBSD user then the lack of sane threading and/or journaling in the filesystems should scare the hell out of you. The other problem with these sort of infrastructure shortcomings is that it makes the existing base of code much less appealing for the purposes of forks and side projects. If it's internally that broken then for folks outside the specifics of the project it might be easier to build on something that does sanely implement the aforementioned features which seem less like features now and more like something essential that is all but taken for granted in other operating systems. Proper threading is hardly optional at this point.
I think his examination of the leadership question is also worth considering as he brings up good points about how Linux is organized. The idea of limiting the "locking" to really important parts of a given project like the kernel that all others more or less depend on (the Ur-dependency) to a dedicated and competent team of developers is great but it should ideally work in conjunction with a more loose structure for more loosely related applications and associated pieces of code. There is nothing revolutionary about this as most of the larger and more established Linux distributions already practice variations on this pretty logical theme. It is kind of amazing that a project could flounder and stagnate for so long with models like these on the rise all the time. All of that said, I really hope that people take heed of the warning as it would be sad to see one of the BSDs just disappear due to bad organization and lack of leadership. The real question I guess is whether something like NetBSD is sexy enough to attract new blood?