Tuesday, June 21, 2011

Lack of competence

Where I used to work "lack of competence" was the scapegoat for all the shortcomings of the organization. It annoyed the hell out of me because it reminded me George Carlin's critique of the use of euphemistic language. As the great man said: "American English is packed with euphemism because Americans have trouble dealing with reality and in order to shield themselves from it they use soft language." Well it is not just an American thing or at least it is not confined in America.

It wasn't just the descriptive nature of the phrase that bugged me it was the meaning and the way it was used. Lack of competence or incompetence was not the correct word to describe the problems that plagued our teams. If anything the competence level in that place was through the roof, however experience was lacking. Experience is the most valuable and expensive asset any employee carries and should be seen and managed by organizations as such. 

Experience is the difference between testing using the scatter-shot method (100 test cases and hopefully we'll get all the bugs) and knowing what four or five test cases will find bugs by reading the project's title. Experience creates experts/gurus/rock-star programmers, reduces design time, errors, increases quality and estimation accuracy but its cost is hefty because to create experience one has to go through the process of bad design, scatter-shot testing, blind estimations and failed projects. That is why experience needs to be actively managed, encouraged and guarded and that was why everybody blamed "lack of competence."

Hey I have been failing X amount of years so I must be experienced! Right? Wrong, just because you have been designing the same flawed solutions for X years doesn't mean you are a good programmer it means that you are an efficient, fast, proficient liability who should be promoted to management ASAP so that you stop poisoning young designers. Not everybody is capable of learning from experience and not all design configurations are conductive to building experience fast.

In the personal level experience is built with humility, drive for constant improvement and the thirst of learning of all these humility is the most difficult to hold on to once some experience is accrued yet it is what separates mediocracy from greatness because to lose humility is to lose the ability to learn.

So why was experience so lacking in my old organization were we all cocksure divas? No, it was quite the opposite actually. The culprit, for me, was the environment itself: testing was slow and cumbersome deterring experimentation, the platform system was extremely complicated and difficult to grasp, the documentation was "centrally" stored in multiple hard to reach locations and what is even worse was that a lot of the additional information (tutorials, presentations, descriptions) were not available as different individuals, teams, contractors and companies either guarded their "secrets," protected their work, or plainly were too busy to share.

This would be bad enough but adding the fear factor to the equation turned what was a bad dream into a full blown nightmare for experience building. In an economy where unemployment for youngsters reached 40% veiled termination threats were weaved into every discussion concerning human error rates which created an atmosphere of dread. In this atmosphere everybody tried to minimize their risks. There was an aversion to learning new skills, copy/paste programming was encouraged and new code was mostly shunned, estimations were based on formulas, while constant code peer reviews and scatter-shot testing strategies brought code production to a halt. Not only were we producing little to no value while working our fingers to the bone but we did not learn anything from it. Mistakes were not allowed, so they were swept under the rug.

Experience building is expensive because experience loves mistakes, but in order to learn from them we must be unafraid to assume responsibility, take ownership of error, document it, analyse it without prejudice and share the results. With the dread of error looming over our heads to take responsibility for our mistakes was an act of madness. In fact it took me accepting that "one day, I will quit this job" to allow me to spread my wings, try some crazy solutions and learn each day more than I was learning in a month under the veil of fear. How ironic that all this experience/value was being added after the decision that made it a temporary gain for the organization itself. In other words, by deciding I will leave the company I became a better employee.

All these thoughts are swimming in my head as I struggle to grasp the intricacies of building an Android app. I think they came to me because my schedule is slipping constantly, I refactor and alter the app's architecture every week, I strain to learn new concepts and I question my own competence all the time. I was trying to figure it out until it struck me, I am not incompetent I am just inexperienced. But I have no fear so I'll be fine.

Till next time,
Stratos out.

Tuesday, June 14, 2011

Re-evolution

Seeing the Windows8 vid got me thinking both in terms of possibilities but also the general direction of the industry. For a long time your OS choice depended on what you wanted to do, office activity, mainstream PC usage and gaming was Windows, Apple had the more arts and crafts with some devs and Linux was hardcore computer folks who loved re-writing drivers to hook up a printer. Then Linux started being more user friendly and cooler, Apple got more and more "one click to rule them all" simplistic, windows was... well loosing market share with Vista and regaining some with Windows 7. And then iOS hit tablets and Android emerged and Blackberry OS went tablet and Windows Phone 7 looked like a flash website. With the rise of the tablet and more and more crossover between home laptop usage and smartphone/tablet PC usage one begins to wander if home computing will remain the same.

Sure, sure there will always be enthusiasts building C games on a Linux machine, web devs running a server in their basement but the average user who only recently got a laptop and figured out the intricacies of a mouse (you know the folks who buy software) are giving up their mouse and laptop for the tablet/Smartphone combo. I guess what I am trying to get to is that there is a brave new world out there a mobile computing evolution with the internet coming out to the real world and interacting with it. A world of possibility but also a world without a particular shape.

What strikes me as peculiar is that Microsoft, who is significantly behind in getting that mobile computing market share, would so bravely pick up the glove and enter the mobile arena by acknowledging it's significance via the UI design of Windows 8 which has been traditionally the flagship of their products. As the market becomes more fractured and monopolies fall MS being brave is not only fresh but also risky business. Heaven forbid if the mobile OSes start invading home computer systems and high school students copy/paste from wikipedia using their docked smartphone what market will remain of traditional windows apps? What is it exactly that we do at home for entertainment that a tablet cannot accommodate? And why stop there, what exactly is a manager doing at work that a docked Blackberry cannot replace his computer? Do we even need DSL/wire-line phones and cable TV in a 4G world? Is TV even relevant in a streaming/torrent world?

The world's changing around us and Microsoft is jumping in the deep end, it will probably work out for them in the end and Google will either beat Oracle or work around it's lawsuits maybe there is even room for Playbooks and whatever tablet will carry Windows 8. But what is truly amazing is the world of possibilities that this new round of electronic evolution will bring good time to be alive and even better time to understand what a compiler is.

Till next time,
Stratos out.

Friday, June 3, 2011

European vs American

I recently moved to Toronto, Canada so I've had the chance to experience a small part of the North American (NA) way of life.

The most striking difference between the NA way and Europe's (EU) style, besides the unapologetic consumerism, is the spirit of do it yourself vs who are you to do it yourself. In EU to change the world via technology you need to get a degree from a university, get a master's, work as a lowly worker for X years in a large company to slowly get up the ranks and by the time anyone is willing to listen to your vision, your vision is no longer fresh and relevant. In NA you work as a coder for a couple of years, get a following in the social media of your choice and startup! The web world moves way too fast for old executives, here the difference between graduate and "acquired by Google" is a good idea and hard work. Here you CAN DO whereas in EU you can only hope to one day, maybe, change a line of text in an outdated webpage.

Things are not 100% rosy, sure. If your idea and/or execution sucks or maybe even if it doesn't but a competitor does it better than you, you become jobless, homeless, maybe even hopeless so fast your head will spin. Things moving fast means you get where you want faster or crash and burn. 35 is old, 40 is too old and 50 is dinosaur old. The startup scene is also filled with pretenders, 20 year old "social media experts," lawyers, smooth talkers and other leaches feasting upon the fresh fruit of Web 2.0 but being critical in who you hire can as always help. Did I mention you need a lawyer? Oh yeah a GOOD one!

Life's good here. It is not California for sure but a talented innovator can do real good. With Microsoft jumping on the app bandwagon with Windows 8, every player and their cat releasing a tablet and the Web 2.0 now requiring mobile apps coders will be in business for a VERY long time. Now excuse me I have to go read up on HTML5.

Till next time,
Stratos out.