There were times when I was able to make a living. This post is just a small sign of that. It also reflects a time when I wasn’t so defeated. Boy, do I miss those days. I wouldn’t say that they were easy days but they were days of less ill-fated thoughts. The problem I had (or have) was being too frivolous about ”work”. I think work should be a means and not an end. Or perhaps (or in contrast?) more clearly stated: people should not have to live to work. Imagine, for the sake of humanity, outlawing “careers”. Wait. No. I mean, work to live. No. Whatever. At the least, work shouldn’t be compulsive. So I jumped from corporate job to corporate job at the first step I planted in Europe with a valid work-visa. I hated every minute of it. And every minute I was supposed to work my mind was instead full of, well, something obviously incompatible. So I wonder, if I am some kind of extreme person in this realm of success and failure, what are the masses of people out there? This is truly preposterous.
Below is a copy of an article I wrote while developing a ”knowledge tool” for an enterprise software provider in Eurowasteland. The article appeared in 1999 in the magazine ”Knowledge Management” which is published in the UK. I should hide this along with all the other stuff I’ve written because it is awful. Remember, with out people like me (us?) where/how would the Steve Jobs or Larry Ellisons of this world have so much success?
Summary article written by the publisher:
”This article addresses the issue of complexity in knowledge management. We, as human beings cannot give into the temptation of adding fuel to the fire of chaos in the technology revolution. The driving force behind KM activities to date has been quality and productivity management. It is the author’s opinion that the driving force of KM should be a simplified form of communication using available PC and networking technologies. He proposes that the medium for facilitating KM is hypertext.”
This article was printed in the magazine ”Knowledge Management”, September 1999, v.3, issue 1, Ark Publishing, UK.
Chaos… The Forbidden Fruit? by T Stough
“Been there done that…”
I imagine that is what James Watt would say to me today if I could ask him what he thought of knowledge management (KM). Kevin Kelley wrote in his book ”Out of Control” (1) that the information age began before the industrial age. Watt, to paraphrase Kelley, couldn’t have done what he did without the systematic collection of tacit and explicit information. This refers of course to the blueprints, plans, lead-pencil and paper designs, human know-how, etc. which would govern the making of what is considered the first steam engine.
Of course, before asking James Watt about Knowledge Management, it would have to somehow be defined. For the purpose of understanding this article though, allow me to define KM:
- Data is everywhere;
- Information is everywhere and we know where it is;
- Knowledge is making the two available but also useful.
In short, allow me to refer to data-info-know when talking about KM. (See illustration. Not available in txt document.)
As we wave through the hype of KM and try to deal with the funds that are dumped on us not equal to expectations, perhaps the time has come to take a step back and consider what is going on. For some time now, corporate leadership has been interested in a phenomenon that will improve productivity. Quality managers, productivity managers, and the like haven’t quite made expected progress! (Does anyone remember TQM?) Enter KM.
But the expectation problem is two-fold. Knowledge workers either promise too much or management believes the hype. Whatever the case, a lot is being spent and results to-date are questionable. In considering this subject, (along with trying to build a KM system), I have come up with a great scapegoat for (my) troubles: It’s the technology.
(Graphic depicting Data/Information/Knowledge = Knowledge Management. Not available for txt document.)
The good news is, the PC has been born, exiting the confines of mother technology and her unwillingness to release the potential of what has been conceived. The bad news is, we have limited insight when considering where maturing technology will allow the PC to go. Be assured, without lots of stuff like bandwidth, memory and usable interfaces, it’s fairly safe to assume that KM will be, at best, another idea that had lots of potential. (One good thing about technology is that it has a life of it’s own–which is why it continues to exist beyond all its currently questionable implementations.)
I would go out on a limb and even make the assumption that we’ve become so advanced in projecting in- formation on a screen and powering-up CPUs that we’ve missed the boat on utilizing it. If you consider that the technology in a F1 racing car could be scaled down to improve my tires, or the suspension of my motorcycle – they may even consider someday utilizing the anti-skidding technology of aircraft brakes in a car (;-) – there must be a way to capitalize all the MHz and technology that’s put in our face everyday. Imagine, we’re all driving around in 4 zillion MHz desktop dumb terminals that, if you could put wheels on ’em, would blow the doors off a race car.
Let me try and return to earth. More specifically, let’s consider something that was conceived years ago and has transcended both time and technology.
Does anyone remember hypertext? I’m not talking about ”HT” in HTTP. I’m talking about the concept coined by Ted Nelson in the late ’60s, a fan of alternative uses of technology which actually led to the invention of the World Wide Web. Even twenty years before him the scientist Vannevar Bush said: ”A record, if it is to be useful in science, must be continuously extended, it must be stored, and above all it must be consulted.” (2) Without the foresight of men like Vannevar Bush and Ted Nelson, the PC might have remained the glorified typewriter and counting device we’ve all come to know so well.
Hypertext allows data-info-know to not just be accessible from all directions but also it can be created from all directions. Further, the hypertext concept is beyond linear and Newtonian thinking, which in itself is a hurdle in the data-info-know process. The best example of the concept hypertext today is not just the internet but also mobility. The internet is to communication what automobile-manufacturing was to mobility. And if you’re a believer in what the internet allows individuals to do, like I am, then it’s hard to understand why every organization hasn’t yet implemented it unconditionally.
(Graphic depicting Data/Info/Know = Knowledge Enterprise. Not available in txt document.)
I’ve always liked Sun’s marketing slogan ”the PC is the network”– which is more a philosophy than a slogan. Without the network capability of PCs it is hard to imagine even attempting to manage the vast amounts of data-info-know that are being created. This can be seen in the results of most inter-networked organizations that are limited to the data-info part of my definition, albeit behind the facade of managed knowledge. Consider ERP and intelligent terminals enslaved within the confines of struggling information users and producers. The backbone of KM is the ability of these users and producers to make what they know available. This can only be achieved in an environment that makes the network and the PCs readily available–not a top down mainframe-driven process with connected terminals. And with all the technology available to us today, there is nothing out there to allow struggling users to ”publish” what they know across a distributed network on the fly.
When considering ERP and the products sold in the PC environment, one can’t help but talk about Microsoft. Unfortunately for the knowledge worker this leads then to three conflicting and difficult issues: inefficient, restrictive and high learning curve. Beyond that, when you talk about these products and their technology you are also forced to consider security and protecting highly sensitive information – a subject which is more scary than criticizing ERP or Microsoft. This is due to the fact that makers of software and solutions don’t have consistent processes to look back on – as, perhaps, was the case in manufacturing mass mobility. There is no way to know what is right or wrong in technology – crash a car, wreck a train, etc., and you’ll pretty much be able to find out what went wrong fast.
Another difficult issue is that every company implementing an IT solution has the most sensitive and world-moving information locked up in both legacy and modern systems. And so, the makers of solutions and software (as service oriented as they are!), offer highly complex products to meet confused and paranoid demands and that is that. Thats a way to run a business eh? Don’t misunderstand me. I’m not promoting the free access of internal corporate information. To me the issue of security is simple: if your organization has information that is so sensitive it cannot be distributed then it shouldn’t be in a distributed and shared knowledge environment. The value of knowledge is in its distribution and accessibility. It must be obvious why so many KM efforts up to now have failed.
(Graphic depicting Knowledge Paradox. Not available for txt document.)
Perhaps the only difference today than when Watt was making his steam engine was ”time to market.” And the time between conceptualizing mobility and realizing mass mobility is ages compared to the time it has taken to internetwork the (corporate) world. But the key factor here is that mobility, relatively speaking, works well. Does this mean the corporate world has to wait 50 or 100 years to get inter-networking right as was the case (if you consider quality, security and services as metrics) in making automobiles? Or is the question: is the success of mass mobility a result of the time-proven processes in manufacturing? Consider the fairly smooth transfer from human manufacturing to robotic manufacturing. How easy it must have been since the robots were only required to emulate what humans have always done. Taking technology to that level is extremely difficult because such ”manufacturing” processes have yet to be created in the world of technology. In the mean time the corporate world is trying to catch up by continuing to implement ERP, or the like. In a way, selling ERP is like trying to get James Watt to buy a CAD system before there were monitors.
Computing has freed us from miscellaneous tasks and brought color and moving pictures to telephony. But where is the value of high-end, pumped up terminals if they cannot facilitate knowledge? And if they could facilitate knowledge, what about the environment required disseminating it? Many KM scholars think the answer lies in other complicated things like change management. I disagree. If you approach KM within the confines of an established Newtonian organization then change management is, worst case scenario, necessary, and best case scenario, it is hard but manageable (and always expensive). Not every organization has the resources and so they settle for the easy compromise–buy an application, implement, collect knowledge and hope for the best. More importantly, it has to be understood that in today’s competitive environment distributed information has more value than non-distributed information. And value, at this point in our efforts to implement KM, is key. The concept behind hypertext is the link that we in KM are seeking because the technology used to make it work is simple.
As we approach the millennium, the buzz words are globalization, openness, distributed. Combine those terms with the idea that secretive information in a world of distributed networking has no value.
(1) Kevin Kelly (1994), Out of Control – The New Biology of Machine, 4th Estate Publishers
(2) Bush, V., (1945) As We May Think, by Vannevar Bush, The Atlantic Monthly)
Tommi Stough (aka worstwriter) was a ”Knowledge Management” consultant at one time and someone who believes that, in business – all aspects of it – common sense has NO place. But don’t quote me on that.