I3 UPDATE / Entovation International News

a free monthly briefing on the knowledge agenda

No. 59 March 2002

 

I3 UPDATE

latest
previous
next
archives
events
about
feedback
subscribe

managing
editor:

David J. Skyrme

publishers:

David Skyrme Associates

Entovation
Entovation
   International

Contents - Next Feature - Knowledge Digest

MAIN FEATURE

Organizational Mapping: Knowing the Pitfalls

Xenia Stanford*

Valdis Krebs (1999) defines social capital as "who you know" and human capital as "what you know". In an organization the 'you' refers to the corporate entity. These two types of capital in any such entity can be explored using knowledge mapping tools called 'social network mapping' and 'competency mapping'.

Both are fraught with problems and pitfalls of which organizations planning to conduct them must realize. At the least the damage can be failure of the attempt thus wasted time and other tangible costs. At the very worst the organization can suffer increased dysfunction to the point of business collapse.

Firstly, lest anyone is confused about what is meant by knowledge mapping, let us clarify that there are many types of knowledge mapping. The one most often to come to mind is concept or mind mapping, which is used to capture thoughts or brainstorm ideas showing various concepts and relationships of these concepts to each other.

Social network mapping specifically maps who, in a group or organization, shares knowledge with whom. This is the social capital of the entity being mapped. Excellent examples can be found at the site for Valdis Krebs (http://www.orgnet.com) and those given at the end of this article. Social network maps can be used to discover gaps in connectivity among people in the organization and sources of untapped knowledge.

Competency mapping demonstrates what type of knowledge and skills are required and/or found within the human capital of the organization. An example of an individual competency map can be found at http://www.cadvision.com/xenias/competencymap.html. An organization could use these personal competency maps to build a 'yellow pages' directory, match people to jobs or positions or determine what training programs are needed to fill skill gaps.

Thus social network and competency mapping are two common types of knowledge mapping as used by organizations hence organizational mapping. These two can be used for great benefit in exploring where knowledge resides and how it is shared within an organization.

However, before you proceed to use these in your organization there are potential dangers of which you should be aware.

Dangers in Organizational Mapping

Personally I have seen modern applications of organizational knowledge mapping fail (give no reliable and valid results) or backfire (cause damage) because sound principles were not used.

Social network mapping as used in business organizations today has its foundation in the science of sociometry or sociometrics. The first dictionary entry of these concepts is found in the International Scientific Vocabulary as published in 1908 where the definition is as follows: "the study and measurement of interpersonal relationships in a group of people".

This social science has been turned into business applications as social network analysis using the organizational social network map as the foundation for the analysis. The basic problem begins with lack of training or knowledge of the science behind it. This causes the input and output to be flawed, sometimes dangerously so.

Mapping the results of the input is fairly easy. There are many systems that can do this take the results from a questionnaire and plug it into the program, which then spits out a social network map showing "who chose whom" or a competency map of "who knows what". However, before an organization purchases such a system or hires a consultant offering these services, they should be aware of the potential pitfalls. Some of these are outlined below.

Pitfall No. 1: Believing the Map is the Ultimate Goal

Mapping is the easiest part of the process. The difficult parts are the audit (input) and analysis (output). These are the ones most fraught with stumbling blocks and hidden dangers.

Mapping may seem to be the output of the system. In truth the map is the middle part of the process and serves only as the beginning for analysis, the true output. It is a pitfall to view the map as the desired end result. The map is nothing but a colossal waste of time and money without the proper analysis.

However, proper analysis is impossible without asking the proper questions at the outset. This brings us to the second pitfall.

Pitfall No. 2: No Purposeful Question

Ask a stupid question and you will get a stupid answer. If you want a valuable result you must ask a question that will give you a valuable answer.

The reason the question may be stupid is that a purpose or mission for the mapping project has not been defined. An organization should not map merely for the sake of saying we now have an organizational map. The map is not good in and of itself. It is only good in so far as it can bring about positive change in the organization.

The value of the map has been described as the 'reality' chart of an organization. The traditional organization chart shows you the prescribed method in which communication is to flow. The knowledge map can show you how the communication actually flows in other words create the chart of what is happening.

Knowing how communication actually flows is of no greater value than the organization chart unless you want to measure how close the flow is to what is desirable and if it is not, to use it to design strategies to change it.

Pitfall No. 3: Not Knowing Where You Are Going

The pitfall is not having a mission. If you don't know where you are heading how will you know when you get there?

Similarly when you wish to effectively measure reality you must have some idea of the ideal and must ask questions that will show whether this ideal is in fact close to the mark or far removed. Thus the mission must reach far beyond the map itself.

The mission must be to create and sustain a knowledge flow that is more profitable to your organization. Then the map becomes a measure of how close to the ideal you already are in order to benchmark for future measures of how much change you have been able to effect.

If you are already there, that is, your organization is already rich beyond your wildest dreams your mission might be to measure the current 'ideal' knowledge flow. Then in the future when the bottom line is not so rosy, you will be able to measure against the benchmark to see where the problems are occurring and use this to try to re-create the ideal.

However, until we know this perfection we must try to imagine what would be better.

Pitfall No. 4: Not Ensuring Both Reliability and Validity

In fact, most ordinary people are not sure what these are much less what difference they make when gathering data. Not only must the question have a purpose and match the mission, the question must deliver both reliable and valid results.

Reliability and validity are indications of how usable a particular measuring tool really is.

"Reliability tells us how consistently we are measuring whatever we are measuring. Validity is concerned with whether we are measuring what we say we are measuring." (Buley)

First reliability means the results are consistent, both internally and across time. If you take a measure of any part of the whole subject at any one time, the results will be consistent. That does not mean the results will be the same, just consistently measuring the same thing. If you wish to measure the knowledge a person has by how many people chose him as a subject matter expert, then the question must consistently measure this concept no matter how many people are asked.

To be reliable the results also must be consistent over time - that is not that people's answers may not change but that the question consistently measures the same concept no matter when the questionnaire is delivered.

For example, if my scale is consistently off by 3 kilograms, each time I step on the scale my weight reports at 3 kilograms higher or lower than I really am. The results are consistent - but consistently wrong.

Validity then kicks in as a measure of what you are really trying to do. If I am trying to accurately measure my weight, then the results matter. A consistently wrong answer means I am not accurately measuring my weight and if that is my intention, the tool (the scale) is not a good one.

This is easier seen in something tangible as weight and a scale as the measuring tool. It is more difficult to apply in the situation of mapping organizational knowledge. If we want to show the "who you know" or social capital, the tool must measure that. When we want to measure "what you know" the tool must actually be able to calculate this.

The data cannot be reliable and valid, if our measuring tool is not accurately consistently measuring what we say we are measuring. Some think the main measuring tool is the 'system' - however, the question is really the input. Yes it is important the technology takes in the results and charts an accurate reflection of the data. Yet no matter how wonderful the technology is, it is nothing without valid and reliable input.

The question is really the key. If the question cannot be assessed to be reliable and valid, there is no sense in even beginning the process.

Pitfall No. 5: Not Assessing the Results Accurately

Now we will assume you have found and tested on a sample audience the questions and found them to be valid and reliable and that they actually reflect the mission. Furthermore, let us assume the system has accurately produced the data in some visible form, such as a map of connections.

So far if any of this process has been flawed, the minimum harm done is the waste of time and effort. Where the greatest danger now lies is in the use or potential misuse of the data.

In practice I have seen what damage improper analysis produces. For example, let's say the resulting map shows that knowledge does not flow from person A to person B. The conclusion might be let us punish A for not communicating with B.

Perhaps B does not need to know what A has to tell him. In that case neither is at fault and in fact there is no problem. Perhaps B needs to know but will not listen to A. Now the fault is Person B.

In another scenario we may have everyone choosing a person called Jill. It may look like Jill is a star in that she is at the centre of map with everyone seeking her input or feeding her knowledge. We have to know which it is and that again depends on the question asked.

Let us say we asked the question from whom would you seek knowledge on this project and all roads lead to Jill. Now we would need to look at whether Jill actually has the knowledge or merely appears to have the knowledge. She may be giving out false information and people may be acting on this with dire consequences.

In this case is she a knowledge star or a false witness?

On the other hand perhaps Jill does dispense accurate knowledge. Then we must examine Jillís choices to see if she is constantly seeking new sources and new knowledge herself.

If the star cluster (Jill and the people choosing her) is exclusive, it could show a clique where the knowledge simply recycles among the group. Without an outside link supplying fresh knowledge, the star becomes stale and soon will fade.

Another common mistake is looking at a person; let's call him Joe, never chosen as a source of knowledge (subject matter specialist) and saying we must let him go because he has no knowledge.

Joe may well have knowledge but no one seeks out and uses his knowledge. It remains hidden and untapped. The fact that no one asks Joe may be because no one feels he has knowledge worth knowing. Joe then reacts based on others' perception of him believing he does not actually know anything of value to others. The fault falls on the company for not valuing Joe and not providing a venue and cultural climate where he can show what he knows.

Go Boldly, But With Knowledge

Knowing about these pitfalls will help you chart your mapping activities with confidence. These are not the only pitfalls. For other possible erroneous assessments of organizational knowledge mapping, see the caution and examples given by Krebs (1996): 'Snap judgements' based upon first assessment of the map often prove wrong; in-depth analysis is required first to determine whether 'problems' revealed are real and then to develop effective 'cures'.

In several of the organizations with whom I have worked the immediate reaction was to make firing, promotion and other rewards/punishment decisions based on the results of social network and competency mapping.

This is dangerous ground! Tread carefully for this is where the costs of the mapping process can go much beyond simple waste. Misinterpretation of the map can result in punishing and rewarding the wrong people. In doing so the cultural climate can turn to one of fright, knowledge hoarding, jockeying for position and cutthroat competition. The results can be as or even more disastrous than the traditional organization and can eventually lead to the demise of the business.

If your attempts fail due to one of these pitfalls, not only can it seriously impair the culture of the organization and thus inhibit participation in other knowledge management activities, it could seriously endanger the image of the overall KM program.

On the other hand those who are aware of the dangers and do follow proper guidance, such as given above, can empower their organization to know itself and identify its social/human capital, which after all is much more valuable than most of the current focus in knowledge management, such as content (see next article).

Works Cited:

Buley, Jerry. Reliability, Validity and Correlation. 2000.
(http://com.pp.asu.edu/classes/jerryb/rvc.html)

Krebs, Valdis. Managing Core Competencies of the Corporation. The Advisory Board Company. 1996
(http://www.orgnet.com/orgnetmap.pdf)

Krebs, Valdis. Working in the Connected World - Managing Connected Assets. 1999.
(http://www.knetus.net/white/social-capital.html)

Copyright Xenia Stanford, Stanford Solutions Inc. 2002.

* Xenia Stanford is President, Stanford Solutions, and Editor-in-Chief, KnowMap: The Knowledge Management, Auditing and Mapping Magazine (www.knowmap.com)


© Copyright, 2001. David Skyrme Associates Limited and Authors - All rights reserved.

I3 UPDATE / ENTOVATION International News is a joint publication of David Skyrme Associates Limited and ENTOVATION International Limited - providers of trends analysis, strategic advice and workshops on knowledge management and knowledge innovation®

® Knowledge Innovation is a registered trademark of ENTOVATION International.


 

CITED ARTICLES

Buley, Jerry. Reliability, Validity and Correlation. 2000

Krebs, Valdis. Managing Core Competencies of the Corporation. 1996

Krebs, Valdis. Working in the Connected World. 1999



LINKS

Competency Map



ALL-TIME FAVOURITES

Customers: a new twist on knowledge management

Dot com winners and losers

Virtual teaming and virtual organizations: 25 principles of proven practice

Measurement myopia; those who measure and those who act

Portal power: gateways or trapdoors?

Creativity is not innovation

Virtual trust

China: accepting the knowledge challenge

Innovation action for Europe