The KMAfrica.com fireside chat is a space designed for you to have a conversation about anything you wish to in the KM Area with other members of the KMAfrica.com community. A clear African night sky and a fire are an excellent setting for a good conversation so, sit down next to fire, get comfortable, look at the dicussions and contribute your own views and questions. The fireside chat is also a forum to table questions and ideas for the further development of KMAfrica.com, general ideas for the practical application of knowledge in any knowledge-related field.
If you are new to social media and are still exploring the area, here are 4 steps to help you get the best out of it:
With the increase in users of microblogging sites such as Twitter.com, a profusion of services have emerged to support analysis and search of this live 'stream' of data. This Tweetgrid example provides an view of how seaching on particular hashtags (designated with a '#') can yield useful results and interesting connections. Importantly, it provides insight into how people in distant places are talking about your field of interest, what metaphors they are using, who they are recommending etc.. Obviously there is a lot of junk and so, as always, you need to exercise discernment. You can update the #hashtags on the tweetgrid and save your own personal glimpse into the mind of the collective. And if you are on twitter, please follow our updates on @KMAfrica by typing FOLLOW KMAFRICA and don't forget to update your Activity Stream with your twitter account details. Further information about #hashtags at Hashtags.org. If you want your message hashtagged, just add the hastags inside the message eg. if you want to twitter on the KMAfrica stream, include #KMAfrica to your message.
Transparency International, self-described as “the global civil society organisation leading the fight against corruption,” has released its 2010 league table of corrupt countries. Guess what: about 3/4 of the world’s nations are corrupt!
With governments committing huge sums to tackle the world’s most pressing problems, from the instability of financial markets to climate change and poverty, corruption remains an obstacle to achieving much needed progress. The 2010 Corruption Perceptions Index shows that nearly three quarters of the 178 countries in the index score below five, on a scale from 10 (highly clean) to 0 (highly corrupt).
These results indicate a serious corruption problem. To address these challenges, governments need to integrate anti-corruption measures in all spheres, from their responses to the financial crisis and climate change to commitments by the international community to eradicate poverty. Transparency International advocates stricter implementation of the UN Convention against Corruption, the only global initiative that provides a framework for putting an end to corruption.
Denmark, New Zealand and Singapore are tied at the top of the list with a score of 9.3, followed closely by Finland and Sweden at 9.2. Bringing up the rear is Somalia with a score of 1.1, slightly trailing Myanmar and Afghanistan at 1.4 and Iraq at 1.5. Notable among decliners over the past year are some of the countries most affected by a financial crisis precipitated by transparency and integrity deficits.
Among those improving in the past year, the general absence of OECD states underlines the fact that all nations need to bolster their good governance mechanisms. The message is clear: across the globe, transparency and accountability are critical to restoring trust and turning back the tide of corruption. Without them, global policy solutions to many global crises are at risk. Transparency International is available on www.transparency.org
The uptake of mobile phones on the African continent continues with growth rates in excess of 100% over the past twelve months (Source MTN 'Yello corporate publication, 2008). This is happening while technologies that link people across space and time are becoming ubiquitous and mobile telephony is the preferred means of telecommunication. The result is a narrowing of the technological gap between the developed and developing world. Rates of ownership, even among the poorest, is surprisingly high and while estimates vary, there were already more than 100 million connected handsets in Africa in 2005.
All that is needed are bright, entrepreneurial minds to seize the space, to make the connection in what is potentially one of the biggest knowledge markets in the world. What is needed is to marry technology with great ideas. These ideas can come from looking at actual situations in which knowledge and technology have been applied to create successful, impactful change, work opportunities and sustainable systems. Could you use these ideas to stimulate new possibilities for yourself and your community?
Further information on some of the projects mentioned here can be found on:
It is important to understand that mobile phones and other ICTs are tools, and not a solution to problems. However, ICTs have an important role to play as a part of wider strategies and programmes. Areas in which ICTS have been successfully employed by a variety of organisations include disaster relief, poverty alleviation, healthcare, conservation, development, and job creation – all representing fertile ground for KM innovation and entrepreneurship.
research: Steve Banhegyi
Western knowledge paradigms have a pyramid shape: at the bottom is real experience but moving up are layers of surrogates that stand for original ideas. At the apex are ideals, or highly refined knowledge. E.g. at the top is an idea called constitutional democracy but later by layer going down we can unravel this idea in terms of judicial institutions, legislative institutions, executive institutions, under which are communities and constituencies right until we come to the level of individuals serving in the various organs of state.
The ideas of limited government, separation of powers, bill of rights making up internal formations that divide the pyramid in the inside if we were afforded the chance to view it from inside out.
However, this method of knowledge management calls for specialization and division of skills because no single individual can ever hope to access all the necessary knowledge alone. By contrast, pre-colonial African knowledge systems were concentric in the way they functioned and were organised in the following way:
At the centre of the circle are the traditions and ancestral teachings. There is horizontal free access to the knowledge. There is a link between the living and the dead much in the same sense of blueprints and intellectual knowledge being accessible to future generations. The common link between the living and the dead is the tried and tested knowledge of the ancestors.
This knowledge is mediated through song, dance, daily chores, rituals, play and mentoring by elders or senior age group members. It is not surrogate knowledge that deals with higher level concepts (being refined ideas standing for massive experiential data). It is immediate, available, concrete and practical and easily assimilated into one’s personal frame of reference.
Difficult concepts such as “where do we come from as a species” are dealt with by means of myths, songs and stories.
When a boy wakes up in the African village:
This no doubt is quite a major learning experience. In western taxonomical terms it covers a wide range of areas from the cognitive areas, affective areas and psychomotor areas of learning. This kind of knowledge is people-centred, it resides within the people themselves, if it were transferred into a pyramid structure it would need trained professionals, textbooks, teacher hours, exams, govt supervision and research.
There is a certain level of knowledge that in African due to the disruption of local culture results in disorientation and alienation of many young and old people. I certainly do not advocate for going back into tribal knowledge but the question is aren’t there certain skills, survival skills and life skills that could be better taught using the concentric African model. Issues of drug abuse, rape, unemployment, street begging, street living may be symptoms of disempowerment and given KM models that advance the social cooperative nature of knowledge would go a long way to address these social problems.
In the concentric African KM model that elders formed the inner circle but did not follow everyone else to police the system. There was much joy and pleasure and play in imparting knowledge. Songs, storytelling, drums, poems all worked within a concentric dispersal system where participants simply crossed barriers and learnt very useful knowledge and skills.
Therefore any dissonance occurring in the system, such as an individual committing a crime e.g. rape, the entire village was affected and restoring psychological balance required the culprit to sponsor the ritual cleansing of the entire village. The deterrent nature of the exercise justified the cost. Modern systems still grapple with issues of community crime, but the African model dealt decisively with such issues. When the colonialists and missionaries came into Africa, they were surprised at the level of civil order and peace. Yes, some despotic chiefs abused this system and subjected their people to abuse which they endured with stoic endurance and fatalistic tolerance which disturbed many missionaries. But retrieving the system and adapting it to the modern situation may pay huge dividends in terms of creating a crime free society.
Africans managed to create a stable system of knowledge management. It had definite spin offs in terms of creating peaceful and stable societies.
The African model poses serious questions: if you cant dance it, sing it and recite it how can it be your personal knowledge? Indeed even the mother church seem to have used the same KM model: can we speak of the early Catholic church without the icons, the liturgy, the recited prayers, and hymns? In dealing with problems of modern city life perhaps it is wise to look inside ourselves and model our knowledge systems using inexpensive, vibrant and enjoyable systems found in Africa.
The African model’s functioning resembles the Brownian molecular movement. Its success lay on two important pillars: the traditions at its centre and practical value.
Knowledge was valued for its corporate blessings and cooperative creation. Some things seemed meaningless in the African KM until one finds out the purpose behind the practices. Not pointing at a particular place e.g a mountain. These practices taught reverence, perseverance and self-control. In all higher level schooling systems which focus on leadership self-control and perseverance are still the virtues most sought after. It does not matter whether they are taught on the netball field or rugby field the end result seem not to have eluded the humble Africans.
It is a challenge today to integrate and adapt African KMS into the modern technologies. It is important to restore the spirituality of technology. Western technology is an end in itself, which is important for its distribution and development. But there is a great need to infuse social values into these technologies so that in the end we preserve our corporate identity as human beings and we ensure that they contribute towards giving mankind a more human face.
The Inaugural Summit of the AU of July 2003, in Durban, South Africa, endorsed the NEPAD progress report and initial action plan and encouraged member states to adopt the NEPAD Declaration on Democracy, Political, Economic, and Corporate Governance, and accede to the APRM. After years of difficulties and African pessimism, some leaders thought that it was time to act rather than wait for others to come and solve their problems. They realised that there was a need to create an atmosphere conducive to development and to create conditions that would encourage the private sector to invest in African economies.
The first step in the APRM process of self-assessment, agreed to by the Heads of States and Government, was to assist member countries to determine the strengths and weaknesses inherent in some states that would result in a plan of action for each country and support from peers and institutions to assist in overcoming the different challenges facing these states. In contrast with past assessments carried out on African countries, the APRM is voluntary. Countries are not obliged to accede.
The MoU details the process and outlines the obligations that need to be fulfilled. For the peer review a national commission is created, in which the media, parliament, including opposition Members of Parliament, NGOs, human rights groups, the youth, gender groups, and business participate. This commission carries out a self-assessment, evaluates the weaknesses and strengths of the country, and formulates a plan of action. The commission’s report is presented to the APRM Panel of Eminent Persons, appointed to oversee the process and ensure its credibility by acting independently. The panel reviews the report in consultation with experts appointed to verify the information in the report. A mission (comprising at least 10 persons specialising in the different areas of APRM) meets with the different stakeholders to scrutinise the report and to find out if there is consensus on its contents, and if there is the necessary willingness to participate in the APRM. The report of the experts is then presented to the relevant government, which decides on steps to be taken to implement the recommendations. From this stage, the report is presented to the Forum of Participating Countries of the APRM who scrutinises the report with the government concerned and determines the needs of the country, including technical support. If there is resistance from the government to take measures to rectify identified problems, the forum exerts peer pressure, by peaceful dialogue, to persuade the government to take up the issues raised and move forward. The spirit of the whole process is a peaceful and non-violent resolution to take up the challenge of NEPAD.
A review takes between six and nine months and depends on the availability of data and the resources of the particular country to conduct the process of data collection, to be reviewed and assessed by independent experts of the African Development Bank, Economic Commission for Africa, Association of Central Banks and civil society. All these efforts will be merged to get the most current and accurate data. The important aspect of the process is its participatory nature. The citizens are part of it and the country's report will be made public after it has been presented to the forum for citizens to check the process and the progress of implementation. Political maturity of the civil society will help make the process a success. Rwanda has been among the first African countries to pioneer the implementation of the APRM. Apart from revealing shortcomings such as lack of adequate capacity in the APRM/ NEPAD secretariat, inadequate fluency in the coordination structures, constraints of time and resources, the standards of objectivity as the ultimate test of the credibility of the whole APRM process were challenged by the Republic of Rwanda. It was suggested that it is in the interest of everyone that the process meets stringent standards of objectivity. It was suggested that it is possible for a country to carry out an objective self-assessment, and in the case of Rwanda every effort was made to make the process as objective as possible. It is also for this reason that an external review is a key component of the APRM and a counter weight or verification mechanism. Rwanda suggested that some additional measures should be put in place to make the exercise more objective. Given the recent history of Rwanda, it seemed as if some external reviewers did not have adequate knowledge of the country and based their views on preconceived ideas and inaccurate information about the country found in different media. There is a need to base reviews on clear objective criteria or score matrix. This would certainly make the exercise more predictable, empirical and scientific. It was also suggested that a minimum requirement for objectivity should be that the final report has to be subjected to a process of moderation before it is tabled before the Heads of State. In addition to the challenges outlined above, there were other problems e.g. language impediments, especially in the rural areas where the questionnaire had to be translated in order to obtain the views of the population. (Republic of Rwanda 2006).
In November 2006, Sudanese billionaire Mo Ibrahim launched the Ibrahim Index for African Governance, a new ranking of sub-Saharan African nations developed in conjunction with the Kennedy School of Government at Harvard University. The rule of law and security will weigh most in the index, ahead of human development, economic development, democracy, transparency and empowerment of civil society. Professor Robert Rotberg, director of the programme on intrastate conflict and conflict resolution at the Kennedy School, under whose direction the index was developed, argues that every indicator is made possible by human and state security. The index will be used to measure and benchmark good governance in Africa on a country-by-country basis. Ibrahim insisted that the index would not duplicate the APRM. He argues that the APRM is subjective, as its outcomes are measured by focus groups, public opinion and sentiment.
Rotberg adds that the APRM is not a 'strong instrument' and that in some places the entire process has been taken over by governments. Rotberg said the index would be measurable and not based on what governments say. He disagreed with the concerns of some analysts that the programme is focused on individuals, pointing out that 'leaders make a difference and big leaders make a big difference'. Rotberg said that the Ibrahim index would have a 'diagnostic effect', prompting states to ask themselves how they can improve where there are shortcomings. He said the index would compare and rank countries. Addressing concerns that the Ibrahim Index could be seen as patronising, Rotberg argued that the index is globally applicable and not drawn up according to Western standards but standards that can be used in Africa. He said the index is meant to be neutral and context free and without political bias. (Zvomuya 2006)
Evaluation of the APRM against the conceptual framework shows that valuable KM practices are embedded in the APRM, which can assist with the performance of post-colonial states in carrying out their continental commitments and any organisation involved in trans-national business. The first aspect is that several role players are afforded the opportunity to gather information and to find synergy in insights, continuously learning from one another. Secondly, African society interacts in various ways as equal partners to add value to the gathered information with minimum pressure from the global environment. Furthermore, the countries involved are given ample opportunity to reject interaction if it is found to be exploitative, demanding or controlling while countries accept a measure of pressure if that pressure is found to be supportive of not only the national interests but in the interest of continental initiatives. The report of the review also serves as a good example of a knowledge product with holistic perspectives suitable for decisions and actions.
However, the researcher has been alerted to the potential of exclusion of normative knowledge claims, especially knowledge situated in the periphery of state structures because of language and capacity limitations. Especially the exclusion of the traditional claims of the IKS because of lack of capacity to access oral data is clearly a challenge. This specific factor distracts value from the APRM report and probably omits important knowledge that could have alerted everybody concerned of potential sources of conflict, especially in remote areas beyond effective state control. Whether alternative instruments such as the Ibrahim Index and several others used by research institutions will overcome these limitations, to enjoy the same legitimacy and acceptability as an APRM, which was created by Africans for Africa, is still subject to evaluation.
Whatever instrument is used to get an objective and holistic view of Africa and its challenges, the control of Africans over the gathering, processing, dissemination, and use of the knowledge is imperative. Moreover, it will have to include measuring of progress about the redressing of imbalances within African society, the quest for emancipation from domination of and convergence with the developed world and more weight on human security than state security, measuring not only the performance of political leaders but also the leadership of traditional and civil society.
This work is (c)opyright to Dr Dries Velthuizen African Wisdom site and is used with permission.
Editor's note: Ed Dante is a pseudonym for a writer who lives on the East Coast. Through a literary agent, he approached The Chronicle wanting to tell the story of how he makes a living writing papers for a custom-essay company and to describe the extent of student cheating he has observed. In the course of editing his article, The Chronicle reviewed correspondence Dante had with clients and some of the papers he had been paid to write. In the article published here, some details of the assignment he describes have been altered to protect the identity of the student.
The request came in by e-mail around 2 in the afternoon. It was from a previous customer, and she had urgent business. I quote her message here verbatim (if I had to put up with it, so should you): "You did me business ethics propsal for me I need propsal got approved pls can you will write me paper?"
I've gotten pretty good at interpreting this kind of correspondence. The client had attached a document from her professor with details about the paper. She needed the first section in a week. Seventy-five pages.
I told her no problem.
It truly was no problem. In the past year, I've written roughly 5,000 pages of scholarly literature, most on very tight deadlines. But you won't find my name on a single paper.
I've written toward a master's degree in cognitive psychology, a Ph.D. in sociology, and a handful of postgraduate credits in international diplomacy. I've worked on bachelor's degrees in hospitality, business administration, and accounting. I've written for courses in history, cinema, labor relations, pharmacology, theology, sports management, maritime security, airline services, sustainability, municipal budgeting, marketing, philosophy, ethics, Eastern religion, postmodern architecture, anthropology, literature, and public administration. I've attended three dozen online universities. I've completed 12 graduate theses of 50 pages or more. All for someone else.
You've never heard of me, but there's a good chance that you've read some of my work. I'm a hired gun, a doctor of everything, an academic mercenary. My customers are your students. I promise you that. Somebody in your classroom uses a service that you can't detect, that you can't defend against, that you may not even know exists.
I work at an online company that generates tens of thousands of dollars a month by creating original essays based on specific instructions provided by cheating students. I've worked there full time since 2004. On any day of the academic year, I am working on upward of 20 assignments.
In the midst of this great recession, business is booming. At busy times, during midterms and finals, my company's staff of roughly 50 writers is not large enough to satisfy the demands of students who will pay for our work and claim it as their own.
You would be amazed by the incompetence of your students' writing. I have seen the word "desperate" misspelled every way you can imagine. And these students truly are desperate. They couldn't write a convincing grocery list, yet they are in graduate school. They really need help. They need help learning and, separately, they need help passing their courses. But they aren't getting it.
For those of you who have ever mentored a student through the writing of a dissertation, served on a thesis-review committee, or guided a graduate student through a formal research process, I have a question: Do you ever wonder how a student who struggles to formulate complete sentences in conversation manages to produce marginally competent research? How does that student get by you?
I live well on the desperation, misery, and incompetence that your educational system has created. Granted, as a writer, I could earn more; certainly there are ways to earn less. But I never struggle to find work. And as my peers trudge through thankless office jobs that seem more intolerable with every passing month of our sustained recession, I am on pace for my best year yet. I will make roughly $66,000 this year. Not a king's ransom, but higher than what many actual educators are paid.
Of course, I know you are aware that cheating occurs. But you have no idea how deeply this kind of cheating penetrates the academic system, much less how to stop it. Last summer The New York Times reported that 61 percent of undergraduates have admitted to some form of cheating on assignments and exams. Yet there is little discussion about custom papers and how they differ from more-detectable forms of plagiarism, or about why students cheat in the first place.
It is my hope that this essay will initiate such a conversation. As for me, I'm planning to retire. I'm tired of helping you make your students look competent.
It is late in the semester when the business student contacts me, a time when I typically juggle deadlines and push out 20 to 40 pages a day. I had written a short research proposal for her a few weeks before, suggesting a project that connected a surge of unethical business practices to the patterns of trade liberalization. The proposal was approved, and now I had six days to complete the assignment. This was not quite a rush order, which we get top dollar to write. This assignment would be priced at a standard $2,000, half of which goes in my pocket.
A few hours after I had agreed to write the paper, I received the following e-mail: "sending sorces for ur to use thanx."
I did not reply immediately. One hour later, I received another message:
"did u get the sorce I send
please where you are now?
Desprit to pass spring projict"
Not only was this student going to be a constant thorn in my side, but she also communicated in haiku, each less decipherable than the one before it. I let her know that I was giving her work the utmost attention, that I had received her sources, and that I would be in touch if I had any questions. Then I put it aside.
From my experience, three demographic groups seek out my services: the English-as-second-language student; the hopelessly deficient student; and the lazy rich kid.
For the last, colleges are a perfect launching ground - they are built to reward the rich and to forgive them their laziness. Let's be honest: The successful among us are not always the best and the brightest, and certainly not the most ethical. My favorite customers are those with an unlimited supply of money and no shortage of instructions on how they would like to see their work executed. While the deficient student will generally not know how to ask for what he wants until he doesn't get it, the lazy rich student will know exactly what he wants. He is poised for a life of paying others and telling them what to do. Indeed, he is acquiring all the skills he needs to stay on top.
As for the first two types of students - the ESL and the hopelessly deficient - colleges are utterly failing them. Students who come to American universities from other countries find that their efforts to learn a new language are confounded not only by cultural difficulties but also by the pressures of grading. The focus on evaluation rather than education means that those who haven't mastered English must do so quickly or suffer the consequences. My service provides a particularly quick way to "master" English. And those who are hopelessly deficient - a euphemism, I admit - struggle with communication in general.
Two days had passed since I last heard from the business student. Overnight I had received 14 e-mails from her. She had additional instructions for the assignment, such as "but more again please make sure they are a good link betwee the leticture review and all the chapter and the benfet of my paper. finally do you think the level of this work? how match i can get it?"
I'll admit, I didn't fully understand that one.
It was followed by some clarification: "where u are can you get my messages? Please I pay a lot and dont have ao to faile I strated to get very worry."
Her messages had arrived between 2 a.m. and 6 a.m. Again I assured her I had the matter under control.
It was true. At this point, there are few academic challenges that I find intimidating. You name it, I've been paid to write about it.
Customers' orders are endlessly different yet strangely all the same. No matter what the subject, clients want to be assured that their assignment is in capable hands. It would be terrible to think that your Ivy League graduate thesis was riding on the work ethic and perspicacity of a public-university slacker. So part of my job is to be whatever my clients want me to be. I say yes when I am asked if I have a Ph.D. in sociology. I say yes when I am asked if I have professional training in industrial/organizational psychology. I say yes when asked if I have ever designed a perpetual-motion-powered time machine and documented my efforts in a peer-reviewed journal.
The subject matter, the grade level, the college, the course - these things are irrelevant to me. Prices are determined per page and are based on how long I have to complete the assignment. As long as it doesn't require me to do any math or video-documented animal husbandry, I will write anything.
I have completed countless online courses. Students provide me with passwords and user names so I can access key documents and online exams. In some instances, I have even contributed to weekly online discussions with other students in the class.
I have become a master of the admissions essay. I have written these for undergraduate, master's, and doctoral programs, some at elite universities. I can explain exactly why you're Brown material, why the Wharton M.B.A. program would benefit from your presence, how certain life experiences have prepared you for the rigors of your chosen course of study. I do not mean to be insensitive, but I can't tell you how many times I've been paid to write about somebody helping a loved one battle cancer. I've written essays that could be adapted into Meryl Streep movies.
I do a lot of work for seminary students. I like seminary students. They seem so blissfully unaware of the inherent contradiction in paying somebody to help them cheat in courses that are largely about walking in the light of God and providing an ethical model for others to follow. I have been commissioned to write many a passionate condemnation of America's moral decay as exemplified by abortion, gay marriage, or the teaching of evolution. All in all, we may presume that clerical authorities see these as a greater threat than the plagiarism committed by the future frocked.
With respect to America's nurses, fear not. Our lives are in capable hands - just hands that can't write a lick. Nursing students account for one of my company's biggest customer bases. I've written case-management plans, reports on nursing ethics, and essays on why nurse practitioners are lighting the way to the future of medicine. I've even written pharmaceutical-treatment courses, for patients who I hope were hypothetical.
I, who have no name, no opinions, and no style, have written so many papers at this point, including legal briefs, military-strategy assessments, poems, lab reports, and, yes, even papers on academic integrity, that it's hard to determine which course of study is most infested with cheating. But I'd say education is the worst. I've written papers for students in elementary-education programs, special-education majors, and ESL-training courses. I've written lesson plans for aspiring high-school teachers, and I've synthesized reports from notes that customers have taken during classroom observations. I've written essays for those studying to become school administrators, and I've completed theses for those on course to become principals. In the enormous conspiracy that is student cheating, the frontline intelligence community is infiltrated by double agents. (Future educators of America, I know who you are.)
As the deadline for the business-ethics paper approaches, I think about what's ahead of me. Whenever I take on an assignment this large, I get a certain physical sensation. My body says: Are you sure you want to do this again? You know how much it hurt the last time. You know this student will be with you for a long time. You know you will become her emergency contact, her guidance counselor and life raft. You know that for the 48 hours that you dedicate to writing this paper, you will cease all human functions but typing, you will Google until the term has lost all meaning, and you will drink enough coffee to fuel a revolution in a small Central American country.
But then there's the money, the sense that I must capitalize on opportunity, and even a bit of a thrill in seeing whether I can do it.
And I can. It's not implausible to write a 75-page paper in two days. It's just miserable. I don't need much sleep, and when I get cranking, I can churn out four or five pages an hour. First I lay out the sections of an assignment - introduction, problem statement, methodology, literature review, findings, conclusion - whatever the instructions call for. Then I start Googling.
I haven't been to a library once since I started doing this job. Amazon is quite generous about free samples. If I can find a single page from a particular text, I can cobble that into a report, deducing what I don't know from customer reviews and publisher blurbs. Google Scholar is a great source for material, providing the abstract of nearly any journal article. And of course, there's Wikipedia, which is often my first stop when dealing with unfamiliar subjects. Naturally one must verify such material elsewhere, but I've taken hundreds of crash courses this way.
After I've gathered my sources, I pull out usable quotes, cite them, and distribute them among the sections of the assignment. Over the years, I've refined ways of stretching papers. I can write a four-word sentence in 40 words. Just give me one phrase of quotable text, and I'll produce two pages of ponderous explanation. I can say in 10 pages what most normal people could say in a paragraph.
I've also got a mental library of stock academic phrases: "A close consideration of the events which occurred in ____ during the ____ demonstrate that ____ had entered into a phase of widespread cultural, social, and economic change that would define ____ for decades to come." Fill in the blanks using words provided by the professor in the assignment's instructions.
How good is the product created by this process? That depends - on the day, my mood, how many other assignments I am working on. It also depends on the customer, his or her expectations, and the degree to which the completed work exceeds his or her abilities. I don't ever edit my assignments. That way I get fewer customer requests to "dumb it down." So some of my work is great. Some of it is not so great. Most of my clients do not have the wherewithal to tell the difference, which probably means that in most cases the work is better than what the student would have produced on his or her own. I've actually had customers thank me for being clever enough to insert typos. "Nice touch," they'll say.
I've read enough academic material to know that I'm not the only bullshit artist out there. I think about how Dickens got paid per word and how, as a result, Bleak House is ... well, let's be diplomatic and say exhaustive. Dickens is a role model for me.
So how does someone become a custom-paper writer? The story of how I got into this job may be instructive. It is mostly about the tremendous disappointment that awaited me in college.
My distaste for the early hours and regimented nature of high school was tempered by the promise of the educational community ahead, with its free exchange of ideas and access to great minds. How dispiriting to find out that college was just another place where grades were grubbed, competition overshadowed personal growth, and the threat of failure was used to encourage learning.
Although my university experience did not live up to its vaunted reputation, it did lead me to where I am today. I was raised in an upper-middle-class family, but I went to college in a poor neighborhood. I fit in really well: After paying my tuition, I didn't have a cent to my name. I had nothing but a meal plan and my roommate's computer. But I was determined to write for a living, and, moreover, to spend these extremely expensive years learning how to do so. When I completed my first novel, in the summer between sophomore and junior years, I contacted the English department about creating an independent study around editing and publishing it. I was received like a mental patient. I was told, "There's nothing like that here." I was told that I could go back to my classes, sit in my lectures, and fill out Scantron tests until I graduated.
I didn't much care for my classes, though. I slept late and spent the afternoons working on my own material. Then a funny thing happened. Here I was, begging anybody in authority to take my work seriously. But my classmates did. They saw my abilities and my abundance of free time. They saw a value that the university did not.
It turned out that my lazy, Xanax-snorting, Miller-swilling classmates were thrilled to pay me to write their papers. And I was thrilled to take their money. Imagine you are crumbling under the weight of university-issued parking tickets and self-doubt when a frat boy offers you cash to write about Plato. Doing that job was a no-brainer. Word of my services spread quickly, especially through the fraternities. Soon I was receiving calls from strangers who wanted to commission my work. I was a writer!
Nearly a decade later, students, not publishers, still come from everywhere to find me.
I work hard for a living. I'm nice to people. But I understand that in simple terms, I'm the bad guy. I see where I'm vulnerable to ethical scrutiny.
But pointing the finger at me is too easy. Why does my business thrive? Why do so many students prefer to cheat rather than do their own work?
Say what you want about me, but I am not the reason your students cheat.
You know what's never happened? I've never had a client complain that he'd been expelled from school, that the originality of his work had been questioned, that some disciplinary action had been taken. As far as I know, not one of my customers has ever been caught.
With just two days to go, I was finally ready to throw myself into the business assignment. I turned off my phone, caged myself in my office, and went through the purgatory of cramming the summation of a student's alleged education into a weekend. Try it sometime. After the 20th hour on a single subject, you have an almost-out-of-body experience.
My client was thrilled with my work. She told me that she would present the chapter to her mentor and get back to me with our next steps. Two weeks passed, by which time the assignment was but a distant memory, obscured by the several hundred pages I had written since. On a Wednesday evening, I received the following e-mail:
"Thanx u so much for the chapter is going very good the porfesser likes it but wants the folloing suggestions please what do you thing?:
"'The hypothesis is interesting but I'd like to see it a bit more focused. Choose a specific connection and try to prove it.'
"What shoudwe say?"
This happens a lot. I get paid per assignment. But with longer papers, the student starts to think of me as a personal educational counselor. She paid me to write a one-page response to her professor, and then she paid me to revise her paper. I completed each of these assignments, sustaining the voice that the student had established and maintaining the front of competence from some invisible location far beneath the ivory tower.
The 75-page paper on business ethics ultimately expanded into a 160-page graduate thesis, every word of which was written by me. I can't remember the name of my client, but it's her name on my work. We collaborated for months. As with so many other topics I tackle, the connection between unethical business practices and trade liberalization became a subtext to my everyday life.
So, of course, you can imagine my excitement when I received the good news:
"thanx so much for uhelp ican going to graduate to now".
The term information processing is used equally in information technology, psychology and the neurosciences to describe brain processes. In psychology, it refers to the operations by which people mentally manipulate what they learn and know about the world and information technology talks about information processing as the efforts to understand how we take in, process, access and store new information.
The brain’s purpose is to integrate information about the outside world together with information from inside the body. The purpose of this, as some have suggested, is to predict the future. To anticipate and engage with change in an adaptive way. Consciousness consists of monitor images of the inner and outer worlds; it can be seen as a container for the representation of all experiences.
The distinction between inside and outside origins of conscious experience is especially useful. In Kant’s terms, phenomena are events out there in the world and noumena are inner-generated conscious experiences. In the human mind, images of what is around us (the Mittelwelt) tend to be detailed and explicit in consciousness.
By contrast, monitor images from inside the body (the Eigenwelt) tend to be vague, fleeting and variable. You cannot see and understand your own internal organs without specialised training in anatomy and physiology. We are generally ignorant of internal processes and invent all manner of imaginary and irrelevant explanations for internal events. As a result our explanations for outside phenomena are very different from explanations of inside experiences.
The subjective Images from inside the brainbodymind can be called noumena. The brain places samples of its inner workings into consciousness – self-talk and dreams are all examples of normal noumena. Delusions and hallucinations are abnormal noumena. Noumena can be divided into images that resemble phenomena but are generated by the brain and images that originate from within the body and represent body states.
Information originating from inside the body is not clearly represented in consciousness. Inner senses belong to two groups - chemical and electronic. The more recently evolved electronic kind of sensing utilizes signals that travel along nerve networks that reach out to every cell in the body.
Images of the outside that are detailed and explicit in consciousness are called phenomena and representations of phenomena are said to be objective. The eye, for example, is a photon sensor that sends electronic signals to the occipital cortex and other regions of the brain that utilize visual input as information. The ear is a mechanical sensor that turns sound into electronic signals that are processed and experienced in and by the brain.
Consciousness is associated with awareness and vigilance. The unconscious animal will not experience the approaching predator but the conscious and vigilant animal maintains awareness of the local environment. Sensory receptors are tuned to features of the environment that may satisfy drives or signal danger.
There is much more happening in the world outside of us than we can ever understand and it is suggested that the human information processing system deals with approximately 11million bits of information at any one time. As demonstrated in many court cases, a detailed, complete account of a single moment or experience could easily consume many hours of documentation. As a result, we tend to simplify and approximate what is really going on. We dissect our Mittelwelt and Umwelt into discrete events (ignoring huge amounts of information in the process) and treat these events as the sole constituents of our experience, filtering the available information and specially selecting data that fits our habits and rejecting anything that does not. We describe what we are used to describing and we decide and act upon recognition - literally knowing again.
The continuous interaction of events in the brain and events in the world (noumena with phenomena) creates our experience of this world. The interface between phenomena and noumena result in the energy patterns in the world being transformed into a representational energy patterns in brains. The sense organs provide information for brain patterns to resonate with world events; these are experienced as perceptions and sensations.
Scientists refer to the continuously interacting and emerging events as a Mesh. From the point of view of this model, we are nodes or points of consciousness in the Cosmic mesh interacting with the mesh, warping it and forming an integral part of it.
Peter Drucker defines knowledge as "Information that changes something or somebody-either by becoming grounds for actions or by making an individual (or an institution) capable of different or more effective action." This definition highlights both individual and corporate aspects of knowledge. KM models focus on what kinds of information move through a system, how the information moves and the relationship of information to processes of change within that system. A number of biomedical models have suggested that the HIV virus can change other information at the level of the DNA of a CD4 cell. In this way, it is suggested, the Virus uses corrupted immune system cells to create replicas of itself thus compromising the ability of the immune system to defend against opportunistic infections. Because of this, HIV itself can be framed as knowledge in that it has the ability to change other information. The primary consequences and sequella of HIV infection are experienced at many levels ranging from the HIV+ individual through the family, the community, the culture, the society, the labour market, the economy etc.
From an information system perspective, the human body is a complex organisation of information and therefore qualifies as a complex information system. The human body satisfies the definition in that it contains information at the level of each individual cell and in the immune, nervous and endocrine systems and the communication transactions between these systems – mediated by chemical messengers called polypeptides.
An information system can be data, information, exformation (information rejected as not being necessary for information processing), knowledge held in stories, skills, relationships, values, beliefs, behaviours (which can be seen as a form of information exchange), symbols, metaphors, narratives, social and economic transactions, as well as policies, procedures and protocols. All these phenomena, and much more besides, constitute a complex, living dynamic information system. The system itself is in a state of continuous change as a result of the feedback mechanisms between its mutually influencing parts.
A particle physicist who knows about Brownian Motion (the random movement of particles in a solution) provides useful know-how and input into solving problems of traffic traffic control or the management of disease transmission. Anthroplogists used to studying pre-industrial cultures can provide insights into how social and community systems could be better designed. Film producers and directors have a wealth of experience in project management that has proven to be useful in helping design approaches to service delivery for government.
Seeing the similarity between processes, appropriating metaphors and language from one area of endeavour and applying them in another is probably something we all do - and now there is a word for it - Homological Transfer. Homological Transfer literally means to transfer logic of similar kind between areas of expertise. It engages the study of know-how, metaphors and stories and their systematic re-utilisation in different contexts. Homological Transfer is a knowledge creation process that takes principles and know-how from a field of study and applies them in a completely different context - and so is probably more in use than many would admit, especially given that we often end up doing that for which we were not formally trained.
Questions: What principles, stories and ideas have you found useful in getting things done, and from where did you learn them?
Clearly, transfer of know-how does not always have a positive impact and there are many examples of this too. Can you think of some?
While Africa remains the least developed continent, it’s quite startling to see a map showing how big it actually is. Africa is 30,3 million km² and so is larger than the combination of China (9,6 million km²), the US (9,4 million km²), Western Europe (4,9 million km²), India (3,2 million km²) and Argentina (2,8 million km²) plus the Scandinavian countries and the British Isles with room to spare.
And while we are all used to seeing maps of the world, Africa's size in relation to the World Map has been understated for many years and is actually longer and wider than the shape we are accustomed to. One reason for this is that these maps were first drawn in the age of the British Empire, when the emphasis was on accentuating the centre of Empire to the detriment of "the Provinces" or Colonies.
The map is NOT the territory – it is a special form of model that attempts to convey what Gregory Bateson called 'news of difference'.. A map might show, for example, the difference between land and the sea. It might show the differences between countries, and the roads and 'not roads'. In order to display these features, you must always remember that the scale will not be correct (the roads aren't actually the size they are on the maps and there is a lot of data removed from the map to make it comprehensible).
“In the 1970's, Arno Peters, German map maker, historian and journalist, developed an equal area map projection in order to counter the commonly used "Eurocentric" Mercator map projection. He stated, "In our epoch, relatively young nations of the world have cast off the colonial dependencies and now fight for equal rights. It seems important to me that developed nations are no longer at the center of the world, but are plotted according to their true size." He points out that on the Mercator map Europe’s 3.8 million square miles are made to appear larger than South America’s 6.9 million square miles. Peters initially wrote a controversial world history text and found that "the quest for the causes of arrogance and xenophobia has led me repeatedly back to the global map as being primarily responsible for forming people’s impression of the world." It is important to note that the Mercator projection is rarely used today except for the purpose it was originally designed for - navigation. Of course, many Mercator maps can still be found in use by graphic designers, in older classroom materials, and as inexpensive wall maps.
Cartographers have criticized the Peters map in part due to its distortion of the shapes of continents - one cartographer went so far as to describe the effect as being "the resulting land masses are somewhat reminiscent of wet, ragged, long winter underwear hung out to dry on the Arctic Circle." While each continent is reflected accurately in terms of area proportion, the overall effect of the maps is not a realistic portrayal of the earth. Cartographers argue that numerous projections developed since the Mercator projection (such as the Robinson and the Goode) succeed in achieving a more realistic image without a Europe centered focus. In fact, equal area projections had existed since 1772, but the press, the United Nations, the World Council of Churches, and the National Council of Churches heralded the Peters map as a way for the "Third World" to break away from colonial constructs.
Thus, timing was everything; many nations of the world had just achieved independence from colonial powers within the previous decade. Peters, as an accomplished journalist, knew the art of generating publicity. There ensued an on-going debate over the use of this map between cartographers on the one hand, and people who believed the map would change people’s perceptions about the Third World on the other.“
See a copy of Peters World Map
Compiled by Steve Banhegyi
Here is an interesting fact: In America, a billion is a thousand million written thus 1,000,000,000
However, in England, the British define a billion as a million million. That is 1,000,000 times 1,000,000 which would be written thus 1,000,000,000,000
Once you get past eight zeroes, the British names do not match the American names. For example, a number followed by 9 zeroes is in the British "milliards" but the American "trillions". In other words, the British Numbering System is completely different from the American Numbering System after the thousands and millions. Thousand and Million are the same, but then you see Milliard, Billion, Billiard, Trillion, Trilliard, etc.
This is important not only for economics but also for science - when you hear Billion and Trillions being bandied about, we need to be clear about whether we are talking about the British or American system. So, is that the American one thousand million equals a billion world, or the British one million million equals a billion world? This is important as it is the difference between 2,000,000,000,000,000 degrees C and 2,000,000,000,000,000,000,000 degrees C.
You can find out more about the British and American numbering systems at Numbering Systems and Place Value Website Additionally, the site looks at other naming systems for large numbers such as the Googols and Googolplexes.
This overview attempts to present useful key ideas necessary for the development of a community animation model in language that is clear and empowering in such a way that it emphases the application of Know-How. The structure as presented here draws together experiences from using the following models in an African context: Isivivane for Change and Transformation (Banhegyi 2001-2007) Isivivane . Additionally, the model draws inspiration from models developed by Walsh & Ungson (1991), Collison & Parcell (1998), Nonake & Takeuchi (1995) in that it emphasises the cultural context, group dynamics and linkages between participants. The approach stimulates a community into action and provides a basic know-how useful in the design and support of a sustainable system and guides a user through that which needs to be done in order to attain success. Essentially the model requires leadership in that someone needs to take the initiative, assume permission, ask the questions and implement the following steps.
The African Renaissance is about developing Africans and Africa. It is a call for the rebirth renewal, reinvention and repositioning of Africans and Africa in a globalizing world. Furthermore, it is a call to Africans to relearn and rediscover who they are and where they are in a global scheme of things. It is a vision bigger than the African Union, NEPAD and many other initiatives by individuals, communities, governments and multilateral organisations. It is not an effort to emulate 'world class' standards set by others but to set world standards to be followed by others. The African Renaissance is seen as the rebirth of the continent after centuries of suppression, correcting negative images. Rebirth must be through rediscovery of Africa's past, reversing the downfall into chaos. It is about planning for the future based on a new knowledge framework accommodating the ideas and philosophies that created the great empires of Ghana, Monomotapa, Songhai and Mali. (Gutto 2005).
The OAU Charter, accepted in Addis Ababa on 25 May 1963, stipulates that the Organization have the following purposes:
The AU superseded the OAU when the Constitutive Act of the African Union of 11 July 2000, was accepted in Lomé, Togo. According to the 'Act' the objectives of the AU is to:
One of the core objectives of the African Union (AU) is the promotion of peace, security, and stability on the Continent, as spelt out in Article 3 (f) of the AU Constitutive Act. To strengthen the AU’s capacity in the prevention, management and resolution of conflicts, Member States adopted, in July 2002, in Durban, South Africa, the Protocol Relating to the Establishment of the Peace and Security Council (PSC), which entered into force in December 2003. The Protocol, in article 2 (1), defines the PSC as 'a collective security and early-warning arrangement to facilitate timely and efficient response to conflict and crisis situations in Africa'.
This work is (c)opyright to Dr Dries Velthuizen African Wisdom site and is used with permission.
Virology has provided our culture with many useful insights and the term 'viral' and the viral metaphor spring up in the form of computer viruses, viral marketing, memetics and memeplexes. Human beings are by nature metaphorical beings and understand complex concepts through metaphor and analogy. In other words, we understand something in terms of something else.
However, trying to understand a virus in terms of a virus throws the thinker into ever more greatly convoluted loops of logic. To think about HIV, what we need is a good metaphor for HIV - what is it 'like'? What organisms or systems do we know of that do the things HIV does? Could the emergent properties of billions of human beings transacting with each other be doing to our planet what HIV does to our bodies?
A particularly useful area for those working with HIV is the field of memetics and memplexes. The term meme (pronounced like dream) was coined by Biologist Richard Dawkins in his 1976 book "The Selfish Gene" As examples of memes, he suggested “tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches”.
Memes are habits, skills, songs, stories, or any other kind of information that is copied from person to person. Memes, like genes, are replicators. That is, they are information that is copied with variation and selection. Because only some of the variants survive, memes (and hence human cultures) evolve. Memes are copied by imitation, teaching and other methods, and they compete for space in our memories and for the chance to be copied again and agin. Large groups of memes that are copied and passed on together are called co-adapted meme complexes, or memeplexes.
The word “meme” is now found in the Oxford English Dictionary where it is defined “meme (mi:m), n. Biol. (shortened from mimeme ... that which is imitated, after GENE n.) “An element of a culture that may be considered to be passed on by non-genetic means, esp. imitation”.
According to memetics, our minds and cultures are designed by natural selection acting on memes, just as organisms are designed by natural selection acting on genes. A central question for memetics is therefore ‘why has this meme survived?’. Some succeed because they are genuinely useful to us, while others use a variety of tricks to get themselves copied. From the point of view of the “selfish memes” all that matters is replication, regardless of the effect on either us or our genes.
Some memes are almost entirely exploitative, or viral, in nature (chain letters and e-mail viruses). These consist of a “copy-me” instruction backed up with threats and promises. Religions have a similar structure and this is why Dawkins refers to them as "viruses of the mind" Many religions threaten hell and damnation, promise heaven or salvation, and insist that their followers pass on their beliefs to others. This ensures the survival of the memeplex. Other viral memes include alternative therapies, new age fads and cults, children’s games, urban legends and popular songs, all of which can spread like infections.
At the other end of the spectrum memes survive because of their value to us. The most valuable of memeplexes include all of the arts and sports, transport and communications systems, political and monetary systems, literature and science. Memetics has been used to provide new explanations of human evolution, including theories of altruism, the origins of language and consciousness, and the evolution of the human brain. The Internet can be seen as a vast realm of memes, growing rapidly by the process of memetic evolution and not under human control. The field of memetics is new and controversial, with many critics, and difficulties to be resolved.
Without an operating system, computer hardware is inanimate and about as capable as a brick. In the early days, the operating system was considered to be an integral part of the computer until a brilliant move by Bill Gates when the hardware was separated from the operating system with MS.DOS Version 1.0. From this point, the operating system became glamorous, glitzy and branded as a consumer product - and had to be paid for separately to the hardware.
I use 3 operating systems - MS.Windows XPPro , Ubuntu 9,04 (Jaunty Jackalope) and Windows Mobile 6.0 on my HTC palmtop. I first started using Linux about 4 years ago. Up 'till then I used Microsoft exclusively apart from my experiences with some of the more exotic operating systems of the early 1980s which included the Commodore PET (with 16Kb RAM!), the Sinclair ZX-81, an o/s for designing integrated circuits called Gaelic and even an O/S called Gerbil.
In using different operating systems, I've noticed that each operating system causes me to interact quite differently with the hardware - an altogether different user experience. On Windows, there are particular rituals that are missing on Linux; I have to do defrags and chkdsks, run virus checker updates (I pay for virus checkers and anti-spyware) and and fiddle with swap files from time to time. I spend much more time on ubuntu now day-to-day and it has always been exceptionally reliable and stable. There are also thousands of software titles available for instant installation and download from Astronomy all the way to Managing a Zoo (I live in a house with teenagers!)
The point is that an operating system has many interesting parallels to culture - a culture inhibits certain behavours whilst stimulating others and in the same way certain programs can run within a particular operating system whilst others cannot. Your apple or Linux software won't run on Windows. A culture can also put you into certain patterns of being and behaving of which you can become completely unconscious - where things can become so commonplace and everyday that they become 'the way we do things around here' - they have become a paradigm.
Permaculture is a particularly useful metaphor in change management and KM and many organisations are using permaculture principles to teach design and sustainability. Permaculture concerns itself with the use of ecology as the basis for designing integrated systems of food production, housing, appropriate technology, and community development.
"Companies are actually living organisms, not machines. We keep bringing in mechanics, when what we need are gardeners." Peter Senge
Permaculture is built upon an ethic of caring for the earth and interacting with the environment in mutually beneficial ways. It seeks to design sustainable human habitats by observing and following the patterns of nature. Permaculture encourages holistic thinking which means that we consider all aspects of a system (irrespective of how seemingly insignificant they are) and how they interrelate with each other. In addition, permaculture encourages use to think about the consequences of any intervention. In very broad summary, Permaculture teaches an appreciation for design that:
According to Dunning (1997, 370) the geographical imbalance between the current 'technology revolution' and the 'population revolution' is a potential time bomb. The wealthiest 12% of the world population controls 85 % of the world's stock of created assets, while the rest owns or controls only 15 % of these assets. An increase of approxamately 50 % of the world population over the next five years will probably occur in the less wealthy part of the world. This imbalance can be addressed by Chinese and Indian economic development and removing the threat of ideological warfare (referring to the war between Islam and the West).
According to Parente (2001) the imbalances in the knowledge economy can be seen in the context of 'new growth models' which can be exogenous or endogenous. The 'exogenous growth model', also known as the 'neo-classical model' is a term that refers to a model of economic growth within the framework of neoclassical economics. A key proposition of neoclassical growth models is that the income levels of poor countries will converge towards the income levels of rich countries. However, since the 1950s, the opposite result has been observed. The developed world appears to have grown at a faster rate than the developing world, the opposite of what is expected according to a prediction of convergence. However, formerly poor countries such as Japan appear to have converged with rich countries, and in the case of Japan actually exceeded the productivity of other countries. However, the model does not take into account that the success of Japan was entrepreneurship and the strength of institutions, which served as catalysts for economic growth. It also does not explain how or why technological progress occurs.
Parente says that these limitations have led to the development of an 'endogenous growth theory', which endogenizes technological progress and knowledge accumulation. Endogenous growth theories usually rely on cycles to desribe an unstable pattern of events in search of equilibruim. Importance is given to the creation of new technologies and human capital. Organisations and individuals have an incentive to be innovative in order to gain an advantage over their competitors, thus improving productivity. The 'endogenous growth theory' has proven no more successful than the exogenous growth theory in explaining the income divergence between the developing and developed worlds. The main failings of these theories is their inability to explain non-convergence, or why some countries are still much richer than others.
Nabudere (2006d) asserts that the creation of ‘new growth models’ that replicate the old models in order to rationalise state intervention and investment cannot be accepted. The idea behind the old growth model that assumed that production functions (land, capital and labour) operated independently in relation to economic growth was wrong. The use of standard economics and ‘development theory’ intended to inform and explain how economic growth was achieved through the three ‘production factors’. Other variables such as social capital and tacit knowledge were ignored. It did not take into account the existence of different forms of ‘capital’ of which finance capital and indigenous knowledge were exploited without compensation. Furthermore, the old economic models were built on a mono-disciplinary approach that placed ‘standard economics’ above other human and social sciences, which demonstrated how science and tacit knowledge co-existed but were ignored or exploited. The recognition of other forms of knowledge is a prerequisite to the emergence of a model that places a premium not only on the stock of knowledge available to an enterprise, but even more on the capacity to learn new ideas.
Nabudere suggests a need for the revamp of educational policy and investment in education towards a more grass rooted ‘learning economy’ that responds to local needs and a culturally relevant ‘knowledge economy,’ to accommodate the pressure originating from the global economy, rooted in the solutions embedded in tacit knowledge and social capital. A new investment policy in education that recognises that knowledge is necessary for production is crucial in policy formulation.
Nabudere concludes that to reconstruct states that reflect the aspirations of Africans, people must consider the developments in the global political economy and link themselves to the positive forces within the global system in order to strengthen their local activities. Learning is no longer concentrated at a single location and scientifically and technologically related learning takes place outside the universities. Nowadays business, communities and several non-academic settings, where groups of people from different disciplines and institutions come together are centres of learning where boundaries that used to exist between academic and non-academic learning is becoming blurred as the ‘excluded middle’ is increasingly included. The 'learning economy' is a crucial aspect of 'knowledge-based economy', with the emphasis on 'learning to learn' in different environments, with a connection between intellectual capital and social capital, a change in organisation towards functional flexibility and training of students on how to learn. Policies must work towards a new convergence, which recognises that knowledge is necessary for production, and that other communities seek interlocking networks of economic and social relationships on globally as Africa moves into a 'learning economy'.
A recent study by the Department of Industrial Psychology, University of Stellenbosch (Du Toit, Engelbrecht and Pooven 2006) revealed that traditional African values, although in congruence with many universal ethical values, place more emphasis upon collectivism, collaboration, caring, dignity and respect. It is argued that these values should underlie a value-based leadership style to enhance team performance in modern organisations through better integration and understanding of a multi-cultural workforce and the management of diversity with a focus on teams. As traditional European and American management concepts do not always provide for the needs of a diverse society in a process of economic and social development, the approach is based on local values, more specifically the value system of Ubuntu. Practicing the social values of Ubuntu in organisations would not only preserve these values in the modern business world, but would also lead to team effectiveness. A leader who has a values-based style of leadership, and who is aware of existing value systems within the team can achieve role modelling. Mbigi and Maree (2005, 117) assert that the creative force of history is not ideology, religion or politics but the way in which people organise work and create value.
The factory system that brought people out of cottage industry and feudal conditions was mass production that brought about the current, mass-consumption civilisation and the mastering of technology. The fame of the USA and the ascent of Japan was the result of mastering mass production and technology. The authors suggest that Africa mastered excellent production techniques and accomplished mass customisation before any other country in the world. Furthermore, African clairvoyants and intellectuals must shift their attention from the politics of resistance to the politics of production. The fascination with European literature must change to fascination with their African roots, which lie in the traditional knowledge practices of Africa. It is only then that Africa will have the confidence to create economic growth. If companies are to be competitive in global markets, they have to learn to harness the collective will, intelligence and energy of their people by creating enterprising communities through the canonising of Ubuntu. They have to select the best business practices and then create team practices that are in harmony with the values of Ubuntu, which means encouraging people to express themselves through the group.
According to the Council for the Development of Social Science Research in Africa (CODESRIA) some societies are marked by isolation, scientific marginalisation, extremely unstable material conditions, political repression, a devastating brain drain and lack of academic freedom. The challenge is to nurture, develop and sustain a productive, highly motivated community of social science researchers, transcending disciplinary, gender and generational barriers. A further challenge is the strengthening of the institutional basis of knowledge production by developing programmes of collaboration with other centres of social research in Africa whether they are national or (sub-) regional, university-based or independent. Moreover, in order to produce knowledge, numerous scientific activities and tools, whose value and impact are universally recognised, must be developed. These include:
This work is (c)opyright to Dr Dries Velthuizen African Wisdom site and is used with permission.
Many researchers – and their advisors on research method – adopt a doctrine called empiricism, which claims that researchers may only use empirical methods. This restrictive doctrine impoverishes any academic discipline where it is dominant. The main reason is that a discipline only qualifies for the status of a science after it has progressed beyond empirical generalisations to explanatory theories; but although empirical methods are useful for discovering the former, they are inherently useless for creating the latter. So the empiricist doctrine retards scientific progress. Researchers should be aware of this danger, and research methodologists should attempt to counter it.
There is a world of difference between the terms ‘empirical’ and ‘empiricism’. The term ‘empirical’ refers to a battery of very useful research methods. The term ‘empiricism’ refers to a restrictive methodological doctrine which claims that researchers may only use empirical methods. The purpose of this paper is not to disparage empirical research methods, but to warn readers that the empiricist doctrine impoverishes any discipline where it is deeply entrenched (Gower, 1997, p10), and to suggest some avenues of counteraction. The subsequent sections explain why the empiricist doctrine impoverishes research. The first section shows that researchers need knowledge of various kinds of research processes and knowledge products, and that this knowledge is distributed over three academic disciplines: Philosophy of Science, History of Science and Research Methodology. The next three sections examine the origin and current status of the empiricist doctrine in the Philosophy of Science, and the debilitating effect of empiricism on research process and product knowledge in the History of Science and in Research Methodology. The last section calls for counter-action in the form of meta-research aimed at identifying non-empirical research processes and knowledge products that could be mentioned in those three disciplines especially in Research Methodology.
As the argument in those sections is lengthy, no space is left over for detailed analysis of the impact of the empiricist doctrine on the Information Systems discipline, nor on any of the other disciplines under the umbrella of Informing Science. Readers are in vited to judge by themselves, from their personal experience, whether those disciplines are dominated by the empiricist doctrine, and whether that doctrine has impoverished them.
Research is a process of producing new knowledge. So it is a productive process similar to the productive processes of manufacturing cars, computers, software, etc. Some useful insights emerge by analysing the other productive processes and then comparing them with the research process.
All productive processes require productive knowledge. For example:
Productive knowledge consists of process knowledge as well as product knowledge (see Mende,2000 for more detail). When manufacturers establish a new factory, they have to decide what the factory is to produce, and how the factory will produce it. So they need to know what kinds of manufactured products are needed, and what kinds of processes can be used to produce them. For example, when Henry Ford decided to produce motor cars, he had to know that people need cars, and that cars can be produced on a production line. Similarly, when researchers embark on a research project, they have to decide what knowledge product to produce and how to produce it. So they too need to know what kinds of knowledge products are required and what kinds of research processes can be used (Kantorovich, 1993, p11;Singleton, Straits & Straits, 1993, p18). For example, when Ohm embarked on his famous research project to find the empirical law of electric current variation with voltage, he had to be aware that people need empirical laws, and that empirical laws can be produced by means of inductive research processes. Similarly, when Darwin embarked on his famous research project on the theory of evolution, he had to be aware that people need theories, and that theories can be established by means of deductive research processes.
Therefore, by analogy with manufacturing management, researchers need knowledge of different types of research processes and knowledge products. Since this is knowledge about knowledge, it may be called ‘meta-knowledge’. For convenience of access, all our meta-knowledge should be concentrated in a single academic discipline. But that is not so. Instead, our meta-knowledge is scattered across three different disciplines, namely History of Science, Philosophy of Science and Research Methodology.
History of Science is an old-established discipline, which began with the ancient Greeks (Lloyd,1973). Today, it describes many of the knowledge products that were discovered by real-life researchers, and also some of the research processes that those researchers actually used. Philosophy of Science is another old-established discipline, which also began with the ancient Greeks. Today, it mainly analyses the validity of existing knowledge products, but occasionally considers the research processes too. Unfortunately, many publications in Philosophy of Science make scant reference to History of Science, and most of them ignore Research Methodology altogether.
Research Methodology is an emerging new discipline that aims to unify many of the methodological principles found in the various sciences and proto-sciences. During the latter half of the 20th century, specialised methodological branches have emerged in many of those disciplines to focus on issues of method. For example, in the natural sciences there are textbooks on experimental techniques of physics and chemistry, and on microscope techniques in biology and geology (Furniss, Hannaford, Smith, & Tatchell, 1989; Heinrich, 1965; Sanderson, 1994). In the social proto-sciences there are textbooks on experimental psychology, sociological method, anthropological research, educational research and business research (Cassell & Symon, 1994; Christensen, 1980; Cole, 1980; Foskett, 1965; Pelto & Pelto, 1978; Zikmund, 2003). Yet certain methods are common to many of these disciplines.
“The research procedures of most academic disciplines follow the dictates of the scientific method ... In many instances, only the tools of research are different. The biologist gathers data by way of the microscope, the sociologist does likewise through a questionnaire. From there on the basic procedure of each is the same: to process the data, interpret them, and reach a conclusion based on factual evidence.” (Leedy, 1989, p. vii).
The new discipline of Research Methodology identifies and explains these common research procedures. Unfortunately, most publications in Research Metholology make scant reference to Philosophy of Science and History of Science.
The three disciplines are subject to the force of fashion (Lovelock, 1995, p. 204; Nagel, 1961, p.115; Sperber, 1990). This force arises in any social group, including a community of scholars. A scholar in an academic discipline is subject to research fashions in the same way as anyone else is subject to clothing fashions, motorcar fashions, food fashions, etc. The next three sections focus on the empiricist fashion. It originated in the Philosophy of Science, where it is now dismissed with contempt; but has spread to History of Science, where it is still mildly influential, and to Research Methodology, where it remains dangerously dominant.
The precursor of empiricism was a philosophical doctrine called positivism. This was a doctrine of neglect. It called for neglect of a particular class of knowledge products, namely theories, and especially those theories that involve un-observable first causes (Oldroyd, 1986, p. 169). Positivism originated in the 18th century, when Berkeley denied the reality of theoretical objects such as the Newtonian forces of mechanics (Losee, 1993, p. 168). Positivism was subsequently disseminated by two influential 19th century philosopher-scientists: the sociologist Comte, who asserted that ‘science must study only the laws of phenomena’, and the physicist Mach, who attempted to purge all theoretical terms from Mechanics (Losee, 1993, p. 170; Whewell, 1860, p. 183). Positivism was disseminated even more widely in the early 20th century, by a group of philosophers
called the Vienna Circle, who regarded theoretical objects as meaningless, and a theory as a mere computational device for describing and predicting phenomena (Harre, 1960, p. 46; Hollis, 1994,p. 42).
Empiricism (or inductivism) is the logical consequence of positivism for research processes. In the same way as positivism dismisses theory, the doctrine of empiricism dismisses deductive theoretical methods, and demands that researchers should restrict themselves to inductive empirical methods.
“induction ... is the method proposed by crude empiricism to distinguish scientific inquiry from non-scientific speculation” (Doyal & Harris, 1986, pp. 2-3).
The remainder of this section demonstrates that the positivist and empiricist doctrines are dangerously restrictive. These doctrines fail first to an inductive empirical argument, and then to a deductive explanatory argument.
The inductive argument is based on evidence from the History of Science, namely the many instances where empirical methods produced significantly less useful knowledge products than theoretical methods.
So, empirical methods were often less useful than theoretical methods. Now if today’s researchers were trapped into empiricism, they would be restricted to empirical methods, and would be unable to produce the more useful theories. Therefore empiricism would impoverish research.
The deductive argument against the positivist empiricist doctrines is based on two propositions of the modern Philosophy of Science:
Proposition 1 means that an academic discipline does not qualify for the status of a science until it has progressed beyond empirical generalizations to explanatory theories. For example, the branches of Physics and Astronomy now called Dynamics and Celestial Mechanics were labelled ‘natural philosophy’ at the time Galileo formulated his empirical laws of motion and Kepler for mulated his empirical laws of planetary orbits. They only achieved the status of sciences after Newton devised a deductive theory to explain Galileo’s laws and Kepler’s laws (Toulmin, 1953, p. 50). The first proposition is supported by the quotes in Table 1.
Proposition 2, which asserts that empirical methods are inappropriate for creating explanatory theories, follows from the fact that empirical research involves inductive reasoning, whereas theoretical research involves deductive reasoning. Empirical methods induce generalisations from facts. Theoretical methods then explain the empirical generalisations by generating deductive arguments to the generalizations from first causes, which are usually un-observable. So inductive methods are useless for devising deductive explanatory theories.
This proposition is easy to confirm from cases in the History of Science. For instance, Newton made no observations or experiments, and analysed no data in devising the theory of Mechanics; neither did Dalton in devising the atomic theory of Chemistry, nor Darwin in devising the biological theory of evolution, nor Einstein in devising the relativity theory.
Table 2. Inadequacy of empirical methods for creating explanatory theories
The positivist and empiricist doctrines now fail to a simple reductio ad absurdum. Science is characterised by the existence of deductive explanatory theory. Yet inductive empirical methods are inappropriate for creating deductive explanatory theory.
So inductive empirical methods are unlikely to produce a science. The empiricist doctrine restricts researchers to inductive empirical methods. So it impoverishes research by inhibiting progression to scientific status.Yet the aim of this doctrine is to ensure scientific status.Therefore empiricism is absurd. So empiricism is untenable in the Philosophy of Science. Indeed, some philosophers have rejected the absurd empiricist doctrine in the past, and many others reject it today – see table 3.
Nevertheless, empiricism still influences the other two disciplines that are concerned with research method.
Historians of Science are subject to the force of fashion – particularly by fashions in the Philosophy of Science.
“The historiography of science, more than the history of other aspects of human thought, is peculiarly subject to philosophic fashion” (Hesse, 1980, p. 3).
Many older historians have been influenced by the old empiricist philosophy (Hesse, 1980, p. 4;Hollis, 1994, p. 42). So when they decide which processes and products to study, they are likely to over-emphasise empirical processes and products, and neglect theoretical processes and products (Hesse, 1980. p. 5). Therefore, historians may have missed potentially useful research products and processes.
The authors of Research Metholology textbooks are also subject to the force of fashion. They have been influenced by two fashions, namely scientism and empiricism. Scientism was a pervasive research fashion until a few decades ago. According to this fashion, all scientists ought to emulate the ‘empirical-analytical’ method, which was supposedly used by
many natural scientists, particularly physicists.“The empirical-analytical method is the only valid approach to improve human knowledge.
What cannot be investigated by it, cannot be investigated scientifically at all and therefore must be banned from the domain of science as unresearchable and consequently as unpublishable, unfundable and almost as unspeakable” (Klein & Lyttinen, (1985).
Table 4. The popularity of positivist empiricism
The authors rarely state these norms explicitly, but rather let their readers infer them implicitly. They do that in three ways.
First, some authors simply define ‘research’ as a process of data collection and data analysis (e.g. Bailey, 1987, p. 11; Creswell, 1994, p. xvii; Erlandson, Harris, Skipper, & Allen, 1993, p. xvii; Leedy, 1989, p. 9; Miller, D. C., 1970, p. v; Neale & Liebert, 1986, p. 7; Riley, 1963, p. xiv; Tuckman, 1978, p. 12-14; Williamson, Karp, Dalphin & Gray, 1982, p. 4).
Second, other authors define ‘the scientific method’ as a process of data collection and data analysis (e.g. Bynner & Stribley, 1978, p. 4-8; Heiman, 1995, pp. 9, 19; Kerlinger, 1986, pp. 10-13; Labovitz & Hagedorn, 1976, p. 23; Leedy, 1989, p. 80; Lehmann & Mehrens, 1979, p. 3; Mason & Bramble, 1978, p. 26; McMillan & Schumacher, 1997, p. 9; Neuman, 1994, p. 8-11;
Rummel, 1964, p. 11-15; Williamson et al., 1982, pp. 6-8).
Third, others define the ‘hypothetico-deductive method’ as hypothesis deduction from theory followed by data collection and analysis, and recommend this method as the model of research in any science (e.g. Bailey, 1987, p. 39; McNeill, 1985, p. 42; Sekaran, 1992, p. 16, TerreBlanche & Durrheim, 1999, p. 4).
So the textbooks of Research Metholology either implicitly adopt the empiricist doctrine, by excluding all research methods other than the empirical methods, or explicitly adopt the empiricist doctrine, by suggesting that empiricism is necessary for an academic discipline to achieve scientific status. Examples are shown in Table 5.
There are at least three reasons why positivism and empiricism are popular among researchers and their methodological advisors. One reason is that “Today’s science teaching reflects yesterday’s philosophy of science” (Kantorovich, 1993, p. 255).
Another reason is that many research advisors know a great deal about confirming and falsifying theories, but know next to nothing about creating them (Phillips, 1985, p. 8). Furthermore, they seem to be unaware that their (physicist) role-models used not only inductive empirical methods of confirmation but also used deductive methods of discovery (Chalmers, 1982, pp. xv-xvi). So when authors write textbooks of Research Methodology, they would have no option but to emphasise the well-known empirical methods, and neglect the little-known theoretical methods.
Similarly, some authors may neglect theoretical methods because they are unaware of the deductive-explanatory role of theory (e.g. Breakwell, Hammond & Fife-Shaw, 1995, p. 5; Bynner & Stribley, 1978, pp. 4-9; Erlandson et al. 1993, p. 16, 50; Kerlinger, 1986, p. 9; Labovitz & Hagedorn, 1976, pp. 14-18; Leedy, 1989, p. 7; Mason & Bramble, 1978, p. 53; McNeill, 1985, p. 176;
Mouton & Marais, 1990, p. 143; McMillan & Schumacher, 1997, p. 8; Neuman, 1994, pp. 41-43; Sekaran, 1992, p. 20, Singleton et al., 1993, p. 23; Riley, 1963, p. 9; Terre Blanche & Durrheim, 1999, p. 404).
A third reason is that some researchers and their advisors have made a virtue of necessity. They confine themselves to empirical methods because theoretical methods have not yet been seen to succeed.
“A great many parts of physics are tied together with a strong interconnecting network of fundamental physical theory from which all other parts can be derived, so-called first principles. On the other hand we have fields ... where empiricism is the order of the day simply because there is no generally valid group of first principles from which to operate.” (Siever,1970, p. 23-4).
“In the early periods of developing a discipline from an applied field, initial efforts are usually directed more toward establishing empirical facts. Later, facts from separate studies can be synthesized and ultimately integrated into theories.” (McMillan & Schumacher, 1997, p. 22).
This explains why some authors, even though they are aware of deductive explanatory theory, nevertheless restrict their textbooks mainly or exclusively to empirical methods (e.g. Babbie,1989, p. 46; Heiman, 1995, p. 17; McBurney, 1994, p. 42; Miller, D. C., 1970, p. 9; Phillips, 1985, p. 11; Rosnow & Rosenthal, 1996, p. 39; Singleton et al., 1993, p. 23; Williamson et al.,
1982, p. 23).
Nevertheless, whatever the reason for conforming to the positivist-empiricist doctrine, that doctrine is absurd, and can therefore be harmful. It is likely to be harmful in at least two ways. First, if researchers are restricted to inductive empirical methods, they would be unable to produce deductive explanatory theories: so positivist empiricism would paralyse theoretical research. Sec-
ond, positivist empiricism is likely to affect the methodological selection mechanism. In the many academic disciplines that are dominated by positivist empiricism, researchers will tend to reject any Research Metholology textbook that fails to conform to the dominant fashion. So the authors will be motivated to conform too. The conformist textbooks would then reinforce the dominant fashion among research students.
Several other authors have expressed additional concerns, shown in table 6.
Research cultism. A research cult can form if a research fashion becomes self-perpetuating. Klein & Lyttinen (1985) have explained how this can happen. Suppose a specific set of research processes and knowledge products has become fashionable in a particular discipline. Then research supervisors who adopt that fashion will insist that their students use the fashionable research processes to produce the fashionable knowledge products. When these students in turn become supervisors, they too will insist that their students use the fashionable processes to produce the fashionable products. And so on.
Cults are likely to form around positivist-empiricism:
“All those who do not abide by the precepts of empiricism are thus threatened with excommunication from the bosom of science” (Doyal & Harris, 1986, p1).
Cults are particularly likely to form in academic disciplines that have not yet secured scientific status, which include the social ‘sciences’ and most other disciplines except the natural sciences.
“One of the livelier academic debates of recent years has concerned the scientific status of those disciplines gathered under the heading social sciences ... Academicians have disagreed about calling these disciplines sciences” (Babbie,1989 p30).
In these disciplines, an additional mechanism of cult formation is present. The leaders of these disciplines aspire to scientific status, but have not studied Philosophy of Science. So they are easily trapped by the positivist or empiricist doctrines. When that happens, it affects hiring and promotions, the funding of research projects, and the publication of research papers.
Accordingly,subordinate researchers are obliged either to accept those doctrines too, or else give up a research career. So a vicious circle forms, and those doctrines can dominate the entire discipline within a few decades. For example,
Positivist-empiricist cultism would intensify the harmful effects identified earlier. These cults would be dangerously restrictive, confining researchers to a narrow range of processes and products (Bauer, 1992, p. 75; Harre, 1981; Lovelock, 1995, p. xvii; Whitley, 1984, p. 146). In particular, they would paralyse theoretical research, and that would – ironically – prevent an academic
discipline from achieving scientific status. The failure to achieve scientific status may then step up the vicious circle. Academic leaders would attribute the failure to insufficiency of empirical research, and increase their efforts to eliminate non-empirical research. As a result, empiricist cultism can retard progress for decades – as happened in Psychology sixty years ago, and as may be happening in Information Systems today.
Positivist and empiricist cultism are also likely to affect the methodological selection mechanism. For instance, if an academic discipline is dominated by an empiricist cult, then researchers would be strongly discouraged from doing any non-empirical work, and would avoid buying any Research Metholology books that do not conform to the ruling empiricist doctrine. So when methodologists write metholology textbooks, or teach metholology courses, they would be totally rejected unless they conform to the empiricist mould. Therefore, if many academic disciplines are dominated by empiricist cults, the demand for on-empiricist Research Metholology books would shrink to extinction, and publishers would avoid such books. This would account for the flood of empiricist textbooks of Research Metholology that is reaching the bookshops today.
In summary, then, the emerging discipline of Research Methodology is facing some very serious problems. Methodologists may be inhibited by research fashions. They are probably inhibited by the empiricist fashion (or even cult), which is likely to paralyse theoretical research. As a result, methodologists have probably missed many useful research processes and knowledge products.
Content analysis. This suspicion was confirmed by analysing a sample of Research Metholology textbooks whose authors appear to have minimum bias towards positivist empiricism. Table 7 identifies the actual research processes mentioned in these textbooks, and lists them in empirical and non-empirical columns.
Note: The name ‘grounded theory’ does not actually refer to a deductive explanatory theory, but rather to a group of related empirical propositions. A more realistic name would be ‘empirical model’.
Table 7 has two implications. First, the non-empirical column has relatively few entries. So even though the textbooks were selected for minimum bias, each one actually has a very strong positivist/empiricist bias. Generalising from the sample, one would therefore expect that most textbooks of Research Metholology are biased towards positivist empiricism.
Second, few such books provide any coverage of several important research processes that have been mentioned by philosophers and historians of science. For instance:
Therefore methodologists have indeed omitted many useful research processes and knowledge products. Consequently:
The impoverished empiricist textbooks of Research Metholology cannot qualify as comprehensive sources of research processes and knowledge products.
Researchers who rely exclusively on those books are likely to find themselves in a methodological rut.
Academic research programs that refer students exclusively to those books may lead the next generation of researchers into a methodological rut.
The prevalence of empiricism casts serious doubts on the methodological recommendations of History of Science, and especially Research Methodology. So there is a need for corrective action.
When people seek practical advice on how to do research, they are more likely to select textbooks entitled ‘Research Methods’ than textbooks entitled ‘Philosophy of Science’ or ‘History of Science’. They would be justified in doing so because, firstly, the title ‘Research Methods’ is more obviously relevant to the work of researchers, and secondly, because authors in Philosophy of
Science and History of Science rarely aim to provide practical advice on how to do research. So the corrective action should mainly be aimed at the emerging discipline of Research Methodology.
Authors in this discipline could do several things.
The previous sections have shown that the empiricist doctrine is dangerause, because it impoverishes research with its absurd claim that researchers should use empirical methods exclusively. In particular:
Further work is necessary to assess the extent to which empiricism has entrenched itself in the Information Systems discipline – and still remains entrenched – as also in the other disciplines under the umbrella of Informing Science, and to assess the extent to which it has impoverished previous research – and is impoverishing current research. The present author’s personal experience suggests that empiricism has strongly influenced the Information Systems discipline, although its influence is beginning to decline. So there is hope for the future. In particular, someone may soon break out of the inductive empiricist mould and devise a deductive theory of IS that explains the many empirical findings of the previous decades.
Jens Mende has a bachelor’s degree in applied mathematics, a diploma in computer science, a master’s in management, and fifteen years’ practical experience, mainly in information system analysis, design and programming. He has spent the last twenty-five years at the University of the Witwatersrand, teaching programming, system design, IS management and report writing. He has published three dozen papers on computer education, system design and IS management, and has written another two dozen on logic, report writing, research method and evolution everywhere.
Jens passed away in 2007 and this resource is based on his original manuscript.
There is an important concept in KM that suggests that all knowledge is created by asking questions; the question is therefore a basic tool of KM. The the question here is, what is the smartest question that you can ask? Here are some possible answers to the question:
increasingly I'm starting to see software titles for corporates that represent themselves as 'Knowledge Management Systems Software' or words combing these magic words in some order. The idea is a clear and a seductive one; you can somehow take what is in the heads of your people and make it magically available to future generations via some computer-based information technology. With all the features and marketing hype around such software, it is important to remember some fundamentals:
The technologies and standards used to store electronic information have not been around for a very long time at all and there is no guarantee that you will be able to meaningfully access knowledge stored in a piece of software in years to come. This is because hardware and software standards change constantly. I found this out when I was trying to retrieve my old lecture notes from Wits that I had stored on a 360KB floppy - although I had a technology to read the disk, it was totally blank after 15 years of storage.
Somehow we are making enormous assumptions about our society and infrastructure if we assume to be able to access systems - sometimes on the other side of the planet - instantly. What happens if suddenly you can't access your laptop? If the internet is down? What happens if you you don't have hardware or electricity? What do we have to fall back on?