Knowledge and Data Sharing

Split off from original Information management page, which covers data protection, staff handbooks etc.

Information Resources

Note: Freedom of Information (England and Wales, also in Scotland with some differences) is a legal right to request access to all types of “recorded” information held by public bodies. Organisations providing public services might be caught in that the contracting body could make additional requirements to allow them to meet FoI requests, but otherwise unlikely. See Lobbying pages for FoI links.

Education and Training for Information Work in the Voluntary Sector is a research report produced by Leeds Metropolitan University (now Leeds Beckett) early 1999. An executive summary may still be somewhere on their website.

Aslib Journal of Information Management.

Chartered Institute of Library and Information Professionals. 7 Ridgmount St, London, WC1E 7AE, phone 020 7255 0500, email: info@cilip.org.uk Also see their Information Literacy group?

Knowledge Management

Not to be confused with information management, but can be closely connected. Some see it as part of de-skilling and job reductions: extracting and exploiting the knowledge held by individuals about work in the round. But, to quote from elsewhere on VR: “How good is an office manual if it doesnt include some element of extracting and collating knowledge otherwise locked up in an individual?”

So what is it? There are varying views about what KM involves – see quotes below. Distinctions are made between Explicit knowledge (recorded) and Tacit knowledge (personal know-how); a defined body of information as distinct from a person’s state of being in respect to that body. Data, information, knowledge, understanding and wisdom are all separate terms with different meanings, although writers (and organisations) often muddy these.

There is very little material on the web explicitly aimed at the voluntary sector. We would be delighted to hear of any.

  • Knowledge management in development matters site is connected with – a “community of international development practitioners who are interested in knowledge management and knowledge sharing issues and approaches”. Knowledge Management for Development Journal.
  • Oxfam Canada’s Sharing Knowledge Handbook. This is written for those “working in villages, towns and rural areas who wish to transform their communities through information sharing”. Presumably from a third world perspective.
  • More IM than KM: Development Informatics working papers from Institute for Development Policy and Management.

It is probably more in what KM is applied to, rather than how, that the voluntary sector differs. So the following links (many quite old but should still work), could be useful.

  • A Delightful Dozen Principles of Knowledge Management (pdf) excerpt from Verna Allee is a good discussion tool.
  • Inside Knowledge magazine.

FreePint, the newsletter for information professionals, had an article on Knowledge management for development: an international organisation’s perspective, November 2005.

Fostering the Collaborative Creation of Knowledge: A White Paper from IBM Research gives some background on managing information in a holistic way (or as they say, an ecological view). We can’t find the paper on the site any longer!

But can knowledge be managed, as individuals have different ‘knowledge bases’? See The Nonsense of ‘Knowledge Management’.



Some quotes

Peter Honey quoting Prof Susan Greenfield (name dropper!)

‘information is just facts which on their own are not at all interesting. Knowledge occurs when disparate facts are linked and turned into ideas.’ (Training Journal, June 2000)

From VNU’s Knowledge Management White Paper:

“What managing knowledge as a resource means in practice actually spans a continuum from generating efficiency to fostering innovation.”

Simon Kent, of Knowledge Management Software in Computer Weekly (June 01):

“Knowledge …. is information’s evolutionary descendant, transcending primitive emphases on hardware, bandwidth and Java compatability with something much more powerful and sophisticated: individual and collective experience that can be leveraged to benefit virtually any activity.”

From US government’s KM web site:

“Essentially, knowledge management is at the intersection of culture, philosophy, and technology connecting people, communities and ideas for action.”

Knowledge Praxis quotes from Karl-Erik Sveiby’s posting to the Knowledge Management Forum, identifying two “tracks” of knowledge management:

  • Management of Information. To researchers in this track, according to Sveiby, “. knowledge = Objects that can be identified and handled in information systems.” [A mechanistic or object approach]
  • Management of People. For researchers and practitioners in this field, knowledge consists of “. processes, a complex set of dynamic skills, know-how, etc., that is constantly changing.” [A cultural or process approach]
  • [to which they add a Systematic approach, which combines and adds to the other two]

from Larry Prusak, director of IBM Institute for Knowledge Management, as interviewed for ebusinessforum, Oct 00:

Key steps in instituting a knowledge-management programme: “A little strategy goes a long way. There are 4 simple steps: What knowledge do you want to work with? Where is it? What do you want to do to it? and to what end: what would you gain if you did this?” …. “You could do it in a day or two.”

“Heirarchy is a distortion of knowledge …(it) is a 19th century concept.”

Designing a knowledge-management system: “You’re better off enacting one than designing one. Letting the people who work in these organisations enact it, and give them loose advice.”

from Michael Schrage, writing in Fortune magazine:

“an objective review would confirm that most firms grossly overinvest in technologies that let people see what’s going on and dramatically underinvest in delegation and true empowerment…….knowledge confirms the absence of meaningful power.”

In conclusion after discussing how efficient technology networks can lead to poor data due to ‘selfish’ practice by staff, managers or customers: “business reality dictates that organizations that commit to strategic networking must invest as much effort in designing the incentives for honest disclosure as they do in designing the technical infrastructure itself.”

Open data and data sharing

See Sector Development, Statistics

  • New Philanthropy Capital has been working on data sharing across sub-sectors (not just about funding).
  • The Global Value Exchange, previously WikiVOIS, is an open source database for individuals and organisations who are trying to account for and measure the social or environmental value that their activities create, http://www.globalvalueexchange.org.
  •  Data Unity is an open source web tool which lets you explore and visualise data, and then share discoveries with others. http://www.dataunity.org/.
  • Digital Impact (was Markets for Good) “is an effort by the Bill & Melinda Gates Foundation, the William & Flora Hewlett Foundation, and the progressive financial firm Liquidnet to improve the system for generating, sharing, and acting upon data and information in the social sector”.

Avoid These Ten Benchmarking Mistakes

Article written by Anne Evans, Benchmarking Link-up Australia

Benchmarking has become embedded in most organisations as part of the way they stay competitive. But there are lots of opportunities for benchmarking to go wrong. Here are some of the most common mistakes organisations make when benchmarking, and how you can avoid them.

Mistake #1. Confusing benchmarking with participating in a survey.

A survey of organisations in a similar industry to yours is not really benchmarking, whatever it may be called. Such a survey will give you some interesting numbers, but benchmarking is the process of finding out what is behind the numbers. In other words, a benchmarking survey may tell you where you rank, but it won’t help you improve your position.

Mistake #2. Thinking there are pre-existing “benchmarks” to be found.

Just because some survey or study says that a cost of $2.35 is the “benchmark” cost of a particular transaction, does not mean that you must perform that transaction for that price. The so-called “benchmark” may simply not be applicable to your markets, customers or resource levels. Insist on identifying your own benchmarking partners and finding out from them what is achievable, and then whether you can achieve a similar level of performance.

Mistake #3. Forgetting about service delivery and customer satisfaction.

Benchmarking stories abound of organisations that have become so fixated on the cost of providing their product or service that they have failed to take the customer into account. Paring down the costs often rebounds in lesser service delivery, so customers go elsewhere and ultimately you don’t have a business. Take a “balanced scorecard” approach when developing your benchmarking metrics.

Mistake #4. The process is too large and complex to be manageable.

A process is a group of tasks. A system is a group of processes. Avoid trying to benchmark a total system – it will be extremely costly, take ages, and be difficult to remain focused. Better to select one or several processes that form a part of the total system, work with it initially and then move on to the next part of the system.

Mistake #5. Confusing benchmarking with research.

Benchmarking presupposes that you are working on an existing process that has been in operation long enough to have some data about its effectiveness and its resource costs. Commencing a new process, such as developing a new employee handbook by collecting other people’s handbooks and taking ideas from them, is research, not benchmarking.

Mistake #6. Misalignment.

Choosing a benchmarking topic that is not aligned with the overall strategy and goals of the business – or worse, cuts across some other initiative the organisation is already taking. A Lead Team at the strategic level needs to oversee the benchmarking project and make sure that it is in line with what is happening in the business as a whole.

Mistake #7. Picking a topic that is too intangible and difficult to measure.

“Employee communication” is probably the most slippery concept that exists in an organisation, but it is often cited as one of the worst problems, so many organisations try to benchmark it. Encourage your benchmarking team to select instead a part of the topic that can be observed and measured; for instance, the process of distributing memos around the organisation.

Mistake #8. Not establishing the baseline.

Going out to make benchmarking visits before you have analysed your own process thoroughly. Benchmarking assumes that you already know your own process and its level of performance thoroughly. After all, that information is what you have to offer to your benchmarking partners in exchange for the information you are seeking from them. Make sure your benchmarking team is very clear about what it wants to learn before you approach potential benchmarking partners.

Mistake #9. Not researching benchmarking partners thoroughly.

This is essential in selecting the right benchmarking partners, so you don’t waste their time or yours. There is a rule of benchmarking etiquette that says you should never ask a benchmarking partner a question that you should have been able to answer for yourself through researching the literature in the public domain.

Mistake #10. Not having a code of ethics and contract agreed with partners.

Your partners should be clear about what you are seeking to learn from them, how that information will be treated, who will have access to it and for what purposes it will be used. Ideally, this should be formally agreed. The benchmarking code of practice offered by the American Productivity and Quality Centre provides a useful model.


© Anne Evans, 1997. Reproduced with permission of the publisher. Address enquiries to Benchmarking Link-Up Australia, 76 Garton Street, North Carlton, Victoria 3054, Australia. Tel: (+61 3) 9380 5878, fax: (+61 3) 9387 4526. E-mail: benchmrk@ozemail.com.au

The original text is one of a number to be found at Benchmarking Link-Up Australia’s website – which seems to have moved so, sorry, we can’t give you a link!

Planning, Evaluation, Quality

Outcomes management, Benchmarking, Quality Standards, Social Impact, further concepts and resources.

Introduction

This is not just something you get consultants or professionals to do! There is an increasing trend towards evaluation, quality management, strategic planning etc. Performance measurement (and reporting) is increasingly a funding requirement, and should be integrated into a performance improvement (or maintenance) strategy as far as possible.

Some people think all this gets in the way of getting on and doing the work, which can be true, but don’t you want to be sure you are actually doing useful stuff effectively, and getting the most out of your limited resources? Below we give some basics on the jargon and ideas which consultants use – while they can often bring in a wider view and feed in other organisations experiences, their main benefit is often forcing you to step back from the daily grind and take stock. Note that some jargon can be used differently depending on people’s background and experience – stopping to agree meanings at the start of an exercise can be worthwhile, if tedious!

You can do this yourselves if you are serious. Just make sure that the time, money, formality, expertise etc. that you use are appropriate for the size and complexity of the organisation (or unit) and the issues you wish to tackle. This page should be a starting point in deciding how to do this.

Planning Tips

Where should you be concentrating your energies? Assess what is important in your operations, and assess their performance (in your terms). Fit into the matrix below (adapted by OUBS course B752 from Slack), and take appropriate action. For example if you are good at activities which are of only low importance (to your goals), you may be able to release this ‘excessive’ attention, and use it to take urgent action on bad performers which will have greater impact overall.

performance v importance matrix

SWOT One of the classic approaches to where you are and where you might go. Do a matrix examining the organisation’s Strengths and Weaknesses, Opportunities and Threats. Concentrate on exploiting the S and O, reducing or avoiding the W and T. STEP analysis is used alongside this – look at the various factors impacting on the organisation – Sociological, Technological, Economic, Political. Sometimes Environment and Values are added to this collection. We’ve also seen PESTLE with the L for Legal.

Mission, Goals, Objectives, Targets Terminology here can be particularly problematic – Vision, values, aims, activities could equally well be used. Basically you need some structure that goes from the long-term Broad Purpose (why) through medium-term ‘what can we achieve’ to the immediate (within next year) and specific hows which can be costed out in detail. A ‘pyramid of purpose’ is illustrated below (bottom level of targets left off as needs better drawing software!).

mission, aims, objectives pyramid

More on Mission on the Organisation Management page.

SMART Are your proposals (objectives, targets) Specific, Measurable, Achievable, Relevant and Timed?(Have also seen items 3 and 4 stated as Actionable, Realistic.) If they can’t be phrased in this way, maybe they are operating principles – e.g. ‘working in partnership’ is an underlying approach rather than an activity in its own right.

Three dimensions Activities, Resources and Goals/Objectives all inter-relate. While you can look at one of these areas at a time, you must then see what effect any changes or developments in one will have on the others and re-visit your original ideas if this throws up problems or further issues.

Another three dimensions

marketing, operations, finance

Adapted from Open University B752 ‘competitive performance management’ summary diagram. These three functional areas have to work together in achieving organisational objectives. The precise split and labels may differ (eg marketing may be called fundraising or public relations and, rather than determining the mix of products and services, will be responsible for ‘selling’ the desired mix to potential funders), but the necessity for having a balanced approach remains.

Monitoring and Quality

Measuring a project’s success, to satisfy funders, trustees but also to address shortcomings, make changes as part of learning process. Not just inputs (resources used, such as money, volunteer time) but outputs (e.g. level of activities), outcomes (what actually happened/changed as a result) and even impact (moving towards achieving the mission). What happened along the way on personnel, finances, unexpected spin-offs, changing environment etc. and how this can be used/improved (learning from failure), influence on priorities.

Approaches and standards

  • PQASSO – a practical quality assurance system for small organisations – is now a nationally recognised standard used by many local authorities as the recommended quality system for funded organisations, and is being used as a working model by various national voluntary organisations, e.g. NCH Action for Children have adopted it for their projects. NCVO CES (see next section) and others do training in how to use it.
  • Quality First, published by Birmingham Voluntary Service Council, is a quality system for organisations with no paid or only part-time staff. It is more of an ongoing process than a standard of attainment, looking at nine quality areas, and can form the basis for moving on to more advanced systems. It was developed by Tony Farley, original author of PQASSO. The workbook is £25 incl p&p (free to Birmingham groups) from BVSC, phone 0121 678 8808.
  • Proving and Improving: a quality and impact toolkit for social enterprise has been produced by New Economics Foundation Consulting. Three (downloadable) toolkit books plus an overview chart outline basics of measuring organisation impact, the pros and cons of over twenty quality measures plus practical tools and exercises.
  • The European Foundation for Quality Management has an ‘Excellence Model’ designed to be applicable to all organisations. Worth a look if you are interested in this subject. The British Quality Foundation has various publications relating to this model. See Resources: Publications below too.
  • ISO 9000 (was BS5750), administered by the British Standards Institute, is about having a ‘systems’ quality standard – it doesn’t guarantee anything about what you produce, other than consistency of the process. It is quite a complex and bureaucratic to register to this standard, and unlikely to be the best route forward for small to medium voluntary organisations (in our opinion).
  • See People Management for Investors in People standard. It can now cover organisations with no paid staff, but isn’t cheap to pursue.
  • Charter Mark was a government award scheme for recognising and encouraging excellence in (public) service delivery, extended to voluntary organisations receiving at least 10% public funding (at April 2000). The following info is out of date, but may still be of help. There are 10 criteria used:
    1. Set standards
    2. Be open and provide full information
    3. Consult and involve
    4. Encourage access and the promotion of choice
    5. Treat all fairly
    6. Put things right when they go wrong
    7. Use resources effectively
    8. Innovate and improve
    9. Work with other providers
    10. Provide user satisfaction

    The Chartermark website now forwards to Customer Service Excellence.

  • Sustainability: Integrated Guidelines for Management (SIGMA) may be of interest – involves BSI and others. A ‘work in progress’ rather than a finished standard.

Evaluation support

Also see Further Resources below.

Specific areas

  • Homeless Link – impact resources “a one-stop resource for homelessness agencies who are interested in taking an outcomes approach to their work”.
  • Monitoring and Evaluation News A news service focusing on developments in monitoring and evaluation methods relevant to development programmes with social development objectives.


Social return, impact

Social Return on Investment (SROI) is a methodology to measure the social value of an organisation or project. It is based on 7 core principles that are designed to ensure the process is orientated around stakeholder involvement, and the measurement of outcomes, positive or negative, experienced by stakeholders as a result of those activities.

Social Value UK (previously SROI Network) has various resources, as well as webinars, training courses and membership.

Measuring social impact Impact reporting and related topics have had increasing attention (at 2014), sometimes connected with the idea of social investment. New Philanthropy Capital has been doing work on this, both from funder and charity perspectives.

New Economics Foundation has a publication Guide to Social Return on Investment.

Also see: Environmental and social impact page.

Benchmarking

Also see data sharing on Knowledge management page.

Benchmarking is a buzz word which can be taken in a number of ways. Basically the idea is to find a comparator to ‘benchmark’ your operations against – usually the idea is to find ‘best practice’, identify the gaps between you and them, and work to improve. It isn’t easy in the voluntary sector to find the partners or agree on performance indicators, and some would say that the value is looking further afield – if you want to benchmark a phone counselling service, say, why not compare with a call centre handling customer complaints?

As Benchmarking Plus (Australia) says, ‘a survey may tell you where you rank, but it won’t help you improve your position’ (but it might prompt you to ask some questions). We reproduce a page of theirs giving ten common benchmarking mistakes. Unfortunately their website no longer contains any useful resources (autumn 2011).

We have extracts from a survey investigating the need for and practicalities of a benchmarking club for UK charities (on the old site) (Open University, 1997).

Benchmarking initiatives

  • The annual Charity Finance magazine Charity Shops Survey (published July?) provides some ratings in that area to measure performance against.
  • Charity Finance Group may have details on benchmarking exercises for the finance function.
  • 18 major British voluntary organisations have formed an HR Benchmarking Club. People Count: Benchmarks for the Human Resources Function in Voluntary Organisations, a publication from Compass Partnership (June 00), gives information on over 165 aspects of personnel management from this club.

Further Resources

See Knowledge management page.

There are some good training courses from DSC, plus NCVO CES mentioned above.

Publications

A good introductory book is Complete Guide to Business and Strategic Planning by Alan Lawrie (DSC).

International Development Research Centre has made its publication Organizational Assessment: A Framework for Improving Performance available online. Not sector specific but does go beyond the commercial.

On the web, IT

Strategic Planning Society is probably only really useful for the big voluntary bodies (they seem to bracket the voluntary sector with the public one), but you may be able to pick up some ideas from their web site. The Voluntary Sector Special Interest Group appears to have disappeared. 17 Portland Place, London, W1N 3AF, phone 020 7636 7737, email: enquiry@sps.org.uk

Here are initial findings on useful software:

  • MS Project is the big one to plot out your implementation of a project, allocate tasks, monitor progress, slippage and the effects etc.
  • To help you brainstorm etc, look at Inspiration. A trial version can be downloaded from the web and is available for Macs and PCs (using Windows) – we havent checked out the current version 6. ‘…A powerful but easy-to-use visual thinking and learning tool that helps you brainstorm ideas, organize thinking, develop concepts and plan. Use its Diagram view to create concept maps, webs, diagrams, knowledge maps etc. or its Outline view to prioritize and rearrange ideas, leading to clear, concise writing.’ It looks good and our limited testing of the demo was positive. UK distributors are PMI, phone 024 7641 9089, email: inspiration@pmi.co.uk, cost £90 plus VAT (July 99).

Non-Profit Evaluation – international resources

The following,mainly American, web sites seem promising, if you have the time – we have only done basic checking on them. All are non-profit specific.

  • Center for Excellence in Nonprofits Programs in leadership development, systemic change, continuous improvement and best practices. A Silicon Valley learning community of non-profits.
  • Kellogg Foundation. Their Evaluation Handbook (in pdf format) is designed for their ‘grantees’ but undoubtedly of interest to others. A participatory, multidisciplinary process, not just about outcomes but also building capacity. In the authors’ view, “Project evaluation should not be conducted simply to prove that a project worked, but also to improve the way it works.”
  • An online evaluation tool which “takes about 10 minutes to complete if you are very familiar with your organization’s operation”, according to a posting on Digital Divide forum, is Innovation Network‘s Workstation 2.0 – this has moved, spring 05, but the site is still worth a check.