Background
Methods
Qualitative data collection
Quantitative data collection
Caregiver age in years (mean) [Range: min.–max.] | 63.9 (SD ± 12.9) [24.0–93.0] |
---|---|
Caregiver gender (valid percentage, n = 555) | |
Female | 75.0 % (416) |
Male | 25.0 % (139) |
Relationship with PwD (valid percentage, n = 559)a
| |
Spouse/partner | 50.1 % (280) |
Child | 36.8 % (206) |
Child-in-law | 3.8 % (21) |
Other | 9.3 % (52) |
Person with dementia age in years (mean) | 79.7 (SD ± 8.4) |
[Range: min.–max.] | [44.0–103.0] |
Qualitative data analysis
Mixed-methods data analysis
Data-label (cut-off scores) | KM area (based on Probst [8]) | DCN-groups (persons/organizations) | Material-proof (+ − > formalized/− − > non formalized) | Result (+ = 1/− = 2) | |
---|---|---|---|---|---|
1.0–1.49 Highly formalized knowledge management 1.50–2.0 Less formalized knowledge management | Knowledge aims/identification | Internal stakeholders | E.g.: mission statements (~ +) or no formalization (~ −) | 1 or 2 | |
Knowledge development/acquisition | Internal stakeholders | E.g.: journal clubs (~ +) or no formalization (~ −) | 1 or 2 | + | |
External stakeholders | E.g.: conferences (~ +) or no formalization (~ −) | 1 or 2 | + | ||
Knowledge distribution | Internal stakeholders | E.g.: IT-portals (~ +) or no formalization (~ −) | 1 or 2 | + | |
External stakeholders | E.g.: informative materials (~ +) or no formalization (~ −) | 1 or 2 | + | ||
User | E.g.: press work (~ +) or no formalization (~ −) | 1 or 2 | + | ||
Knowledge use | Internal stakeholders | E.g.: guidelines (~ +) or no formalization (~ −) | 1 or 2 | + | |
Knowledge evaluation | Internal stakeholders | E.g.: quality circles (~ +) or no formalization (~ −) | 1 or 2 | + | |
External stakeholders | E.g.: research institutes (~ +) or no formalization (~ −) | 1 or 2 | + | ||
User | E.g.: feedback surveys (~ +) or no formalization (~ −) | 1 or 2 | + | ||
Knowledge storage | Internal stakeholders | E.g.: IT-libraries (~ +) or no formalization (~ −) | 1 or 2 | = | |
End-result
|
x/11 = 1.0–2.0
|
Results
Knowledge evaluation tools and processes used in the DCNs
Target area | Number of DCNs with formalized structures | Global DCN structures (number of notes by internal stakeholders [one count per network]) | Processes/tools (number of notes by internal stakeholders [one count per network]) |
---|---|---|---|
Internal DCN evaluation (internal stakeholders) | 8/13 | Working groups (7/8) | Performed by: General DCN evaluation in protocolled working groups (5/7) Evaluation of mission statement in quality circles (3/7) Literature-based knowledge evaluation in journal clubs (1/7) |
Feedback surveys (5/8) | Performed by: Network evaluation enquiry (4/5) Delphi census (1/5) | ||
QM-systems (5/8) | Used tools: Quality handbooks (4/5) KTQ (PDCA) (2/5) Balanced Scorecard (1/5) | ||
Extraction of user feedback | 7/13 | IT systems (7/7) | Performed by: Homepage contact forms (6/7) Feedback hotline listed on homepage (1/7) |
Case management (7/7) | Performed by: Protocolled meetings between internal stakeholders and case managers (7/7) Case protocols of DCN users/external stakeholders [e.g., general practitioners] (5/7) | ||
Feedback surveys (5/7) | Used tools: Printed seminar feedback inquiries (5/5) Printed general feedback inquiries (3/5) Telephone inquiries (1/5) | ||
Conferences (4/7) | Performed by: Informative events with external stakeholders (3/4) Feedback forums between DCNs and users (2/4) | ||
External performed evaluation | 4/13 | External research partners (4/4) | Performed by: Universities (3/4) Research institutes (1/4) |
Information storage | 13/13 | Paper-based systems (13/13) | Used tools: File folders—general (13/13) Dementia network libraries for network Stakeholders (2/13) Dementia network libraries for network users (1/13) |
IT-systems (4/13) | Used tools: Internal literature databases (4/4) Internal IT-exchange forums (2/4) |
Barriers, facilitators and attitudes of internal DCN stakeholders toward knowledge evaluation
“We already use quality and knowledge evaluation tools in many areas of our network, and we wish to extend these processes to all fields. […] so that we get feedback: What suits and what does not.” (KR:EI-1617)
“We are very excited about the success of this forum (user feedback forum; see Table 3). Everybody can equally discuss and spread new ideas. This is a fantastic basis for the further development of our network based on user wishes but also in general.” (AA:GD-151)
“We (the stakeholders) are all using quality evaluation and feedback instruments (within their companies). We all know how they work, and we do it every day. We don’t need complex tools for knowledge evaluation in this network because we are all focused on direct and flexible communication.” (TK:EI-991)
“We would like to have clear instruments for that (knowledge evaluation), but we don’t have them. […] We simply had no resources in our volunteer-based network until now.” (UK:EI-421f.)
“We regard quality as providing opportunities for our network. Knowledge evaluation processes can improve our quality, but every new process for the systematic evaluation of our DCN work costs time, which is limited.” (PK:GD-479f.)
“We have nobody to develop this in our network. We’re just learning by trial and error.” (AR:EI-100)
“Something we have tried and already given up is assessing the satisfaction of our users through static questionnaires. This heterogeneous group of people with different opinions and needs related to multiple support areas of our network could not be assessed using one single quantitatively based instrument. This approach didn’t work.” (AR:GD-549)
“We use a standardized questionnaire developed by the Alzheimer Society to evaluate the training of our users. The results are always perfect (laughing). That’s why I think it’s not selective enough. Who says that the seminar was stupid? Nobody.” (AA:GD-209)
Correlation of the KM in the DCNs with regard to family caregivers’ knowledge of dementia support services (mixed-methods analysis)
Instrument | Label CR* (n) | % CR* HF*1 (n) | % CR* LF*1 (n) | p value 95 % CI (x2) | % CR* total (n) | % CR* compar.*2 (n) |
---|---|---|---|---|---|---|
D-IVA (Item 20.1 + 20.2) | 20.1 No need for dementia-specific information (558)a
| 6.9 (18) | 5.7 (17) |
0.681
| 6.4 (35) | 2.4 (2) |
20.2 Need for dementia-specific information but no knowledge of how to obtain it (563)a
| 1.9 (5) | 5.0 (15) |
0.048
| 3.6 (20) | 10.9 (9) |
Instrument | Label | mean CR* HF*1 [SD] (n) | mean CR* LF*1 [SD] (n) | p value 95 % CI (U-Test) | mean CR* total [SD] (n) | mean CR* compar.*2 [SD] (n) |
---|---|---|---|---|---|---|
D-IVA (Item 21) | 21. Appraisal of how difficult it is for a family caregiver of a PwD to obtain an overview about different types of dementia information and support services | 2.43 [1.12] (245) | 2.39 [1.17] (263) |
0.580
| 2.41 [0.67] (508)a
| 2.29 [0.68] (72) |
BIZA-D (Item 4.13) | 4.13 Feelings about being hindered in obtaining information about support services for household care | 0.89 [1.02] (242) | 1.21 [1.27] (283) |
0.024
| 1.05 [1.18] (525)a
| No comparison data |