The use of parenteral systemic anticancer therapy (SACT) has led to improved cancer survival. A quality assurance (QA) system of the aseptic compounding process is necessary to ensure safe and consistent production of parenteral SACT. This scoping review identifies international evidence and practice relating to QA standards in the preparation of parenteral SACT in healthcare establishments.
Standards relating to aseptic compounding in hospital pharmacies and literature exploring the aseptic compounding of parenteral SACT were included. Literature relating to the non-aseptic compounding of medicines and records specific to sterile manufacturing in industrial settings were excluded. A search of several electronic databases, trial registries, the grey literature and websites of key European hospital pharmacy groups and accreditation bodies was conducted on 16 March 2022. A narrative discussion was performed by country, and content analysis of articles was conducted.
Thirty-seven records were included. Standards reviewed covered the work environment, the preparation process and the safety of the workers who are potentially exposed to hazardous chemicals. It was a common practice to include frequent audits to ensure adherence to standards. Some standards also recommended external inspections to allow for further learnings. Periodic reviews are encouraged to ensure standards maintain relevance. National standards of the countries reviewed were based on international standards, with minor adaptations for local conditions.
The main limitation of this review is that it is limited to countries with a high human development index. The review shows that the use of an internationally recognised standard as a basis for national standards is best practice, and will allow for relevance into the future.
Linezolid is the first oxazolidinone antimicrobial agent developed for treating multi-drug-resistant gram-positive bacterial infections. The study aimed to investigate the risk factors of linezolid (LI)-induced thrombocytopenia (LI-TP) and to develop and validate a risk prediction model to identify elderly patients at high risk of developing LI-TP during linezolid therapy.
A retrospective cohort study was performed at Zhongshan Hospital, FuDan University, China. The study involved elderly Chinese patients aged ≥65 years administered with linezolid (600 mg) twice a day between January 2015 and April 2021. We collected the patients’ clinical characteristics and demographic data from electronic medical records, and compared the differences between LI-TP patients and those who had not developed thrombocytopenia (NO-TP) after linezolid treatment. The risk prediction model was developed based on the regression coefficient generated from logistic regression model.
A total of 343 inpatients were enrolled from January 2015 to August 2020 and were used as the training set. Among them, 67 (19.5%) developed LI-TP. Multivariate logistic regression analysis revealed that baseline platelet counts <150x109·L-1 (odds ratio (OR)=3.576; p<0.001), age ≥75 years (OR=2.258; p=0.009), estimated glomerular filtration rate (eGFR <60 mL·(min·1.73 m2)-1 (OR=2.553; p=0.002), duration of linezolid therapy ≥10 d (OR=3.218; p<0.001), intensive care unit (ICU) admittance (OR=2.682; p=0.004), concomitant piperacillin-tazobactam (OR=3.863; p=0.006) were independent risk factors for LI-TP in elderly patients. The LI-TP risk prediction model was established using a scoring method based on the regression coefficient and exhibited a good discriminative power, with an area under the curve (AUC) of 0.795 (95% confidence interval (CI) 0.740 to 0.851) and 0.849 (95% CI 0.760 to 0.939) in the training set (n=343) and validation set (n=90) respectively.
These findings indicate that duration of linezolid therapy, age, eGFR, ICU admittance, baseline platelet counts, concomitant piperacillin-tazobactam were significantly associated with LI-TP in elderly patients. A risk prediction model based on these risk factors showed a good discriminative performance and may be useful for clinicians to identify patients at high risk of developing LI-TP.
Prescription errors can cause serious adverse drug events. Clinical decision support systems prevent prescription errors; however, real-time clinical rules in obstetrics, gynaecology, and paediatric outpatients remain unexplored. We evaluated the effects of localised, real-time clinical rules on alert rates and acceptance rates compared with manual prescription review.
We developed real-time clinical rules that incorporate information systems to obtain characteristic information and laboratory values. We conducted a retrospective cohort study to compare the alert and recommendation acceptance rates of all prescription error types before and after clinical rule implementation in obstetrics, gynaecology, and paediatrics. Clinical rules, prescription error types, and alerts were determined by a prescribing review committee comprising physicians, pharmacists, nurses, and administrators. The difference in alert and acceptance rates between the groups was analysed using relative risk.
The number of alerts increased after clinical rules implementation; the number of on-duty pharmacists for review decreased from 10 to 2. Compared with those with manual review, the alert rates for paediatrics and obstetrics and gynaecology increased with the clinical rules by 3.97- and 11.26-fold, respectively, and the alert rates for drug–drug interactions (DDIs) and combined medication errors in obstetrics and gynaecology increased with the clinical rules by 26.10- and 26.54-fold, respectively. In paediatrics, the alert rate for all prescription error types was higher with the clinical rules review than with the manual review; the alert rates for DDI, dosage, and combination medication errors were significantly different between the clinical rules and the manual review. However, there was no difference in the recommendation acceptance rate between the manual review and the clinical rules.
Clinical rules can identify prescription errors that manual review cannot detect and ensure real-time review efficiency in high-volume outpatient prescription settings. The high acceptance rate and modification of prescriptions may be relevant to highly customised and localised clinical rules.
Extravasation is a potential complication resulting from parenteral administration of drugs. The purpose of this study was to characterise the physicochemical properties of non-antineoplastic parenterally administered drugs and determine their potential to cause a toxic effect on tissue.
A list of drugs administered by intermittent or continuous intravenous (IV) infusion was prepared. A database was also established to collect information from the literature. Each active substance was classified according to its risk to cause tissue damage using the following criteria: (1) High risk: active substances presenting with any of the following characteristics: osmolarity of the IV solution form >500 mOsm/L; vasoconstriction; vesication; cellular toxicity; very common, common or uncommon adverse events such as phlebitis, necrosis or pain at the site of administration according to the Summary of Product Characteristics. (2) Moderate risk: active substances where the pH range was <3 or >11 or where adverse events at the site of administration occurred rarely, very rarely or with unknown frequency. (3) Low risk: active substances where the osmolarity of the IV solution was <500 mOsm/L and the pH ranged between 3 and 11. These active substances did not cause vasoconstriction, neither were they classified as vesicant or cytotoxic or presented with adverse events at the site of administration.
The risk classification list included 138 active substances, of which 86 were classified as ‘high risk’, 18 as ‘moderate risk’ and 34 as ‘low risk’.
The classification of intravenously administered drugs according to their risk profile is useful to ensure their safe use, as it can be used to implement the necessary safety measures to prevent adverse events.
Most cytostatics used in cancer treatment are dosed on body surface area (BSA). To administer an appropriate dose it is therefore necessary to know the patient’s correct body weight. However, evidence is lacking on how often, after initiation of treatment, body weight should be measured to recalculate BSA. We aimed to assess the relevance of weight measurements during chemotherapy treatment.
Over a 2 year period we analysed BSA changes in adult patients undergoing chemotherapy treatment. The frequency of and median time to ≥10% BSA change was determined. We assumed a 10% BSA change required dose adjustment and was therefore clinically relevant.
Using a database query, data from 2276 patients were used for descriptive statistics, life table analyses and generalised estimating equations. The frequency of ≥10% BSA change occurred in a maximum of 7.6% of the patients, depending on the tumour type. Descriptive statistics in the indications with more than 100 patients showed that BSA changes of ≥10% occurred after 84 days. The groups with the earliest BSA changes were patients with acute leukaemia, lymphoma and pancreatic cancer.
Our observations from real-world data indicate it is safe to omit the current requirement for monthly weight measurements. We advise that during chemotherapy, measuring the body weight in patients who have acute leukaemia, lymphoma or pancreatic cancer or who are under 20 years of age, should be performed at least every 3 months. For other patients, extending this period to a 6-monthly weight measurement should be considered.
Many patients receive benzodiazepines or Z-drugs during hospitalisation due to sleeping problems. In a pilot study, we aimed to find out whether, and to what degree, a multi-faceted intervention can reduce the use of these drugs, especially in older patients and those without a psychiatric or neurological disorder. The results of this pilot study should inform the design of a randomised controlled trial (RCT).
In a quasi-experimental design, we implemented the intervention in a German hospital with the support of the hospital director, medical and nursing staff and employee representatives. We compared prescription data for sleep-inducing drugs before and after the intervention by Fisher’s exact test and used odds ratios (ORs) with their 95% CIs as a measure of effect size.
The data from 960 patients aged ≥65 years before intervention and 1049 patients after intervention were analysed. Before intervention, 483 (50.3%) of the patients received sleep-inducing drugs at some time during their hospital stay. After the intervention, 381 (36.3%) patients received a sleep-inducing drug, resulting in an OR of 0.56 (95% CI 0.47 to 0.68) (p<0.001). The reduction was particularly pronounced in patients without a psychiatric or neurological disorder (from 45.0% to 28.8%). In particular, the consumption of benzodiazepines declined from 24.3% to 8.5% (OR 0.31; 95% CI 0.23 to 0.4) (p<0.001).
A multi-faceted intervention to change the practice of the use of sleep-inducing drugs in one hospital was successful in terms of drug reduction, particularly for benzodiazepines. The intervention was effective especially for target persons—that is, those without a psychiatric or neurological disease. Awareness of the magnitude of the change and the role of important stakeholders could help researchers and hospital staff to design a large RCT, including control hospitals, to evaluate the success of a multi-faceted intervention on a scientifically sound basis.
Treatment with dihydropyrimidines poses a significant risk of serious adverse reactions for patients with dihydropyrimidine dehydrogenase (DPD) deficiency. This study seeks to analyse the correlation between DPD deficiency and plasmatic uracil values in patients who are candidates for a fluoropyrimidine scheme. It also studies the incidence of adverse events (AEs) in patients with DPD deficiency established with plasmatic uracil determination.
This was a retrospective observational study conducted in a tertiary level establishment from September 2020 to April 2021. Patients included were diagnosed with gastrointestinal tumours, were of good status, and were initiated into a fluoropyrimidine-based regimen. The incidence and grade of AEs, according Common Terminology Criteria for Adverse Events (CTCAE), were collected and compared in patients with and without DPD deficiency.
119 patients diagnosed with gastrointestinal cancer met the inclusion criteria. In 92 (77%) patients there was no DPD deficiency according to plasmatic uracil thresholds. In the group of patients without deficit, dose reductions oscillated between 10–25% (mean 17.4%). In the no DPD deficiency group, 43 (46%) patients experienced AEs. Patients who had a DPD deficiency according to plasmatic uracil measurements were started on a 5-fluorouracil (5-FU) regimen with a dose reduction of 15–50% (mean 35%). In this group, 12 patients (44%) experienced some AEs.
New research is needed to clarify the correlation between plasma uracil values and DPD deficiency to achieve an optimal balance between clinical benefit and toxicity.
The aim of this study was to investigate the prevalence and severity of potential drug–drug interactions (pDDIs) in hospitalised patients with major psychiatric disorders and to identify factors associated with their occurrence.
The research was designed as an observational, cross-sectional study conducted at the Clinic for Mental Disorders (CMD) ‘Dr. Laza Lazarevic’, Belgrade, Serbia. Medscape, Epocrates and Lexicomp bases were used to detect potential drug interactions among inpatients. Multivariate regression analysis was used to reveal risk and protective factors associated with the number of pDDIs.
The study included 511 patients, average age 44.63±11.81 years. The average number of pDDIs per patient ranged from 5.9±4.7 (Medscape) to 8.2±5.4 (Epocrates) and 8.5±5.1 (Lexicomp). The following risk factors were identified by all three interaction checkers used: C-reactive protein, number of pharmacological subgroups, number of prescribed drugs, antibiotics, antacids, vitamins, number of associated comorbidities, route, form and dose of the drug.
When making clinical decisions to reduce drug problems, including DDIs, one should consult several interaction databases, which should be reviewed by a multidisciplinary team consisting of an experienced clinical pharmacist, physician, nurse, and so on.
Aggregation is one of the key critical points limiting the stability of monoclonal antibodies in solution. The present study aimed to investigate the in-use stability of a residual monoclonal antibody solution after withdrawal of most of the filling volume of PF-06439535 (bevacizumab biosimilar), addressing the physical and chemical stability with respect to aggregation and fragmentation.
The stability of residual PF-06439535 solution (25 mg/mL) after withdrawal of 80% (12.8 mL) filling volume with a 20G needle was monitored over a light-protected storage period of 8 days at 2–8°C and 25°C with measurement time points at D0 (start of storage), D2, D4, and D8 (2, 4, and 8 days of storage after start, respectively). Unopened vials stored under the same conditions served as control. For this purpose, the analytical results from size exclusion chromatography, dynamic light scattering, and micro-flow imaging obtained after the individual measurement time points up to 8 days were compared with those obtained at D0 and with those obtained for unopened vials stored under the same conditions.
No aggregation or ongoing fragmentation due to partial withdrawal of filling volume could be observed in the residual PF-06439535 solution. Moreover, no changes in the particle size distribution at D8 compared with the D0 values were identified upon storage at either 2–8°C or 25°C (both opened and unopened vials). The total concentration of particles ≥10 µm of all samples was <100 particles/mL. In addition, no variations in the pH values or in the visual appearance were detected over the whole study period in all samples at all storage conditions.
Consequently, residual PF-06439535 solution (25 mg/mL) in opened vials may be regarded as stable when stored light-protected over a period of 8 days in the refrigerator (2–8°C) or at 25°C.
Transition from originator biological medicines to their biosimilar equivalents is now part of routine clinical practice, but there is little understanding of patient experiences, which influence adherence and overall satisfaction with care. Understanding this will help ensure future switches adequately address patients’ concerns and expectations leading to better outcomes for all stakeholders.
35 patients participating in a clinical trial including an open-label transition event from originator to biosimilar adalimumab, mimicking what would be encountered in a real-world setting, took part in semi-structured interviews exploring their experience of biosimilar transition.
Opinions expressed were often heterogeneous, but common experiences and themes were identified. Five themes were identified following thematic analysis. (1) Understanding and awareness of biosimilars: prior awareness of biosimilars and knowledge of the biosimilar concept was low, indicating a disparity between healthcare professionals and patients. (2) Motivation to undertake transition: patients accept a biosimilar transition to minimise drug expenditure. (3) Initial concerns: before undertaking biosimilar transition away from the brand they had experienced, anticipated loss of efficacy and adverse effects from the biosimilar were common concerns for patients. (4) Reassuring factors: trust in the healthcare team is critical to patient acceptance of biosimilars. Important reassurances include a point of contact, education about biosimilars and monitoring. (5) Experiences during the transition: on reflection, participants described consistent efficacy and tolerability (although 22 participants specifically mentioned injection pain) following brand transition.
The majority of patients felt comfortable with future transition to another adalimumab biosimilar. Injection experience was an important component of patient satisfaction.
While randomised controlled trials in HIV-infected patients have shown that certain dual antiretroviral therapy (DAT) regimens are non-inferior in terms of efficacy compared with classical triple-drug regimens, few real clinical experiences have been described. The aim of the study was to investigate, in real clinical practice, DAT effectiveness, durability, and risk factors for treatment discontinuation.
This was a prospective cohort study that included HIV-infected patients treated with DAT (2015–2020). DAT was considered effective when patients achieved or maintained virological suppression and was assessed at 24 and 48 weeks. DAT durability was evaluated using the Kaplan-Meier method. Adherence and treatment cost were compared with patients’ previous antiretroviral regimens.
51 patients were included, 27.5% with HIV-1 RNA ≥50 copies/mL at baseline, treated with a wide range of dual combinations. At 48 weeks follow-up, 83.8% and 50.0% of patients who started DAT with HIV-1 RNA <50 copies/mL and ≥50 copies/mL, respectively, were suppressed. 39 out of 51 patients (76.5%) maintained DAT for a mean treatment duration of 40.5±14.8 weeks. Full adherence was observed in 78.4% of patients compared with 70.2% in the previous regimen. Mean daily cost was 18.6±4.3 compared with 16.1±7.9 in the previous regimen (p=0.008).
DAT effectiveness and durability were higher in patients who were virologically suppressed at baseline. DAT is a possible alternative for virologically non-suppressed patients who cannot be treated with a triple-drug regimen.
Adherence to and persistence with long-term treatment with oral anticoagulants play a significant role in preventing adverse events and mortality in patients with cardiac conditions. The aim of this study was to evaluate the adherence, persistence and switching rate at 3 years in real-life patients with non-valvular atrial fibrillation receiving treatment with first-line new oral anticoagulants.
The study assessed all patients treated with drugs with the ATC codes B01AA, B01AE, B01AF and dispensed in pharmacies in the Lanciano-Vasto-Chieti and Pescara Local Health Units from 1 January 2011 to 30 September 2021. Adherence was calculated as the proportion of days covered; persistence was calculated as the difference in days between the start and end of treatment; and the switching rate was calculated as the difference in days between the start of treatment and the switch.
A total of 4270 patients were analysed. The absolute adherence figure at 3 years was 0.85. The lowest adherence levels were found in patients treated with dabigatran with an absolute value of 0.72, while the highest levels were found in patients treated with rivaroxaban with an absolute value at 3 years of 0.88. The persistence curves at 3 years of treatment with dabigatran showed a statistically significant difference (p<0.0001) compared with those of rivaroxaban and apixaban.
The data collected over a 3-year period showed that adherence and persistence levels and switch data were optimal and comparable in patients with non-valvular atrial fibrillation receiving treatment with either rivaroxaban or apixaban. In contrast, patients treated with dabigatran had worrying adherence and persistence levels.
We conducted a scoping literature review in PubMed and Embase to identify potential assessment items. Their relevance was systematically rated and consolidated into the final tool.
60 relevant items were included, grouped into eight focus topics: the committee’s institutional integration, member characteristics, performance indicators, meeting structure, formulary decision-making and characteristics, strategies to guide medication use and medication use evaluations.
In combination with a SWOT (strengths, weaknesses, opportunities and threats) analysis, the tool helped the identification of improvement opportunities for a pilot hospital: adapting the committee’s structure, improving the formulary decision-making, implementing strategies to guide formulary medication use and strengthening the committee’s recognition within the institution.
The tool successfully identified improvement opportunities for a PTC and could therefore be interesting for other hospitals.
]]>Nirmatrelvir/ritonavir may cause a clinically relevant drug-drug interaction (DDI) with immunosuppressive drugs, such as tacrolimus, which may condition the use of this antiviral in transplant patients. We aimed to describe the management of this interaction.
Descriptive study in which renal transplant patients in treatment with nirmatrelvir/ritonavir and tacrolimus were included. They suspended tacrolimus the day before starting the antiviral treatment, and the decision to restart it was made based on their tacrolimus blood levels. Main variables studied to measure this DDI were tacrolimus blood concentration, dose adjustment and serum creatinine.
Three patients were included. During the study, tacrolimus levels elevation did not have repercussion in the serum creatinine, that remained stable in all patients. No patient required hospitalisation or showed signs of rejection.
Our experience provides further evidence that this interaction should not be a contraindication to treatment with nirmatrelvir/ritonavir, and can be managed with close monitoring of tacrolimus levels.