The QT interval is an electrocardiographic parameter that has gained considerable importance in the safety evaluation of drugs.
QT is measured from the beginning of the Q wave until the end of the T wave and, therefore, not only represents the depolarization, but also -clinically important- the repolarization of both ventricles. Myocardial repolarization, which is mainly mediated by an outward potassium current, is highly variable depending on heart rate (with higher heart rate leading to shorter QT interval and vice versa). Several formulas have been established to compensate for this variability, but the most commonly used still include the first two proposed in 1920 by H. C. Bazett (QTcB) [1] and later L.S. Friderica (QTcF) [2]. The normal values for the QT interval usually range from 390 ms to 450 and 460 ms for men and women respectively [3].
Special importance was attached to the QT interval in 1957 when Jervell and Lange-Nielsen reported cases from a family with prolonged QT, deafness and occurrences of syncope and cardiac death at a very young age [4]. This disease entity is now known as Long QT syndrome and is associated with an increased risk for ventricular arrhythmia that is characterized by tachycardia with an undulating or twisting appearance, termed “Torsades de pointes” (TdP) tachycardia [5]. Shortly after, there were also several reports of ventricular tachycardia especially under antiarrhythmic treatment with quinidine. These were later attributed to an acquired QT prolongation, subsequently raising awareness of a risk of drug-induced effects on the QT/QTc interval [6, 7]. The first post-marketing withdrawal of the calcium channel blocker prenylamine in 1988 due to sudden cardiac deaths caused by Torsades de pointes certainly underscored this risk [8]. Thereafter several substances including antibiotics and antipsychotics have been removed from the market, prompting the further establishment of national guidelines for drug development. These culminated in the International Conference of Harmonization guidance documents ICH S7B [9] and ICH E14 [10] in 2005, both of which are still in effect and are an integral part of drug development (but have been updated via Q&As). The list of drugs with potential to prolong QTc is steadily growing and has reached beyond 220 substances [11].
While it is quite evident that a number of drugs increase the risk of potentially lethal ventricular arrhythmias through means of QT/QTc prolongation, the overall impact in the general population remains unclear. Estimates on the incidence of TdP vary greatly. It is likely they are underreported [12], because events of syncope or sudden cardiac death caused by TdP may either not be apparent in the first recorded ECG or not recognized or reported correctly. (For the latter it should be noted that a dedicated code for TdP was added to the International Classification of Diseases (ICD) in 2022 and is not present in the German modified version [13].) Incidences in the general population have been estimated in a wide range from 4/100,000 [14] to 2.5 and 4/1,000,000 [15] per year for men and women respectively. If looking at drugs with known potential to cause QT/QTc prolongation, prevalences of drug-induced long QT syndrome have been reported in up to 6 to 8% of patients in this very heterogeneous field [16]. While the reported incidence of Torsades de pointes still seems low in this group (up to around 0.3%), the occurrence of this particular arrhythmia is associated with a high lethality of about 10-20%. Several risk factors are of importance in the development of drug induced long QT syndrome including female gender, known heart disease and a variety of drug interactions [17].
As outlined in ICH S7B, preclinical assessment is an important pillar in the development of new drugs. Especially relevant for QT/QTc prolongation is the inhibition of the human ether-à-go-go-related gene (hERG) cardiac potassium channel that plays a significant role in the repolarization phase of the cardiac myocyte.
The clinical risk assessment on prolongation of the QT/QTc interval is covered in ICH E14, recommending a dedicated study for ECG assessment of novel agents, but also for known substances in new indications or new target populations. This kind of study has been termed “thorough QT/QTc” (TQT) study. It is usually performed in healthy volunteers requiring a sufficiently high exposure to the tested substance in therapeutic and supratherapeutic doses as well as including both a positive (e.g. moxifloxacin) and a placebo control group [18]. While this procedure has shown to be effective to detect substances with risk of TdP induction prior to marketing, a dedicated TQT study is comparably resource- and cost-intensive [19].
A positive signal in the TQT study (i.e. the effect of the drug on QT/QTc passes a threshold of 10 ms) often warrants further investigation and may have a substantial effect on regulatory approval [20]. As dedicated TQT studies are often performed in late stage 2 or early stage 3 of drug development, such findings can lead to significant delays of development timelines (or even termination of the development program). Therefore, alternative approaches to reliably detect the potential to prolong the QT/QTc interval in the early stages of the clinical development have been under investigation.
Concentration QTc analysis, establishing the relationship between drug concentrations to changes in the QT/QTc interval, has been noted early on as useful for regulatory review [21]. This approach also takes into account individual responses instead of averaging QT responses for each time-point, which is especially valuable in case of e.g. variable Tmax. A revision with Questions and Answers to ICH E14 in 2015 included question # 5.1 “Use of concentration response modeling of QTc data” that highlighted possible alternatives to the dedicated TQT study [22]. It states, “Data can be acquired from first-in-human (FIH) studies, multiple ascending dose (MAD) studies, or other studies […].” It is however necessary that achieved drug concentrations are above the maximum therapeutic dose and should also reflect high clinical exposure scenarios while also ensuring high data quality control. Requirements for positive control can also be forgone in case of sufficiently high exposure. First-in-human studies often fulfill these criteria with doses approaching the maximum tolerated doses. These findings can be used to exclude clinically significant effects on the QTc interval, precluding the need for a dedicated TQT study [23].
It has to be said that the results regarding QTc liability always hinge on the quality of the data. Irrespective of technical aspects, a number of pitfalls should be taken into consideration before planning the clinical risk assessment.
The basis for correct evaluation whether a given QT interval is within the “normal” bounds evidently relies on the precise measurement of the interval itself. While this may appear trivial at first glance, there are a number of associated difficulties:
The first question is where to measure QT. All ECG parameters are prone to variances between different leads. In case of the QT interval this phenomenon has been termed QT dispersion (measuring the difference between the minimum and maximum QT interval duration). While QT dispersion is being discussed as a potential marker for myocardial diseases, differences of around +/- 50 ms between leads can simply be attributed to varied projections of the T-wave [24]. Historically Lead II has been the preferred lead to use and this is where the normal reference values are derived from, although it may not represent the longest QT interval. Additionally it can prove difficult to correctly determine the end of the T-wave, for which different methods with varying results are applied in clinical practice [25]. In the example of the commonly used tangent (usually shorter QT) and threshold methods, differences can be around 10 ms [26], which should be taken into consideration regarding the reference values.
This measurement can be hampered even more by the presence of a U-wave in different leads, especially if it overlaps with the T-wave, where it may prove impossible to determine a definitive QT-interval. An accuracy as low as <25 % has been reported for the correct assessment of abnormal and normal QTc by cardiologists and non-cardiologists [27].
Furthermore, the common heart rate correction formulas require a mostly stable rhythm. Determining QTc in presence of arrhythmias often proves a challenge. This even applies to findings thought to be a marker of good cardiovascular health such as sinus arrhythmia, which can be observed regularly in the often young and healthy study participants of early phase clinical trials [28, 29]. Some of these challenges can be overcome by simply collecting more ECG data for comparison and expertly selecting appropriate methods of QT measurement.
These factors may differ greatly from participant to participant but may also apply to the same individual. Regarding intraindividual changes, it has also been observed that the QTc interval follows a circadian variability influenced by the autonomous nerve system [30, 31]. This is more pronounced in different disease entities and actually serves as a potential therapeutic target in congenital Long QT-syndrome via e.g. left cardiac sympathetic denervation [32].
Individualized QTc (QTcI), which is QTc derived by linear regression of multiple data points from either repeated ECGs or Holter monitoring, is a potential alternative, which can account for the above-mentioned changes [33]. This approach is especially preferable when the investigated drug is assumed to have a substantial effect on heart rate [34]. It has proven to be effective in the diagnosis of congenital Long QT syndrome [35]. However, the requirement for multiple data points and a complicated regression analysis limits this approach somewhat to smaller cohort or TQT studies.
There are a number of promising mathematical QT models, which try to improve on the shortcomings of the “classic” formulas without the need to determine QTcI. A non-linear, spline-based regression model by Rabkin et al. for example aimed to determine common covariates with influence on QT by leveraging data from large historical cohorts and showed good results independent of heart rate [36]. None of these approaches has, however, found widespread recognition and use in clinical studies yet.
Last, but not least, there is always a human element to any study and there is merit to a controlled setting with experienced staff to detect any potential confounders. There is a seemingly endless list of additional factors that can potentially impact QTc ranging from infectious diseases with marked inflammatory response [37] to excessive caffeine intake [38] and, of course, concomitant medication (during the flu season compounds with chlorpheniramine among others are worth mentioning [39].)
In summary, there is no fixed approach to determine the important risk of QT/QTc prolongation, but there are several assessments that can be tailored to the individual needs of the investigation.