(p. 157) Prevention of Schizophrenia
(p. 158) Overview
There are currently recognized precursors of schizophrenia that are apparent during adolescence. A wide variety of early intervention techniques have been developed that draw on the knowledge of these precursors to identify individuals at risk for the illness and to prevent the predisposition toward schizophrenia from developing into the full disorder. Most of the research that enabled the identification of these precursors and the development of these intervention techniques was performed retrospectively in adults with schizophrenia, with little specific research attention directed toward forms of schizophrenia that manifest during adolescence. In addition, prevention efforts have necessarily lagged behind studies of the risk factors, detection, and early intervention of the disease. Yet a great deal has already been learned about risk-profiling and early intervention in schizophrenia generally, and those aspects that may be useful in understanding the adolescent forms of the illness are discussed below.
Traditionally, prevention efforts have been classified into three levels: (1) primary prevention, which is practiced prior to the onset of the disease, (2) secondary prevention, which is practiced after the disease is recognized but before it has caused suffering and disability, and (3) tertiary prevention, which is practiced after suffering or disability has been experienced, in order to prevent further deterioration. This classification scheme is attractive and simple, but it does not distinguish between preventive interventions that have different epidemiological justifications and require different strategies for optimal utilization. For example, it focuses on intended outcomes rather than on target populations or prevention strategies.
The terms universal, selective, and indicated have been adopted as a valuable way to distinguish preventive interventions. All three of these strategies refer to the target population. Universal preventive interventions are applied to whole populations and aim at reducing risk and promoting protective factors. Because obstetric complications have been linked to the subsequent onset of schizophrenia in several studies (Zornberg et al., 2000), one potentially effective universal prevention strategy would be to focus on lowering the incidence of such complications through improved prenatal, perinatal, and postnatal care.
In contrast to universal prevention strategies, selective and indicated interventions target specific subgroups for intervention. Selective interventions target those who are at elevated risk based on group-level characteristics that are not directly related to etiology. Because schizophrenia is a familial and heritable disorder (Gottesman, 1991), a selective prevention program for schizophrenia might focus on asymptomatic children with first-degree affected relatives or, more specifically, on those with particular combinations of schizophrenia-risk-specific gene variants, as they become known.
Finally, an indicated intervention involves targeting individuals who either have signs of the disorder but are currently asymptomatic, or are in an early stage of a progressive disorder. Because there are no universal signs of schizophrenia, indicated interventions for this disorder have a somewhat broad definition. Two lines of research that may lead to indicated interventions for schizophrenia include the study of individuals with prodromal signs of schizophrenia (Eaton et al., 1995) and the characterization of individuals with schizotaxia, which can be defined as the underlying predisposition to schizophrenia that may or may not be expressed as prodromal symptoms (Tsuang et al., 2002).
To develop and refine selective and indicated prevention efforts for schizophrenia, the disorder itself (as well as its precursors) must be thoroughly understood. Some of the risk factors for schizophrenia, such as birth complications and a family history of the disorder, are widely recognized. Others are just becoming known or are still being validated. When a wide variety of schizophrenia-specific precursors are available, these features can be used to maximize the efficiency and effectiveness of preventive efforts by narrowly specifying the characteristics of at-risk individuals, allowing only those who would benefit from intervention to be selected to receive it.
(p. 159) Premorbid Aspects of Schizophrenia
The etiology of schizophrenia is complex, most likely involving a range of genetic and gene–environment interactions that are well summarized as the “epigenetic puzzle” (Gottesman & Shields, 1982; Plomin et al., 1994), as discussed in Chapter 5. The schizophrenia syndrome, the delusions, hallucinations, thought disorder, negative features, and cognitive dysfunction, is manifest at some stage during the lives of around 1 in 100 people. Figure 7.1 shows that the occurrence begins to take off in the early teenage and adolescent years, being rare before puberty and becoming less common in the second half of life.
However, important events may be occurring in the period leading up to illness and in the early years of development, the so-called prodromal and premorbid periods.
Prodromal and Premorbid Phases of Schizophrenia
In most cases, schizophrenia does not come totally out of the blue; there are important changes that occur before the psychotic syndrome. Fragmentary psychotic symptoms, depression, changes in behavior, attenuated general functioning, and other nonspecific features commonly occur in the weeks, months, and sometimes years before the first psychotic break. This period before the schizophrenia syndrome is established is known as the prodrome, and it represents a change that can frequently be identified by either the affected individual or his or her family members.
The prodrome is a period of considerable interest from a clinical and theoretical point of view because it may be possible to intervene early during this time and so prevent the onset of psychosis or improve its outcome. This exciting prospect of early intervention, considered in Chapter 6, is technically complex because of the nonspecific nature of some of the symptoms in the prodrome. Schizophrenia or other psychoses are by no means inevitable in a group of adolescents who show apparently prodromal features. Looking back to adolescents who have developed schizophrenia, the psychological difficulties are, of course, much more difficult. Much research is aiming to understand the biology underlying this period just before and around the onset of schizophrenia when important neuropsychological and structural changes may be occurring (Pantelis et al., 2003; Wood et al., 2003). There is general agreement, however, that earlier-onset cases such as these occurring in childhood or adolescence are likely to have more severe premorbid abnormalities (Nicolson & Rapoport, 1999; Nicolson et al., 2000).
There are other differences and abnormalities that occur well before the period of risk shown in Figure 7.1 begins. They are not only in a psychological domain and show no obvious continuity (p. 160) with the schizophrenia syndrome. Rather than changes from the preexisting state that herald the illness during a prodrome, these differences are more a long-term part of the person, his or her personality and early development.
These differences are known as premorbid features. The distinction from the prodrome is not always clear, particularly in younger people, but may have theoretical importance because they seem to point toward early vulnerability or predisposing factors, rather than to events that occur as an illness is triggered or precipitated. The existence of premorbid abnormalities and differences in those who will, years later, develop schizophrenia suggests that parts of the epigenetic puzzle are put in place in very early life.
In childhood-onset cases, the distinction may be almost impossible because of the severity and insidious onset of schizophrenia before age 13 (Alaghband-Rad et al., 1995). However, it does seem that the variety of premorbid neurodevelopmental impairments in childhood-onset schizophrenia are striking compared to adolescent- and adult-onset populations (Addington & Rapoport, 2009). Why look in early life for premorbid differences and causes of schizophrenia?
From its first descriptions, schizophrenia has had a longitudinal dimension. Thomas Clouston (Clouston, 1892; Murray, 1994; Murray & Jones, 1995) recognized a syndrome that he called “developmental insanity” in which developmental physical abnormalities were associated with early-onset psychotic phenomena, particularly in adolescent boys. When defining the schizophrenia syndrome more clearly, both Kraepelin (1896) and Bleuler (1908, 1911) noted that many of the people who developed the psychotic syndrome had been different from their peers long before the psychosis began. Here is a quotation from one of Bleuler’s early accounts of what has become known as schizophrenia:
It is certain that many a schizophrenia can be traced back into the early years of the patient’s life, and many manifest illnesses are simply intensifications of an already existing character…. All ten of my own school comrades who later became schizophrenics were quite different from the other boys. (Bleuler 1911/1950)
If some of the seeds of schizophrenia are sown in early life, then there ought to be other evidence. The excess of minor physical abnormalities (Green et al., 1989; Gualtieri et al., 1982; Guy et al., 1983; Lane et al., 1997; Lohr & Flynn, 1993; Sharma & Lal, 1986), and the dermatoglyphic or fingerprint abnormalities in people with schizophrenia (Bracha et al., 1992; McGrath et al., 1996) are seen as “fossilized” reminders of insults very early in life, during the first or second trimester of pregnancy, such as infections and nutritional problems (reviewed in Tarrant & Jones, 1999). These factors and some of the neuropathological data are probably best explained in terms of developmental processes having gone awry (Weinberger, 1995).
However, these processes are difficult to observe directly. Genetic high-risk studies, where the offspring of people with schizophrenia are followed up, have shown subtle differences in the neurological development of these children at special risk, and in those not known to be so (Erlenmeyer-Kimling et al., 1982; Fish, 1977; Fish et al., 1992; Walker & Lewine, 1990). Genetic studies such as these are discussed in Chapter 5.
What Are the Premorbid Differences Seen in Schizophrenia?
Bleuler wasn’t very precise when he mentioned that many of the people he’d known who developed schizophrenia as adults were different from other boys as children. It’s certainly interesting that he mentions boys, specifically, because tightly defined schizophrenia does seem to be more common in men than in women, and the early developmental differences are often more obvious in boys than in girls. This may be partly an artifact of some research designs, as well as an effect of differences in the wiring of male and female brains.
Many aspects of development can be seen to be slightly different in children who will later develop schizophrenia. Often these differences are subtle and would not be noticed at the time by parents or professionals. Usually, differences (p. 161) can be noted in characteristics that are developing rapidly according to the age of the child, things that are on the cusp of the developmental wave, and the child appears to catch up later on. Here are some examples.
Early Milestones and Motor Development
Direct evidence of neurodevelopmental differences is available (Weinberger, 1995). One source is a remarkable piece of opportunistic research by Walker and colleagues (1990, 1993). They studied “home movies” of families in which one child later developed schizophrenia. Facial expression of emotion and general motor functions were rated blind to that child’s identity among the siblings. The pre-schizophrenic children were distinguished on both accounts, some with fairly gross but transitory motor differences. These may point to the basal ganglia of the brain as being involved in the underlying mechanism, reminding us that subtle motor disturbances are apparent at the beginning of schizophrenia, before any treatment (Gervin et al., 1998).
Such developmental differences have now been demonstrated in large, population-based or epidemiological samples. In the British 1946 birth cohort, a group of several thousand people born in one week in March 1946 have been studied regularly throughout their lives. Their mothers were asked about development when the children were age 2 years, before anyone knew what would happen later on. All the milestones of sitting, standing, walking, and talking were slightly though clearly delayed in those who developed schizophrenia as adults, but there was nothing that would have alarmed parents at the time. There were other indications that language acquisition was different before onset of schizophrenia. Nurses were more likely to notice a lack of speech by 2 years in the children who developed schizophrenia as adults, and school doctors noted speech delays and problems in them throughout childhood.
Developmental differences have been replicated in similar cohort studies in other domains, such as bladder control, fine motor skill, and coordination during late childhood and adolescence (Cannon et al., 1999; Crow et al., 1995). The motor and language delays were replicated and extended in a birth cohort study from Dunedin, New Zealand (Cannon et al., 2002), where over a thousand children have been followed during childhood. Those who indicated in their mid-20s that they had experienced symptoms suggestive of schizophrenia, mania, and other disorders were compared with those who said that they had never had such phenomena.
Figure 7.2 shows how a summary motor performance score was lower through most of (p. 162) childhood for those who experienced a schizophreniform disorder compared with the other groups; Figure 7.3 indicates that there was also a receptive language problem in those who later had hallucinations, delusions, and thought disorders.
Developmental differences before the onset of schizophrenia have been observed during the first year of life in the North Finland 1966 birth cohort. This comprises about 12,000 babies due to be born in this geographic area during 1966 (Rantakallio, 1969). Their early development was charted in the first year of life and later linked to information about who had developed schizophrenia through adolescence and into the early 30s (Isohanni et al., 2001). Figure 7.4 shows the incidence of schizophrenia in male subjects according to how quickly the little boys had learned to stand without support or “toddle” during the first year. The figure for girls was similar.
It is clear that not only was there an effect whereby the later a boy learned to toddle, the greater was his chance of developing schizophrenia in later life, but also that this effect seemed to hold true throughout the range of variation in reaching this milestone, all of which might be considered normal. If one were looking only for very late developers, then one might be more likely to find them within the pre-schizophrenia group than in those who did not develop the illness. However, this approach would completely obscure the widespread nature of this association, the meaning of which is considered later on.
There is another finding apparent from Figure 7.4. For the boys who passed the milestone early, in the 9-month and 10-month categories, the relatively few individuals who developed schizophrenia all did so in their mid-teens to mid-20s; their period of risk seemed fairly short. For those who were later developers, the period of risk was longer; these groups are still accruing cases of schizophrenia into their early 30s and beyond. It may be that the overall risk period for schizophrenia is shorter where neurodevelopment is more efficient, and longer where it is less efficient.
Bleuler’s quotation above most obviously implies differences in behavior and temperament. Studies in this area have also moved on through retrospective research methodologies to cohort designs. Sophisticated rating scales for the retrospective assessment of behavior and personality demonstrate differences prior to psychosis, with the most common being characteristics of a shy, “schizoid” habit (Ambelas, (p. 163) 1992; Cannon-Spoor et al., 1982; Foerster et al., 1991; Gittleman-Klein & Klein, 1969).
Robins (1966) undertook a pioneering, historical cohort study in which she followed a group of boys who had been referred to a child guidance clinic in St. Louis, Missouri. Here antisocial behavior was associated with later schizophrenia. Watt and Lubensky (1976; Watt, 1978) traced the school records of people with schizophrenia who came from a geographically defined neighborhood in Massachusetts. Girls who were to develop schizophrenia were introverted throughout kindergarten into adolescence. Boys who were to become ill were more likely to be rated as “disagreeable,” but only in the later school grades (seven to 12). This pattern has been identified (Done et al., 1994) in a British cohort using a similar set of behavioral ratings, and in the Dunedin cohort mentioned above (Cannon et al., 2002). The 1946 British birth cohort contained children’s own ratings of their behavior at age 13 years, and teachers’ ratings 2 years later. These data showed no evidence of antisocial traits in the pre-schizophrenia group but a strong association with shy, “schizoid” behaviors at both ages. The two views gave a very similar picture; the shyer someone seemed as a child, the greater the risk. Other studies do, however, remind us of the varied childhood psychiatric conditions that predate schizophrenia (Kim et al., in press).
The behavioral differences seem to persist toward the prodrome but are independent from it. Malmberg et al. (1998) studied a sample of some 50,000 men conscripted into the Swedish army at age 18 to 20 years when they underwent a range of tests and assessments. Four behavioral variables at age 18 were particularly associated with later schizophrenia: having only one or no friends, preferring to socialize in small groups, feeling more sensitive than others, and not having a steady girlfriend. Cannon et al. (1997) also noted the same relationship.
Another twist to the story about premorbid behavioral differences comes from the recognition that some of the individual parts of the schizophrenia syndrome, such as hallucinations or delusions, can exist in otherwise well-functioning individuals in the population. However, they are indeed associated with a greater risk of occurrence of subsequent schizophrenia whether they occur in early adolescence (Poulton et al., 2000) or adulthood (Myin-Germeys et al., 2003a, 2003b).
Thus, there seems to be a consistency over childhood and adolescence, and across several types of study, regarding the presence of premorbid behavioral differences. People who will develop schizophrenia as adolescents and adults are different from their peers in terms of behavior in childhood, just as Bleuler noted a (p. 164) century ago; the effects may be even more widespread than he thought.
Cognitive Function and IQ
This aspect of psychological function also shows differences in the premorbid period. Aylward, Walker, and Bettes (1984) have provided a comprehensive review of intelligence in schizophrenia. They concluded that intellectual function is lower in pre-psychotic individuals than in age-matched controls. Linking the pre-psychotic deficit to outcome, they raised the question as to whether IQ may be an independent factor that can protect otherwise vulnerable individuals, or whether the deficits are part of that vulnerability.
Once again, the birth cohort studies shed light on the question. Cannon et al. (2002) showed that mean IQ test scores were consistently lower during childhood in those children who developed schizophreniform disorder (Fig. 7.5). This mean shift in premorbid IQ had also been seen in two British cohorts (Jones & Done, 1997). When the childhood IQ data from the 1946 cohort (Pidgeon, 1964, 1968) are studied in greater detail, it is clear that the lower mean premorbid IQ is not due to a subset of people with very low scores; rather, the whole distribution of those who will develop schizophrenia when they reach adolescence or adulthood is shifted down—the majority or most children seem not to be doing as well as they might have been expected to perform (Jones et al., 1994). This is a similar situation to the motor findings in the Finnish cohort. It is not that there is a group of very abnormal individuals driving the findings; the effects are seen across the normal range.
David et al. (1997; see above) replicated this result in the Swedish conscript study, although the measures were later in life, at age 18. There was no evidence of a threshold effect below or above which this relationship did not hold. Very bright individuals can develop schizophrenia, but they are less likely to than those who are less able. Put another way, any individual is more likely to develop schizophrenia than someone who is more able in terms of IQ, although the effect is small. Interest in the cognitive aspects of schizophrenia (David & Cutting, 1994; Green, 1998) suggests a parsimonious conclusion that pre-psychotic IQ deficits (and perhaps social characteristics) may be manifestations of the same abnormal cognitive processes that later result in psychosis.
What Do Premorbid Abnormalities Mean?
The range of differences in the developmental histories of people who will develop schizophrenia when they are older suggests that something (p. 165) to do with the causes of this syndrome is active long before the characteristic features begin (Marenco & Weinberger, 2000). There is evidence for many such early factors, including genetic effects (Fish et al., 1992; Jones & Murray, 1991), obstetric complications (Cannon et al., 2002), psychosocial stresses, famine, infections, and other toxic events during brain development (see Jones, 1999, for review).
It seems that many events that may lead to early brain development being suboptimal may increase the risk of later schizophrenia. There may be specific causes or combinations of causes, such as gene–environment interactions, that make people vulnerable to developing the schizophrenia syndrome, perhaps after later, necessary events that act as triggers. These may include normal (Weinberger, 1987, 1995) or abnormal brain development (Feinberg, 1982a, 1982b, 1997; Pogue-Geile, 1997), as well as traditional precipitants such as psychosocial stressors or drugs (see Chapter 6).
The behavioral, motor, language, and cognitive differences shown in the premorbid period may be manifestations of vulnerability or predisposition to schizophrenia; they may not be risk modifiers in themselves. These indicators seem remarkably homogeneous—in retrospect, like a final common pathway. The idea of only a subgroup of individuals having this manifest vulnerability as suggested in the seminal views of developmental aspects of schizophrenia (Murray & Lewis, 1987) is not supported by research. Most or even every person who develops the syndrome may have had a degree of developmental vulnerability, although this will not have been obvious at the time.
The early motor findings in the Finnish birth cohort (see Fig. 7.4) are consistent with the vulnerability being due to developmental processes being generally less efficient—the formation or enhancement of functional neural networks, for instance. The greater the inefficiency, the greater the risk of schizophrenia when that same inefficiency is played out in the formation of complex and integrative systems later in adolescence and adult life (see Chapter 5).
There are several candidates to explain this unifying vulnerability. These include hormonal events (Walker & Bollini, 2002) that are able to tie together motor and other system abnormalities in early life and links with psychosocial stress in models of predisposition and precipitation (Walker et al., 1999). Molecular biology and the investigation of not only the presence but also the functional activity of genes and the proteins for which they code may yield other dimensions of vulnerability. For instance, Tkachev et al. (2003) showed that expression of genes associated with glial cells that are associated with the nutritional support of nerve cells (oligodendrocytes), and with myelin, the insulating sheaths they provide for these neurons, were downregulated in the frontal cortex of brains of people who had died having suffered schizophrenia in life. This seems a very good candidate for the homogeneous vulnerability factor posited in this account of premorbid abnormalities before schizophrenia, and may be an endophenotype or hidden manifestation of the disorder. The deficient gene expression remains to be demonstrated before the onset of schizophrenia and will, itself, have its own prior causes.
As mentioned at the beginning of this section, premorbid features of schizophrenia are, in our current understanding, not yet of use in terms of prediction and early intervention. They occur in multiple domains, but many of the effects we can measure are subtle and leave individuals remaining well within the wide range of normality. Premorbid features tell us a great deal about what we should be looking for in terms of underlying mechanisms and causes of schizophrenia, and when these may operate; they are signposts toward these. As we learn about the processes that underpin the behavioral, cognitive, and motor differences that we can measure in the premorbid phase of schizophrenia, we may become able to identify those who are vulnerable with enough precision to be able to do something useful for them.
Developmental Precursors of Adolescent-Onset Schizophrenia
There are precursors of schizophrenia prior to the first onset of psychosis in many, but not all, adolescents who develop schizophrenia. As will be seen below, the precursors of schizophrenia (p. 166) can be subtle changes in basic brain functions like motor functions, attention and memory, certain behavior problems, or attenuated schizophrenic symptoms. Identifying the developmental precursors of adolescent-onset schizophrenia has important implications both for enhancing our understanding of the underlying neurobiology of schizophrenia, and for the development of preventive interventions for schizophrenia.
Neurobiological factors present in individuals at high risk for developing a schizophrenic disorder, prior to the onset of frank psychotic symptoms, may represent potential etiological factors for schizophrenia. A number of brain systems known to be disturbed in schizophrenia, including prefrontal and medial temporal lobes (Selemon & Goldman-Rakic, 1999; Weinberger, 1986), may underlie certain neurocognitive impairments in children at risk for schizophrenia (R. Asarnow, 1983; Cannon et al., 1993). Determining how these neurobiological factors evolve when a schizophrenic disorder develops could provide important clues about how the diathesis for schizophrenia is potentiated into the overt disorder. A combination of disease-related progressions and maturational changes is hypothesized to exacerbate these dysfunctions when individuals at risk for the disorder convert to having the disorder.
Two broad classes of methods have been used to identify developmental precursors of schizophrenia. The first class of methods is prospective studies of children. A common feature of prospective methods is identifying, then characterizing, a group of children and following them up to determine which children subsequently develop a schizophrenic disorder. One important prospective method is to study children who are at increased statistical risk of developing a schizophrenic disorder. The lifetime risk for schizophrenia in the general population is less than 1%, so very large samples are required to prospectively identify the precursors of schizophrenia by following up children drawn from the general population. Given the population base rate of schizophrenia (<1%), you would need to start off with at least 2,500 children (without accounting for subjects being lost to follow-up) to identify the developmental precursors of schizophrenia in 25 individuals. High-risk studies ascertain individuals with an increased lifetime risk for schizophrenia for inclusion in prospective, longitudinal studies. This is typically accomplished by studying the children of parents with schizophrenia. The lifetime risk for schizophrenia for children of one parent with schizophrenia is approximately 10% to 12%, an approximately 10-fold increase in the risk for the disorder. High-risk studies frequently measure putative etiological factors for schizophrenia prior to the onset of the disorder. In this way, studies of children at risk for schizophrenia provide a vehicle for testing hypotheses about etiological factors in schizophrenia.
Most (85%–90%) patients with schizophrenia do not have parents with a schizophrenic disorder. This has raised the concern that findings from “genetic high-risk” samples may not accurately describe the developmental precursors of schizophrenia in the much larger number of individuals who develop schizophrenia but do not have a schizophrenic parent. Recognition of this problem has led to an interest in complementary strategies for identifying developmental precursors of schizophrenia. Birth cohort studies are prospective studies that can provide information on precursors of schizophrenia that do not have some of the ascertainment biases inherent in high-risk studies. In contrast to studies of children at risk for schizophrenia, birth cohort studies follow up large, representative samples of entire birth cohorts. Birth cohort studies are designed to provide information about a wide range of medical, psychiatric, and social conditions, so they use very large samples, literally thousands of subjects. For example, the 1946 British birth cohort study that provided important data on developmental precursors of schizophrenia studied almost 5,400 children born in the week of March 9, 1946, then systematically followed them up to determine that 30 children developed schizophrenia, as well as a broad range of other psychiatric and medical outcomes. A great strength of birth cohort studies is the large, representative sample size. However, a limitation of birth cohort studies is that since they are not typically (p. 167) designed to test hypotheses about any particular disorder, they use a rather broad range of measures, which are not specifically tailored to measure potential precursors of schizophrenia.
By studying children prior to the onset of the disorder, it becomes possible to identify the precursors or antecedents of the disorder, as opposed to the consequences of the disorder—for example, the initiation of antipsychotic drug treatment. We will review some of the key findings that have emerged from three decades of studies of children at risk for schizophrenia and birth cohort studies.
A second class of methods involves the collection of information on the premorbid development of individuals, usually adults, who have been diagnosed with schizophrenia. Some of the earliest studies of this type relied on retrospective reports from informants who knew the patient as a child. This approach has obvious limitations, among them being the fact that recollections of the past may be subject to bias. The follow-back method features the ascertainment of individuals with schizophrenia and then, using different types of archival material, characterizing them prior to the onset of psychosis. Since the focus of this section is on adolescent-onset schizophrenia, we will emphasize those few studies that ascertained adolescent-onset schizophrenics.
Follow-back studies vary in the type of archival material used to describe the premorbid characteristics of individuals who develop schizophrenia. There is wide agreement (see Watt et al., 1982) about the advantages of using contemporaneous childhood records over retrospective interviews to reconstruct the premorbid histories of individuals who develop schizophrenia. The major limitation of follow-back studies is that the childhood evaluations were not guided by specific hypotheses about the age-specific manifestations of schizophrenia; as a consequence, the most informative measures may not have been collected. They also have ascertainment biases, the nature of which varies depending on how the sample of schizophrenia patients was identified.
Birth cohort and follow-back studies can show associations between childhood characteristics and the development of schizophrenia because in both types of studies individuals with schizophrenia have been identified. These associations are prospective in birth cohort studies and retrospective in follow-back studies. Because the data used to describe childhood risk factors in birth cohort and follow-back studies were not collected with the intent of testing hypotheses about schizophrenia, the measures may not be sensitive to some of the more subtle manifestations of liability to schizophrenia. In contrast, the measures included in studies of children at risk for schizophrenia were specifically designed to tap liability to schizophrenia. On the other hand, most studies of children at risk for schizophrenia, while intended to be longitudinal, were not able to follow up subjects through the age of risk to determine which high-risk subjects developed a schizophrenic disorder. Consequently, while there are extensive cross-sectional comparisons of children at risk for schizophrenia to controls, there is much less information on the long-term predictive validity of childhood risk factors identified in high-risk studies.
If the results of follow-back studies of adolescent-onset schizophrenia patients yield converging results to those of children at risk for schizophrenia and birth cohort studies, this would provide reassurance about the generalizability and validity of the results.
A Developmental Perspective on Risk Factors
There are relatively age-specific manifestations of liability to schizophrenia (see J. Asarnow, 1988; R. Asarnow, 1983; Erlenmeyer-Kimling et al., 2000; Walker, 1991, for reviews), and the manifestations to liability to schizophrenia are somewhat different at different ages. For example, one of the interesting findings that emerges from a review of developmental precursors of schizophrenia is that some deficits observed during infancy frequently found in high-risk, birth cohort, and follow-back studies are not found in later stages of development. Another important reason to attend to (p. 168) the developmental progression of risk factors is that from the point of view of targeting individuals for prevention, risk factors more proximal to the period of time when schizophrenia develops may have better diagnostic accuracy than, for example, infancy predictors.
Table 7.1 summarizes some of the major findings concerning precursors of schizophrenia at three different developmental periods: infancy, early childhood, and middle childhood/early adolescence. Table 7.1 is not an exhaustive summary of the results of high-risk, birth cohort, and follow-back studies; rather, it presents the characteristics that best differentiate high-risk children from controls or predict later development of schizophrenia that have thus far been identified in the literature. Cited below are comprehensive reviews of the results of high-risk, birth cohort, and follow-back studies.
Table 7.1 Developmental Precursors of Schizophrenia Identified by Means of Three Different Research Strategies
Birth Cohort Studies
Symptoms and Behaviors
Symptoms and Behaviors
Symptoms and Behaviors
Infancy (0–2 years)
Impaired motor and sensory functioning
High or variable sensitivity to sensory stimulation
Abnormal growth patterns
Short attention span
Passive, low energy, quiet, inhibited
Absence of fear of strangers
Low communicative competence in mother–child interaction, less social contact with mothers
Delays in motor milestones
Speech problems or delays
Delayed potty training
Abnormal motor functioning
Early childhood (2–4 years)
Poor gross and fine motor coordination
Inconsistent, variable performance on cognitive tests
Depression and anxiety
Angry and hostile disposition
Schizoid behavior (i.e., emotionally flat, withdrawn, distractible, passive, irritable, negativistic)
More likely to receive a diagnosis of developmental disorder
Middle childhood/early adolescence years (4–14 years)
Passive impairment (poor fine motor coordination, Socially balance, sensory perceptual isolated signs, delayed motor development)
Poor social adjustment Attentional impairment under ADD overload conditions
Anxious/Variance-scatter on depressed intellectual tests
Poor affective control (emotional instability, aggressive, disruptive, hyperactive, impulsive)
Poor interpersonal relationships, withdrawn Cognitive slippage disturbance
Mixed internalizing-externalizing symptoms, fearful ADD-like syndrome
Poor academic achievement
Poor balance, clumsiness
Less socially confident
“Schizoid” social development
Reduced general intelligence
Poor academic achievement
Poor social adjustment
ADD, attention-deficit disorder; CNS, central nervous system.
The format of Table 7.1 was modeled after a review by J. Asarnow (1988). The entries in Table 7.1 for studies on high-risk children come from reviews by J. Asarnow (1988), Erlenmeyer-Kimling (2000, 2001), R. Asarnow (1983), and Cornblatt and Obuchowski (1997). The entries for birth cohort studies are based on reviews by Jones, Rogers, Murray, and Marmot (1994) and Jones and Tarrant (1999). The data for entries of follow-back studies of adolescent-onset schizophrenia come from Watkins, Asarnow, and Tanguay (1988), and Walker, Savoie, and Davis (1994). Watt and Saiz (1991) provided a broad review of follow-back studies of adult-onset schizophrenia.
Two types of risk characteristics are differentiated into separate columns in Table 7.1: “endophenotypes” versus clinical and behavioral features. Endophenotypes are putative reflections of the underlying schizophrenic genetic diathesis. Most of the putative endophenotypes employed in high-risk studies are neuromotor or neurocognitive functions (e.g., language, attention, and memory) believed to tap central nervous system disturbances that reflect liability to schizophrenia. In contrast, clinical and behavioral features are either non-schizophrenia psychiatric symptoms or behavior problems that, while they may reflect the underlying genetic diathesis, are much more proximal to the overt symptoms of schizophrenia. The reason for making this distinction is that these two different classes of risk characteristics have somewhat different implications as targets for prevention.
The results of high-risk studies have to be considered in the context of a major limitation: there are limited data on how well the cross-sectional differences between children at risk for schizophrenia and matched controls predict the later onset of schizophrenia. Only six studies of children at risk for schizophrenia have obtained diagnostic evaluations in adulthood or late adolescence:
1. The New York High-Risk study (Fish, 1984)
3. The Israeli High-Risk study (Ingraham et al., 1995)
4. The New York High-Risk project (Erlenmeyer-Kimling et al., 2000)
5. The Swedish High-Risk study (McNeil et al., 1993)
6. The Jerusalem Infant Development study (Hans et al., 1999).
The New York High-Risk project studied the largest number of subjects for the longest period of time, and therefore provides the most extensive data on the diagnostic accuracy of childhood and adolescent predictors of schizophrenia-related psychoses. None of these studies focused on the prediction of adolescent-onset schizophrenia. Indeed, there are very few cases of adolescent-onset schizophrenia in the entire high-risk literature. As a consequence, we are making the assumption that the factors that predict adult-onset schizophrenia are germane to the prediction of adolescent-onset schizophrenia.
During infancy in most, but not all, studies (see Walker & Emory, 1985, for review), neurological signs or neuromotor dysfunctions are found more frequently in children at risk (p. 169) (p. 170) for schizophrenia than controls. In these studies neuromotor anomalies were assessed by observation during a pediatric neurological examination or by performance on standardized tests of infant development (e.g., the Bayley). Neurological signs and neuromotor dysfunctions are not specific to infants at risk for schizophrenia and are not rare events in the general pediatric population. Neurological abnormalities in neonates typically tend to improve. In contrast, it appears that these abnormalities in children at risk for schizophrenia persist, and may worsen over time. Infants with neurological or neuromotor abnormalities are the high-risk infants most likely to develop schizophrenic disorders in adolescence and early adulthood (Fish, 1987; Marcus et al., 1987; Parnas et al., 1982). Neurological dysregulation in infancy predicts the development of schizophrenia spectrum disorders (Fish, 1984). Impaired performance on tasks with extensive motor demands during middle childhood also predicts the presence of schizophrenia spectrum disorders during adolescence (Hans et al., 1999).
Disturbances in early social development are found more frequently in children at risk for schizophrenia than controls. Depending on the study, these disturbances are manifested in difficult temperaments, apathy or withdrawal, being inhibited, less spontaneous and imitative, reduced social contact with mothers, and the absence of fear of strangers. The absence of fear of strangers during infancy could be an indication that the child does not differentiate between familiar adults with whom the child is attached (e.g., their parents) versus others. This absence of the fear of strangers may reflect inadequately developed attachment. These disturbances are not specific to children at risk for schizophrenia; they are also associated with broad risk factors such as socioeconomic status, general maternal distress, early trauma or neglect, and poor quality of parenting. There are scant data on how well these disturbances in infant social development predict the development of schizophrenia. However, many of these findings are related to the development of social competencies (Watt & Saiz, 1991).
During early childhood (2–4 years of age) children at risk for schizophrenia are more likely to show poor fine and gross motor coordination, and low reactivity than controls. While poor fine and gross motor coordination was found in a different sample of children than the samples of infants at risk for schizophrenia who showed a variety of neurological signs and neuromotor dysfunction, these data suggest that the dysfunctions observed in infancy are persistent.
In early childhood there is an increased occurrence of internalizing symptoms (depression and anxiety), angry and hostile dispositions, and schizoid behavior (emotionally flat, socially withdrawn, passive and distractible) in children at risk for schizophrenia. Again, these characteristics are not specific to children at risk for schizophrenia, and there is no evidence that these characteristics are strongly predictive of the later development of schizophrenia.
Middle Childhood/Early Adolescence
Neuromotor impairments, including gross motor skills (Marcus et al., 1993), are found more frequently in children at risk for schizophrenia than in controls during middle childhood/early adolescence (4–14 years of age). One of the most robust cross-sectional findings during middle childhood and early adolescence is the presence of neurocognitive impairments, especially on measures with high attention demands. A subgroup of children at risk for schizophrenia show impairments on some of the same tasks for which patients with schizophrenia show impairments. The neurocognitive tasks for which children at risk for schizophrenia show impairments include measures of sustained attention (various continuous performance tests) and secondary memory (e.g., memory for stories). For example, children at risk for schizophrenia, as well as acutely disturbed and partially remitted schizophrenia patients, perform poorly on a partial report span of apprehension task (R. Asarnow, 1983) in the high attention/processing demand condition. The span of apprehension measures the rate of early visual information processing (Fig. 7.6).
(p. 171) There are some data on the predictive validity of the neurocognitive impairments identified during middle childhood and early adolescence. In the New York High-Risk project the presence of impairments on a number of attentional tasks (an “Attentional Deviance Index”) given in middle childhood predicted 58% of the subjects who developed schizophrenia-related psychoses by mid-adulthood (Erlenmeyer-Kimling et al., 2000). Attentional impairments in middle childhood were also associated with anhedonia (Freedman et al., 1998) in adolescents prior to the onset of schizophrenia and social deficits during early adulthood (Cornblatt et al., 1992; Freedman et al., 1998). Neuromotor dysfunction during childhood (assessed by the Lincoln-Oseretsky Motor Development Scale) identified 75% of the high-risk children who developed schizophrenia-related psychoses during adulthood (Erlenmeyer-Kimling et al., 2000). A verbal short-term memory factor that included a childhood Digit Span task and a complex attention task predicted 83% of the New York High-Risk project children who developed schizophrenia-related psychoses during adulthood, and showed high specificity to those psychoses (Erlenmeyer-Kimling et al., 2000). If replicated, these findings would suggest that the combination of genetic risk (being the child of a parent who has schizophrenia) and neurocognitive impairments during middle childhood might identify individuals with a greatly increased risk for developing schizophrenia. The sensitivity (correctly predicting the onset of schizophrenia-related psychoses) was higher for the verbal memory (83%) and motor skills (75%) factors than for the attentional factor (58%). Conversely, the false-positive rate (incorrectly predicting that a child would develop schizophrenia) was lower for the Attentional Deviance Index (18%) than for the memory factor (28%) and motor factor (27%).
The short-term follow-up in the Jerusalem High-Risk study provides an important link between the attentional impairments frequently observed in children at risk for developing schizophrenia during adolescence and the motor impairments found during infancy and early childhood. The children who showed impaired neuromotor performance during childhood were the subjects most likely to show impairments on a variety of measures of attention and information processing during early adolescence (Hans et al., 1999).
During middle childhood, children at risk for developing schizophrenia receive an increased (p. 172) frequency of a variety of psychiatric diagnoses, including an AHDH-like syndrome. Poor affective control, including emotional instability and impulsivity, as well as aggression and disruptive behaviors are found more frequently in children at risk for developing schizophrenia than controls. Early precursors of thought disorder may be reflected in the presence of cognitive slippage. Poor peer relations are one of the most frequently found behavioral characteristics during middle childhood and early adolescence. None of these symptoms is specific to children at risk for schizophrenia; for example, poor affective control is found in children who subsequently develop an affective disorder.
Birth Cohort Studies
A British birth cohort study of almost 5,400 people born in the week of March 9, 1946, complements the studies of individuals at risk for developing schizophrenia by virtue of being representative of the general population. Thirty cases of schizophrenia were identified among individuals between the ages of 16 and 43 in this cohort, which reflects the population base rate of the disorder. A 1956 birth cohort study in northern Finland (Isohanni et al., 2001) yielded 100 cases of DSM-III schizophrenia.
Across the major birth cohort studies, a number of developmental precursors of schizophrenia have been identified. These include the British birth cohorts of 1946 (Jones, Rodgers, Murray, & Marmot, 1994) and 1958 (Done, Crow, Johnson, & Sacker, 1994; Jones & Done, 1997) and the northern Finland 1956 birth cohort (Isohanni et al., 2001). Neurological signs, reflected in various forms of motoric dysfunction ranging from tics and twitches, poor balance and coordination, and clumsiness to poor hand skill, are consistently identified as developmental precursors of individuals who later develop schizophrenia (Done et al., 1994; Jones et al., 1994). There was an increased frequency of speech problems up to age 15 in persons who subsequently developed schizophrenia. Low educational test scores at ages 8 and 11 were also risk factors (Jones et al., 1994).
During early and late middle childhood, individuals who subsequently developed a schizophrenic disorder could be differentiated from their peers by their preference for solitary play, poor social confidence, and in general a “schizoid” social development.
In general, birth cohort studies suggest that there appears to be “consistent dose-response relationships between the presence of developmental deviance and subsequent risk” (Jones & Tarrant, 1999). The more deviant an individual is toward the “abnormal” end of a population distribution, the greater the risk of the disorder.
There is considerable overlap between the developmental precursors of affective disorders and schizophrenia (Van Os et al., 1997). For example, lower educational achievement is associated with affective disorders in general, while delayed motor and language milestones are associated with childhood onset of an affective disorder. As in schizophrenia, there is evidence of persistence of motor difficulties, with an excess of twitches and grimaces noted in adolescents.
In regard to endophenotypic characteristics, during infancy children who subsequently developed schizophrenia as adolescents are characterized by the presence of abnormal motor functioning and impaired language. Neuromotor and language impairments, and decreases in positive facial emotion, are also present in early childhood (Walker et al., 1993). During middle childhood the language impairments fade; however, the neuromotor impairments persist. In addition, during middle childhood children who subsequently develop a schizophrenic disorder are characterized by poor academic achievement, poor attention, and reduced general intelligence.
During middle childhood children who subsequently develop a schizophrenic disorder are characterized as being passive and socially isolated, with poor social adjustment. They frequently present with symptoms of attention-deficit/hyperactivity disorder (ADHD) and/or anxiety and depression.
(p. 173) A novel approach to using archival data to characterize the premorbid histories of individuals who develop schizophrenia is the use of home movies to identify infant and childhood neuromotor dysfunctions (Walker, Savoie, & Davis, 1994). Ratings were made of neuromotor functioning in children who subsequently developed schizophrenia, their healthy siblings, pre–affective disorder participants, and their healthy siblings. The pre-schizophrenia subjects showed poorer motor skills, particularly during infancy, than their healthy siblings and pre–affective disorder participants and their siblings. The abnormalities included choreoathetoid movements and posturing of the upper limbs, primarily on the left side of the body.
Consistency of Findings Across Methods
Inspection of Table 7.1 reveals a consistency across studies of children at risk for schizophrenia, birth cohort studies, and retrospective studies in the presence of motor and language problems during infancy. This consistency is particularly impressive given the considerable variation across studies in how motor functioning and language were assessed.
During early childhood (2–4 years of age) neuromotor problems are observed in all three types of studies. In birth cohort studies and retrospective studies, impaired language is noted. In high-risk studies children at risk for schizophrenia are noted as being depressed, anxious, angry, and schizoid, while in birth cohort studies they are noted as preferring solitary play.
During middle childhood (4–14 years of age) there is a persistence of neurological impairments reflected in poor motor functioning in high-risk, birth cohort, and follow-back studies. High-risk studies, unlike birth cohort studies and retrospective studies, included laboratory measures of attention information processing. On these tasks, children at risk for schizophrenia showed attentional impairment under conditions of high processing demands. This may be related to the poor academic achievement that is observed in birth cohort studies and retrospective studies during middle childhood as well as the frequent diagnosis of ADHD. In contrast to the persistence of neuromotor problems, language problems tend to diminish over time so by middle childhood they are rarely noted across the three classes of studies. In adolescents who develop a schizophrenic disorder, language functions are relatively preserved compared to visual-spatial and motor functioning (Asarnow, Tanguay, Bott, & Freeman, 1987).
The results of this brief review suggest a developmental pathway from precursors first identified in infancy to the development of schizophrenia-related psychoses in late adolescence and early adulthood. Neurological signs or neuromotor dysfunctions are present in infancy and persist through early and middle childhood and early adolescence. Neuromotor dysfunction in early childhood predicts the presence of attentional impairments under high processing demands during early adolescence. Neuromotor dysfunctions and attentional impairments during adolescence predict the development of schizophrenia-related psychoses. Since the characterization of key points in this developmental sequence is based on only one or two studies, clearly this model needs to be tested in future research.
The developmental pathway sketched here has potentially interesting implications for our understanding of the neurobiology of schizophrenia. What brain systems are involved in the control of simple motor functions and attention? The developmental link between early neuromotor dysfunction and later attentional impairments may implicate cortical-striatal pathways that support both motor functions and attentional control mechanisms. Striatal dysfunction results in impaired sequential motor performance and chunking of action sequences. Impairments in a variety of attentional functions, including set shifting and self-monitoring, are also associated with striatal dysfunction (Saint-Cyr, 2003).
In high-risk studies the precursors of later difficulties in developing social relations can first be (p. 174) detected in infancy. In some studies, children at risk for schizophrenia have less social contact with their mothers and less fear of strangers, as well as having a difficult temperament.
Poor peer relations are one of the most frequently found behavioral characteristics during middle childhood and early adolescence. A preference for solitary play, poor social confidence, and in general a “schizoid” social development are frequent precursors of schizophrenia.
Studies of children at risk for schizophrenia, birth cohort studies, and retrospective studies all find an increased frequency of nonpsychotic symptoms, particularly internalizing symptoms and poor affective control (including emotional instability and impulsivity) during middle childhood and early adolescence. However, none of these symptoms are specific to children at risk for schizophrenia; many of these symptoms and behavioral characteristics are found in children who subsequently develop an affective disorder.
Limitations: What We Don’t Know
Neuromotor and attentional dysfunctions appear to be putative developmental precursors to schizophrenia. They consistently appear with increased frequency in high-risk, birth cohort, and follow-back studies. In a number of high-risk studies, infancy and childhood neuromotor impairments predicted the later onset of schizophrenia-related psychosis. Attentional impairments during middle childhood and early adolescence in the New York High-Risk project predicted the development of schizophrenia-related psychosis.
The endophenotypic indices that appear to have the greatest predictive validity are neuromotor dysfunction and impaired performance on measures that tap processing under high attention demands, or measures of secondary memory. It remains unclear whether these measures tap schizophrenic-related processes specifically. A number of the measures (including continuous performance tests, partial report span of apprehension tasks, and secondary verbal memory tests) that are sensitive to subtle neurocognitive impairments in children at risk for schizophrenia in middle childhood/adolescence also detect neurocognitive impairments in children with ADHD and learning disabilities. It is unlikely that these impairments have cross-sectional diagnostic specificity.
While the ability of childhood/adolescent measures of attention to predict schizophrenia-related psychosis in the New York High-Risk project is promising, those results need to be replicated in an independent sample. Future studies will need to determine the extent to which childhood/adolescent neurocognitive measures predict schizophrenia-related psychosis conditioned on the presence of a second risk factor, having a parent who is schizophrenic. In effect, the analyses reported by the New York High-Risk project contained two risk factors that predicted schizophrenia-related psychosis—being the child of a schizophrenic parent and having attentional, verbal, short-term memory, or neuromotor impairments. These factors did not predict the onset of schizophrenia in the children of parents with an affective disorder nearly as well as they did in the children of parents with schizophrenia. As noted above, children with other, more common psychiatric diagnoses show deficits on these types of tasks. More research is needed on the diagnostic accuracy of these measures when they are used in the general pediatric population before they can be used to screen children for precursors for schizophrenia. At present, all that we know is that these measures have some promise in predicting which children who have a parent with schizophrenia are likely to develop a schizophrenic disorder themselves.
What is needed in the next generation of studies is not merely the demonstration of group mean differences between high-risk and control groups. If endophenotypic measures are to be used as candidates for preventive intervention programs, what is required are diagnostic accuracy analyses that specify the sensitivity and specificity of tasks using various cutting scores. Cutting scores can be created, depending on the purpose, that optimize sensitivity (detecting true positives) or specificity (false negatives). For example, if the intervention can (p. 175) produce significant adverse events, it might be desirable to set a cutting score to minimize false positives.
Poor peer relations, a preference for solitary play, a “schizoid” social development, various nonpsychotic symptoms (particularly internalizing symptoms), and poor affective control occur frequently during middle childhood and early adolescence in high-risk, birth cohort, and follow-back studies. While these behavior problems and symptoms are precursors of schizophrenia, they are not diagnostically specific; many of these symptoms are associated with other psychiatric disorders. For example, poor affective control is both a symptom of and a precursor to affective disorders. Poor peer relationships are associated with the presence of both externalizing and internalizing disorders. There are relatively few data on the diagnostic accuracy (i.e., specificity and sensitivity) of symptoms and behavior problems detected in middle childhood and early adolescence as predictors of schizophrenia-related psychoses.
The behavior problems and symptoms that are putative precursors of schizophrenia are associated with psychiatric disorders (e.g., depression and ADHD) that are much more common than schizophrenia in the general population. This suggests that they will produce high false-positive rates if they are used in the general pediatric population in an attempt to identify individuals likely to develop schizophrenia.
Implications for Preventive Intervention
There is great interest in developing preventive interventions for schizophrenia, in part because of the belief that once the disorder emerges, a neurodegenerative process is initiated that can only be partially forestalled by currently available treatments. The neurocognitive impairments, non-schizophrenia symptoms, and behavior problems that are putative developmental precursors of schizophrenia may have important implications for the development of preventive interventions for this disorder. These precursors could be used to identify children who might benefit from preventive intervention and serve as targets of interventions.
The neurocognitive impairments that are putative developmental precursors of schizophrenia have potential utility in identifying candidates for preventive interventions. Depending on the risk profile of the intervention, cutting scores on neurocognitive indices could be constructed to either maximize sensitivity or minimize false positives. However, as noted above, before the cutting scores for putative neurocognitive precursors of schizophrenia can be applied to the general pediatric population, additional research is required to evaluate the diagnostic efficiency of these measures in populations without a genetic risk. The neurocognitive precursors of schizophrenia seem to be unlikely targets for preventive interventions. There is no evidence that mitigating attentional, memory, and neuromotor impairments forestalls the development of schizophrenia-related psychoses. Identifying neurocognitive precursors of schizophrenia does advance attempts to develop new somatic treatments for schizophrenia by helping to elucidate the dysfunctional neural networks that underlie this complex disorder.
The diagnostic accuracy of the behavior problems and symptoms that are putative precursors of schizophrenia thus far identified in high-risk, birth cohort, and follow-back studies have not been carefully examined. Given the nonspecificity of these behavior problems and symptoms, it seems likely that they would yield high rates of false positives if used to identify candidates for preventive interventions for schizophrenia. It may be that clinical features more proximal to the onset of schizophrenia-related psychoses, such as prodromal signs and symptoms, have greater diagnostic accuracy in predicting which children will develop schizophrenia. A number of research groups are currently addressing this question.
The behavior problems and symptoms that are putative precursors of schizophrenia are potentially interesting targets for interventions. To the extent that poor peer relations, the presence of internalizing symptoms, and poor affective control pose difficulties for the (p. 176) child and parent, they become worthy targets of therapeutic interventions. Behavioral (e.g., social skills training) and pharmacological (mood-stabilizing drugs) interventions for these problems are based on symptomatic presentations. The nonspecificity of these problems is not particularly problematic in this case. While there is no reason to believe that enhancing social skills and controlling affective symptoms will forestall the development of schizophrenia, there is good reason to believe that they will enhance the patient’s current quality of life, and may also improve his or her adaptation after a psychotic episode. The best predictor of post-psychotic psychosocial functioning is the level of premorbid social competencies.
What Are the Precursors of Schizophrenia?
One of the best ways to develop early intervention efforts for schizophrenia is to start by identifying key features of those individuals who are or will become schizophrenic and determine how these features differ from those seen in individuals who are not ill and are not likely to ever become afflicted with the illness. Several research designs can accomplish this goal. For example, cross-sectional studies of patients and control subjects can be used to characterize each group on as many potentially meaningful variables as possible, including behavior, personality, social activity, neuropsychological abilities, brain structure and function, and genetics. One problem with this method, however, is that any differences observed between the two groups cannot necessarily be attributed a causal role in the development of disease. For example, if total brain volume were lower among a group of schizophrenic patients than it was among a group of well-matched controls, this might indicate that low brain volume is a precursor or predictor of the development of schizophrenia. However, from such a cross-sectional design, it is unclear if the brain volume deficit in the patient group actually preceded the onset of schizophrenic illness. In fact, it is possible that it did, but it is also possible that the onset of schizophrenia caused a decline in brain volume due to some degenerative process. Alternatively, other factors, such as treatment with antipsychotic medication, may have precipitated the decline in brain volume. It is further possible that the brain volumetric decline in the patient group was concurrent with the onset of illness but causally unrelated to it.
Numerous cross-sectional studies have unearthed a wealth of information regarding the ways in which schizophrenic patients are different from patients with other psychiatric illnesses and from normal control subjects. However, because of the limitations on causal inference that exist in these types of studies, their results can only guide further research; they are not powerful enough to dictate a specific pattern of behavioral, neuropsychological, or biological characteristics that would be useful for identifying individuals for targeted prevention efforts. As already mentioned, studies of individuals with prodromal signs of schizophrenia and individuals with schizotaxia provide more insight into those traits that precede the disorder than cross-sectional studies. Thus, great efforts have been made to enable identification of individuals in the earliest stages of the illness or even in the premorbid period so that they may be targeted for intervention.
By characterizing the prodromal phase of schizophrenia, subtle changes in behavior have been noted in those who are beginning to deteriorate into the early stages of the disease, and these changes are now being used to identify other clinically at-risk individuals for inclusion in early intervention programs. Some of the more pronounced changes observed during the prodrome occur in domains of thought, mood, behavior, and social functioning (Phillips et al., 2002). Specifically, difficulties in concentration and memory may emerge, as well as preoccupations with odd ideas and increased levels of suspiciousness. Mood changes may include a lack of emotionality, rapid mood changes, and inappropriate moods. Beyond simply odd or unusual behavior, the prodrome may also be characterized by changes in sleep patterns and energy levels. Social changes can be quite marked, with withdrawal and isolation as the most predominant features. These characteristics may be (p. 177) particularly informative of the disease process in schizophrenia, because they are by definition not related to the effects of medication or the degenerative effects of being ill for a prolonged period.
Perhaps the most powerful window into the premorbid changes in pre-schizophrenia individuals comes from the longitudinal study of children and adolescents who are genetically at high risk for the illness. By studying the biological children of schizophrenic parents, the clinical, behavioral, and biological features of schizotaxia can be revealed. Longitudinal studies of individuals such as these, who harbor the latent genetic liability toward schizophrenia, can be extremely informative for early intervention and prevention efforts because they can track the emergence of schizophrenia precursors before any signs of illness are apparent. Thus, any differences observed between children of schizophrenic patients and children of control subjects can be definitively attributed to factors other than the effects of antipsychotic medication, the degenerative effects of the illness, or any other factors that are subsequent to disease onset. The observed differences can be viewed as antecedents to the illness, which is as close to a causal relationship as can be ascribed in human research studies in which group membership cannot be experimentally assigned.
Studies of children of patients with schizophrenia have yielded a variety of findings of altered behavioral, neuropsychological, and biological processes. The richness and diversity of measures taken on these subjects can make profiling the premorbid genetic susceptibility to schizophrenia difficult. On the other hand, such studies have also produced some surprisingly uniform findings, which simplify our understanding of what may be the most central or universal deficits among those who are at the highest risk for schizophrenia.
Certain personality characteristics seem to reliably differentiate children of schizophrenic parents from children of control subjects (Miller et al., 2002). For example, schizotypal personality features, including social withdrawal, psychotic symptoms, socioemotional dysfunction, and odd behavior, have been shown to precede the onset of psychosis among genetically high-risk children. Deficits of social functioning are also commonly observed in this group (Dworkin et al., 1993). Specifically, children of schizophrenic patients are more likely than children of controls to have more restricted interests, significantly poorer social competence (especially in peer relationships and hobbies/interests), and greater affective flattening. Some neuropsychological deficits have also been reliably observed in these high-risk individuals (Asarnow & Goldstein, 1986; Cosway et al., 2000; Erlenmeyer-Kimling & Cornblatt, 1992; Schreiber et al., 1992). For example, several studies have replicated a pattern of impaired discrimination, sustained attention, and information processing on the visual continuous performance test among children of schizophrenic patients. These high-risk individuals also exhibit marked impairments on memory for verbal stimuli and in executive functioning, as well as neuromotor deficits such as soft neurological signs, gross and fine motor impairments, and perceptual-motor delays.
Perhaps underlying these personality, social, and neuropsychological deficits, children of schizophrenic patients have also been shown to have altered brain structure and function compared to children of control subjects (Cannon et al., 1993; Berman et al., 1992; Liddle, Spence, & Sharma, 1995; Mednick, Parnas, & Schulsinger, 1987; Reveley, Reveley, & Clifford, 1982; Seidman et al., 1997; Weinberger et al., 1981). The most commonly observed structural brain abnormality among children of schizophrenic patients is a reduced volume of the hippocampus and amygdala region. Loss of volume in the thalamus has also been observed in these children, and there has been some support for enlarged third ventricular volume and smaller overall brain volume in this group. Children of schizophrenic patients also have been found to exhibit linear increases in cortical and ventricular cerebrospinal fluid to brain ratios with increasing genetic load—that is, children with the greatest number of affected biological relatives showed the highest ratios.
(p. 178) Ultimately, these clinical, behavioral, social, and biological profiles of risk for emergent schizophrenia will be augmented by information on specific genes that increase susceptibility. Genes coding for neuregulin 1 (NRG1; Stefansson et al., 2002), nitric oxide synthase (NOS1; Shinkai et al., 2002), and dystrobrevin-binding protein 1 (DTNBP1; Straub et al., 2002) have been reported to have an association with schizophrenia, but these findings will require verification. Many other polymorphisms have shown a positive association with the disorder, but attempts to replicate these findings have often failed. For several of these widely studied polymorphisms, meta-analysis has been used to clarify the presence or absence of a true allelic association with the disorder in the presence of ambiguity. In fact, using this approach, some candidate genes, including those that code for the serotonin 2A receptor (HTR2A) and the dopamine D2 (DRD2) and D3 (DRD3) receptors, have already been shown to have a small, but reliable, association with the disorder (Dubertret et al., 1998; Glatt, Faraone, & Tsuang, 2003; Williams et al., 1997). Eventually, other gene variants, including perhaps NRG1, NOS1, and DTNBP1, will be found to be reliably associated with schizophrenia. This may make it possible to create a genetic risk profile that will be predictive of future onsets of schizophrenia, especially in combination with other known risk indicators.
Together, the various abnormal features of children of schizophrenic patients provide a “composite sketch” of the underlying premorbid susceptibility toward schizophrenia. Because the probability of developing schizophrenia among children of one or two affected individuals (12% and 46%, respectively) is far greater than that probability among children of control subjects (1%), these abnormalities signal the subsequent development of schizophrenia with a relatively high degree of sensitivity and reliability. However, it is also clear that these trends are not absolute, and many children of schizophrenic patients will not exhibit these signs, nor will they ever develop schizophrenia.
Do Early Intervention and Prevention Efforts Work?
It has been recognized for some time that the duration of untreated illness in schizophrenia is correlated with the prognosis for the disease, such that those with the longest period of untreated psychosis experience the least favorable outcomes (Browne et al., 2000). It has also been discovered that outcome correlates with the duration of illness as measured from the onset of the prodrome rather than only from the onset of frank psychosis. From this line of evidence, the rationale for early intervention efforts was born. It was reasoned that if early treatment of the illness led to a more favorable outcome, early intervention even before the onset of the illness might further inhibit the progression of the illness, either delaying its onset, decreasing its severity, or both.
A fundamental question in designing early intervention protocols is, “What will be the target of the intervention?” There is no single best answer to this question, which may be why various targets are being used in current early intervention efforts. The earliest interventions might realize the greatest opportunities to divert high-risk individuals from the subsequent development of schizophrenia, but the ability to predict schizophrenia accurately might be greatest in the period closest to disease onset. For example, targeting attention problems in young children of schizophrenic parents might allow the identification of the children who are at highest risk of transitioning to psychosis and afford ample time to intervene in that process, yet because of the restricted sensitivity and specificity of this deficit, targeting attention problems may also cause some high-risk children to be excluded from the protocol while, inevitably, some of the children who were included in the protocol would not go on to develop the illness. On the other hand, targeting the changes of the prodrome, such as the emergence of odd behaviors or increased suspiciousness, might lower false-positive and false-negative classification errors, but the ability of the intervention protocol to influence the course of the illness might be relatively (p. 179) restricted compared to earlier interventions. Thus, a balance must be maintained between the potential effectiveness of the intervention and the specificity of the intervention to the target population.
Another key question in developing early intervention protocols is, “At what level should the intervention be administered?” Again, this is a question without a simple answer. Universal and selected interventions will have the greatest likelihood of reaching those individuals most in need of intervention—that is, they will have the greatest sensitivity. However, these may also be too expensive to implement successfully. Indicated interventions will be more feasible due simply to their more restricted nature, but this will prevent such protocols from reaching some individuals who may benefit from them. In fact, interventions administered at multiple levels may work better than protocols designed to intervene only at a single level.
Perhaps the least consensus in the design of early intervention trials is on the form of the intervention. The effectiveness of various early intervention programs is currently an active area of research and, fortunately, multiple types of interventions have shown promise for keeping at least some high-risk individuals from developing schizophrenia. In fact, educational programs, as well as psychosocial and psychotherapeutic interventions, have all shown some degree of promise in either reducing the duration of untreated psychosis or postponing the onset of schizophrenia, suggesting that these methods may also be useful in decreasing the likelihood of schizophrenic illness altogether. In Norway, for example, the establishment of a comprehensive, multilevel, multitarget psychosis education and early detection network reduced the average duration of untreated psychosis in the catchment area by approximately 75% over a 5-year period (Johannessen et al., 2001).
The preventive effects of various psychotherapeutic techniques, such as individual cognitive-behavioral therapy or family-based cognitive remediation, have yet to be evaluated with great rigor, but pharmacological intervention has received a fair amount of empirical support for efficacy in preventing or delaying the transition from prodrome to psychosis. A variety of psychopharmacological compounds may have efficacy in suppressing schizophrenia, including second-generation antipsychotic drugs like risperidone, antidepressants such as the selective serotonin reuptake inhibitors, mood stabilizers such as lithium and valproate, and antianxiolytics such as benzodiazepines, but few of these have so far been tested for such a role. Of these, the novel antipsychotic risperidone has shown tremendous promise in preventing the descent into schizophrenia among prodromal individuals when compared with needs-based therapy alone, even up to 6 months after discontinuation of treatment (McGorry & Killackey, 2002). Of note, risperidone has also been shown to improve neuropsychological functioning among the nonpsychotic, nonprodromal schizotaxic relatives of schizophrenic patients (Tsuang et al., 2002).
In light of these successes, it is not so troubling that consensus is difficult to reach on which form of intervention is the most appropriate; it seems that the method of intervention is not quite as important as the fact that any intervention is better than none. There are, however, a number of problems with current early intervention efforts. For example, because our screening criteria cannot definitively identify individuals who are at risk for developing psychosis, early intervention efforts are sometimes administered to individuals who do not need them or cannot benefit from them. Alternatively, because the warning signs of psychotic decompensation sometimes go unrecognized, some individuals who should have received intervention do not. Furthermore, little is known about the potential harm that may be caused by informing individuals that they are at risk for schizophrenia; presumably there may be some negative consequences of receiving this knowledge. In addition, the benefits of some of our most promising early intervention and prevention protocols (pharmacotherapies) may be offset by the potential side effects of individual compounds.
(p. 180) A careful analysis of the benefits and the risks of early intervention has led to the general consensus that intervention in the prodrome of schizophrenia is warranted. There is less agreement about the feasibility of selective and indicated intervention in the premorbid phase of schizotaxic individuals, who may or may not ultimately develop a schizophrenia-spectrum illness. Studies have shown that pharmacological intervention can improve the subclinical deficits experienced by some non-schizophrenia genetically at-risk individuals; however, at such an early stage of research and with a limited understanding of schizotaxia, it is not yet clear if these benefits outweigh their associated risks when the selection of proper candidates for intervention may still be suboptimal. As the phenomenology and time course of schizotaxia become better understood, criteria for inclusion in preventive and early intervention efforts will improve, along with the efficiency of such protocols in treating only those individuals who will receive maximal benefit while sustaining little harm. A comprehensive summary on the early intervention literature is provided by Srihari and Shan (2012).