Sir

In his Millennium Essay, Frederick Seitz1 makes a compelling case for the importance of generalists — individuals with broad vision — in all cultural fields, and he laments that their ranks are thinning. I subscribe entirely to Seitz's assessment, yet I disagree with his analysis of what causes this trend and with his suggested remedy: reforming elementary and secondary education.

It has become almost a cliché in the United States and Europe to blame schools for society's increasing cultural ignorance. Yet many schools have struggled for years to maintain a diverse curriculum, despite budgetary constraints and conflicting societal demands.

Even so, a school's ability to affect students' cultural breadth is very limited. Children spend most of their time with family and friends, not at school. Parents need to provide daily evidence that they value diverse cultural pursuits: if they let their children devote an inordinate amount of time to a narrow range of activities, schools cannot be expected to produce people with differing values.

Provided parents have planted the seed of cultural and intellectual curiosity in their children, there is little that colleges and universities can do to kill it, but they can nurture it to a lesser or greater extent. Seitz considers that university teaching staff have virtually no room to manoeuvre against the strong internal and external forces that promote specialization. Chief among these, according to Seitz, are “conditions of intense competition”, which leave little time for scholars to cultivate new, diverse interests.

Examples abound, however, of scholars such as Einstein, Jaynes and Mandelbrot who gained prominence and significant competitive advantages in their field precisely because they were able to borrow concepts, tools and methods from disciplines far from their own.

The growing complexity of most fields of research, which Seitz also mentions as an incentive to specialize, has to be put in historical perspective. A century and a half ago, scholars in North America and Europe were expected to be well versed in all branches of knowledge, and to remain so during their whole lives.

Admittedly, most fields of research have become much more complex since then, but the level of specialization of most scholars has increased incommensurably. I believe that colleges and universities are responsible for this, not because of pressures on them but because of their approach to learning.

By the second half of the nineteenth century, undergraduate education was in many respects similar to what we have now and, except in a handful of English institutions such as Oxford and Cambridge universities, did little to equip students with the skills necessary to keep learning new disciplines and updating their knowledge. On the other hand, an old tradition of "self-study" in society at large provided them with these skills2. Early graduate programmes in Germany and, for a time, at Johns Hopkins3 emulated this in their system of “self education under guidance”.

In the early part of the twentieth century, however, universities eradicated self-directed learning from the educational landscape. They successfully promoted the idea that quality learning can occur only in the physical or, more recently, ‘virtual’ presence of a teacher. With rare exceptions4,5, colleges and universities nowadays do not prepare students to keep learning on their own after they leave college. As a result, many graduates (including many who become academics) find that their lack of skills renders learning inefficient and slow. The best they can do is to deviate as little as possible from the narrow trajectory on which they were placed during their initial training, or retraining in the private sector.

To alleviate the resulting cultural and intellectual atrophy, colleges and universities must rethink their attitude towards self-directed learning, and implement imaginative ways to foster it effectively.