History

The CATC

The Canadian Association of Theoretical Chemists/L'Association canadienne des chimistes théoriciens was created on February 10, 1983. Its primary mandate is to promote and develop the practice of theoretical chemistry in Canada, especially by organizing scientific conferences focussed on this area of chemistry.

Read More

The impetus to form the Association came in 1980 from a grass roots movement at the business meeting of the seventh Canadian Symposium on Theoretical Chemistry, at which it was recommended by William G. Laidlaw, and agreed to by almost 40 persons present, that an incorporated body would be beneficial. In the subsequent few years, led by Russell J. Boyd, by-laws for the corporation were drafted, and letters of patent were filed in 1983. The application contained six signatories: André Bandrauk (Université de Sherbrooke), Russell J. Boyd (Dalhousie University), Sigeru Huzinaga (University of Alberta), Raymond Kapral (University of Toronto), William G. Laidlaw (University of Calgary), and Josef Paldus (University of Waterloo). This began a tradition by which Chairs of theoretical chemistry conferences served as Directors of the Association.

As the academic community grew over time, the Association continued to have a unifying influence on the theoretical chemistry community, providing a forum to meet colleagues and decide upon matters of mutual interest. Today the Association has members spanning many different disciplines, including both academia and national laboratories, reflecting the growth and reach of theoretical and computational methods.

Theoretical and Computational Chemistry in Canada

Research in theoretical and computational chemistry started to become established at Canadian universities in the mid-1950s. As a nation, Canada was behind its international counterparts in university-based research with most research carried out in government-operated laboratories. The launching of Sputnik by the Soviet Union in 1957 triggered a major re-examination of science education across North America with major resources invested not only in public school curricula, but also in research funding to universities. Initial faculty appointments in theoretical and computational chemistry at Canadian universities in this period were mainly of foreign-trained professors. In many cases, these faculty members were also involved in experimental work.

Read More

As the post-war baby boom generation approached university age in the 1960s, there was a major shift in the Canadian academic scene as many institutions expanded in size and scope. A number of new universities were created in the same period. Universities offering doctoral programs in sciences increased from 2 in 1940 to over 25 in 1970. In the late 1960s, there were many faculty positions available with 35 appointments made of theoretical and computational chemists, in sharp constrast to the 6 appointments pre-dating 1960 and the 16 appointments between 1960 and 1965.

By the 1970s, fewer faculty positions were available but Canadian computational chemists were finding job opportunities in Canadian industry in a variety of roles. Several pharmaceutical companies engaged computational chemists in research, using modelling to identify compounds with therapeutic potential. Other companies developed educational software and computerized tools for interpretation of spectra and prediction of physical and chemical properties. Because of their highly developed computational and mathematical literacy, Canadian-trained computational chemists are found in a variety of settings including finance and system management.

The NSERC University Research Fellowships program in the 1980s made it possible to retain talented researchers in anticipation of the retirement of faculty members appointed in the 1960s. Many appointments to faculty positions after 1970 were from alumni of Canadian universities. Over the years, theoretical and computational chemistry has maintained a strong presence in both academia and the national laboratories. In 2019, over 45 Canadian universities had faculty active in research in computational and theoretical chemistry.

For a detailed historical account of Theoretical and Computational Chemistry in Canada, please consult:

Boyd, R. J. The Development of Computational Chemistry in Canada, Reviews in Computational Chemistry, Volume 15 (eds K. B. Lipkowitz and D. B. Boyd), John Wiley & Sons (2000) pp 213-299.

Computational Resources

Over the years, as computational resources have become more abundant and powerful, theoretical and computational chemists have harnessed computers to solve ever more complex and detailed problems. However, it wasn't always this way and the changes that have taken place even within a single lifetime have been profound.

Read More

In the beginning, the feasibility of calculations associated with theoretical chemistry was constrained by the availability of computational resources. In the 1960s, state-of-the-art computers such those produced by IBM were operated with paper tape and punch cards. By the end of the 1970s, most Canadian universities had mainframe computers and the computations desired by chemists competed with the entire university for computer time. In 1978, the VAX 11/780 was the first computer on the market that was feasible to acquire as a dedicated machine at the departmental or research group level. Through the 1980s most computational chemists in Canada had access to such dedicated computers.

The 1980s and early 1990s saw research groups purchasing a wide assortment of computer workstations. Products from new companies like Sun Microsystems, and Silicon Graphics started to compete with IBM, Digital, and HP, eventually capturing a significant part of the academic computing market. These computer systems were typically built around single-core CPUs, with accelerator cards for floating point operations, and had early versions of the UNIX operating system, or other proprietary operating systems. This time also saw a tremendous growth in user interfaces and connectivity. The X Windows system was created and for the first time, the typical user could give up their green or orange CRT terminal (usually VT100) for a new graphics monitor or X terminal. In turn, this led to advances in visualization which underpin the entire graphical user interfaces of today. This period also saw the birth of the global Internet. Up to that point, campuses had been networked but connections to points outside of campuses were slow or non-existent (usually handled by routers working over dedicated phone lines). Within the span of a few years, networks within institutions, provinces, and countries were interconnected and all computers then had to adopt uniform IP addresses and spaces. Domain names were created, and the first popular web browser, Mosaic, was created. The GNU Project was also founded.

Beginning in the late 1990s, the computational power of personal computers had increased to the point where they were competing with work stations. Researchers soon found they could buy multiple personal computers for the same price as one workstation and the age of parallel computing started to take hold. Groups of personal computers were networked together and given shared filesystems with NFS to form the first computer clusters. Eventually, vendors starting designing turnkey computer cluster solutions with specialized formats for rackmounted hardware. Cluster management software became more sophisticated and by the 2000s clusters with hundreds of networked computers were common. These clusters allowed parallel computation to be used effectively and this was the start of the gradual migration of codes and algorithms to the parallel computing paradigm. On the software side, the work from the previous decades on open source software had gained enough inertia that many systems were now using open source operating systems, like Linux, incorporating GNU components and other software. This was made practical thanks to various package managers, and to companies like Red Hat that maintained packages of working distributions of Linux which provided for seamless upgrading.

By the 1990s, it was apparent Canada needed a strategy for high performance computing infrastructure. A study showed Canada lagging behind the rest of the world in such computing capacity. Funding spurred by the Canadian Foundation for Innovation (CFI) led to a rapid expansion of such facilities in Canada. Researchers formed regional computing consortia across the country: WestGrid involved institutions in British Columbia, Alberta, Saskatchewan, and Manitoba; SciNet involved the University of Toronto; HPCVL (High Performance Computing Virtual Laboratory) involved institutions in Eastern Ontario, SHARCNET involved most of the remaining institutions in Ontario; RQCHP (Réseau Québécois de Calcul Haute Performance) involved the University of Montréal, Concordia University, Bishop's University, and École Polytechnique; CLUMEQ (Consortium Laval UQAM McGill and Eastern Quebec), as the name suggests, involved other institutions in Québéc; and ACEnet (Atlantic Computational Excellence Network) involved institutions in New Brunswick, Nova Scotia, Prince Edward Island, and Newfoundland and Labrador. Eventually, the three consortia in Ontario would form Compute Ontario, and the two in Québéc would become Calcul Québéc.

The regional consortia provided a range of high performance computing and visualization infrastructure, meant to serve the broad needs of Canadian computational researchers in all areas of science, applied science, computing, and arts. This was the beginning of the consolidation of computing infrastructure, reversing the trend over the previous decades to group-based work stations. Researchers now used local computing infrastructure for code development or testing, but ran production codes and large simulations on the infrastructure managed by the regional consortia.

In 2006 the seven regional consortia joined forces to create Compute/Calcul Canada, a national computing platform that would consolidate computing resources on a national scale and make them accessible to all Canadian researchers with computational needs, regardless of discipline or institution. Thanks to substantial funding from CFI, the provinces, vendors and institutions for both infrastructure and operations, large scale computing and storage infrastructure was installed and refreshed at consolidated sites across the country. Today, Compute Canada plays a foundational role in supporting and promoting the needs of researchers with computational demands, and is an crucial part of the research infrastructure landscape in Canada.