No industry requirements that I’m aware of push the boundaries like research and education (R&E).
Just think about the Large Hadron Collider (LHC): A machine which, as the name suggests, makes particles collide at unfathomable speeds and is best known for its efforts to recreate the moments following the Big Bang. This machine is currently the largest science project on Earth and is driving particle physics research at hundreds of universities around the globe. Although the LHC is located in Europe, the collaborating scientists are situated worldwide.
These boundaries, however, can only be pushed because the data networks that support these projects are evolving at similarly explosive rates; in reality, that evolution is born of necessity. According to the European Organisation for Nuclear Research (CERN), which is running the project, the LHC will produce around 15 million gigabytes of data annually – or enough data to fill more than 1.7 million dual layer DVDs per year.
In short, the sheer magnitude of data created by a project this large means dedicated networks need to be built to support it, and often those networks are far, far in advance of anything in use by businesses or consumers. In fact, R&E facilities such as the LHC are often the cradle for networks that will become the norm five or 10 years later – the testing grounds for future networks.
Because of distinctly different traffic flows and bandwidth requirements compared to commercial networks, R&E networks are often early adopters of cutting edge technology. For example, these networks are likely to be the first to test and deploy new transmission and switching technologies, often before these technologies have been evaluated in commercial labs.
In the past, this has included leaps in optical transmission speeds – from one gigabit (G) through 40G, and now to today’s increasingly popular 100G networks. Super high-speed broadband networks that rely on optical transmission have been the mainstay of the R&E community long before projects like the National Broadband Network (NBN) were even conceived.
But why is this need for speed and cutting-edge technology so critical for R&E environments?
A single experiment in a research institution can easily use up all the available bandwidth of a 100G connection. Fields such as astronomy and particle physics produce multiple terabytes of data from generated multimedia, documentary and other forms of content. Emerging ultra-high multi-media images require nearly 80Gbps when transmitted in uncompressed broadcast formats. This is content that must then be transported within the institution and between collaborating partners, both local and global. No longer is research success measured by the work of a single scientist, institution or country. These networks allow for a level of unprecedented global collaboration that has the potential to change the human condition everywhere.
What does this mean in practice? Imagine a researcher has to ship physical media like a terabyte hard drive with all this content from one location to another. It could take traditional global transportation days, and in some cases weeks, by which time the data could be stale, dated or invalid. Using 100Gbps high-speed broadband networks, the same data could arrive in as little as 81 seconds.
This type of high-speed transport is crucial to the research being undertaken over Janet, the national R&E network for the United Kingdom. Janet serves over 18 million users and connects UK universities, Further Education (FE) Colleges, Research Councils, Specialist Colleges and Adult and Community Learning providers, as well as the national school network. It also provides international connectivity so students and researchers have access to world-wide scholastic resources.
Janet is also part of the collaborative project to explore how particles acquire mass. The hunt for the Higgs Boson or “God particle” which theoretically should give everything in the universe its mass, and which some experts believe has been recently discovered by Large Hadron Collider researchers is enabled by the network. Data is transferred from CERN in Switzerland across Janet to the tier 1 data analysis facility, the Science and Technology Funding Council (STFC). From STFC it is then transferred to Universities across the UK for further analysis.
Enter Janet6, the next generation of the network. This new incarnation will take advantage of the latest optical transmission technology called “coherent optical transport” which will deploy a 100G backbone, and will be able to support the exponential growth in bandwidth needed by applications discussed here.
A 100G network, at present, is the benchmark. Yet this network is scalable to 400G and beyond. It’s a dedicated network on par or surpassing many national telecommunications networks in terms of sophistication and speed, capable of transporting massive amounts of data long distances to multiple recipients economically and efficiently.
However, the development of next-generation networks for R&E environments is about more than just capacity. Rather, it is a transformational technology that enables new types of services and applications that are not available today – certainly not commercially. The best examples of these services don’t just exist in the world’s first Hadron-colliding facility in Europe, however. Some of the best examples exist right here in Australia.
The Victorian Education and Research Network (VERNet) is a case in point – a purpose-built high-speed fibre optic network connecting Victorian-based universities, research centres and scientists with communities and colleagues throughout Australia and around the world.
The network is a core piece of the infrastructure that supports the University of Melbourne, which in 2011 used the network to connect the Melbourne Brain Centre at Austin Hospital to other key brain centre locations and the University’s internal network and data centres. The Centre, which helps treat brain disorders such as Alzheimer’s and Parkinson’s, has now enabled its staff to collaborate and share critical IT services for the vital transmission of research and medical data.
VERNet also connected the Peter MacCallum Cancer Centre to the University, allowing the Centre to use the University’s storage facilities and giving it access to other major research facilities around the world at high speed.
One of the main advantages of using optical transmission to drive broadband networks is the ability to increase the speed and capacity of these networks by increasing the density of the transmitted information. Using sophisticated electronics, it is possible to double, even quadruple, the amount of information delivered in the same time over the same optical fibre, requiring only a change in electronics on either end of the transmission route. In other words, the same optical network capable of 10G capacity today could potentially carry 40G or 100G traffic tomorrow, simply by changing the transmission technology.
These developments are made possible by the constant search for better, faster and more efficient methods of transporting large volumes of R&E information across the globe. Driven by commercial requirements, rather than scientific needs, the world’s communications providers will be able to take advantage of these innovations when needed because of the boundary-pushing work of research and education networks. One thing is certain: the future of mainstream broadband networks is already here. You just need to know where to look for it.
Anthony McLachlan is the VP Asia-Pacific for networking solutions vendor Ciena