It was December 2011. Unbeknownst to anyone, seemingly innocuous emails began showing up in the accounts of U.S. oil and gas pipeline controls engineers and technicians. These were not just general spam emails. They targeted specific individuals across the country.
Long before cybersecurity, malware, and ransomware had become buzz words, these little emails carried a hidden little software package called a Trojan. Just like the soldiers hidden in the Trojan horse of Greek legend, they were poised to attack as soon as they were turned loose. In this case, by the simple click of a mouse.
This was well before the days of anti-virus software was everywhere and long before even the basics of cyber-hygiene was used, which might have provided a defense. Programmed to look around the machines and networks on which they were installed, the intruders went to work, infecting operating systems and networks reporting confidential information back to their creators, totally unknown by the machine’s owners.
These background-running viruses were performing what is called command-and-control (C2), the most common tactic employed by cybercriminals today. C2 has continued to evolve into a disturbingly ingenious set of products available online to those who know where to look. C2 tactics enable cyber-criminals to explore a victim’s system at their leisure. They have complete control of the victim’s machine, infiltrating data and information. They can export files, read key strokes, and steal passwords. You don’t know they are there and their computers log everything.
Soon, the pipeline operator victims of the 2011 attack began to notice strange things. They knew something was going on with their computers and sought answers. In early 2012 the Department of Homeland Security’s ICS CERT team (Industrial Control System Cyber-Emergency Response Team) began receiving calls from pipeline operators and even law enforcement.
In a recent interview, Marty Edwards, director of ICS CERT, said, “In 2012 through a variety of reporting, ICS-CERT became aware of a series of incidents targeting the oil and natural gas pipeline sector. ICS-CERT in turn analyzed what information had been provided to us, and created a series of actionable information alerts – in order to inform industry in all sectors of the need to be vigilant and check for intrusion activity. As a result, many different companies reached out to ICS-CERT with similar intrusion activity – which underscores the importance of industry reporting to ICS-CERT…”
A disturbing aspect of this series of cyber-incursions was the extent to which it was targeted. It was not simply circulated to the entire email lists of all of these pipeline companies, but sent to very specific individuals at multiple companies. That requires RECON. That means they know who you are. That means organized espionage. That means big money. These were not amateurs. These were not teenagers fooling around in the basement on a Saturday afternoon seeing what kind of trouble they could create.
And they cashed in. By the time ICS CERT arrived on the scene, the attackers had been in the systems for months. They had exfiltrated the important information they wanted. They had complete information on the pipeline’s control and SCADA systems. They had operating system information, program files, design and operations data. They had it all. And they got away with it!
With this kind of history in mind and subsequent cyber-activity we can understand why, in November 2014, National Security Agency (NSA) Director Admiral Michael Rogers told the House Intelligence Committee, “I fully expect that during my time as commander, we are going to be tasked with defending critical infrastructure within the U.S.… It’s only a matter of time of the when, not the if, that we’re going to see something dramatic … I bet it happens before 2025.”
The NSA is tasked with processing cyber-cryptology continuously in signal intelligence worldwide. This declaration by Rogers was a clear shot across the bow of all critical infrastructure organizations, such as pipelines, that NSA is seeing cyber-espionage and cyber-attacks on control and SCADA systems growing at an alarming rate in the U.S. and worldwide.
We are now hearing about cyber-attacks on the news daily. When we hear about them we automatically think about bank accounts, credit cards, maybe medical information or any number of other possibilities. But most people do not think about the possibility or the consequences of a cyber-attack on an operating control or SCADA system within their own organizations. And, as importantly, they do not know what it could mean.
This should immediately raise some important questions within any pipeline organization. Such as:
- Is my organization exposed to a control system cyber-attack?
- What could a cyber-attacker do in my organization?
- Who or what assets are at risk?
- Could people be hurt or killed?
- Could we be held for ransom and how much would it cost the company?
- Could they shut down my operations?
- Can they steal confidential company information?
- Can they interfere with important safety systems on my pipeline?
- Can a cyber-attacker prevent me from shutting down or shutting in my pipeline in an emergency?
- Is there anything that I can do to stop or prevent it?
OT vs. IT
What is the difference between cybersecurity for a control system and security for my office or enterprise system? In cybersecurity terms, it is known as operations technology (OT) vs. information technology (IT). OT is that technology (hardware, software, operating systems, networks, controllers and SCADA systems) that is employed to facilitate operational control of any operating company, most notably those systems that are recognized in the U.S. as critical infrastructure).
On the other hand, IT is the more traditional computer hardware, servers, software, systems and networks used in the typical office environment. On most occasions, these are the kinds of systems that we think of when we read about a cyber-incident or attack.
There are key differences in OT and IT systems. The risks and exposures are different. So are the effects if a cyber-incursion occurs. To begin discussions about cybersecurity one needs to understand one fundamental of cybersecurity science: the cybersecurity triad. This is known as CIA (not to be confused with the agency).
CIA stands for confidentiality, integrity and availability. These three terms are used because in a cyber-attack, one or more of these aspects of a system is attacked and compromised.
“Confidentiality” refers to keeping private information private. This might be your bank account or credit card information. It could be a member’s personal information for an insurance company. It could be private financial information about your company. It might be information about family members for executives in a large corporation. It also could be detailed technical information of an equipment manufacturer about the next generation of technology being released in the upcoming year. In all these cases, knowledge has value. And in these cases, information confidentiality is the most important aspect of the security of the system.
“Integrity” means that you can believe the information that is being presented or recorded on the system. Integrity is one of the subtler players in the cybersecurity triad. Integrity is easily compromised without the knowledge of the data owner. It is a matter of trust. And, as importantly, that the system will respond as needed when asked or expected to perform a specific task.
“Availability” means that the system actually is available to perform as intended and that it is on time. For operating systems, such as control or SCADA systems, availability is the most important feature. For example, if an operator goes to the HMI or SCADA system for a pump or compressor station 500 miles away, he must be confident the system will actually do what they command from the screen. So, availability is ultimately the attack vector of highest concern to pipeline operators.
This difference in system performance requirements will affect all aspects of the cybersecurity approach. Different threat vectors and risks mean different tools to be used to defend against those attacks. The difference in the systems themselves in terms of equipment, the networks, the protocols employed and the risks associated with each system means a much different security plan for OT vs. IT systems.
As technology has developed over the decades, in many respects control systems have benefited. Over time, simple pneumatic and relay controls were replaced with electronic controllers. In the ’70s, microprocessor-based controllers and companies emerged with more complex capabilities. Powerful I/O systems developed, allowing operators to monitor and control thousands of devices on a single system.
For pipeline operators, this complexity was increased by the vast scale of the systems. Unlike most control systems that are focused on a single plant or facility, pipeline operations were scattered across hundreds or even thousands of miles. Having a central control room necessarily meant that they had networks spread across huge areas.
Early on, many of the controller manufacturers had relatively simple, proprietary control networks. These networks allowed operations to monitor activities across large areas of pipeline operations. They monitored I/O, displayed data on control room screens, logged information on simple trend systems and could control pipeline operations. Because these systems were all proprietary, few outside people actually knew enough about the systems to even try to invade or attack them.
Early on in network designs, two key network types began to emerge. In the U.S., the overwhelming majority of peer-to-peer communications were achieved using a network protocol called Modbus RTU. Modbus RTU was a serial communications network using largely RS-485 serial communications. In Europe and globally, Profibus was the standard as Siemens was by far the biggest player in those markets.
In many respects, the ’90s changed things. Microsoft was improving its core operating systems and control system developers saw this as an opportunity to standardize designs around off-the-shelf technologies. Essentially, this meant not having to re-invent the wheel for all aspects of their systems. A key part of this system standardization emerged around networks, and Ethernet began to grow into the network of choice.
With the various Windows operating systems and Ethernet protocols being employed for both IT and OT systems, many aspects of the cyber-criminal’s life were greatly simplified. As a result of this standardization and with the growing use of advanced control systems (PLCs, DCSs, RTUs, and SISs), the wide dispersion of system information created a large user community with the necessary skills to know about these systems. Standardized operating systems and network types have created a situation perfectly made for exploitation by cyber-attackers.
One of the biggest tools developed early on for hackers was Shodan, which is still available. Shodan is a search engine designed to allow users to search for internet facing internet protocol (IP) addresses. Think of an IP as your computer’s street address on web. Shodan enables anyone to simply search for any web-facing control system and present to the searcher the IP address for the device it finds, including a description.
Then there is the Darknet. Think of this as almost the Google of cyber-criminal activity. Essentially an online mall, the Darknet provides hackers and cyber-criminals with a smorgasbord of attack tools for all kinds of systems, networks and services. Unfortunately, over time, these tools have included those needed for control and SCADA system infiltration, data exfiltration, C2 and ransomware.
On the graph, it’s easy to see why control and SCADA system hacking has become easier and more diversified, even while control systems have become more complex. With these tools and services commonly available, cyber-attackers have never had an easier time exploiting the field. In fact, these tools are so powerful and easy to use that often the attackers need to know nothing about your system to use them. They can buy what they need off the web or, if they are not skilled enough to do it themselves, can hire a company to do it for them.
Threats to Pipelines
Due to the broad geographical locations of pipeline and utility company assets, these organizations are faced with the daunting challenge of securing their assets in some of the most remote and diverse areas in the world. With assets scattered from frozen tundra to the deep desert, subsea, offshore, mountains, jungles, and cities, the continuously changing terrain and geography of pipelines provides endless attack vectors and ingress opportunities for cyber-criminals.
These vectors, combined with the ever-increasing capabilities of the attackers, means ever-greater vigilance is required. Common attack types of concern to pipeline operations include:
- Trojanized software, the most common being Havex, Black Energy 1, 2 and 3, and Dragonfly
- Email phishing (about 91% of all infections occurring this way)
- Malware as a service (MaaS) – look this one up – it’s scary
- Ransomware – What if your control system were held ransom until you paid up?
C2 seems to be the most prevalent method of system infection. Through C2, the bad guys can have access to your system for many months before you even know they are there, if you ever know. Recent studies found C2 in place for 11-22 months before being discovered.
Author: Randy Kirkendoll, PE, GSEC, is a registered electrical engineer and certified cybersecurity analyst with 40 years’ experience (20 in pipeline controls) in industrial control systems engineering, programming, and operations. He is the chief operating officer of Redi Technologies.