By Gregory Hale
Be humble, develop a hard shell and own the problem.
Those are just some of the lessons learned after the SolarWinds attack that occurred December 12, 2020. Over 18,000 SolarWinds customers installed malicious updates in three versions of its Orion monitoring and management software, with the malware spreading undetected. Through this code, hackers accessed SolarWinds’s customer information technology systems, which they could then use to install even more malware to spy on other companies and organizations.
“Early Saturday morning I got a call from our CEO saying FireEye just called and said we had shipped tainted software. With the information they came with, there was no question it was real,” said Tim Brown, CISO at SolarWinds during his talk at the S4X22 cybersecurity conference in Miami South Beach last week. “This type of thing was so far beyond normal that it was just not what you expected.”
RELATED STORIES
- OT On Front Line of Protecting Freedom
- ICS Firms Join Joint Cyber Defense Collaborative
- ICS Devices Targeted by Attackers
- Create Your Cybersecurity Vision
Brown was able to discuss and show a glimpse behind the scenes on managing a major incident in the first hours, days, weeks, and months that followed. Brown was able to show from his experiences on what other security professionals could expect, how they could prepare, and how they could organize a response for an incident.
Year Before Attack
“The whole process was interesting to say the least,” he said. “The threat actor made it into our environment starting in September 2019. They came and did a test run in November. That test run did nothing. They came back March with 2,500 lines of code. They code stayed the same from March to June.”
In the first days after the incident, Brown said they learned as much as they could as fast as they could and were ready to communicate.
“Since we were a public company, we had to produce a 10-K which would give as much information as we knew at the time,” he said. “Sunday we were all in the office, by 2:30 a.m. Monday morning we were able to issue our 10-K. What did we know at that point in time? We knew the threat actor mission-centric and very focused on what they were doing. We had information on the threat actor from FireEye and Mandiant at the time. We knew the code was pretty novel, we knew the code was well written, we knew the code size. We knew the insertion point.” They also knew the attackers conducted a well-crafted, sophisticated campaign. They also attacked the build environment to then attacked specific, targeted customers.
There were other details they were able to place in the document. There were still plenty of unknowns, but at least they were ready to be as transparent as possible to keep the masses informed.
Contacted the Feds
Brown also said the day after the attack they were in contact with the FBI and the Cybersecurity and Infrastructure Security Agency (CISA). They also retained several third parties to help with the investigation. One thing they were able to quickly understand, he said, was this highly sophisticated attack was not in the source code but somewhere in the supply chain.
Through a quick investigation, the company understood three builds of the software produced between March 2020 and June 2020 suffered from the attack.
They also had a good idea on the number of potentially affected customers, but as it turned out, it would be months before the final number was understood, Brown said.
That was the immediate triage, then came the first weeks after the attack.
“We organized our war room into multiple teams with independent leaders, who met every night,” Brown said.
There were folks from engineering, IT, business escalation and customer outreach, communications, and law enforcement.
That is when they brought in additional outside help in Krebs Stamos Group and KPMG Forensics team.
While the news may have died down a few days after news of the incident among the general public, but in the weeks afterward, customers needed to know what was going on.
They constantly communicated what they knew to customers. They also built a security resource center with FAQs and support materials. Not surprisingly, the biggest question from customers was “am I infected?”
Solid Reporting, Inaccurate Reporting
They found during the aftermath, there was some solid research conducted along the way, but there was also some inaccurate reporting. He pointed out in highly technical and sophisticated campaigns like this one are difficult for the layperson to understand. This can create inaccurate and incomplete reporting and narratives, which are then amplified and accelerated by social activity reporting.
“Because we focused on communicating with our customers, employees, and partners over controlling a media narrative, the press reached for anyone they could to get quotes, including people who did not have the information and access they claimed they had,” he said.
Brown then went on to discuss some lessons learned from the from the first weeks after the attack.
“Initially, you will be outnumbered, out marketed, out-communicated, no question,” he said.
Keep Your Focus
He added misinformation will be everywhere, but don’t get sidetracked. Instead, focus on the truth and communicating internally and externally.
In addition:
• No matter what you’re going through, remember what others are facing – especially your customers and employees
• Bring in the correct partners. Add partners when necessary. Understand there are many agendas. Many want to be involved and use the situation to their advantage.
• Be ready for alternative forms of communications and reduced efficiency
• Create escalation models for each focus area
• Don’t fight the small stuff. If you spend your time responding to every inaccuracy, you take the focus off understanding and supporting your customers
• It’s OK not to have all the answers. Focus on sharing the facts you know to be true. Allow others to focus on attribution
Investigations take time, be aggressive, but don’t miss things.
• Grow a hard shell. Focus on the job at hand.
In the months afterward, as things calmed down a bit, everyone was able to digest what happened, learn even more lessons. They were able to change their software build scenario to a three-way process where no one person as access multiple access.
In time the onslaught lessons and things get better, he said. “It took about six or seven months for that corner to change.”