NetworkTigers discusses firewall evolution and advances in technology.
Since the advent of the internet, firewalls have been a cybersecurity cornerstone and a network’s first line of defense. Initially, basic devices designed to follow simple filtering rules for filtering internet traffic; today’s firewalls perform many security functions and even employ machine learning to make security recommendations based on data they collect from your network. Let’s take a look at how firewall technology has evolved.
First, what does a firewall do?
Firewall evolution has made great strides, but firewalls essentially perform the same task: filtering network traffic to prevent unauthorized access to system components, applications and systems. A properly configured firewall keeps the “bad guys” from walking through the front door of a network but also prevents employees or other legal users from accidentally poking into areas they should stay away from.
Timeline: the creation and evolution of the firewall
1988: packet filtering systems
The first firewalls were created in 1988 by Digital Equipment Corporation (DEC). According to Palo Alto Networks, these early firewalls were “packet-filtering systems that inspected the information in the packets by looking at the destination address, its protocol, and the port number used. If the traffic did not match the packet filter’s rules, the firewall would take action by dropping the packet without a response or rejecting the packet with a notification to the sender.”
These primitive firewalls are referred to as Stateless Firewalls, as they did not examine the state of the packet in question. Firewall evolution was about to take a big step.
1989: stateful firewalls
As needs progressed and applications advanced, more robust security measures were needed.
To meet this demand, AT&T Bell Labs created stateful firewalls, also known as Circuit Level Gateways. These firewalls can record all connections and data related to active sessions and connection states.
A downside to these firewalls is their vulnerability to Denial of Service attacks, as they could be easily overwhelmed with junk connections.
1991: application layer firewalls
DEC reclaimed the spotlight to usher in the next generation of technology in the form of the application layer firewall. Called a DEC SEAL (Secure External Access Link), this new firewall was able to examine the data moving to and from all running software. It was created specifically to protect computers from malware.
1994: the firewall toolkit (FWTK)
Security was further boosted in the mid-90s when the Firewall Toolkit (FWTK) was developed by Marcus Ranum, Weir Xu and Peter Churchyard. This application firewall would become the foundation upon which one of the first commercial firewall products, Trusted Information Systems’ Gauntlet, would be built.
Able to identify the legitimacy of File Transfer Protocols (FTP) and Hypertext Transfer Protocols (HTP), this new technology could better sort malicious connection attempts from real ones, making it harder for threat actors to achieve successful Denial of Service attacks.
2004: UTM firewalls
The International Data Corporation (IDC) created the term Unified Threat Management (UTM) which is then applied to firewalls that serve as cybersecurity Swiss Army knives. UTM firewalls provide system defenses in the form of web filtering, gateway antivirus protection, intrusion protection systems, VPNs and more. These firewalls still act as traffic filters but offer comprehensive tools for real-time network security and monitoring.
The Next-Generation Firewall (NGFW) is developed, building further on UTM technology. NGFWs combine the previously developed features with tools that include Deep Packet Inspection (DPI), sandboxing, application control, URL filtering, network profiling and more.
NGFWs allow for the support of secure, encrypted traffic to protect data from unauthorized viewing. They provide administrators with deep, granular visibility into applications and user activity and they can identify evasive maneuvers used by threat actors to sneak into networks.
NGFWs also allowed for flexibility, as they were offered in physical and virtual options.
2020: ML-Powered NGFWs
The early 2020s saw a significant firewall evolution in the form of ML-Powered NGFWs. These firewalls use machine learning to predict threats and deliver improved network protection.
Until now, firewalls were reactive tools that required manual updates and maintenance. They were integral security components but did not take a lead role in security and acted exclusively as static fortifications.
The integration of machine learning, however, turns tradition on its head. ML-Powered NGFWs can identify modern threats and even their variants. They can flag network behavior abnormalities and, because they scan and analyze so much network telemetry, they can make security recommendations tailored to specific systems. ML-Powered NGFWs perform these tasks and reports continually in real time to even protect users from zero-day exploits.
Whereas older firewalls acted as stationary turrets, machine learning has given firewall technology the ability to patrol and present complete visibility into network devices, activity and trends.
The advent of the cloud has also had a dramatic effect on firewall technology and usage. Users no longer have to own or purchase equipment and applications that require space, monitoring and maintenance, as cloud-based firewalls exist offsite and are serviced by a third-party administrator. While moving to the cloud is often undertaken to streamline processes or save money, traditional software and hardware-based firewalls still have a leg up on them in some situations.
Cloud firewall pros and cons
Cloud firewalls can be quickly deployed, don’t require owning any hardware and are automatically updated and maintained. Cloud firewall users also don’t need to hire IT administrators to keep their systems streamlined and functional.
However, cloud firewall users are at the mercy of third-party providers which may be unsettling. Additionally, they aren’t as customizable as in-house options and can potentially slow your network down.
Traditional firewall pros and cons
Software and hardware-based firewalls give users complete control over their security. They can be customized to suit an organization’s specific needs and allow for independence, as a third-party company is not involved in deployment. In many cases, in-house firewalls are also faster than those running in the cloud.
While the cost of purchasing hardware can be a deterrent, moving to the cloud can be more expensive once subscription fees are totaled up. Additionally, organizations can save a great deal of money by investing in refurbished firewalls purchased from reputable dealers. Traditional firewalls require maintenance, however, so an IT administrator must be on hand.
The future of firewall evolution
While the tech sector has seen devices and applications come and go, firewall progress is more relevant than ever in today’s cyber environment.
The introduction of ML-Powered NGFWs is likely the tip of the iceberg compared to future firewalls. Advances in AI will make them smarter, stealthier and better at blocking threats. As machine learning becomes deeper and more agile, tomorrow’s firewalls may more accurately predict threats before they materialize and continue to give hackers a run for their money.