As we march toward the third phase of digital transformation, the collection and analysis of data and telemetry from every point in the code-to-customer application path is critical. Businesses rely heavily on analytics with machine learning that provide all sorts of data such as consumer usage patterns, inventory tracking and seasonal variations that improve performance, create efficiencies and increase competitive advantage. At the same time, machine learning is also critical to combating the advanced security threats companies face today and will continue to face in the future.
Yet there is still a security architecture debate critical to the realization of the value of machine learning: packet filtering or proxy? This debate was raging almost twenty years ago when I was an intern in the security innovation group in Cisco’s Office of the CTO, and it continues today.
Early on, network packet-filtering appeared to win because of a focus on speeds and feeds. Packet-filtering approaches operate on individual packets, which in the past often made them faster than their connection-oriented proxy cousins. Security solutions built on packet-filtering approaches evolved into more stateful engines becoming ‘application-aware’, placing a greater focus on application and identity. Still, the core value proposition of packet-filtering based security relied heavily on the inspection of individual packets at speed.
Eventually however, ‘application-fluent’ programmable proxies evolved to provide value that eclipsed the early advantages of packet-filtering. The key to that value is two-fold: First, proxies provide visibility into every interaction – from user to application, from the network to the application, and across logical business flows, that all enable it to detect advanced attacks. And second, a programmable proxy can inject code, enrich headers, and insert trace data to dynamically instrument clients and applications. In other words, inspection was no longer enough – proxies provide the critical ability to instrument interactions with the breadth and depth of data needed to discover patterns and produce actionable security insights.
There is clearly still a need for filtering bad traffic and focusing inspection capabilities on the good traffic. This gives organizations the ability to go both broad and deep, with an architecture that aligns with a ‘zero trust’ design approach by connecting the user identity with other access control policies.
Additionally, attack sophistication has moved past rudimentary attacks detected with a ‘point in time’ analysis of a connection with a single inline device like a WAF. The detection and mitigation of advanced attacks requires correlation of multiple signals or data points over a period of time across the entire data path.
The future of security rests on telemetry that is more than technical data points picked out of packets. It requires a holistic view of interactions from client to application to behavior. Machine learning requires enormous amounts of data to establish and recognize patterns. This is why programmable proxies are such a critical part of an advanced security approach. The high-fidelity data received from instrumentation ensures rich data that enable protection against – and even prediction of - application attacks.
At the end of the day, companies are going to find it very difficult to realize the performance improvements, efficiencies, and competitive advantages that can come with machine learning if they are constantly fighting advanced attacks using half the tools in their toolbox.