The foundation of a Zero Trust architecture

April 1, 2020 | Derrick Johnson

Part 1 of a 3 blog series. You can also read part 2 and part 3.

Organizations have placed a lot of time, effort and capital spend on security initiatives in an effort to prevent security breaches and data loss.  Even the most advanced “next generation” application layer firewalls filtering malicious traffic at the network perimeter has only revealed equal if not greater threats within.  To help counter this internal threat, organizations have invested heavily in internal monitoring and other advanced security controls that inspect traffic at all layers of the OSI stack to identify malicious activity, and stop it before it reaches the destination, or to alert on the activity alone. 

While these initiatives have been helpful, they rely on a connection first being malicious or a trigger on a pre-established set of criteria before any bells and whistles sound or prevention techniques are applied.  By throwing more technology and controls at the problem, networks have become a chaotic mess of watchers, gatekeepers and agents as more and more technologies and controls are thrown into it, with legitimate business traffic trying to navigate its way to through it all. Yet breaches are still occurring at an alarming rate leaving organizations looking to a different approach. 

Zero Trust is gaining momentum as a different lens to data and network security.  It casts aside complete reliance on a decades-old and easily neglected least privilege / whitelisting model by eliminating trust from every communication packet on the network, whether it originated from inside the organization or outside, and looks to gain confidence that the packet is legitimate.  In short, rather than the traditional “trust but verify” approach, it never trusts and always verifies all traffic.  Zero Trust is built on a set of foundational principles or tenets:

  • All Network flows are authenticated before being processed and access is determined by dynamic policy.  In a Zero Trust Network (ZTN), confidence must be gained in a requestor of access before access can be granted, and that confidence does not traverse the network.  Authentication may involve an evaluation of attributes in identity or other artifacts, asset state, requestor state, behavioral attributes, and others.  The transaction requiring authentication is evaluated against an ever-changing policy based on that transaction’s behavior over time.
  • All transaction flows are cataloged in order to enforce access.  Understanding what you’re trying to protect is just as important as where it is going.  Assets (basically, anything with an IP address as well as data sources) must have value.  Classification of data as well as its location must be known if it is to be protected.   Mapping and cataloging network flows to assets will help build access policies and understand expected and unexpected traffic patterns.   
  • Security (authentication and encryption) is applied to all communications independent of location and must be performed at the application layer closest to the asset in the network.  Communications must be secured and access requests from systems located within the enterprise network must meet the same requirements as external systems.  Application layer security applied as close to the asset as possible eliminates upstream threats.
  • Comprehensive vulnerability and patch management procedures must be followed.  Device security issues will persist and, as such, a comprehensive vulnerability and patch management program will keep enterprise owned devices in their most protected and functioning state.  Continuous monitoring of device and application state is required to identify and address security vulnerabilities as needed, or act on their access privileges accordingly.
  • Technology is utilized for automation in support of user/asset access and other policy decisions.  A Zero Trust architecture requires automation, especially in support of dynamic policy, authorization and authentication.  Automated technology must be used in obtaining access, scanning and assessing threats, adapting to behavior changes, and continually re-evaluating confidence in communications.
  • All traffic is controlled and monitored as access is provided.  Effective monitoring must be performed to improve security posture, and create, adjust and enforce policy.  Working in conjunction with automation, advanced analytics involving user and device behavior will provide that defenses are automatically appropriated and aligned before actual incidents occur.

The general concept of Zero Trust applied with the above tenets serves as guidance in developing a Zero Trust Architecture (ZTA).  The ZTA involves not only implemented and interconnected tools and advanced technologies, but also a set of operational policies and authentication requirements that enforce the Zero Trust principles.    A ZTA can be implemented in various ways depending on an organization’s use case, business flows and risk profile.  While each approach applies different components and technologies, such as enhanced identity, micro-segmentation and software defined perimeters, any approach should implement all the above tenets.

Derrick Johnson

About the Author: Derrick Johnson

Derrick Johnson is the National Practice Director for Secure Infrastructure Services within AT&T Cybersecurity Consulting, responsible for its direction and overall business performance. Derrick's practice provides strategic and tactical cybersecurity consulting services around next-generation network and cloud security architectures, zero trust networking, logical and virtual network segmentation and micro-segmentation, security operations, orchestration and automation, and firewalling, among other initiatives. Derrick is a Certified Information Systems Security Professional (CISSP) who joined the AT&T Cybersecurity Consulting team through the acquisition of the VeriSign, Inc. Global Security Consulting business, which was completed in October of 2009. Prior to working for VeriSign, Derrick was the Global Information Security Officer for Stream International; a global business process outsource (BPO) service provider specializing in customer relationship management services. Prior to Stream, Derrick was a Senior Associate on KPMG’s Information Risk Management team, specializing in Information Security Services. Before becoming a consultant Derrick spent four years in systems and network engineering, with a role as a Senior Network Engineer with America OnLine, performing network engineering and administration for America OnLine’s Advanced Network Services (ANS) team. Derrick earned his BS in Computer Engineering from Syracuse University.

Read more posts from Derrick Johnson ›

‹ BACK TO ALL BLOGS

Watch a demo ›
Get price Free trial