With data everywhere, the endpoint becomes the problem
It’s no secret that the pandemic profoundly changed the end user computing environment.
With much of the workforce now off site and mobile, agency IT teams lack sufficient visibility and control of their devices. Legacy tools were designed to support pre-pandemic environments, and are not able to collect timely and complete data from remote endpoints. With more employees using VPNs and BYOD, agency risk surfaces increase without the additional protections afforded by closed networks.
Security decisions must be made based on accurate, real-time data. Without the ability to receive, centralize, and analyze all endpoint data on remote endpoints, agencies can’t fill knowledge gaps. They simply don't know what they don’t know without full telemetry. Agencies are compelled to address these issues with urgency. The time to work on this issue is now, as there will be a 1000% increase in the number of endpoints by the end of this decade.
What can be done to support the new endpoint reality? Agencies must stop carrying forward legacy cyber and systems management solutions hamstrung by limited integration, speed, visibility, and control. Tackling this problem within federal agencies requires innovation and change. Cyber is directed as priority one in the May 12 Executive Order. The American Rescue Plan provides for funding through the enhancement of the Technology Modernization Fund.
The evolution from legacy approaches to data collection and centralization do not support the evolution of workforce computing. A move to modern, distributed instrumentation, interaction, and data integration enables control at the point of its production – the endpoint.
The days of centralizing datasets before analysis and action are long gone. We have passed the tipping point due to data being distributed worldwide, and impossible to collect in real time with legacy point tools and systems management solutions. There is simply too much data to manage. 5G facilitates data transport, but agency IT teams have to rethink data instrumentation.
Previous strategies for data instrumentation at the edge are broken, and simply do not scale to support enterprise-level endpoint footprints spread across complex networks, and Internet based computing. As security policies change, IT teams need to be able to adapt instrumentation on the fly to account for those changes in real-time.
The Zero Trust Factor
Given data creation velocity, the concept of trusting endpoint validation only upon initial access must be retired. Agencies using a zero trust architecture require real-time, accurate telemetry to validate that each endpoint complies with rapidly evolving standards.
Under a zero trust strategy, IT teams authenticate and authorize connection requests every time a user wants to do something new or different. This is where the data instrumentation problem emerges. Agencies must revalidate the endpoint for every attempted transaction based on the device – where it is, who’s using it, the day of the week, and whether a device is virtual, physical or employee-owned. With a majority of staff now working remotely, authentication and authorization processes become particularly difficult and time-consuming.
In a zero trust model, agencies must separate endpoints from data instrumentation and collection capabilities. A single, real-time platform that integrates endpoint management and security effectively breaks down silos to close the visibility and resilience gaps that exist between IT operations and security teams.
A unified platform approach gives organizations end-to-end visibility across users, servers, and cloud endpoints. It allows for the ability to identify assets, protect systems, detect threats, respond to attacks, and remediate deficiencies at scale.
Agencies must address the complexity problems inherent in the new end-user reality, and evolve to manage data at the edge where it is created. Modern platform solutions provide for needed speed, visibility and control as if all enterprise endpoints were in one living, breathing database.
This allows agencies to instrument data in real-time across all endpoints, without centralization to support zero trust models. The future of endpoint computing arrived suddenly and is here to stay. Agencies can meet the challenges presented with thoughtful approaches to people, process, and technology concerns.
Nate Russ leads the federal civilian business at Tanium, where his team works in close concert with systems integrators to deliver on strategic programs like CDM. His prior experience includes leadership roles at Splunk and Symantec. Nate began his career at Accenture, where he performed hands-on consulting roles in the telecommunications industry.