Hot Tech Topics – Data Pipes in 500 words +/-
What are Data Pipes?
Data pipes are cloud-based services that capture, transform, route, and deliver data. The source might be Internet of Things (IoT) devices, web sites, mobile apps, Enterprise Resource Planning (ERP) solutions, Customer Relationship Management (CRM) solutions, databases, you name it. Data Pipe services can be configured to collect, aggregate, combine, cleanse and format the data, then deliver it to a specified destination in a specified way. Typical destinations include serverless applications designed for event processing, data scientists and their artificial intelligence tools, and cyber security teams and tools.
Note – Michael Conlin, the author of our Hot Tech Topics series, was recently hired as the Chief Data Officer for the Department of Defense. We wish him well in his new position.
Why are Data Pipes such a hot tech topic for government?
Many government organizations are constrained by the fact much of their data is locked up in on-premises application silos. Additionally, long-standing budget constraints have delayed the much needed modernization of these applications. This combination makes it a challenge to support complex new data processing workloads, like integrating new data from multiple sources such as IoT. Yet citizen expectations increase every day, driven by an increasingly dynamic American economy and American way of life. From an IT perspective, these dynamics materialize in high volumes of streaming data and data that changes quickly. Data pipes are not a cure-all but they do represent a significant new tool for these modern workloads that improve service quality contributing to higher public trust.
Where can I go to take advantage of Data Pipes?
Please revisit my opening sentence where I used the phrase "cloud-based services". In these articles I use the term "cloud" as short for "public cloud". While it is possible to develop a custom facsimile of a few data pipe services in an on-premise data center consider that the challenge isn't just to create complex data processing workloads, but to also make them repeatable, highly available, and fault tolerant. For IT professionals this means managing inter-task dependencies, planning for resource availability, handling errors, scheduling, scaling, balancing workloads, managing events, managing configuration, and so on. Whole suites of services are required. Leading public cloud vendors - Azure, AWS, and Google – already incorporate these capabilities into their portfolios of data pipe services. Vanishingly few government organizations have the scale of working capital needed to keep pace with the rate of innovation routinely displayed by these public cloud leaders. When you want data pipes, go to the cloud…the public cloud.
“Oh no. Our future is in the hands of engineers…again.”
Michael Conlin, Institute for Innovation, Innovators Circle Initiative Leader, Perspecta