Practical Reminders In Adopting Strategic Tech Trends For Digital Businesses In 2022
Gartner’s annual list of strategic technology trends for 2022 shows a mix of familiar and less familiar tech that will be useful for digital businesses. The tech research and consulting firm expects these trends to become force multipliers for businesses and catalysts for innovation in the next three to five years.
However, it is important to know more than the basics when embracing these technologies. They can definitely provide advantages, but they are not free from drawbacks or possible pitfalls. They come with challenges, and sometimes the wrong approaches in adopting them can do more harm than good.
Cybersecurity mesh is a flexible architecture designed to bring together heterogeneous and distributed security services. With it, organizations can take advantage of having all the best standalone security solutions seamlessly working together to enhance security posture. It brings security controls closer to the assets they need to secure.
Adopting this technology, however, can be trickier than expected. The mesh can be highly complex, and managing it may become too challenging resulting in inefficiencies and slowdowns. It helps to use a cybersecurity mesh architecture (CSMA) solution that can automate processes while implementing zero-trust access, adaptive cloud security, and seamless scaling. Using CSMA solutions is particularly useful especially as organizations shift towards multi-cloud and hybrid environments.
Data fabric refers to the architecture and set of data services that enable continuous and consistent capabilities across a selection of endpoints, platforms, and data sources. With the help of analytics and machine learning, data fabric can learn and actively make suggestions on where to use and change data. Gartner says it can ease the burden of complex data management by up to 70 percent.
Implementing a data fabric architecture, however, entails challenges when it comes to the deployment and configuration of services, the management of dependencies between services, integration with external systems, monitoring, and troubleshooting. CIOs or IT departments need to be mindful of these areas of concern.
When it comes to integrating with external services, it may be necessary to use a middleware between the data fabric and external systems. Organizations may also choose to use third-party systems that are compatible or readily integratable with their data fabric.
Composable applications are apps that feature functional parts that can be dissociated from the whole app to build a better application with more functions. They make it easier and faster to use and reuse code, which means that new software will have faster time to market. In contrast, LCNC apps are applications created using low-code/no-code development platforms that allow even those who do not have coding or programming know-how to self-build apps.
Embracing composable applications, however, is not as easy as it sounds. For one, it is important to have a composable architecture, which provides the mechanisms and crucial support for composable app development and deployment. The architecture has to have the ability to scale storage, networks, and databases without compromising reliability and flexibility. Additionally, it is important to identify and train the right skills for composable app development and management. Composable apps are unlike traditional applications.
The prominence of privacy laws, especially in Europe and North America, makes privacy-centric technologies indispensable. For this, privacy-enhancing computation is expected to play an important role. This technology is designed to secure data processing in both trusted and untrusted environments to allow the extraction of useful data while still complying with existing laws or regulations. Gartner projects that around 50 percent of large organizations will be using this technology by 2025.
Privacy-enhancing computation can be implemented through four techniques, namely zero-knowledge proofs, secure multi-party computations, homomorphic encryption, and differential privacy. No single technique works for all kinds of situations.
Organizations need to examine the advantages and drawbacks of each option in accordance with their specific needs and circumstances. Generally, though, homomorphic encryption is considered the most secure followed by secure multi-party computation, differential privacy, and trusted execution environments.
Hyperautomation is about the systematic and business-driven approach in quickly identifying, vetting, and automating business and IT processes. It helps achieve more efficient remote operations, scaling, and changes in the business model. It entails the orchestrated use of various technologies including artificial intelligence, event-driven software architecture, robotic process automation, low-code/no-code tools, intelligence business process management, and integration-platform-as-a-service.
Hyperautomation is not for the inexperienced, though. It requires a top-down approach. As intelligence automation expert Wayne Butterfield explains, "it should come from leaders who can see what sort of automation is needed and wherein the organization it’s needed, and who can predict how the puzzle pieces will fit together to create the greatest efficiency for the business."
Additionally, it is important to bear in mind that hyperautomation cannot be implemented in an instant. It can take years to plan and implement. It should be part of an organization's long-term strategic goals initiated and supported by senior-level management.
AI engineering is a new discipline aimed at building systems, processes, and tools that support the effective application of AI in real-world situations. It is the antithesis to the hasty development of capabilities and building of individual tools that characterize the current AI development paradigm. It centers on helping humans achieve the desired mission outcomes given the realities of uncontrolled real-world situations. Backed by strong AI governance, AI engineering is expected to operationalize the delivery of artificial intelligence applications to solidify its business value.
Carnegie Mellon University's Software Engineering Institute says that successful AI engineering is founded on artificial intelligence that is human-centered, scalable, robust, and secure. No matter how sophisticated AI is, if it does not align with human needs and behaviors, it is unlikely to find its place in the real world.
Also, it has to be scalable to match the changing needs of organizations. Additionally, it has to address one of the biggest challenges of AI implementation: security. It has to be resilient and reliable enough to overcome cybersecurity risks and the challenges beyond controlled environments and laboratory tests.
Generative AI is the use of AI and machine learning to allow machines to create artificial content that can be passed on as real content. It utilizes existing multimedia content (texts, audios, images, and videos) to generate new content for different applications including motion pictures, location services, search engine services, security services, and healthcare.
Generative AI, however, is associated with deep fakes, which connote the abusive use of multimedia and AI technology for deception and misinformation. This is a stigma that should be addressed competently. It can also have privacy violation implications, especially in its healthcare application. Collaborative efforts in the tech industry are necessary to combat the abuse of this technology.
The need for cautious and responsible technology use
Many innovative technologies help augment organizational capabilities and success in different fields. Successful implementation, however, is not going to be served on a silver platter to organizations that decide to embrace these technologies. They have to do their research and build their capabilities to competently take advantage and maximize the benefits.