Source: David W. Cearley, Brian Burke, Top 10 Strategic Technology Trends for 2019, October 15, 2018, Gartner Research, ID: G00374252.
Top 10 Strategic Technology Trends for 2019
Note: I only provide a brief summary of each trend. I recommend you read the research by David Cearley and Brian Burke for a more detailed explanation of the trends. I cite the source above.
Source: Gartner (October 2018)
Trend No. 1: Autonomous Things
Autonomous things use AI to automate functions previously performed by humans. Their automation goes beyond the automation provided by rigid programming models, and they exploit AI to deliver advanced behaviors that interact more naturally with their surroundings and with people. Autonomous things come in many types and operate across many environments with varying levels of capability, coordination, and intelligence. Track the overall evolution of the technologies across the autonomous things framework (see Figure 2). Also, use this framework to consider the technical needs of particular use-case scenarios.
Source: Gartner (October 2018)
Trend No. 2: Augmented Analytics
Augmented analytics focuses on a specific area of augmented intelligence. Augmented analytics uses automated machine learning to transform how analytics content is developed, consumed and shared. It includes:
- Augmented data preparation, which uses machine learning automation to augment data profiling and data quality, harmonization, modeling, manipulation, enrichment, metadata development, and cataloging. This trend is also transforming all aspects of data management, including automating data integration and database and data lake administration.
- Augmented analytics as part of analytics and business intelligence (BI), which enables business users and citizen data scientists to automatically find, visualize and narrate relevant findings without building models or writing algorithms. These findings may include correlations, exceptions, clusters, segments, outliers, and predictions. Users explore data via autogenerated visualizations and conversational interfaces, including natural language query and natural-language-generated narration of results.
- Augmented data science and machine learning, which uses AI to automate key aspects of data science and machine learning/AI modeling, such as feature engineering, model selection (automated machine learning [autoML]), model operationalization, explanation, tuning, and management. This reduces the need for specialized skills to generate, operationalize and manage advanced analytic models.
Augmented analytics represents a third major wave for data and analytics platform capabilities (see Figure 3).
Figure 3. The Evolution of Augmented Analytics
ML = machine learning
Source: Gartner (October 2018)
Empowering the Citizen Data Scientist
There are not enough qualified data scientists to meet the demands of data science and machine learning. Citizen data science is an emerging set of capabilities and practices. It enables users whose main job is outside the field of statistics and analytics to extract predictive and prescriptive insights from data. These users are called citizen data scientists. Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists. Organizations can use citizen data scientists to fill the data science and machine learning talent gap caused by the shortage and high cost of data scientists.
Gartner expects that, in the next several years, citizen data science will rapidly become more prevalent as an approach for enabling and scaling data science capabilities more pervasively throughout the organization. Gartner predicts that, by 2020, more than 40% of data science tasks will be automated, resulting in increased productivity and broader use by citizen data scientists. Gartner also predicts that, by 2024, a scarcity of data scientists will no longer hinder the adoption of data science and machine learning in organizations. This growth, enabled by augmented analytics, will complement and extend existing analytics and BI and data science platforms, as well as enterprise applications. It will do so by putting insights from advanced analytics — once available only to data science specialists — into the hands of a broad range of business analysts, decision-makers and operational workers across the organization. This will drive new sources of business value.
Look for opportunities to use citizen data science to complement and collaborate with existing user-oriented modern analytics and BI, and expert data science initiatives. Implement a program for developing citizen data scientists from existing roles, such as business and data analysts, as well as application developers. Improve highly skilled data science productivity with citizen data science by defining and providing guidance for the interactions and responsibilities of both disciplines. Organizations will still need specialist data scientists to validate and operationalize models, findings and applications.
Beyond the Citizen Data Scientist
Augmented analytics will be a key feature of analytics embedded in autonomous things that interact with users — especially autonomous assistants using conversational interfaces. This emerging way of working enables business people to generate queries, explore data, and receive and act on insights in natural language (voice or text) via mobile devices and personal assistants. However, this is only the initial use of augmented analytics in autonomous things. Augmented analytics enables the embedding of an automated data scientist function in any type of autonomous thing. When the autonomous thing requires analytics to operate, it can tap into the embedded augmented analytics function to respond.
Bias is inherent in manual exploration processes. So greater use of machine learning and automated and human-augmented models will mean less error from bias. It will reduce the time users spend exploring data, giving them more time to act on the most relevant insights from data. It will also give frontline workers access to more contextualized analytical insights and guided recommendations to improve decision making and actions.
Take Action
Adopt augmented analytics as part of a digital transformation strategy. This will enable the delivery of more advanced insights to a broader range of users — including citizen data scientists and, ultimately, operational workers and application developers — using the scarce resources of data scientists. Pilot to prove the value and build trust. Monitor the augmented analytics capabilities and roadmaps of modern analytics and BI, data science platforms, data preparation platforms, and startups as they mature. Do so particularly in terms of the upfront setup and data preparation required, the types of data and number of variables that can be analyzed, and the types and range of algorithms supported. Evaluate languages supported and the integration of augmented analytics with existing tools and AI governance processes.
Explore opportunities to use augmented analytics to complement existing analytics and BI, data science initiatives, and embedded analytic applications. Do so where automated data pattern detection could help reduce the exploration phase of analysis and improve highly skilled data science productivity.
Develop a strategy to address the impact of augmented analytics on current data and analytics capabilities, roles, responsibilities, and skills. Increase investments in data literacy. Both startups and large vendors offer augmented analytics capabilities that could disrupt vendors of BI and analytics, data science, data integration, and embedded analytic applications. By 2020, augmented analytics will be the dominant driver of data and analytic systems. And by 2020, automation of data science tasks will enable citizen data scientists to produce a higher volume of advanced analysis than specialized data scientists.
Trend No. 3: AI-Driven Development
AI-driven development explores the evolution of tools, technologies and best practices for embedding AI capabilities into applications. It also explores the use of AI to create AI-powered tools used in the development process itself (see Figure 4). This trend is evolving along three dimensions:
- The tools used to build AI-powered solutions are expanding from tools targeting data scientists (AI infrastructure, AI frameworks and AI platforms) to tools targeting the professional developer community (AI platforms and AI services).
- The tools used to build AI-powered solutions are themselves being empowered with AI-driven capabilities that assist professional developers and automate tasks related to the development of AI-enhanced solutions.
- AI-enabled tools, in particular, are evolving from assisting and automating functions related to application development (AD) to being enhanced with business-domain expertise and automating activities higher on the AD process stack (from general development to business solution design).
Source: Gartner (October 2018)
Trend No. 4: Digital Twins
A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities such as a power plant or a city. The notion of a digital representation of real-world entities or systems is not new. Its heritage goes back to computer-aided design representations of physical assets or profiles of individual customers. The difference in the latest iteration of digital twins is:
- The robustness of the models with a focus on how they support specific business outcomes
- Digital twins’ link to the real world, potentially in real time for monitoring and control
- The application of advanced big data analytics and AI to drive new business opportunities
- The ability to interact with them and evaluate “what if” scenarios
Digital twins in the context of IoT projects are leading the interest in digital twins today. Well-designed digital twins of assets could significantly improve enterprise decision making. They are linked to their real-world counterparts and are used to understand the state of the thing or system, respond to changes, improve operations and add value (see Figure 5).
CAD = computer-aided design; FEA = finite element analysis; ML = machine learning
Source: Gartner (October 2018)
By 2020, we estimate there will be more than 20 billion connected sensors and endpoints, and digital twins will exist for potentially billions of things. Benefits will include asset optimization, competitive differentiation and improved user experience in nearly all industries. By 2021, half of the large industrial companies will use digital twins, resulting in those organizations gaining a 10% improvement in effectiveness.
Trend No. 5: Empowered Edge
Edge computing describes a computing topology in which information processing and content collection and delivery are placed closer to the sources and repositories of this information. Edge computing draws from the concepts of mesh networking and distributed processing. It tries to keep the traffic and processing local, with the goal being to reduce traffic and latency. As such, the notion of edge content delivery has existed for many years. The “where to process the data” pendulum has swung between highly centralized approaches (such as a mainframe or a centralized cloud service) and more decentralized approaches (such as PCs and mobile devices). Connectivity and latency challenges, bandwidth constraints, and greater functionality embedded at the edge favor distributed deployment models. The advantages of processing power and low costs of operating at hyperscale, coupled with the complexity of managing and coordinating thousands of geographically separated endpoints, favor the centralized model.
Much of the current focus on edge computing comes from the need for IoT systems to deliver disconnected or distributed capabilities into the embedded IoT world for specific industries such as manufacturing or retail. Widespread application of the topology and explicit application and networking architectures aren’t yet common. Systems and networking management platforms will need to be stretched to include edge locations and edge-function-specific technologies. Edge computing solves many pressing issues, such as high WAN costs and unacceptable latency. The edge computing topology will enable the specifics of digital business and IT solutions uniquely well in the near future.
Source: Gartner (October 2018)
Trend No. 6: Immersive Experience
Through 2028, the user experience will undergo a significant shift in how users perceive the digital world and how they interact with it. Conversational platforms are changing the way in which people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are changing the way in which people perceive the digital world. This combined shift in both perception and interaction models leads to the future immersive user experience. The model will shift from one of the technology-literate people to one of people-literate technology. The burden of translating intent will move from the user to the computer. The ability to communicate with users across many human senses will provide a richer environment for delivering nuanced information.
UX = user experience
Source: Gartner (October 2018)
Trend No. 7: Blockchain
A blockchain is a type of distributed ledger. A distributed ledger is an expanding chronologically ordered list of cryptographically signed, irrevocable transactional records shared by all participants in a network. Each record contains a timestamp and reference links to the previous transactions. With this information, anyone with access rights can trace a transactional event, at any point in its history, belonging to any participant. Blockchain and other distributed-ledger technologies provide trust in untrusted environments, eliminating the need for a trusted central authority. Blockchain has become the common shorthand for a diverse collection of distributed ledger products, with more than 70 offerings in the market.
Blockchain provides business value by removing business friction. It does this by making the ledger independent of individual applications and participants and replicating the ledger across a distributed network to create a consensus-based authoritative record of significant events. Everyone with a particular level of permissioned access sees the same information at the same time. Integration is simplified by having a single shared blockchain model. Blockchain also enables a distributed trust architecture that allows untrusted parties to undertake commercial transactions, and create and exchange value using a diverse range of assets (see Figure 8).
Source: Gartner (October 2018)
Trend No. 8: Smart Spaces
A smart space is a physical or digital environment in which humans and technology-enabled systems interact in an increasingly open, connected, coordinated and intelligent ecosystems. Multiple elements — including people, processes, services and things — come together in a smart space to create a more immersive, interactive and automated experience for a target set of personas or industry scenarios.
This trend has been coalescing for some time around elements such as smart cities, digital workplaces, smart homes, and connected factories. We believe the market is entering a period of accelerated delivery of robust smart spaces, with technology becoming an integral part of our daily lives, whether as employees, customers, consumers, community members or citizens. AI-related trends, the expansion of IoT-connected edge devices, the development of digital twins of things and organizations, and the maturing of blockchain offer increasing opportunities to drive more connected, coordinated and intelligent solutions across target environments (see Figure 9).
Source: Gartner (October 2018)
Individual organizations have long used technology for targeted purposes, but with the supporting systems typically closed and isolated from one another. As systems move to more dynamic coordination, organizations are applying AI and other technologies to create a much more flexible and autonomous level of coordination among systems. This is particularly so as the IoT phenomenon drives computing from isolated desktops and mobile devices into the world around us. Changes in the user experience are also changing how people interact with one another and with systems in a smart space. Smart spaces are evolving along five key dimensions:
- Openness. Openness refers to the degree of accessibility to the elements in a smart space. For example, in a closed model, individual applications or systems in a smart space would be isolated from one another with no sharing of data. Alternatively, if data were shared, it would be shared in a controlled and proprietary way. In contrast, in an open model, systems would be aware of one another, with data exposed and accessible to a wide audience through standardized mechanisms. Trends in open data formats, identifiers, and protocols, as well as the work of open-source communities, are driving this aspect of smart spaces.
- Connectedness. Connectedness refers to the depth, breadth, and robustness of the links between the elements in a smart space. Connectedness is closely related to openness. If there’s no connection, there’s no access to any of the functions or data of an application in the smart space, other than the predefined user interface. Such a system would be closed. As the mechanisms to access the attributes, data, and functions of an application increase, the degree of openness increases. Increasing the granularity of the accessible attributes, data and functions also increases connectedness. Trends such as the IoT, IoT platforms, digital twins, edge computing, APIs and API gateways, and mesh app and service architecture (MASA) all contribute to greater connectedness in a smart space.
- Coordination. Coordination refers to the depth and robustness of coordination between the elements in a smart space. Coordination is a more active aspect of smart spaces that builds on connectedness. While connectedness looks at the opportunity to connect various elements, coordination looks at the actual level of interaction and cooperation between the elements. For example, two applications operating in a smart space that shared login credentials would have a very low coordination score. However, if they also shared data and had tightly integrated process execution, they would have a much higher coordination score. Trends such as MASA, APIs, and events also factor into coordination. So does blockchain, which offers a mechanism to dramatically reduce business friction between elements in a smart space through a shared ledger and smart contracts.
- Intelligence. Intelligence refers to the use of machine learning and other AI techniques to drive automation into the smart space and deliver services to augment the activities of people in it. Intelligence can manifest itself in the form of autonomous things or augmented intelligence, including augmented analytics. An important aspect is the use of AI to:
- Help users in the smart space
- Deliver immersive experiences to enhance how users perceive and interact with other elements in the smart space
- Scope. Scope refers to the breadth of a smart space and its participants. A smart space with a very narrow scope would focus on a single team within a department of a large organization. A smart space with a broader scope might focus more across the organization but within a bounded problem space. A smart space with an even broader scope might include elements external to the organization with an ecosystem of participants. Openness, connectedness and coordination set the stage for increasing the scope of a smart space. Intelligence promotes simplified access and automated management as the scope of a smart space increases.
In the long term, smart spaces will evolve to deliver intelligent environments in which multiple entities coordinate their activities in digital ecosystems and drive use cases or contextualized specific service experiences. In the ultimate manifestation of a smart space in the intelligent environment phase, there will be rich digital-twin models of people, processes and things across a city. Event-driven structures will replace predefined hard-coded integration points. Virtual assistants and independent agents will monitor and coordinate activities across multiple systems inside the organization or government entities and across multiple entities. Open data exchanges and the use of blockchain will reduce friction between different players in the ecosystem and the systems they use. It will be possible to add new capabilities to existing environments without upgrading the entire infrastructure.
Meanwhile, individuals operating in smart space ecosystems will use ever-expanding IoT edge devices and more immersive experiences. Technology will seem to fade into the background around them. Instead of using computers, the entire world around them will be their computer. This long-term smart space intelligent environment model won’t exist until 2028 at the earliest. However, organizations can achieve it much sooner in more narrowly defined industry scenarios with a targeted set of personas and processes. Individual elements will evolve rapidly through 2023, spawning this new phase of smart spaces.
Smart City Spaces Lead the Way
Smart cities represent the most extensive example of the shift from isolated systems to intelligent environments. The vision behind smart cities is to create transparency into urban quality of life, promoting citizen engagement and self-development. The smart city is a framework or an invitation to collaborate, not a set of technologies or business strategies. It describes the collaboration and ecosystem surrounding the development of contextualized services for users in an urban environment. Well-designed smart cities aim to achieve holistic objectives and focus on intelligent urban ecosystems.
In the first wave of smart cities, technology providers were pushing end-user organizations to invest in individual POCs for the IoT, asset management or cost savings. Now, in the second wave, there is a shift to avoid the massive complexity of individual POCs and backward integration of new systems into existing systems. The focus is more on connected and coordinated systems with open and extensible capabilities.
Intelligent ecosystems are also being developed in urban revitalization zones, industry zones, and brownfield sites. In these instances, the business or user/use case focus is on targeting real estate development or new living experiences for mixed community and social networking purposes. In emerging economies, industrial parks that combine business, residential and industrial communities are being built using the intelligent urban ecosystem frameworks, with public-private partnerships leading the technology and experience development. All sectors link to social and community collaboration platforms with IT and data exchange. Greenfield smart cities and neighborhoods are increasingly gaining attention from partnerships of city and real estate developers that want to create digital and intelligent smart city projects. These greenfield smart cities will incorporate the city vision into a design and implementation roadmap to deliver a smart space ecosystem.
Trend No. 9: Digital Ethics and Privacy
Digital ethics and privacy are growing concerns for individuals, organizations, and governments. Consumers are increasingly aware their personal information is valuable and are demanding control. Organizations recognize the increasing risk of securing and managing personal data, and governments are implementing strict legislation to ensure they do.
While the private sector is increasingly bound by privacy legislation, law enforcement and security services have far fewer controls. Police services use facial recognition to identify people of interest in real time. They use automatic number plate recognition (ANPR) to track vehicles of interest. They also use data from fitness trackers to establish people’s location and heart rate at the time of a crime. They’ve even used Face ID to unlock a suspect’s phone during an investigation. With billons of endpoints collecting information, law enforcement can identify who you are, where you are, what you’re doing and even what you’re thinking.
People are increasingly concerned about how their personal information is being used by organizations in both the public and private sector, and the backlash will only increase for organizations that are not proactively addressing these concerns.
Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components. As defined by Oxford Dictionaries, “trust” is a firm belief in the reliability, truth or ability of someone or something. Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organization’s position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond “are we compliant” toward “are we doing the right thing.” The move from compliance-driven organizations to ethics-driven organizations can be described as the hierarchy of intent (see Figure 10).
Source: Gartner (October 2018)
Trend No. 10: Quantum Computing
A commercially available, affordable and reliable quantum computing (QC) product or service can transform an industry. One example is pharmaceuticals, in which new drug compounds can be quickly derived and the segmentation of customers or populations can occur in local governments, airlines, retail and financial services. Gartner inquiries about QC have more than tripled each year for the past two years. Three factors are driving this interest:
- The threat of QC to cryptography
- Curiosity about the capabilities of QC and time frames for specific applications
- QC’s potential use as a competitive advantage
Quantum computing is a type of nonclassical computing that operates on the quantum state of subatomic particles (for example, electrons and ions) that represent information as elements denoted as quantum bits (qubits). A qubit can hold all possible results simultaneously (superposition) until read. Qubits can be linked with other qubits, a property known as entanglement. Quantum algorithms manipulate linked qubits in their undetermined, entangled state. The qubits resolve to a solution when read. Quantum computing is a massively parallel process that scales exponentially as you add additional qubits.
Imagine a library that includes every book ever written. A conventional computer would read each book sequentially to search for a particular phrase. Quantum computers read all the books simultaneously. The parallel execution and exponential scalability of quantum computers means they excel with problems too complex for a traditional approach or where a traditional algorithm would take too long to find a solution. Specific applications could include machine learning, route optimization, image analysis, biochemistry, and drug discovery, materials science, and code-breaking (as prime number factoring).
Industries such as automotive, financial, insurance, pharmaceuticals, military, and research organizations have the most to gain from the advancements in quantum computing. Key potential applications of quantum computing include:
- Optimization. Optimization problems are most likely the No. 1 use case for QC. QC optimization can potentially help with machine learning, AI and neural networks. The promise is that they will be able to dramatically improve the acceleration of pattern recognition as the technology matures through 2023.
- Materials science. QC could be used to analyze complex atomic interactions, enabling faster discovery of new materials that will enable new economies and new discoveries. Creating new patentable materials is a major potential source of profits for early adopters in key sectors.
- Chemistry. QC could enable quantum simulation at an atomic scale allowing for design of new chemical processes.
- Personalized medicine. QC could be used to model molecular interactions at atomic levels to accelerate time to market for new cancer-treating drugs. QC could accelerate and more accurately predict the interaction of proteins leading to new pharmaceutical methodologies.
- Biology. QC could be used for native quantum simulation of processes such as photosynthesis or for modeling energy systems and interactions. QC could help accelerate the development of new or improve fertilizers helping improve the world’s food sources.
Source: Gartner (October 2018)