Getty
It is estimated that AI augmentation will create $2.9 trillion of business value and 6.2 billion hours of worker productivity globally by 2021.

Achieving the Promise of AI

Dec. 11, 2019
Enterprises that adopt Artificial Intelligence also assume numerous business and technical challenges. A layered, technical architecture defines how data can be absorbed and leveraged.

A rapidly growing number of businesses are adopting Artificial Intelligence (AI) to reduce their operational cost, improve customer experience, and/or generate new sources of revenue. According to Gartner’s 2019 CIO Survey, the number of enterprises implementing AI solutions has grown 270% in the past four years and tripled in the past year.

This significant rate of growth is due to the high potential for AI to create significant business value. It is estimated that AI augmentation alone will create $2.9 trillion of business value and 6.2 billion hours of worker productivity globally by 2021.

However, successfully adopting AI brings about a host of business and technical challenges including but not limited to:

Business challenges
— Defining an AI strategy that can deliver a demonstrable positive business outcome, which use cases will yield the highest level of return on the investment;
— Ensuring explainable AI, the ability to provide transparency on AI-driven decision-making;
— Addressing ethical issues related to AI including how critical decisions are made based on insights provided by complex algorithms;
— Adopting a new organizational culture that sees AI not as a threat but as a tool to augment human thinking and making better and faster decisions; and,
— Ensuring legal and compliance risks are being properly addressed.

Technical challenges
— Coupling AI-driven solutions to core decision support and transactional systems, and connecting advanced AI systems to traditional applications and technical infrastructure;
— Ingesting structured and unstructured, internally and externally generated data that is spread across multiple silos;
— Ensuring validity of data that will be exposed to AI-driven solutions.

Have the right business focus
AI projects fail or succeed to the extent that specific use-cases are identified that have the potential to demonstrate meaningful business value. Defining the business value that AI and machine learning (ML) solutions can deliver around a specific use-case is paramount. Don't pursue the technology because it appears attractive.

While AI promises considerable economic benefits, gaining broad traction with business stakeholders is possible only if specific business outcomes can be identified upfront. Innovation in deep learning that leverages artificial neural networks continue to advance. But, using the technology in areas such as facial recognition and conversational AI have become table stakes.

For example, in financial services, more sophisticated risk analysis, anti-money laundering, advanced claims management, credit-worthiness evaluation, and intelligent customer onboarding have become prime focus areas. In manufacturing, predictive supply-chain management, predictive maintenance, and Smart demand forecasting is where most of the investment is going.

Robotics and IoT in the plant also are table stakes now. And in retail, predictive inventory planning, recommendation engines, and hyper-personalized customer engagement have become critical competitive opportunities. All industries have specific use-cases where AI is transforming the business. But the best opportunities are those where AI can drive significant business disruption.

Aside from having the right business focus, upskilling the workforce to work with AI is equally critical to implementing modern AI-based solutions. The ability to democratize AI transformation across the enterprise by adopting tools and capabilities to enable business users to quickly test algorithms also will be crucial to gaining traction.

The right foundation for AI
Aside from identifying the right business use-case to leverage AI, increasingly companies are faced with “having the right technical infrastructure to support modern AI applications.” The integration of traditional software and data environments with modern ML and deep-learning applications is proving to be a formidable challenge.

A good place to start is the pursuit of a modern, enterprise-wide technical architecture. In the same way that a building architecture blueprint specifies how electrical, plumbing, telecommunications, passages, staircases, and other utility and structural elements are to be built, a layered technical architecture provides the foundation that defines how data can be ingested and leveraged across traditional and AI-based solutions. This architecture provides the blueprint and foundation upon which traditional and new AI-powered applications can be built.

Information harvesting
The information harvesting layer of a modern technology architecture is where data is efficiently scanned and cataloged. Market-leading services and solutions can be leveraged to connect with any data source inside or outside the enterprise. The services within this architecture layer automatically extract, unify, and organize information, leveraging semantic technologies that enable ingestion of this data into the knowledge fabric. This is one of the most challenging aspects of the architecture and requires the adoption of advanced and modern techniques, such as scalable machine learning and natural language processing (NLP).

Knowledge fabric
The knowledge fabric layer is where enterprise data is converted to knowledge. The most common and efficient way of representing an enterprise’s knowledge domain and artifacts, that is understood by both humans and machines, are enterprise knowledge graphs (EKG). EKG is a perfect way to relate your structured and unstructured information and discovering facts about your organization.

With a proper knowledge infrastructure, you can seamlessly combine highly scalable graph database technologies with complementary storage and search systems to deliver actionable insights. This empowers humans and teams to focus on data analysis, rather than data collection.

This layer also can feature a proprietary set of knowledge accelerators that include domain-specific ontologies across industry verticals and business methods.

Enterprise AI
The preceding layers were preparing data to support AI algorithms without being concerned about data collection. This layer is where AI models and algorithms can be embedded into the very core of the architecture to create valuable insights with the potential to augment human thinking across disciplines and innovate operations, processes, products, and more.

The Enterprise AI layer takes advantage of highly scalable ML frameworks for both “training” and “deployment” phases. In the training phase, historical data is used to train and evaluate machine-learning models. The selected models will be deployed in the same layer, to be consumed by human or other applications in the enterprise. Multiple models can be deployed simultaneously to serve different enterprise use cases. One architecture serves all.

Human and machine consumption
Ultimately, derived knowledge has to be consumed by humans or machines in an intuitive manner. The Human & Machine consumption layer provides easy-to-use interfaces across the web, mobile, and API services to enable access data to the knowledge fabric layer.

Workflow orchestration and security
All previously identified processes need to be managed, i.e. scheduling and monitoring, using a workflow orchestration tool. A workflow orchestration tool allows the enterprise to define the entire data pipeline. The data pipeline may include data harvesting in batch or real-time (streaming), training and evaluation tasks, monitoring model performances, applying AI models in batch and feeding the result back to the knowledge fabric. Alternatively, the data is fed into other enterprise applications.

Needless to mention, security also is an important aspect of the architecture foundation. Controlling access to resources and the data pipeline is a major requirement.

Takeaways
AI solutions require high-quality data that is standardized and aggregated across the enterprise. The power of AI systems to work on complex problem solving on a 24-by-7 basis means that the enterprise technical architecture must deliver a continuous flow of data upon which “smart” decisions can be made. This means continuous harvesting of data in multiple formats both within and outside of an organization’s traditional boundaries. Not having access to the right data creates the risk that complex AI algorithms will use outdated or incorrect data.

Businesses that will successfully leverage the capabilities of AI, aside from ensuring the right focus around specific use-cases that can show positive results also must spend time defining the underlying technical architecture to enable these new generation of solutions.
Anthony DeLima is the head of Digital Transformation and Global CTO, and Sayyed Nezhadi is the chief technology architect, for NEORIS USA — a "digital accelerator" that provides tech consulting to businesses worldwide.

Latest from Enterprise Data

Sergey Gavrilichev | Dreamstime
Suwin Puengsamrong | Dreamstime
Smart factory concept: Engineer using laptop control with CNC machine in automotive industry.
Stononame | Dreamstime
Siemens PLC for industrial process control.