"Innovation distinguishes between a leader and a follower," pronounced the iconic Steve Jobs.In 2019, Accenture conducted a global survey with 1,500 C-level executives across 16 industries, titled "Scaling to new heights of competitiveness". Majority of top managers strongly agree that leveraging artificial intelligence (AI) is necessary to achieve their growth objectives, while acknowledging that scaling AI at the enterprise level is a real challenge. Scaling AI means that diverse teams, departments, and individuals across the enterprise realise the value of AI and utilize it in their work processes to achieve efficiency and business advantages.
The volume and rate of data accumulation, especially for capital-intensive industries, increases exponentially, as more devices become internet-enabled each year. This generates rising demand for data storage and computing resources among organizations. According to Flexera 2021 State of the Cloud Report, the pandemic has accelerated cloud plans and spend. In fact, enterprise cloud spend is significant and growing quickly, compared to previous years, and more organizations are leveraging public cloud services (for example, AWS, Azure, and Google), private ones, or both (hybrid model).
Despite the accelerating adoption rate of cloud services, many enterprises are still not running their applications in the cloud. Organizations face challenges relating to response latency; data security; data management; cyber regulatory compliance; implementation cost; retaining employee knowledge; as well as inference at the edge.
The first critical decision is how to transfer data and the cost of doing so. Response latency is seen as a constraint or an obstacle for a business that is performing critical operations and trying to run its assets efficiently and effectively. For any business, it is important to consider whether the benefit of achieving lower latencies is greater than the cost of acquiring the necessary network bandwidth, which in some cases is not possible due to infrastructure constraints.
Another challenge is the increased cyberattack surface. More businesses are shifting their confidential information to the Cloud, and data breaches targeting cloud-based infrastructures increased by 50% from 2019 compared to 2018 (see Verizon Business 2020 Data Breach Investigations Report). Moving data out of the plant increases the number of potential cyber-attack vectors. Data breaches can be caused by a simple misconfiguration or internal insider threats and can be hard to avoid when part of the IT infrastructure is outsourced to a third-party business. Therefore, ensuring data security in this dynamic environment is crucial for enterprises.
Digital sovereignty, which refers to the level of control over the data, hardware, and software that a company relies on to operate, is another challenge facing the enterprise. Operational sovereignty provides customers with assurances that those working for a cloud provider cannot compromise a customer's workloads. Software sovereignty ensures that the customer can control the availability of its workloads and run them without being dependent on or locked into a single cloud provider. Moreover, data sovereignty provides customers with a mechanism to prevent the cloud provider from accessing their data, designating access only for specific purposes. The real challenge for organizations is trusting those managing their cloud services, especially when sensitive data could circulate in the hands of multiple third-party businesses.
Cyber regulatory compliance has its own complexity; making sure that compliance programs evolve with cloud deployment, infrastructure, environments, and applications, and various cloud services and applications are configured securely. Data movement from the plant to the cloud service, especially when it is owned and operated by a third-party business, may violate regulatory compliances. Organizations that have a multi-cloud strategy can benefit from what is called Cloud Security Posture Management (CSPM) as it becomes difficult to ensure that various cloud services and applications are securely configured.
The next concern is around the cost of cloud-centric implementations. According to the International Data Services (IDC) report, the annual public cloud spending will hit $500 billion by 2023. There is a growing awareness of the long-term cost implications of the Cloud and several companies are taking the dramatic step of repatriating parts of their workloads or adopting a hybrid approach to alleviate the cloud costs. This shift is driven by an incredibly powerful value proposition – infrastructure available immediately, at exactly the scale needed by the business – driving efficiencies both in operations and economics for enterprises.
A critical challenge is to retain experienced employees' knowledge, as a key strategic resource, before they retire or after a merger or acquisition occurs. One of the solutions is to automate workflows and processes at the edge. Utilizing such automation along with incorporating AI and Machine Learning (ML) techniques can track and store the critical "knowhow" of key employees at different levels of an organization, and retain, improve, and share the knowledge with new recruits or the generations to come.
Finally, a reasonable solution would be that instead of streaming process data from the plant edge into the Cloud for running inference models, the application (including the trained model) could be shipped to an edge execution environment. Actionable responses and insights could be quickly communicated to the human stakeholders. This mechanism would reduce the high cost in terms of time, network bandwidth, storage capacity, loss of independence, security and privacy caused by centralized cloud storage and computing.
In the current state of IoT devices, edge computing reflects as intelligently collecting, aggregating, and analysing IoT data via cloud services deployed close to IoT devices (i.e., at the edge) based on the business needs of the application. The future of edge computing is complementary to cloud capabilities. The Cloud will not be replaced by the edge. The duality of these two paradigms promotes an infrastructure risk distribution between the offshore facility (manufacture) and its data center. This will provide uninterrupted real-time actionable responses on the edge. The Cloud will execute less critical tasks such as model training, retraining, and sustainment as well as monitoring. This hybrid combination will optimize uptimes while minimizing the risk of unseen issues.
To achieve the intelligent edge vision, it is necessary to leverage today's edge computing technology in an optimal and scalable way to deliver high-value intellectual property (IP) in an intelligent edge solution. For example, the Aspen AIoT Hub provides access to data at scale, where in the enterprise, the plant, or the edge – providing comprehensive AI pipeline workflows to embed AI in Aspen Models for both engineers and data scientists.
Indeed, change is mission-critical – as Albert Einstein has observed, "If you always do what you always did, you will always get what you always got."
By Adi Pendyala, Snr Director, and Lawrence Ng, Vice President, APJ, Aspen Technology, Inc.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.