Trends of IT technologies v. 2019

Trends of IT technologies v. 2019

Software development using AI

   The market is moving from a situation where professional analysts have to work with application developers to create most of the solutions, supplemented by AI, to a model, when a professional developer can act using pre-prepared models offered as a service.  This provides the developer with an ecosystem of algorithms and models of AI, as well as development tools adapted to integrate the functionality and models of AI into the solution.  Another level of opportunity for professional developers of application software is created by the fact that AI is applied to the development process itself, automating various data analysis, application development, and testing functions.  By 2022, at least 40% of new application projects will have AI co-developers as part of a team.

   “Such a winning development environment using AI, automating not only functional but other aspects of creating applications, will give birth to a new era of applied development accessible for everyone when nonprofessionals can use tools with AI elements to generate new solutions.  [By themselves], the means for non-professionals to generate applications without writing code are not new, but we expect that systems using AI will bring a new level of flexibility to this, ”writes Curley.

Intelligent Edge Technologies

   By "boundary" (edge) refers to the terminal device used by people or embedded in the objects of the world around us.  By peripheral processing (edge-computing) is meant such a topology of the computational process, when information processing, as well as data collection and delivery, are implemented as close as possible to these devices themselves.  This model aims to make processing as local as possible to reduce the traffic of data transmitted over the network and, as a result, delay.

   Soon, the development of peripheral technologies will contribute to IoT and the need to carry out processing closer to the devices, rather than on a centralized cloud server.  But, this does not create a new architecture - cloud technologies and peripheral processing will evolve as complementary models, where cloud service management is implemented as a centralized service, performed not on centralized servers alone, but on distributed servers (including within the organization) and peripherals themselves.

Over the next five years, specialized chips for AI, along with more computing power, storage resources, and other advanced features, will be added to a wider range of boundary devices.  The very high degree of heterogeneity of this embedded IoT world and the long life cycle of assets — industrial systems — will create significant management difficulties.  In the long term, as 5G matures, a growing peripheral processing environment will have more reliable feedback from centralized services.  5G will reduce the delay, increase the bandwidth and (which is very important for peripheral calculations) will increase the spatial density of nodes (the number of boundary devices per square kilometer).

Intellectual spaces

The “smart space” is the physical or digital environment in which people and IT-equipped systems interact in an open, interconnected, coordinated and intelligent ecosystem.  Many elements — people, processes, services, and “things” —are combined into an intellectual space to create more immersive, interactive, and automated interactions for a targeted group of people and industry scenarios.

“This trend has been forming for some time already around such areas as smart cities, digital workplaces, smart home, and intelligent (connected) enterprises.  that the market is entering a period of accelerated development of high-grade intellectual spaces when technology will become an integral part of our daily life, whether as employees, customers, consumers, members of some communities or citizens”, writes Cary.

Quantum computing

Quantum computing (QC) is an alternative to classical algorithms;  they are based on quantum states of subatomic particles (for example, electrons or ions), which represent information in the form of elements called “qubit” (qubit, a quantum bit).  Parallel execution of commands and the exponential scalability of quantum computers means that they do an excellent job with tasks that are too complicated for the traditional approach, or where traditional algorithms would take too long to find a solution.  Automotive, financial services, insurance, pharmaceutical companies, the military, and research organizations can benefit most from progress in quantum computing.  In the field of pharmaceuticals, for example, quantum computing could be used to model molecular interactions at atomic levels to speed the creation of new drugs for cancer treatment, or they could speed up and more predict the interaction of proteins, leading to new pharmaceutical techniques.

 “CIOs and IT service managers should begin QC planning, deepening their understanding of this technology and how it can be applied to real business problems.  Learn while this technology is still emerging.  Identify the real problems, where it has potential, and consider the possible impact on security, writes Curley. But do not believe the hype that it will revolutionize everything in the next few years.  Most organizations should learn and track its development until 2022 and, start using it from 2023 or 2025 ”.

“Augmented" analytics

Augmented analytics is a specialized area of augmented intelligence — using machine learning (ML) to automate business analysis processes, including extracting valuable business information and sharing it in an organization.  Tools of augmented analytics will be accepted by the market, reaching the level of mass use, as a key component of data preparation, data management, modern analytics, business process management, process mining, and data mining platforms.  Automated augmented analytics summaries will also be embedded in corporate applications — for financial, HR, sales, marketing, customer service, procurement, and asset management — to optimize decisions and actions taken by all employees within their area of expertise, not professionals.  processing and analyzing information.  Advanced analytics automates the process of data preparation, extraction of valuable information and its visualization, allowing in many situations to do without professional data analysts.

“This will lead to data mining“ for all ”- the emergence of tools and methods that enable those whose direct activities are far from statistical processing and analytics, to extract predictive and prescriptive information from [available] data,” writes Curley.  - In the period up to 2020, the number of amateur data analysts will grow five times faster than professional analysts.  Organizations will be able to rely on such new analysts to cope with the lack of data analysis and machine learning professionals caused by the shortage of such specialists and the high cost of their labor. ”

Facebook Twitter Google+ Pinterest
(0 votes)

Leave a comment

Contact us:

1255 Treat Boulevard - Suite 309,
Walnut Creek, CA, 94597
Tel: +1 925 529 2611
DEVX LTD
Themistokli Dervi, 6, Office D4, 1066, Nicosia, Cyprus
13 John Prince's Street, 2nd Floor, London, England, United Kingdom, W1G 0JR
Tel: +44 203 769 2690
×