Tech

Why you can think of the real-time system in the wrong way


Organizations in many industries are increasingly interested in and are trying to build real-time systems that go beyond the limited capabilities of software systems of the past. The problems these systems need to address impact internal operations and customer experience, and also extending beyond the walls of individual organizations to change the projected capabilities of the industry, and even the health of the planet.

The next generation of real-time systems is born with extremely diverse uses:

  • Safe and secure: Bringing a new level of public health and safety in smart buildings that automatically detect and prevent the spread of disease,
  • Retail: Create a new personalized neighborhood marketing experience in a physical retail environment,
  • Emergencies: Detect flooding and other emergencies and then automatically trigger evacuation procedures

In all these situations – there can be no compromises on responsiveness, reliability and scalability. This requires those responsible for development to have a more modern way of thinking about how to architect these high-performance real-time systems.

When Database The First Way To Create A Real Time System Fails – How Many Elevators Can You Really Monitor – Before It Goes Down?

A modern super city there may be buildings with hundreds of thousands of elevators – all of which require constant monitoring to detect situations of concern such as security and safety concerns. The best way to tackle this type of ‘smart building’ challenge is through real-time stream processing that can handle data analytics at scale and provide consistent and timely situational awareness.

Development will likely start with information from a single lift with analysis performed in simple time series databases and small batch queries. But it would be incorrect to assume that what works for one person will work for hundreds and then thousands.

The flaw in this assumption is that database queries should be able to handle the explosion of data without a major performance drop. This approach works as expected with a small number of lifts, but the whole system fails when the amount of data (elevators) grows beyond the capacity of the database

Regardless of placing other real-time capabilities around the periphery of traditional databases in this system, the use of databases is inherently disruptive to the system on a massive scale.

The solution to creating a highly scalable system is to perform in-memory anomaly detection analysis, and then transfer the information into a database for historical purposes. The database is the last step, not the first, in a modern real-time system.

Three types of real-time systems

Despite the growing interest in real-time systems, with it comes noise, confusion, and misinformation about the different types and capabilities of real-time systems, as well as such as how relevant (or not) databases are to their scalability and performance on demand. There are three types of real-time systems, each of which is relevant to solving a different type of problem.

  1. ‘Hard’ real-time system – hardware based,
  2. Micro-Batch Real-Time System – a ‘soft’ real-time system that uses more traditional data queries and processes,
  3. Event-driven real-time system – ‘soft’ real-time systems using stream or event processing.


1. The “hard” real-time system

These types of systems are needed to solve problems that cannot withstand any missing ‘deadlines’, with performance rates measured in milliseconds. No database can deliver such performance and in addition, all hardware and computing devices need to be done on-site. High-precision automated robot assembly lines require the rigor of this kind of real-time system.

2. Micro batch real-time system

This real-time systems approach is best suited for problems that only require some real-time processing with hundreds of milliseconds (or even seconds) latency and small scaling requirements. An e-commerce ordering system can be a good match for this.

Traditional approaches to data processing are performed on small amounts of data (micro-batches) with a rapid ‘duty cycle’. The point of failure to create serious problems is found in the effort to scale up the system and reduce the latency between batches to make these systems work, similar to real-time systems in the direction of event.

As the number of batches increases linearly, the computational cost and the cost to continuously run queries in the growing volume of micro-batches increases exponentially (up to the square of the database size). data.) At some point, the laws of physics kick in, and it’s not possible to get the system’s data analysis layer to operate in a defined ‘real time’ in large volumes. Ultimately, databases will never be as fast as event handling.

3. Event-driven real-time system

This is a ‘goldilocks’ solution for applications that require very short time intervals in the range of 1-10 milliseconds. Recommendation systems are an appropriate use of this type of real-time system – such as in eCommerce or in industrial automation.

In-memory processing, not the database, is the driving force in this system. Information (from IoT sensors, embedded AI, event brokers, etc.) is processed in flight using stream analytics and can then be sent to a database for historical purposes.

As the amount of data increases – the computational work scales linearly – not exponentially – as in the case of the micro-batch approach.

Find and Avoid Performance Breakpoints and Scoring in Real Time Systems

The analysis of the three types of real-time systems shows us that systems using the traditional database storage model can never scale in real time, even if the import is real time.

It takes time to execute queries, and query performance decreases as the database grows – that’s exactly what happens when you try to scale the system. In our earlier elevator example, the import was real-time, but accessing and executing queries against the information stored in the database was NOT real-time.

The performance of that system is ultimately controlled by the worst performing part of the entire system – the database.

When designing the next generation of real-time systems, it is essential to consider the timeframe in which different information must be accessed and understood, and the scale at which you ultimately want to grow your system.

It’s not a decision or one or two – Next generation real-time systems will need to be incorporated

There is no universal approach for all real-time systems. But it’s important to start with understanding what information needs to be stored for longer in the database for historical reporting, deeper analysis, and pattern recognition.

Next, as opposed to information that requires immediate action (on the order of milliseconds) for real-time event handling. The best systems will be those that combine different data processing models to take advantage of the benefits that each offers.

Featured image credit: Natã Romualdo; Bark; Thank you!

Mark Munro

Mark has over 35 years of experience in IT as a Software Engineer, Consultant, Technical Architect and Solution Architect for various development tools and platforms from client/ tier 2 & 3 servers to SOA and Event Driven platforms. Mark has worked for various technology companies from Digital Equipment Corp, Forté Software, AmberPoint and now Vantiq. Mark has helped clients across multiple verticals develop, design, and architect highly complex and scalable applications and systems. Mark currently serves as Product Manager Platforms and Accelerators working with customers, consultants, partners, and engineering to understand and help define the product direction.



Source link

goznews

Goz News: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably.

Related Articles

Back to top button