• Wed. Apr 24th, 2024

North East Connected

Hopping Across The North East From Hub To Hub

In-Memory vs. Near-Memory Computing

ByDave Stopher

Oct 16, 2021 #North East

As newer data-intensive apps emerge, the older technologies of storing and processing data are failing. Demand for advanced data processing systems is high and the invention of high-tech memory-centric chips is promising to offer solutions to the already overstretched bandwidth bottleneck challenges common with the current systems.

The memory-centric chip is broader by definition but the newest technologies in this invention are the in-memory and near-memory computing systems. They help overcome computing data processing mechanism limitations inside and nearby data stores.

Detailed overview of in-memory computing

In memory computing helps solve the challenges of storing data in traditional disks. RAM stands for random access memory and is temporary storage for the currently active tasks in a computer.

Data in RAM is read instantly and fast at a speed of about five thousand times more than data in a disk. The new technology uses software that allows RAM to be used as a storage space just like a hard disk.

The software allows users to connect several computer RAMs and each RAM stores a certain volume of data and processes the same data in parallel. By parallel, it means every RAM will be similarly processing data continuously between the various computers.

If a single RAM can process data approximately five thousand times more than a hard disk, ten or twenty parallel RAMs will process at speeds way higher to several ten thousand times faster.

Two components make this operation possible – the RAM storage and the distributed parallel processing system.

Near-memory computing solutions

The blockchain implementation a decade ago helped bring a paradigm shift in data computing technologies. The technology made it possible for one big system to store data in several blocks that were chained together as one unit across the globe. Any update done in one block automatically updates in all participant’s ledgers.

Data and commands are stored in computer memory but anytime data is moved from one end to another, there is always a cost implication. If the data is far away from the processor, the cost implication is even higher due to distance. The cost rises further if the processor is more powerful because it uses more power to move and process data in either way.

There has to be a solution to help cut costs for moving data, improve on the storage and increase performance. Data can either be stored in the cloud, in hard disks, or RAM. The solution to minimizing cost is to bring the memory near the processor so that the distance is significantly shortened. This is made possible by the dies stacked on an interposer that bridges the chips and board and thereby increasing the bandwidth and I/O.

Near-memory allows data that is often used by a business to be stored closer to the processor such that it becomes easier to move it the processor and back. The system uses multiple processors that share one address making it possible to process data on a stream before it reaches the main server.

The technology can also be applied in a cloud storage system where some of the data is removed from the cloud and stored at the edge. The process makes moving data cheaper and easier.

In-memory versus near-memory technology

These two technologies have significant advantages to businesses once they are fully implemented.

Multiple storage systems

Traditional methods of storage are limiting as businesses today are generating large volumes of data daily. In-memory computing is providing a timely solution to storage because businesses can connect several computers’ random access memory. When RAM acts as a store, large volumes of data can be stored compared to disk storage.

Reduced cost of moving data

The ordinary way of moving data comes at a cost. This is because data is either moved to cloud storage or from remote disk backups and the process takes time and consumes energy. If a business needs to move data often for its daily use, the overall cost can be high.

Near-memory technology is helping cut this cost by bridging the distance gap. Computer engineers have created a chip to help store data that is often used near the computer processor such that the data is accessed faster and it moves a much shorter distance.

Both in-memory and near-memory computing has a huge cost-saving implication in a business. First, near-memory technology shortens the distance traveled by data and minimizes the amount of energy used to move it.

In-memory, on the other hand, saves time taken to process data. Both processes save hugely on time and energy and when it’s converted to saved cost, a business can save several thousand dollars per annum.

Exponential speed of processing data

Compared to disks, RAM can process data at a speed of about five thousand times more. In-memory technology has enabled several RAMs to be interconnected remotely such that each RAM stores a large amount of data and processes it in parallel to the other RAMs.

This technology has enabled businesses to process data thousands of times faster per second. That means large companies in sectors like commercial banks, insurance companies, and large retailers can process millions of transactions per second across the globe without overworking the system.

Extended hardware and software lifetime

When every amount of data is stored and processed within a computer, the entire system becomes overworked. This is because the hardware will be using more power to run and the software will be overworked. Their lifetime becomes shorter and companies will require to replace the systems more often.

In-memory technology has helped store data in portions away from the main computer system. This helps reduce the amount of energy required to process and move data. Both the hardware and software are not overworked and their lifetime is increased.

Easy access of data

Data stored in near-memory is easily accessed because it is next to the processor compared to data stored remotely. Also, data stored in-memory is easily accessible in real-time because the system enables several computers to work in parallel. The entire process helps reduce processing, retrieval, and storage time and the impact can be tangible in the overall cost.

Related Post