Read the Latest from Gartner on the Data Tsunami and Edge Computing
Thursday, Mar 19 2020
Data generation is accelerating incessantly. In fact, an overwhelming 2.5 quintillion of data is created every single day, and it will continue growing in the future.
NGD Systems Raises $20 Million in Series C Funding to Accelerate the Deployment of the World’s First NVMe Computational Storage Drive
Wednesday, February 05, 2020
The latest round includes new investments from MIG Capital and Western Digital Capital Global, Ltd, that enables artificial intelligence, machine learning and edge computing within computational storage.
Scalable Computational Storage
Listening to Customer Requirements and delivering solutions that provide In-Situ Processing to drive compute capability at the storage device. This unites storage and compute satisfying the growth of Edge, Datacenters and AI.
Youtube is blocked
451 Research announces NGD Systems as a Q3 2019 Firestarter innovation award recipient
The 451 Firestarter award recognizes organizations for exceptional innovation and disruption in their market. Based on their insights and expert opinion of long-term trends and the competitive landscape within the industry, the 451 Research analyst team, on a quarterly basis, nominates technology firms of any size and age.Read The Report
NGD Systems and UCI Collaborate to Improve Machine Learning Training
Why try to put square pegs in round holes, See how NGD Systems and UCI were able to shave the pegs of Deep Neural Network Training in this Technology paperRead The Report
Bringing Intelligence to Storage
Overcoming the Data Tsunami
Data growth is creating a gap in today’s compute infrastructure due to the amount of information that is generated versus the results required. By supplying the ability to sort, search, scan and derive value from data at the source is paramount to success in overcoming the data Tsunami.
Artificial Intelligent (AI), Machine Learning (ML) and all workloads require a new method to provide real time results. That is the power of In-Situ Processing
In the News
Wednesday, March 25, 2020
As more applications generate massive amounts of data -- think IoT, AI, analytics -- the time taken to transfer, analyze and process that data is becoming a significant factor, slowing storage networks and systems. Computational storage is a way to update storage architectures and strip out latency, making the storage-compute process more efficient.Learn More
Tuesday, February 25, 2020
Enterprises are said to be awash in data, and one of the problems posed by all this data is not just storing it, but processing it. Storage architectures have by and large been keeping up with the capacity problem, while the introduction of flash has also given storage a much-needed speed-up over the past decade, significantly boosting the performance of transactional workloads such as databases as well as file, block, and object storage.Learn More
Saving FAISS with In-Situ Processing
Our Revolutionary Computational Storage Solutions use In-Situ Processing (to process within the storage) to solve modern AI and ML workloads. As much as 500x faster than traditional Architectures.Learn More
QoS, Not IOPS and GBs is Key
Current metrics on SSD storage are no longer valid for solving today’s workloads and needs. The need to look beyond IOPS and GBs and focus on execution and QoS is paramount to be successful with managing your data that is now more valuable than everLearn More
Capacity Scaling Done Right
With the data tsunami coming, the ability to store and manage all of that data becomes a challenge. Not only do you need to get more capacity in a single device, but having that device manage its on data is key. See how High Capacity Storage combined with Compute options solves scale issues.Learn More