StorONE Blog

How to Bypass the Compromises of Legacy RAID Architectures

Gal Turchinski Posted by Gal Turchinski
on July 8, 2019

Traditional storage architectures force the IT professional to sacrifice either on cost or on performance, in order to obtain data protection services such as snapshots and erasure coding. This is no longer acceptable in a business environment that increasingly does not tolerate compromise on data integrity or on application performance, and that requires maximum levels of utilization of hardware resources. In this installment, we will explore how StorONE has rearchitected the storage software to enable resiliency and accelerate drive rebuilds, without impacting performance, while also increasing both flexibility and hardware cost efficiency.

Read More

Volume Level Erasure Coding to Avoid Storage Tradeoffs

Gal Turchinski Posted by Gal Turchinski
on July 2, 2019

We previously explored the tradeoffs that traditional storage snapshots require in terms of cost and performance, and how StorOne has written its snapshot algorithms to avoid forcing customers to choose between obtaining snapshots, or delivering on required levels of performance, and staying within the budget. In this blog, we will evaluate a similar problem that erasure coding presents, and how StorOne’s N+K approach enables customers to address this headache.

Read More

StorONE Unified Virtual Appliance with Seagate Drives Ideal for I/O-Intensive Environments

Gal Naor Posted by Gal Naor
on June 25, 2019

NEW YORK – June 25, 2019 – StorONE today announced record speeds using Seagate SSD drives and StorONE’s TRU™ S1 Software Defined Storage solution. In recent performance testing, StorONE combined its software with Seagate’s enterprise-class SSDs in a virtual appliance configuration that reached a breakthrough half a million IOPS with 24 Seagate SSDs and all enterprise-class data protection features running. The high-availability, failure-proof VMware cluster with two nodes achieved this throughput on random reads (4K) and 180,000 IOPS on random writes (4K) with latency of less than 0.2 milliseconds.

Read More

Rethinking Snapshots to Accelerate Performance

Gal Turchinski Posted by Gal Turchinski
on June 17, 2019

Previously, we discussed the challenges inherent in providing the strong levels of data protection that are required today. Specifically, outdated storage architectures require application performance to be sacrificed and budgets to be exceeded, in order to obtain acceptable levels of data protection and resiliency. Of data protection capabilities, snapshots are the most CPU and memory-intensive, and as a result are a leading culprit of performance slowdowns. Throughout this installment, we will discuss how StorOne has rewritten snapshots to address this challenge.

Read More

Data Integrity: The Backbone of Competitive Advantage

Gal Turchinski Posted by Gal Turchinski
on June 5, 2019

Data is the foundation of business advantage in today’s economy. Analytics and artificial intelligence (AI) are helping businesses to uncover new competitive opportunities and to operate in a more efficient and streamlined fashion. At the same time, requirements for data privacy are higher than ever before, because consumers are becoming more discerning about how their information is used, and stringent data privacy regulations are emerging globally. Simply put, it is mission-critical to the business that data be available, accurate, consistent and secure.

Read More

Innovation Over Integration Yields Unprecedented Storage Efficiency

Gal Naor Posted by Gal Naor
on May 14, 2019

We live in an age of tremendous storage hardware innovation.

Solid-state drives (SSDs) that are capable of delivering more than 100,000 input/output operations per second (IOPS) in raw performance have hit the market. The reality, though, is that customers are not getting the full benefits of these innovations. They are only able to obtain a fraction of these levels of performance from their storage arrays, because the storage array is bogged down by wildly inefficient legacy storage software algorithms.

Read More

How to Reduce the Cost of Storage Operations

Gal Naor Posted by Gal Naor
on April 29, 2019

Storage managers have always been pressured to do more with less.

That pressure intensifies as the volume of data explodes, as the number of performance-hungry workloads grows, and as faster-performing but also more expensive storage technologies such as solid-state drives (SSDs) and non-volatile memory express drives (NVMe) enter the equation. Delivering the throughput, processing power, and storage capacity required by today’s workload ecosystem without breaking the bank necessitates new levels of hardware utilization that are not possible with legacy storage software.

Read More

How to Reduce the Cost of Performance Storage

Gal Naor Posted by Gal Naor
on April 8, 2019

The advent of solid-state disk (SSD) media and non-volatile memory express (NVMe) protocols makes storage cost optimization more important than ever before.

SSDs offer more capacity per unit than hard disk drives (HDDs), and they continue to become denser as the technology matures. Meanwhile, NVMe delivers faster performance and lower latency to this media, and new networking interfaces such as 40 Gigabit Ethernet (40GbE) and 100 Gigabit Ethernet (100GbE) increase bandwidth. All of this innovation comes at a price premium, creating the need to harness all the performance available from the storage media effectively.

Read More

Why Does IT Buy New Storage?

Gal Naor Posted by Gal Naor
on March 18, 2019

Storage environments have been historically siloed

Infrastructure is dedicated specifically for production, backup and archive use cases, and each environment or workload often receives its own dedicated storage infrastructure. This fragmentation is becoming exacerbated as hyperconverged infrastructure, cloud storage services, and a new tier of performance-hungry workloads enter the equation. Quickly, IT organizations are faced with a significant degree of storage cost and complexity. This is unacceptable in terms of meeting digital business requirements for agility, simplicity and new levels of cost efficiency.

Read More

What Are Your Storage Priorities?

Gal Naor Posted by Gal Naor
on March 11, 2019

The Storage Utilization Challenge

Today’s information era places a premium on storage performance and capacity while further squeezing budgets. Data is growing at an exponential rate, and businesses are turning to artificial intelligence, machine learning and analytics workloads in a meaningful way to harness this information for new advantages. Meanwhile, these business requirements necessitate faster performance from traditional workloads such as Oracle and Microsoft SQL Server as well. This demanding workload ecosystem requires unprecedented levels of utilization of storage capacity, storage memory and storage IO performance.

Read More