A woman and a man looking at a tablet device.

Evaluating Storage Products for Your Enterprise Needs

For IT departments, evaluating storage products is an important process that can determine the entire shape of a company’s digital infrastructure. A poorly architected storage solution can substantially impair a department’s performance and lead to major outages or in worst case scenarios, permanent data loss. However, an intelligent decision, made with the right factors in mind, can provide an organization with a scalable shared storage solution that is able to meet the performance and reliability service level objectives of the proposed system design.

Especially on a larger scale, maintaining IT architecture can be like keeping an old car operational. It’s expensive and resource-intensive if you don’t have the time or money to source a better alternative. IT sysadmins working with antiquated and inefficient hardware can struggle to catch up and support data transformation initiatives.

Choosing a suitable storage solution

Hand in a business suit clicks a virtual screen to select document management, options for upload and download, security, settings, more

When designing a greenfield solution, it’s important to first understand the high-level architecture and system design of the proposed solution and understand the possible resource bottlenecks throughout the entire stack. This will enable application and storage architects to choose and design the suitable storage solution. We highlight some key questions storage architects should be asking to help make an informed decision:

  • What is the storage solution intended for?
  • Will it require block, file or object storage access?
  • What is the typical workload like?
  • What is the IOPS, throughput and latency requirement?
  • What is the required availability? (99.9%,99.99%,99.999%?)
  • Does the data need to be backed up? How frequently?
  • Does the data need to be replicated?
  • What are the disaster recovery requirements in terms of recovery time objective (RTO) and recovery point objective (RPO)?
  • What are the data retention requirements?
  • How much does the data change daily, weekly, monthly, yearly?
  • What is expected growth in capacity per year?

Enterprise Data Requirements

Understanding the block, file, and object requirement

When onboarding new applications, it’s important to understand the kind of data being stored to make an informed decision on whether to use block, file or object storage.

Block storage is the most common use case for DAS and SAN environments. In the case of DAS, an entire RAID volume or Physical Drive is presented to the OS as a raw, unformatted volume. In the case of SAN environments, the entire LUN (compromised of several physical drives) presented from the storage array is presented to the OS through a high-speed network and appears as a raw unformatted volume. The underlying layers of the raw volume consists of smaller extents or sectors that the operating system handles and then the underlying storage subsystem is able to map those logical blocks to specific physical blocks on the specific drive(s). Block level storage is fast, reliable and most ideal for continuously changing data like relational databases, online transaction processing (OLTP) databases, email servers or virtual desktop infrastructure, where high transaction throughput and low latency is a requirement.

Object storage stores data (and metadata associated with it) in containers with unique identifiers, with no folders or subdirectories like those associated with file storage. They use the concept of key-value stores, where each key point to a specific “value” or piece of data and is retrieved via APIs.

It is mainly used to handle large amounts of unstructured data, like emails, backup images, video surveillance footage or, in IoT, data management for machine learning and data analytics. Object storage is good for handling very large amounts of data and can scale as quickly as the application requires but is slow at data retrieval, making it inefficient for databases or high-performance computing. Examples of object storage are Amazon S3, Google Cloud object storage, or Azure Blob storage.

File storage stores data in files, organized in folders and subdirectories, and is shared over a network using SMB (Windows) or NFS (Linux). It’s good for centralizing storage files like videos, images or documents but it has limited scalability as the amount of data continues to grow. It is not the most suitable application to handle very large amounts of unstructured data or continuously changing data like OLTP databases.

Worker with stylus manipulates virtual display while sitting behind a laptop, choosing options for signing an electronic document

Successful enterprises therefore concern themselves with building High-Performance Computing systems (HPCs). They leverage local databases and data services to perform transactional computation and then enable native integration with cloud object stores to store large amounts of unstructured data. This enables the throughput and IOPS-intensive transactions to occur in local datacenters’ fast block and file storage and slower cloud object storage for storing a large amount of unstructured data.

Large scale data processing requires a data storage solution based on the type of data your enterprise needs to analyse. For example, to process and analyze unstructured on-prem or cloud-based data, companies need a file data platform for a hybrid storage infrastructure, one which can provide real-time analytics and insights.

Storage Performance Testing

A central pillar of evaluating storage products is testing and validating them. The benefits of testing are many. Improved application performance, storage cost optimization, and risk mitigation are all outcomes that can be tested for with the right tools. That said, small or underfunded IT departments can find it hard to do so, as DIY or shareware tools often prevent the rigorous variety of testing necessary to replicate a company’s real-world production environment.

Testing can be used to answer any or all these questions:

  • How much can I improve app performance by implementing new storage tech/products?
  • Can I afford the performance improvement?
  • Will new techniques reduce the cost per gigabyte without overly affecting performance?
  • How can I select the best tech/product/configuration to match my app workloads?
  • Which workloads will gain most from new architectures/products?
  • Where are the performance limits of potential new configurations?
  • How will storage media behave on reaching performance limits?

If you are choosing a scalable enterprise data storage solution, it is vital to pay attention to how your chosen storage works with data and apps.

Support for Storage Products

A great product can be sadly undermined by a lacking support team to help an enterprise manage any issues they have during its use. Conversely, a good product can be raised to new heights by the exceptional efforts of its technical support staff. It may be worth accounting for your existing professional relationship with your enterprise’s incumbent storage vendor when making a decision regarding a potential change in your storage solution. Additionally, any service level agreements (SLAs) such as meeting KPIs like latency, throughput, or IOPS during specific workloads should impact your choice. If your intended vendor has a strong reputation in the industry (for example, they generally exceed the industry standard benchmarks), you can take them at their word when they advertise features such as high IOPS and throughput at acceptable latency for each platform.

Another element to keep in mind is the cost of any storage products you’re considering. Not only the cost of acquisition, but the cost of maintenance and total cost of ownership (TCO) should be accounted for.

#KingstonIsWithYou

Ask a Server SSD Expert

Ask a Server SSD Expert

Planning the right solution requires an understanding of your project's storage goals. Let Kingston's experts guide you.

Ask an Expert

Related articles