5 Storage Infrastructure Management Challenges for IT Managers

Park Place Hardware Maintenance


Parker January 25, 2018

Industry watchers predict 2018 will bring a wave of bigger, faster flash storage, hyperconverged infrastructure, and software-defined storage to help manage the influx of data headed their way

The words used to describe the increase in global data volumes aren’t pretty. Overload. Onslaught. Even explosion. IT leaders are feeling battered as their organizations collect more data than they know what to do with.

Industry watchers predict 2018 will bring a wave of bigger, faster flash storage, hyperconverged infrastructure, and software-defined storage to help manage the influx of data headed their way.

These solutions are vital, but they overlook a core problem. For many enterprises today, data is more a cost center than a revenue generator.

The big data movement, increased video use and other data-driven business trends are pushing organizations to develop new and more advanced storage architectures. This leaves IT managers on the back of the strategic movement needing to create hardware strategies that support business goals without breaking the bank. The resulting challenges can be overwhelming, but the right combination of sophisticated tactics and effective partnerships can help IT leaders deal with storage issues.

5 Common Storage Infrastructure Management Challenges

The first thing to do when developing these plans is to understand the scope of the challenges impacting the storage segment of the data center. Five key storage issues impacting IT managers include the following.

1. Big data

The analytics movement has become a pervasive IT trend and left many IT managers shaking their heads tat the rapid rise in both unstructured and structured data within facilities. At the same time, IT teams need to deal with new information added to the analytics system all the time and a need to archive data once it has been used so that it is still accessible, but not taking up space on the high-performance systems. The capacity and performance requirements of big data can be staggering, making advanced tactics critical for success.

Enterprise network monitoring software can help identify and alleviate the pain points of these big data challenges.

2. Increased flash use

Flash technologies are rising in part because they are becoming accessible and in part because big data is making their performance capabilities necessary. Many experts agree that though flash is expensive, its ability to accelerate data workflows in big data plans makes them capable of delivering a huge return on investment. While this potential is significant, IT teams need to not only get the capital for flash investments, they must also be ready to support archiving to make sure data is prepared for long-term storage.

While flash has the potential to be incredible viable for long-term storage, it is currently limited by its read-write cycle capabilities. In general, flash solutions can only handle approximately 10,000 read-write cycles before their performance and reliability declines, making them ideal for high-performance projects, but inadequate for long-term archiving and similar tasks. Enterprise storage management services are available to alleviate the pain of growing storage needs.

3. Video

Many businesses are implementing video strategies as part of a larger goal to engage with employees and customers in more meaningful ways. While video offers incredible potential from an end-user perspective, it also creates nightmares for IT managers. Storing video can be a huge challenge as the amount of data involved in the process is considerable. Codec technologies can ease this burden by enabling some videos to be stored in formats that are not as data-intensive, but such strategies can be difficult to employ.

4. Legacy systems

The rapid rise in storage demands across the entire IT sector makes using legacy systems incredibly important for many organizations. Reliable storage systems that meet specific needs or handle different workflows particularly well can be incredibly valuable for IT teams. Having a good third-party hardware maintenance plan in place can be particularly valuable in this area because the access to rapid repairs, effective contact center models and refurbished parts can position IT managers to support legacy storage systems without risk. Learn how storage hardware maintenance can save you 30-40% vs. the OEM today!

5. Data protection

While capacity and performance challenges are dominating the storage landscape, IT managers have a constant need to prevent data loss and protect data from theft. Third-party hardware maintenance plans can pay off in this area as well as vendors can help with processes like secure hard  disk disposal and data recovery from damaged components.

IT managers are facing staggering storage challenges, but the combination of advanced strategies and good partnerships can go a long way toward enabling IT leaders to keep pace with changing storage requirements.

Shifting Toward Specialized Products for Block, File, and Object Storage

The good news for the channel is that storage is hot. In the InformationWeek 2018 State of Infrastructure Study, storage beat out even cloud services as a top factor driving IT infrastructure change. Additionally, 62% said their data volumes are increasing by more than 10% annually, while few (7%) reported hosting the bulk of their storage (75% or more) in the cloud.

The takeaway is that enterprises and SMEs have a lot more data to host, and they are going to need new hardware to do it.

The real news isn’t the capacity, however, it’s the shift in expectations. Traditionally, storage was primarily about archiving, compliance, and disaster recovery. With the rise of big data analytics and machine learning, data has taken on strategic value, which businesses of all sizes are intent to squeeze for all its insightful juice. This trend is pushing customers to prioritize storage performance to a degree not common in the past.

Thus we find, for example, “unified storage” is losing some luster. The Jack of All Trades approach doesn’t permit the storage software necessary to excel in each use case. This is leading more vendors toward offering specialized products for block, file, and object storage.

Among the types of products most likely to meet customers’ needs today and tomorrow are:

  • All-Flash arrays
    No one who has installed SSDs is regretting the decision, and there is good reason to believe materials shortages are abating. Costs will start to fall again, leaving few barriers to installing Flash for performance and OPEX savings reasons. Impressive transfer rates of 10 Gb per second will become the standard. An associated outcome, interest in hybrid arrays will decline, as there are few compelling arguments for complicating systems with the HDDs.
  • Storage packed with NVMe SSDs
    These low-latency interfaces add about 10% to the purchase price of a storage system but deliver two to three times the performance and about half the latency, so customers are getting ready to dive in. That being said, NVMe is also likely to show up in most product lineups, so it won’t be a significant brand differentiator.
  • Hyperconverged infrastructure (HCI)
    IDC predicts the HCI market will grow to almost $5 billion next year; that’s up from $2 billion in 2016. The software-centric approach and almost Lego-like scaling is a good fit for many data centers. HCI with All Flash is an obvious choice.
  • Storage class memory (SCM)
    Support for SCM was included in Microsoft Windows Server 2016 and most Linux kernals, so the world is ready. But price point compared to Flash (SCM lines up better against Flash than DRAM) may keep a lid on demand for now. However, 10x read/write speed over Flash, higher IOPS, comparable throughput, and granular data access at the bit or word level will tempt some enterprises in this direction. SCM may be more of a 2019-and-beyond concern, but the channel should begin preparing now.
  • Products with an in-cloud equivalent
    HPE Cloud Volumes and IBM Spectrum Virtualize are just two examples. Arrays are more often coming packaged with cloud equivalents for replication or failover.

Most products appealing to the enterprise, at least, will need to enable software-defined storage, which will increasingly be delivered in containers to provide for flexible programmable infrastructures. And rapidly emerging, machine learning will facilitate new storage management processes to optimize workload services.

Manufacturer-wise, Dell-EMC, HPE, IBM, and NetApp still have a place, but All Flash has opened doors for newer entrants, such as Pure Storage, and HCI is propelling sales for the likes of Nutanix and others upwards.

About the Author

Parker, Park Place Assistant