Blogs /
Experts Weigh In on the Challenges of Data Migration and How to Solve Them
Cloud Migration
Data Migration
Tech Blog
Jun 7, 2023

Data migration is a necessity in today’s business landscape. Companies are contending with more data than ever before, and in turn, are in need of more efficient environments to store this growing tsunami. Historically, organizations have needed to refresh their storage platforms every 3 – 5 years. Today, with more cost-effective storage options that improve sustainability and deliver better performance, storage refresh is happening much faster. These two major trends have created a huge need for efficient, secure, and fast data migration.
Everyone has seen the statistics or directly felt the impact of challenged data migration. In fact, Gartner reported that approximately 50% of enterprises will exceed their data migration timelines and budgets.* If you haven’t lived through a broken data migration project, the common response is “Why?” or ”How can it be this hard?”
The reality is that migrating databases, applications, or other mission critical data is complex. There are a series of technical considerations that have to be addressed depending on the original source and the type of new storage. Additionally, downtime is incredibly disruptive and can be difficult to schedule. The other serious concern is security and making sure you are not inadvertently compromising your security standards.
How do you ensure a successful data migration that meets your timeline and security standards? It starts by using a purpose-built block level data migration solution. Migrating block-level data is exponentially faster than other methods. Ensuring that your solution integrates with any block-level data storage application or operating system on-premises or in the cloud is key, as some tools only work for one type of storage product, which can create complications. Security is paramount, so don’t compromise when it comes to securing your data throughout this process.
With more than 4,000 data migrations complete, my list of things to look for in a data migration solution includes the ability to migrate data live without disrupting your business, limit downtime to zero or near-zero, rewind the data in case a problem occurs, flawless automation to save your team time, and more.
Given the industry’s growing focus on the cloud and new storage technologies, I reached out to key experts to get their take on the struggles companies are facing when it comes to data migration. Here’s what they had to say, plus their best advice for solving this increasingly common problem.
Ask the right questions
Tom Coughlin, analyst and consultant at Coughlin Associates and Forbes contributor, recommends companies ask themselves these three questions: “First, do you know where all the material you want to migrate is located? Second, how many copies do you want and is it worthwhile to do deduplication during your migration? Third, how often should and can you migrate your material and what is the economic driver for migration?”
According to Coughlin, “strategies to deal with these issues include good metadata retention and indexing of content, comparing content when migrating, and deduplicating content based upon predetermined rules. Also, make sure that you take into account all of the costs involved in a migration, including validation of the migration and additional values from the migration, to create a sound business model for migrating.”
Understand the changing role of data migration (and act accordingly)
Chris Evans, owner and author at Architecting IT, points out the changing role of data migration: “Increasingly, data migration is moving from a “one-off” task for technology replacement to one where data is placed in the most appropriate place to balance cost, accessibility, and security. This can mean migrations to/from the public cloud, in and out of an archive, or where hardware is being repurposed/replaced.”
Evans says companies need to consider three factors:
“Risk: How do I safely move data without affecting the business?”
“Impact: How do I move data without requiring an outage/downtime (or minimize any impact)?”
“Cost: How do I cost-effectively move data within my infrastructure?”
“To build an effective migration strategy,” Evans says, “businesses need to address the factors above using tools, process, and standards. When considering tooling, we mean software solutions that move data at the block, file, and object level. We could also consider a virtual machine as an ‘object,’ although virtualization platforms generally have tools to do migration in place. Tools should be auditable, transparent to end users (where possible), and offer features like dry runs and restartability.”

He goes on to point out that, “process is extremely important and dictates the steps to pre-validate data before migration, move data efficiently, validate data has moved correctly and is still accessible to the end user. Enabling process is the efficient use of standards, which covers everything from efficient naming standards, standards on security policy, data performance requirements, and availability requirements.”
“After the above points have been considered, don’t forget data protection! As data moves between infrastructure, it still needs to be protected with the same policies and SLAs requested by the business.”
Finally, Evans predicts that “in a hybrid world, data mobility and migration will become a ‘business as usual’ task, but only if IT organizations can address the three challenges above: Risk, impact, and cost.”
IoT vendors must take extra consideration
Contributor and Storage Influencer Chris Preimesberger calls attention to the fact that, “IoT vendors often send sensory data in non-interoperable or proprietary formats that cloud services are unable to recognize. The engineering work required to reformat the payloads to use them for business is significant, time-consuming, and recurring. Physical on-premises IoT gateways are expensive to deploy and manage, introducing new management and security challenges.”
But that’s not all. According to Preimesberger, “embedded cloud-friendly chips and stacks work only for new IoT devices, leaving legacy devices behind and requiring design-in times that can take months or years. Data comes in various dimensions, too, such as temperature, humidity and pressure ratings—not just what one sees on a page. Thorny problems like these are commonplace.”
More complexity, more challenges
Tim King of Solutions Review says that “the single biggest data migration challenge for enterprises in 2023 is the increasing complexity and diversity of data sources. Sub-challenges include data security and privacy. Addressing data migration challenges requires careful planning, execution, and technology investment.”
Time is a critical factor
Randy Kerns of Evaluator Group believes “the biggest challenges for IT in migration are in the amount of time it takes for the migration based on the amount of data and the connection bandwidth, allowing for continuing operations during migration, and migration of various data types – block (usually databases) and files.”
Kerns goes on to say that “strategies employed include acceleration through parallelism with multiple connection paths, systems with the ability to move data while still allowing access and providing for a consistent switchover, and a few solutions that can migrate block storage in addition to files.”
Every company needs the ability to effectively migrate its data without downtime. As Chris Evans pointed out, this need will become critical as businesses increasingly use data migration to move workloads on an as-needed basis to balance things like cost, accessibility, and security. Although it has its challenges, data migration tools are making this process much easier. By heeding the advice above, businesses can begin to reap the benefits of seamless data migration.
—–
* Gartner Report 2021 Primary Storage | Gartner Report 2021 Distributed File and Object storage
Wayne Lam