Blog

2nd generation digital migration – if it was easy, everyone would do it!

Written by

Back in the day – say around 20 years ago – digitisation offered a panacea; a mechanism to rid the world of analogue and proprietary digital video tape formats and make content more easily accessible and exploitable. Using supposedly non-proprietary encoding schemes, the content became independent of the physical media, so future migrations would be easy. Robotic data libraries and controlling software automated many processes, removing the need for many staff. Carefully annotated and indexed content using new DAM systems would make assets inherently exploitable, watermarking would offer protection, and early speech-to-text processing would make for the richest set of metadata.

But was this the expected panacea? Well, not entirely. Digitisation realised lots of benefits, but it didn’t all work out as anticipated, introducing a number of unintended consequences and risks for subsequent digital migrations. Here are a few examples.


Damn vendors…

DAM vendors themselves have become the biggest point of risk for a few good reasons. In the grand scheme of things, the broadcast media market is not that big and everyone wants something different. The result has been lots of DAM systems that may only have originally been designed against a few use cases, which they excel at, perhaps as the original design was for a single big client with a bespoke workflow. However, a DAM system originally designed for a digital archive library may not lend itself well for use in transmission or production, and vice versa.

Just stop for a second and think of all the DAM vendors who have ceased trading or have been acquired by trade or a client and then disappeared. Migrating from a legacy DAM system is not likely to be trivial; it can throw up some of the simplest issues – but complex problems. As an example, let’s say you’ve migrated to a new DAM system, you search for some video items, and it produces different search results to the original system. Have assets become orphaned, never to be seen again?

Perhaps the simple answer is: don’t buy a DAM system – build it! Ramp the development team up, capture the requirements, develop a solution, deploy it, become a hero and ramp the team back down again. This works well for several years, then new codecs come along, new versions of OS, security patches, and then there is no way to keep up. The original developers have long gone and the few left on board plan to retire, whist holding the keys to the castle! As it’s a self-build, there is no documented API, as there was no consideration in the original design to migrate away from this proprietary system.

Rise of the PAMs

A PAM is a Production Asset Management system that manages live ‘work-in-progress’ production data (unlike a DAM, which manages finished content). However, a PAM was never intended to become a permanent repository. So, it doesn’t translate or migrate well to a DAM environment, since its data hierarchy is production-data-centric, may have had data fields added on a per-production or genre basis and is not a carefully managed and structured taxonomy. The result is a PAM that may be many years old and holds business-critical production information but the system may have become obsolete. However, there is no way of migrating away from the system without losing valuable production information, as this can’t be represented appropriately in a DAM system.

Who needs standards anyway?

Standardising codecs and wrappers has been an industry ambition for decades but the truth is that everything is a moving target, and always will be. Early digital codecs were not great quality, they were inefficient and often required proprietary chips to encode in real time. Some codecs were optimised for acquisition, post-production, transmission and streaming, and many were proprietary to different vendors; often called ‘de facto’ standards.

There is also a problem with existing standards; different vendors can have different interpretations of a ‘standard’ or can be selective in which parts of the standards they implement. This leads to situations where an archive can contain media that notionally conforms to a standard but that is unsupported in another system that has justifiable claims to support the same standard. So the two systems cannot inter-operate.

Of course, organisations such as the EBU and SMPTE have made valiant efforts to create standards. But it’s become increasingly difficult as change continues to accelerate; manufacturers have differing agendas which makes ‘true’ standardisation almost impossible.

So which codec to use in the library? The simple answer is possibly any of them! This made the problem of contemporary migration a multi-dimensional problem, with differing aspect ratios, frame sizes and frame rates. The final twist of historical complexity was the US film telecine 3:2 pulldown, along with 29.97 recorded frame rate, meaning the audio never quite matched up.

With some exceptions, the wrapper and metadata standardisation is still ‘out in the wild’, since many people had different motivations and needed something different. As an example, the next-generation wrapper metadata and codec could come from a new phone or brand-new movie-production quality camera technology.

The exception to all the above has been the success of some of the big studios and broadcasters, who have very defined technical requirements for contracted content deliverables. Enabled through organisations such as the DPP, this is a moving target too.

Don’t forget audio  

Audio also gets in the way of the perfect standard, with its own set of issues: number of channels, surround sound, Dolby Atmos, etc. Of course, audio will carry its own metadata, but often AAF (Advanced Authoring Format) files are also required to integrate audio post-production tools. What do we do with those production tools – will they also still work in future?

Decline of LTO

Migrating the 1st generation behemoth robotic digital libraries – which are often LTO data tape based – has become a real issue. LTO systems were always scaled for ‘normal’ operation, with enough drives and slots, plus the robotics needed to pick and place data tapes fast enough.

Often the software wasn’t conventional HSM (hierarchical storage management) software either. Production video files tend to be large and could span more than one data tape cartridge, plus partial file extraction was needed (and still is today) that could enable, say, a one-minute section out of a one-hour programme, thus accelerating transfers by avoiding transferring the whole file. Storage management software could associate assets – for example, ‘Michael Jackson’ – with a single tape or group of tapes. Usually, the storage management software would have an API and some proprietary way of writing files to tape, maintaining a proprietary index and database of the library, perhaps extending to tapes external to the library.

So, let’s say we have an old robotic tape library containing 4PB of data that needs to be migrated.  The first thing is, it’s most likely still in use. So perhaps only half the number of drives – say 6 out of 12 – are available for the migration. The remaining drives, which are probably coming to the end of their lives, are going to be hammered and in continuous use, so drive failure rates could be higher than anticipated, slowing the migration.

Then, there is the proprietary robotic library controlling system, which all made complete sense when new. But now, this is a huge bottleneck, since the API may be too slow to poll for data regarding 4PB of assets. The system may allow a tape-wise migration or a content-wise migration; depending on how the content was written to the library should determine how it is migrated. Also, the original software vendors were all acquired, and knowledge and technical support has become difficult, especially so for migrations.

There is often complacency when considering migrating digital archives. If a digital archive is 20 years old, it is highly unlikely to contain homogeneous content and metadata. Let’s say a legacy production system is also going to be replaced with a modern production system; it’s highly unlikely that the new system will be backwardly-compatible with all of the legacy content. So, it’s important to determine the potential migration yield and how much the process can be automated. If the library contains petabytes of data and millions of assets, the migration yield could be fundamental to the success (or not) of the project.

So how can Marquis help?

First, we don’t make MAM, PAM or DAM systems, or sell storage systems. We believe choosing and using such systems is a free customer choice. What we do have is the migration technology and years of experience to enable and de-risk automated migrations. Our services team works with vendors, partners, service providers, SIs and clients to make and enable successful migrations.

Our metadata translation capabilities have been used by the biggest media enterprises. We’re also the only company who has successfully archived a PAM system for a major studio, so it can still be queried.

We have a vendor-specific codec interoperability library and API library that goes back 2o years, which no other vendor has. The original vendors may be long gone but their original content may be still in the library with the legacy system – which is now end of life – but still in use. These capabilities are fundamental to automating a migration.

As an example, we recently worked on an automated multi-petabyte migration project that was suffering a 60% failure rate in media compatibility. We were able to deploy our technology, which resulted in a 95%+ automated transfer success rate. Unfortunately, we came in very late to mitigate the problem, and by this time, the project was already over-running and over budget. It didn’t have to be.

The best plan is to bring us in early, since we can analyse content and metadata and work out how best to migrate it. We can test sample files in our labs (or remotely for infosec compliance). We can pre-determine policies on how to automatically migrate them, such as to re-wrap, transcode, scale or de-interlace, etc. We know how to integrate to legacy archive APIs and, if needed, directly access the database if the API is too slow. We can work out how to interoperate legacy content with new vendors, or even come up with mezzanine framework for interoperability.

Our technology runs on-prem and in-cloud, so migrations – whatever they may be – are easy. We also licence our technology just for the migration period, so no sunk costs.  

Finally, we can also scope and mitigate risk at the pre-tendering stage. Since we know what to look for, as per the examples described above, we can ensure risks are identified and fixes are pre-determined.

The outcome will always be a more successful migration project, which is much more likely to finish on time and on budget.

First, we don’t make MAM, PAM or DAM systems, or sell storage systems. We believe choosing and using such systems is a free customer choice. What we do have is the migration technology and years of experience to enable and de-risk automated migrations. Our services team works with vendors, partners, service providers, SIs and clients to make and enable successful migrations.

Our workflow tools

Marquis has a range of clever workflow tools that can be flexibly deployed for broadcast and post-production, helping facilities improve the efficiency and ease of digital workflows. Our systems deliver cost-effective integration between leading broadcast content applications, ensuring standalone or shared-storage edit platforms, media asset management and automation systems all work seamlessly together.

Project 25x25

Project Parking

Avid project analytics and management

WT 300 px

Workspace Tools

Advanced risk mitigation tools for Avid storage

postflux 310

Postflux

Adobe Premiere Pro project workflow tool

Med 300px

Medway

Middleware workflow integration solution

interplay export

IMET

Avid Interplay metadata exporting tool

EB 300px

Edit Bridge

Avid Interplay integration with Adobe Premiere and After Effects

X2Pro Large

X2Pro

Delivers FCP X projects to Avid Pro Tools for audio finishing

Worx4 X

Worx4 X

Consolidates and trims media in an FCP X project

Read 1757 times
Paul Glasgow

Managing Director