error correction

Digitisation: methodologies, processing and archival practices

 Time-based correction machinesWe work with a range of customers at Great Bear, digitising anything from personal collections to the content of institutional archives. Because of this, what customers need from a digitisation service can be very different.

A key issue relates to the question of how much we process the digital file, both as part of the transfer and in post-production. In other words, to what extent do we make alterations to the form of the recording when it becomes a digitised artifact. While this may seem like an innocuous problem, the question of whether or not to apply processing, and therefore radically transform the original recording, is a fraught, and for some people, ethical, consideration.

There are times when applying processing technologies is desirable and appropriate. With the transfer of video tape, for example, we always use time-based correctors or frame synchronisers to reduce or eliminate errors during play back. Some better quality video tape machines, such as the U-matic BVU-950P, already have time-based correctors built in which makes external processing unnecessary. As the AV Artifact Atlas explains however, time-based correction errors are very common with video tape:

‘When a different VTR is used to playback the same signal, there can be slight mechanical and electronic differences that prevent the tape from being read in the same way it was written. Perhaps the motors driving the tape in a playback VTR move slightly slower than they did in the camera that recorded the tape, or maybe the head of the playback VTR rotates a fraction quicker than the video head in the machine that recorded the tape. These tiny changes in timing can dramatically affect stability in a video image.’

We also utilise built in processes that are part of machine’s circuitry, such as drop out compensation and noise reduction. We use these, however, not in order to make the tape ‘look better.’ We do it rather as a standard calibration set up, which is necessary for the successful playback of the tape in a manner appropriate to its original operating environment.

After all, video tape machines were designed to be interchangeable. It is likely such stabilising processing would have been regularly used to play back tapes in machines that were different to those they were recorded on. Time-based correction and frame synchronisation are therefore integral to the machine/ playback circuitry, and using such processing tools is central to how we successfully migrate tape-based collections to digital files.

Digital processing tools Time Based Correction - Close Up

Our visual environment has changed dramatically since the days when domestic video tape was first introduced, let alone since the hay day of VHS. The only certainty is that it will continue to change. Once it was acceptable for images to be a bit grainy and low resolution, now only the crisp clarity of a 4K Ultra HD image will do. There is perhaps the assumption that ‘clearer is better’, that being able to watch moving images in minute detail is a marker of progress.  Yet should this principle be applied to the kinds of digitisation work we do at Great Bear? There are processors that can transform the questionable analogue image into a bright, high definition, colour enriched digital copy. The teranex processor, for example, ‘includes extremely high quality de-interlacing, up conversion, down conversion, SD and HD cross/standards conversion, automatic cadence detection and removal even with edited content, noise reduction, adjustable scaling and aspect ratio conversion.’ ‘Upgrading’ analogue images in this way does come with certain ethical risks.

Talking about ethics in conjunction with video or audio tape might seem a bit melodramatic, but it is at the point of intervention/ non-intervention where the needs of our customers diverge the most. This is not to say that people who do want to process their tapes are unethical – far from it! We understand that for some customers it may be preferable for such processing to occur, or to apply other editing techniques such as noise reduction or amplification, so that audio can be heard with greater clarity.

Instead we want to emphasise that our priority is getting the best out of the tape and our playback machines, rather than relying on the latest processing technology that is also at risk from obsolescence. After all, a heavily processed file will always require further processing at an unknown point in future so that it can maintain visually relevant to whatever format is commercially dominant at the time. Such transformations of the digital file, which are necessarily destructive and permanent, contribute to the further circulation of what Hito Steyerl calls ‘poor images‘, ‘a rag or a rip; an AVI or a JPEG…The poor image has been uploaded, downloaded, shared, reformatted, and reedited. It transforms quality into accessibility, exhibition value into cult value, films into clips, contemplation into distraction.’

Maintaining the integrity, and as far as possible authenticity of the original recordings, is a core part of our methodology. In this way our approach corresponds with Jisc’s mantra of ‘reproduction not optimisation’ where they write:

‘Improving, altering or modifying media for optimisation may seem logical when presenting works to a public or maintaining perceived consistency. It should be remembered that following an often natural inclination to enhance what we perceive to be a poor level of quality is a subjective process prescribed by personal preference, technological trends and cultural influences. In many cases the intentions of a creator are likely to be unknown and this can cause difficulties in interpreting levels of quality. In these instances common sense alongside trepidation should prevail. On the one end of the spectrum unintelligible recordings may be of little use to anyone, whereas at the opposite end recordings from previous eras were not produced with modern standards of clarity in mind.’

It is important to bear in mind, however, that even if a file is subject to destructive editing there may come a time when the metadata created about the artefact can help to illuminate its context and provenance, and therefore help it maintain its authenticity. The debates regarding digital authenticity and archiving will of course shift as time passes and practices evolve.

In the meantime, we will continue to do what we are most skilled at: restoring, repairing and migrating magnetic tape to digital files in a manner that maintains both the integrity of the original operating environment and the recorded signal.

Posted by debra in Audio / Video Archives, Audio Tape, Video Tape, 0 comments

Early digital tape recordings on PCM/ U-matic and Betamax video tape

We are now used to living in a born-digital environment, but the transition from analogue to digital technologies did not happen overnight. In the late 1970s, early digital audio recordings were made possible by a hybrid analogue/digital system. It was composed by the humble transport and recording mechanisms of the video tape machine, and a not so humble PCM (pulse-code-modulation) digital processor. Together they created the first two-channel stereo digital recording system.

Inside a Betamax Video Recorder

The first professional use digital processing machine, made by SONY, was the PCM 1600. It was introduced in 1978 and used a U-matic tape machine. Later models, the PCM 1610/ 1630, acted as the first standard for mastering audio CDs in the 1980s. SONY employee Toshitada Doi, whose impressive CV includes the development of the PCM adaptor, the Compact Disc and the CIRC error correction system, visited recording studios around the world in an effort to facilitate the professional adoption of PCM digital technologies. He was not however welcomed with open arms, as the SONY corp. website explains:

‘Studio engineers were opposed to digital technology. They criticized digital technology on the grounds that it was more expensive than analogue technology and that it did not sound as soft or musical. Some people in the recording industry actually formed a group called MAD (Musicians Against Digital), and they declared their position to the Audio Engineering Society (AES).’

Several consumer/ semi-professional models were marketed by SONY in the 70s and 80s, starting with the PCM-1 (1977). In a retro-review of the PCM-F10 (1981), Dr Frederick J. Bashour explains that

‘older model VCRs often worked better than newer ones since the digital signal, as seen by the VCR, was a monochrome pattern of bars and dots; the presence of modern colour tweaking and image compensation circuits often reduced the recording system’s reliability and, if possible, were turned off.’

Why did the evolution of an emerging digital technology stand on the shoulders of what had, by 1981, become a relatively mature analogue technology? It all comes down to the issue of bandwidth. A high quality PCM audio recording required 1-1.5 MHz bandwidth, which is far greater than a conventional analogue audio signal (15-20KHz). While this bandwidth was beyond the scope of analogue recording technology of the time, video tape recorders did have the capacity to record signals with higher bandwidths.

If you have ever wondered where the 16 bit/ 44 Khz sampling standard for the CD came from, it was because in the early 1980s, when the CD standard was agreed, there was no other practical way of storing digital sound than by a PCM Converter & video recorder combination. As the wikipedia entry for the PCM adaptor explains, ‘the sampling frequencies of 44.1 and 44.056 kHz were thus the result of a need for compatibility with the 25-frame (CCIR 625/50 countries) and 30-frame black and white (EIAN 525/60 countries) video formats used for audio storage at the time.’ The sampling rate was adopted as the standard for CDs and, unlike many other things in our rapidly changing technological world, it hasn’t changed since.

The fusion of digital and analogue technologies did not last long, and the introduction of DAT tapes in 1987 rendered the PCM digital converters/ video tape system largely obsolete. DAT recorders basically did the same job as PCM/ video but came in one, significantly smaller, machine. DAT machines had the added advantage of being able to accept multiple sampling rates (the standard 44.1 kHz, as well as 48kHz, and 32kHz, all at 16 bits per sample, and a special LP recording mode using 12 bits per sample at 32 kHz for extended recording time).

Problems with migrating early digital tape recordings

There will always be the risk with any kind of magnetic tape recordings that there won’t be enough working tape machines to playback the material recorded on them in the future. As spare parts become harder to source, tapes with worn out transport mechanisms will simply become inoperable. We are not quite at this stage yet, and at Greatbear we have plenty of working U-matic, Betamax and VHS machines so don’t worry too much! Machine obsolescence is however a real threat facing tape-based archives.

Such a problem comes into sharp relief when we consider the case of digital audio recordings made on analogue video tape machines. Audio recordings ‘work’ the tape transport in a far more vigorous fashion than your average domestic video tape user. It may be rewound and fast-forwarded more often, and in a professional environment may be in constant use, thus leading to greater wear and tear.

Those who chose to adopt digital early and made recordings on tape will have marvelled at the lovely clean recordings and the wonders of error correction technology. As a legacy format however, tape-based digital recordings are arguably more at risk than their analogue counterparts. They are doubly compromised by fragility of tape, and the particular problems that befall digital technologies when things go wrong.

Example of edge damage on a video tape‘Edge damage’ is very common in video tape and can happen when the tape transport becomes worn. This can alter the alignments of transport mechanism, leading it to move move up and down and crush the tape. As you can see in this photograph the edge of this tape has become damaged.

Because it is a digital recording, this has led to substantial problems with the transfer, namely that large sections of the recording simply ‘drop out.’ In instances such as these, where the tape itself has been damaged, analogue recordings on tape are infinitely more recoverable than digital ones. Dr W.C. John Van Bogart explains that

‘even in instances of severe tape degradation, where sound or video quality is severely compromised by tape squealing or a high rate of dropouts, some portion of the original recording will still be perceptible. A digitally recorded tape will show little, if any, deterioration in quality up to the time of catastrophic failure when large sections of recorded information will be completely missing. None of the original material will be detectable in these missing sections.’

This risk of catastrophic, as opposed to gradual loss of information on tape based digital media, is what makes these recordings particularly fragile and at risk. What is particularly worrying about digital tape recordings is they may not show any external signs of damage until it is too late. We therefore encourage individuals, recording studios and memory institutions to assess the condition of their digital tape collections and take prompt action if the recorded information is valuable.

***

 The story of PCM digital processors and analogue tapes gives us a fascinating window into a time when we were not quite analogue, but not quite digital either, demonstrating how technologies co-evolve using the capacities of what is available in order to create something new.

 

Posted by debra in Audio Tape, 1 comment