standards

Grundig C 100 and the early history of the Compact Cassette

The recent arrival of a Grundig C 100 (DC-International) cassette in the Greatbear studio has been an occasion to explore the early history of the compact cassette.

grundig-c100-cassette-tape

Grundig DC90 cassette

The compact cassette has gained counter-cultural kudos in recent times, and more about that later, but once upon a time the format was the new kid on the block.

The audio cassette was revolutionary for several reasons, an important one being its compact size. The compact cassette, introduced by Dutch company Philips in 1963 could be held in the palm of your hand, while its closest neighbour in media history, the RCA Sound Tape cartridge (1958-1964), needed to be held with two.

The compact cassette also offered a more user-friendly experience for the consumer.

Whereas reel-to-reel tape had to be threaded manually through the tape transport, all the user of a compact cassette tape machine had to do was insert a tape in a machine and press play.

Format Wars

One of the less-emphasised histories of the compact cassette is the alternative cassette standards that were vying for market domination alongside Philips in the early 1960s.

One alternative was the DC-International system developed by the German company Grundig who at that time were a leading manufacturer of tape, radio and Hi-Fi systems.

In 1965 Grundig introduced its first cassette recorder, the C 100, which used the Double Cassette (DC) International system. The DC-International used two-reels within the cassette shell similar to the Compact-System promoted by Philips. There were, however, important differences between the two standards.

The DC-International standard used a larger cassette shell (120 x 77 x 12mm) and recorded at a speed of 2 inches per second. The Compact-System was smaller (100 × 63 × 12mm) and recorded at 1⅞ in/s.

audio-cassette-grundig-c100-comparison

Grundig DC-International compared to standard compact cassette

Fervent global competition shaped audio cassette production in the mid-1960s.

Grundig’s DC-International was effectively (and rapidly) ousted from the market by Philips’ ‘open’ licensing strategy.

Eric D. Daniel and C. Denis Mee explain that

‘From the beginning Philips pursued a strategy of licensing its design as widely as possible. According to Frederik Philips, president of the firm at the time, this policy was the brainchild of Mr. Hartong, a member of the board of management. Hartong believed that Philips should allow other manufacturers access to the design, turning the compact cassette into a world product….Despite initial plans to charge a fee, Philips eventually decided to offer the license for free to any firm willing to produce the design. Several firms adopted the compact cassette almost immediately, including many Japanese manufacturers.’ [1]

The outcome of this licensing strategy was a widespread, international adoption of Philips’ compact cassette standard.

In Billboard on 16 September 1967 it was reported: ‘Philips has scored a critical victory on the German market for its “Compact-System”, which now seems certain to have uncontested leadership. Teldec has switched from the DC-International system to the Philips system, and Grundig, the major manufacturer of the DC-International system, announced that it will also start manufacturing cassette players for the Philips system.’

Cassettes today

The portable, user-friendly compact cassette has proved to be a resilient format. Despite falling foul to the digital march of progress in the early 1990s, the past couple of years have been defined by claims that cassettes are back and (almost) cool again.

Although the Recording Industry Association of America have denied reports they are tracking cassette sales again, it is clear that ‘a small, but engaged niche audience… is steadily growing’ for tape-based releases.

Whether that audience is gorging on tapes from do it yourself tape labels or sampling the delights of Justin Bieber’s latest album, cassettes are a hit for low-budget music-makers and status-bearers alike.

Compact Cassette Preservation

Amid this cassette fervour, Greatbear remains embroiled with the old wave of cassettes.

Cassettes from the 1960s and early 1970s carry specific preservation concerns.

Loss of lubricant is a common problem. You will know if your tape is suffering lubricant loss if you hear a horrible squealing sound during play back. This is known as ‘stick slip,’ which describes the way friction between magnetic tape and tape heads stick and slip as they move antagonistically through the tape transport.

This squealing poses big problems because it can intrude into the signal path and become part of the digital transfer. Tapes displaying such problems therefore require careful re-lubrication to ensure the recording can be transferred in its optimum – and squeal free – state.

Early compact cassettes also have problems that characterise much ‘new media.’

As Eric D. Daniel et al elaborate: ‘during the compact cassette’s first few years, sound quality was mediocre, marred by background noise, wow and flutter, and a limited frequency range. While ideal for voice recording applications like dictation, the compact cassette was marginal for musical recording.’ [2]

The resurgence in compact cassette culture may lull people into a false sense that recordings stored on cassettes are not high risk and do not need to be transferred in the immediate future.

It is worth remembering, however, that although playback machines will continue to be produced in years to come, not all tape machines are of equal, archival quality.

The last professional grade audio cassette machines were produced in the late 1990s and even the best of this batch lag far behind the tape machine to end all tape machines – the Nakamichi Dragon with its Automatic Azimuth Correction technology – that was discontinued in 1993.

To ensure the best quality transfers it is advisable to play back tapes using professional-grade machines. This enables greater control of problems that can arise with azimuth, wow and flutter which often need to be checked and if necessary adjusted prior to playback, a process that is not possible on cheaper, domestic machines.

As ever, if you have any specific concerns or enquiries regarding your audio cassette collections, please contact us to discuss it. 

Notes

[1] Eric D. Daniel et al, eds. (2009) Magnetic Recording: The First 100 Years. Piscataway: IEEE Press Marketing, 103-104.

[2] Eric D. Daniel et al, eds, Magnetic Recording, 104.

Posted by debra in audio tape, 0 comments

D-1, D-2 & D-3: histories of digital video tape

Enormous D-1 cassette held in hand

Large D-1 cassette dimensions: 36.5 x 20.3 x 3.2cm

D-2 tape with rulers showing size

D-2 cassette dimensions: 25.4 x 14.9 x 3cm

D-3 tape with rulers showing size

D-3 cassette size M: 21.2 x 12.4 x 2.5 cm

At Greatbear we carefully restore and transfer D-1, D-2, D-3, D-5, D-9 and Digital-S tapes  to digital file at archival quality.

Early digital video tape development

Behind every tape (and every tape format) lie interesting stories, and the technological wizardry and international diplomacy that helped shape the roots of our digital audio visual world are worth looking into.

In 1976, when the green shoots of digital audio technology were emerging at industry level, the question of whether Video Tape Recorders (VTRs) could be digitised began to be explored in earnest by R & D departments based at SONY, Ampex and Bosch G.m.b.H. There was considerable scepticism among researchers about whether digital video tape technology could be developed at all because of the wide frequency required to transmit a digital image.

In 1977 however, as reported on the SONY websiteYoshitaka Hashimoto and team began to intensely research digital VTRs and 'in just a year and a half, a digital image was played back on a VTR.'

Several years of product development followed, shaped, in part, by competing regional preferences. As Jim Slater argues in Modern Television Systems (1991): 'much of the initial work towards digital standardisation was concerned with trying to find ways of coping with the three very different colour subcarrier frequencies used in NTSC, SECAM and PAL systems, and a lot of time and effort was spent on this' (114).

Establishing a standard sampling frequency did of course have real financial consequences, it could not be randomly plucked out the air: the higher the sampling frequency, the greater overall bit rate; the greater overall bit rate, the more need for storage space in digital equipment. In 1982, after several years of negotiations, a 13.5 MHz sampling frequency was agreed. European, North American, 'Japanese, the Russians, and various other broadcasting organisations supported the proposals, and the various parameters were adopted as a world standard, Recommendation 601 [a.k.a. 4:2:2 DTV] standard of the CCIR [Consultative Committee for International Radio, now International Telecommunication Union]' (Slater, 116).

The 4:4:2 DTV was an international standard that would form the basis of the (almost) exclusively digital media environment we live in today. It was 'developed in a remarkably short time, considering its pioneering scope, as the worldwide television community recognised the urgent need for a solid basis for the development of an all-digital television production system', write Stanley Baron and David Wood

Once agreed upon, product development could proceed. The first digital video tape, the D-1, was introduced on the market in 1986. It was an uncompressed component video which used enormous bandwidth for its time: 173 Mbit/sec (bit rate), with maximum recording time of 94 minutes.

large cream-coloured video machine with electroluminescent display panel

BTS DCR 500 D-1 video recorder at Greatbear studio

As Slater writes: 'unfortunately these machines are very complex, difficult to manufacture, and therefore very expensive […] they also suffer from the disadvantage that being component machines, requiring luminance and colour-difference signals at input and output, they are difficult to install in a standard studio which has been built to deal with composite PAL signals. Indeed, to make full use of the D-1 format the whole studio distribution system must be replaced, at considerable expense' (125).

Being forced to effectively re-wire whole studios, and the considerable risk involved in doing this because of continual technological change, strikes a chord with the challenges UK broadcast companies face as they finally become 'tapeless' in October 2014 as part of the Digital Production Partnership's AS-11 policy.

Sequels and product development

As the story so often goes, D-1 would soon be followed by D-2. Those that did make the transition to D-1 were probably kicking themselves, and you can only speculate the amount of back injuries sustained getting the machines in the studio (from experience we can tell you they are huge and very heavy!)

It was fairly inevitable a sequel would be developed because even as the D-1 provided uncompromising image quality, it was most certainly an unwieldy format, apparent from its gigantic size and component wiring. In response a composite digital video, the D-2, was developed by Ampex and introduced in 1988.

In this 1988 promotional video, you can see the D-2 in action. Amazingly for our eyes and ears today the D-2 is presented as the ideal archival format. Amazing for its physical size (hardly inconspicuous on the storage shelf!) but also because it used composite video signal technology. Composite signals combine on one wire all the component parts which make up a video signal: chrominance (colour, or Red Green, Blue - RGB) and luminance (the brightness or black and white information, including grayscale).

While the composite video signal used lower bandwidth and was more compatible with existing analogue systems used in the broadcast industry of the time, its value as an archival format is questionable. A comparable process for the storage we use today would be to add compression to a file in order to save file space and create access copies. While this is useful in the short term it does risk compromising file authenticity and quality in the long term. The Ampex video is fun to watch however, and you get a real sense of how big the tapes were and the practical impact this would have had on the amount of time it took to produce TV programmes.

Enter the D-3

Following the D-2 is the D-3, which is the final video tape covered in this article (although there were of course the D5 and D9.)

The D-3 was introduced by Panasonic in 1991 in order to compete with Ampex's D-2. It has the same sampling rate as the D-2 with the main difference being the smaller shell size.

The D-3's biggest claim to fame was that it was the archival digital video tape of choice for the BBC, who migrated their analogue video tape collections to the format in the early 1990s. One can only speculate that the decision to take the archival plunge with the D-3 was a calculated risk: it appeared to be a stable-ish technology (it wasn't a first generation technology and the difference between D-2 and D-3 is negligible).

The extent of the D-3 archive is documented in a white paper published in 2008, D3 Preservation File Format, written by Philip de Nier and Phil Tudor: 'the BBC Archive has around 315,000 D-3 tapes in the archive, which hold around 362,000 programme items. The D-3 tape format has become obsolete and in 2007 the D-3 Preservation Project was started with the goal to transfer the material from the D-3 tapes onto file-based storage.'

Tom Heritage, reporting on the development of the D3 preservation project in 2013/2014, reveals that 'so far, around 100,000 D3 and 125,000 DigiBeta videotapes have been ingested representing about 15 Petabytes of content (single copy).'

It has then taken six years to migrate less than a third of the BBC's D-3 archive. Given that D-3 machines are now obsolete, it is more than questionable whether there are enough D-3 head hours left in existence to read all the information back clearly and to an archive standard. The archival headache is compounded by the fact that 'with a large proportion of the content held on LTO3 data tape [first introduced 2004, now on LTO-6], action will soon be required to migrate this to a new storage technology before these tapes become difficult to read.' With the much publicised collapse of the BBC's (DMI) digital media initiative in 2013, you'd have to very strong disposition to work in the BBC's audio visual archive department.

The roots of the audio visual digital world

The development of digital video tape, and the international standards which accompanied its evolution, is an interesting place to start understanding our current media environment. They are also a great place to begin examining the problems of digital archiving, particularly when file migration has become embedded within organisational data management policy, and data collections are growing exponentially.

While the D-1 may look like an alien-techno species from a distant land compared with the modest, immaterial file lists neatly stored on hard drives that we are accustomed to, they are related through the 4:2:2 sample rate which revolutionised high-end digital video production and continues to shape our mediated perceptions.

Preserving early digital video formats

More more information on transferring D-1, D-2, D3, D-5, D-5HD & D-9 / Digital S from tape to digital files, visit our digitising pages for:

D-1 (Sony) component and D-2 (Ampex) composite 19mm digital video cassettes

Composite digital D-3 and uncompressed component digital D-5 and D-5HD (Panasonic) video cassettes

D-9 / Digital S (JVC) video cassettes

Posted by debra in video tape, video technology, machines, equipment, 7 comments

Software Across Borders? The European Archival Records and Knowledge Preservation (E-Ark) Project

The latest big news from the digital preservation world is that the European Archival Records and Knowledge Preservation – (E-Ark), a three year, multinational research project, has received a £6M award from the European Commission ‘to create a revolutionary method of archiving data, addressing the problems caused by the lack of coherence and interoperability between the many different systems in use across Europe,’ the Digital Preservation Coalition, who are partners in the project, report.

What is particularly interesting about the consortium E-Ark has brought together is commercial partners will be part of a conversation that aims to establish long term solutions for digital preservation across Europe. More often than not, commercial interests have driven technological innovations used within digital preservation. This has made digital data difficult to manage for institutions both large and small, as the BBC’s Digital Media Initiative demonstrates, because the tools and protocols are always in flux. A lack of policy-level standards and established best practices has meant that the norm within digital information management has very much been permanent change.

Such a situation poses great risks for both digitised and born digital collections because information may have to be regularly migrated in order to remain accessible and ‘open’. As stated on the E-Ark website, ‘the practices developed within the project will reduce the risk of information loss due to unsuitable approaches to keeping and archiving of records. The project will be public facing, providing a fully operational archival service, and access to information for its users.’

Vectorscope

The E-Ark project will hopefully contribute to the creation of compatible systems that can respond to the different needs of groups working with digital information. Which is, of course, just about everybody right now: as the world economy becomes increasingly defined by information and ‘big data’, efficient and interoperable access to commercial and non-commercial archives will be an essential part of a vibrant and well functioning economic system. The need to establish data systems that can communicate and co-operate across software borders, as well as geographical ones, will become an economic necessity in years to come.

The task facing E-Ark is huge, but one crucial to implement if digital data is to survive and thrive in this brave new datalogical world of ours. As E-Ark explain: ‘Harmonisation of currently fragmented archival approaches is required to provide the economies of scale necessary for general adoption of end-to-end solutions. There is a critical need for an overarching methodology addressing business and operational issues, and technical solutions for ingest, preservation and re-use.’

Maybe 2014 will be the year when digital preservation standards start to become a reality. As we have already discussed on this blog, the US-based National Agenda for Digital Stewardship 2014 outlined the negative impact of continuous technological change and the need to create dialogue among technology makers and standards agencies. It looks like things are changing and much needed conversations are soon to take place, and we will of course reflect on developments on the Great Bear blog.

 

Posted by debra in audio tape, video tape, 0 comments

Digital Preservation – Establishing Standards and Challenges for 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

Close up of tape machine on the 'play', 'stop', 'rewind' button

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

Analogue to Digital Converter close up What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Posted by debra in audio tape, video tape, 0 comments

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

SKA_dishes

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

A 100TB storage unit in 2010, compared with a smaller hard drive symbolising 2020.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Post published Nov 18, 2013

Posted by debra in audio tape, video tape, 0 comments

Archiving for the digital long term: information management and migration

As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.

However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:

‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’

The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’

g tech pro hard-drive-raid-array

The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.

As technology continues to ‘innovate’ at startling rate,  it is hard to predict how long the current archival standard for audio and audio-visual will last.

Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.

macos-x-copy-dialogue-box

In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.

Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.

Because of this at Greatbear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.

Posted by debra in audio tape, video tape, 0 comments