innovation

Guest post: The Upright Electric Guitar

Is it a piano? Is it an electric guitar? Neither, it’s a hybrid! Keys, “action”, dampers from an upright piano, wood planks, electric guitar strings, and long pickup coils.

Watch and listen to a YouTube video of this instrument: https://youtu.be/pXIzCWyw8d4

Inception, designing and building

I first had the idea for the upright electric guitar in late 1986. At that time I had been scraping together a living for around 2 years, by hauling a 450-pound upright piano around to the shopping precincts in England, playing it as a street entertainer – and in my spare time I dreamt of having a keyboard instrument that would allow working with the sound of a “solid body” electric guitar. I especially liked the guitar sound of Angus Young from AC/DC, that of a Gibson SG. It had a lot of warmth in the tone, and whenever I heard any of their music, I kept thinking of all the things I might be able to do with that sound if it was available on a keyboard, such as developing new playing techniques. I had visions of taking rock music in new directions, touring, recording, and all the usual sorts of things an aspiring musician has on their mind.

Digital sampling was the latest development in keyboard technology back then, but I had found that samples of electric guitar did not sound authentic enough, even just in terms of their pure tone quality. Eventually all this led to one of those “eureka” moments in which it became clear that one way to get what I was after, would be to take a more “physical” approach by using a set of piano keys and the “action” and “dampering” mechanism that normally comes with them, and then, using planks of wood to mount on, swop out piano strings for those from an electric guitar, add guitar pickups, wiring and switches, and so on – and finally, to send the result of all this into a Marshall stack.

I spent much of the next 12 years working on some form of this idea, except for a brief interlude for a couple of years in the early 1990s, during which I collaborated with a firm based in Devon, Musicom Ltd, whose use of additive synthesis technology had led them to come up with the best artificially produced sounds of pipe organs that were available anywhere in the world. Musicom had also made some simple attempts to create other instrument sounds including acoustic piano, and the first time I heard one of these, in 1990, I was very impressed – it clearly had a great deal of the natural “warmth” of a real piano, warmth that was missing from any digital samples I had ever heard. After that first introduction to their technology and to the work that Musicom were doing, I put aside my idea for the physical version of the upright electric guitar for a time, and became involved with helping them with the initial analysis of electric guitar sounds.

Unfortunately, due to economic pressures, there came a point in 1992 when Musicom had to discontinue their research into other instrument sounds and focus fully on their existing lines of development and their market for the pipe organ sounds. It was at that stage that I resumed work on the upright electric guitar as a physical hybrid of an electric guitar and an upright piano.

I came to describe the overall phases of this project as “approaches”, and in this sense, all work done before I joined forces with Musicom was part of “Approach 1”, the research at Musicom was “Approach 2”, and the resumption of my original idea after that was “Approach 3”.

During the early work on Approach 1, my first design attempts at this new instrument included a tremolo or “whammy bar” to allow some form of note / chord bending. I made detailed 3-view drawings of the initial design, on large A2 sheets. These were quite complicated and looked like they might prove to be very expensive to make, and sure enough, when I showed them a light engineering firm, they reckoned it would cost around £5,000.00 for them to produce to those specifications. Aside from the cost, even on paper this design looked a bit impractical – it seemed like it might never stay in tune, for one thing.

Despite the apparent design drawbacks, I was able to buy in some parts during Approach 1, and have other work done, which would eventually be usable for Approach 3. These included getting the wood to be used for the planks, designing and having the engineering done on variations of “fret” pieces for all the notes the new instrument would need above the top “open E” string on an electric guitar, and buying a Marshall valve amp with a separate 4×12 speaker cabinet.

While collaborating with Musicom on the electronic additive synthesis method of Approach 2, I kept hold of most of the work and items from Approach 1, but by then I had already lost some of the original design drawings from that period. This is a shame, as some of them were done in multiple colours, and they were practically works of art in their own right. As it turned out, the lost drawings included features that I would eventually leave out of the design that resulted from a fresh evaluation taken to begin Approach 3, and so this loss did not stop the project moving forward.

The work on Approach 3 began in 1992, and it first involved sourcing the keys and action/dampering of an upright piano. I wanted to buy something new and “off the shelf”, and eventually I found a company based in London, Herrberger Brooks, who sold me one of their “Rippen R02/80” piano actions and key sets, still boxed up as it would be if sent to any company that manufactures upright pianos.

These piano keys and action came with a large A1 blueprint drawing that included their various measurements, and this turned out to be invaluable for the design work that had to be done next. The basic idea was to make everything to do with the planks of wood, its strings, pickups, tuning mechanism, frets, “nut”, machine heads and so on, fit together with, and “onto”, the existing dimensions of the piano keys and action – and to then use a frame to suspend the planks vertically, to add a strong but relatively thin “key bed” under the keys, legs under the key bed to go down to ground level and onto a “base”, and so on.

To begin work on designing how the planks would hold the strings, how those would be tuned, where the pickup coils would go and so on, I first reduced down this big blueprint, then added further measurements of my own, to the original ones. For the simplest design, the distance between each of the piano action’s felt “hammers” and the next adjacent hammer was best kept intact, and this determined how far apart the strings would have to be, how wide the planks needed to be, and how many strings would fit onto each plank. It looked like 3 planks would be required.

While working on new drawings of the planks, I also investigated what gauge of electric guitar string should be used for each note, how far down it would be possible to go for lower notes, and things related to this. With a large number of strings likely to be included, I decided it would be a good idea to aim for a similar tension in each one, so that the stresses on the planks and other parts of the instrument would, at least in theory, be relatively uniform. Some enquiries at the University of Bristol led me to a Dr F. Gibbs, who had already retired from the Department of Physics but was still interested in the behaviour and physics of musical instruments. He assisted with the equations for calculating the tension of a string, based on its length, diameter, and the pitch of the note produced on it. Plugging all the key factors into this equation resulted in a range of electric guitar string gauges that made sense for the upright electric guitar, and for the 6 open string notes found on a normal electric guitar, the gauges resulting from my calculations were similar to the ones your average electric guitarist might choose.

Other practicalities also determined how many more notes it would theoretically be possible to include below the bottom “open E” string on an electric guitar, for the new instrument. For the lowest note to be made available, by going all the way down to a 0.060 gauge wound string – the largest available at that time as an electric guitar string – it was possible to add several more notes below the usual open bottom E string. I considered using bass strings for notes below this, but decided not to include them and instead, to let this extra range be the lower limit on strings and notes to be used. Rather than a bass guitar tone, I wanted a consistent sort of electric guitar tone, even for these extra lower notes.

For the upper notes, everything above the open top E on a normal guitar would have a single fret at the relevant distance away from the “bridge” area for that string, and all those notes would use the same string gauge as each other.

The result of all the above was that the instrument would accommodate a total of 81 notes / strings, with an octave of extra notes below the usual guitar’s open bottom E string, and just under 2 octaves of extra notes above the last available fret from the top E string of a Gibson SG, that last fretted note on an SG being the “D” just under 2 octaves above the open top E note itself. For the technically minded reader, this range of notes went from “E0” to “C7”.

Having worked all this out, I made scale drawings of the 3 planks, with their strings, frets, pickup coils, and a simple fine-tuning mechanism included. It was then possible to manipulate a copy of the piano action blueprint drawing – with measurements removed, reduced in size, and reversed as needed – so it could be superimposed onto the planks’ scale drawings, to the correct relational size and so on. I did this without the aid of any computer software, partly because in those days, CAD apps were relatively expensive, and also because it was difficult to find any of this software that looked like I could learn to use it quickly. Since I had already drawn this to scale in the traditional way – using draftsman’s tools and a drawing board – it made sense to work with those drawings, so instead of CAD, I used photocopies done at a local printing shop, and reduced / reversed etc, as needed.

Key drawing of 3 planks, strings, frets, fine tuning mechanism and pickup coils, combined with upright piano action

It was only really at this point, once the image of the piano action’s schematic was married up to the scale drawings of the 3 planks, that I began to fully understand where this work was heading, in terms of design. But from then on, it was relatively easily to come up with the rest of the concepts and to draw something for them, so that work could proceed on the frame to hold up the planks, the key bed, legs, and a base at ground level.

Around this time, I came across an old retired light engineer, Reg Huddy, who had a host of engineer’s machines – drill presses, a lathe, milling machine, and so on – set up in his home. He liked to make small steam engines and things of that nature, and when I first went to see him, we hit it off immediately. In the end he helped me make a lot of the metal parts that were needed for the instrument, and to machine in various holes and the pickup coil routing sections on the wood planks. He was very interested in the project, and as I was not very well off, he insisted in charging minimal fees for his work. Reg also had a better idea for the fine tuning mechanism than the one I had come up with, and we went with his version, as soon as he showed it to me.

If I am honest, I don’t think I would ever have finished the work on this project without all the help that Reg contributed. I would buy in raw materials if he didn’t already have them, and we turned out various parts as needed, based either on 3-view drawings I had previously come up with, or for other parts we realised would be required as the project progressed, from drawings I worked up as we went along. Reg sometimes taught me to use his engineering machinery, and although I was a bit hesitant at times, after a while I was working on these machines to a very basic standard.

I took the wood already bought for the instrument during the work on Approach 1, to Jonny Kinkead of Kinkade Guitars, and he did the cutting, gluing up and shaping to the required sizes and thicknesses for the 3 planks. The aim was to go with roughly the length of a Gibson SG neck and body, to make the planks the same thickness as an SG body, and to include an angled bit as usual at the end where an SG or any other guitar is tuned up, the “machine head” end. Jonny is an excellent craftsman and was able to do this work to a very high standard, based on measurements I provided him with.

As well as getting everything made up for putting onto the planks, the piano action itself needed various modifications. The highest notes had string lengths that were so short that the existing dampers had to be extended so they were in the correct place, as otherwise they would not have been positioned over those strings at all. Extra fine adjustments were needed for each damper, so that instead of having to physically bend the metal rod holding a given damper in place – an inexact science at the best of times – it was possible to turn a “grub screw” to accomplish the same thing, but with a much greater degree of precision. And finally, especially important for the action, the usual felt piano “hammers” were to be replaced by smaller versions made of stiff wire shaped into a triangle. For these, I tried a few design mock-ups to find the best material for the wire itself, and to get an idea of what shape to use. Eventually, once this was worked out, I made up a “jig” around which it was possible to wrap the stiff wire so as to produce a uniformly shaped “striking triangle” for each note. This was then used to make 81 original hammers that were as similar to each other as possible. Although using the jig in this way was a really fiddly job, the results were better than I had expected, and they were good enough.

Close-up of a few hammers, dampers and strings

While this was all underway, I got in touch with an electric guitar pickup maker, Kent Armstrong of Rainbow Pickups. When the project first started, I had almost no knowledge of solid body electric guitar physics at all, and I certainly had no idea how pickup coils worked. Kent patiently explained this to me, and once he understood what I was doing, we worked out as practical a design for long humbucker coils as possible. A given coil was to go all the way across one of the 3 planks, “picking up” from around 27 strings in total – but for the rightmost plank, the upper strings were so short that there was not enough room to do this and still have both a “bridge” and a “neck” pickup, so the top octave of notes would had to have these two sets of coils stacked one on top of the other, using deeper routed areas in the wood than elsewhere.

For the signal to send to the amplifier, we aimed for the same overall pickup coil resistance (Ω) as on a normal electric guitar. By using larger gauge wire and less windings than normal, and by wiring up the long coils from each of the 3 planks in the right way, we got fairly close to this, for both an “overall bridge” and an “overall neck” pickup. Using a 3-way switch that was also similar to what’s found on a normal electric guitar, it was then possible to have either of these 2 “overall” pickups – bridge or neck – on by itself, or both at once. Having these two coil sets positioned a similar distance away from the “bridge end” of the strings as on a normal guitar, resulted in just the sort of sound difference between the bridge and neck pickups, as we intended. Because, as explained above, we had to stack bridge and neck coils on top of each other for the topmost octave of notes, those very high notes – much higher than on most electric guitars – did not sound all that different with the overall “pickup switch” position set to “bridge”, “neck”, or both at once. That was OK though, as those notes were not expected to get much use.

Some electric guitar pickups allow the player to adjust the volume of each string using a screw or “grub screw” etc. For the upright electric guitar I added 2 grub screws for every string and for each of the bridge and neck coils, and this means we had over 300 of these that had to be adjusted. Once the coils were ready, and after they were covered in copper sheeting to screen out any unwanted interference and they were then mounted up onto the planks, some early adjustments made to a few of these grub screws, and tests of the volumes of those notes, enabled working up a graph to calculate how much to adjust the height of each of the 300+ grub screws, for all 81 strings. This seemed to work quite well in the end, and there was a uniform change to volume from one end of the available notes to the other, one which was comparable to a typical electric guitar.

Unlike a normal electric guitar, fine tuning on this instrument was done at the “ball end” / “bridge end” of each string, not the “machine heads end” / “nut end”. The mechanism for this involved having a very strong, short piece of round rod put through the string’s “ball”, positioning one end of this rod into a fixed groove, and turning a screw using an allen key near the other end of the rod, to change the tension in the string. It did take a while to get this thing into tune, but I have always had a good ear, and over the years I had taught myself how to tune a normal piano, which is much more difficult than doing this fine tuning of the upright electric guitar instrument.

fine tuning mechanisms for each string (in the upper right part of the photo)
hammers, dampers, strings, pickup coils and their grub screws, and fine tuning mechanisms

A frame made of aluminium was designed to support the 3 planks vertically. They were quite heavy on their own, and much more so with all the extra metal hardware added on, so the frame had to be really strong. Triangle shapes gave it extra rigidity. To offset the string tensions, truss rods were added on the back of the 3 planks, 4 per plank at equal intervals. When hung vertically, the 3 planks each had an “upper” end where the fine tuning mechanisms were found and near where the pickup coils were embedded and the strings were struck, and a “lower” end where the usual “nut” and “machine heads” would be found. I used short aluminium bars clamping each of 2 adjacent strings together in place of a nut, and zither pins in place of machine heads. The “upper” and “lower” ends of the planks were each fastened onto their own hefty piece of angle iron, which was then nestled into the triangular aluminium support frame. The result of this design was that the planks would not budge by even a tiny amount, once everything was put together. This was over-engineering on a grand scale, making it very heavy – but to my thinking at that time, this could not be helped.

The piano keys themselves also had to have good support underneath. As well as preventing sagging in the middle keys and any other potential key slippage, the “key bed” had to be a thin as possible, as I have long legs and have always struggled with having enough room for them under the keys of any normal piano. These 2 requirements – both thin and strong – led me to have some pieces of aluminium bar heat treated for extra strength. Lengths of this reinforced aluminium bar were then added “left to right”, just under the keys themselves, having already mounted the keys’ standard wooden supports – included in what came with the piano action – onto a thin sheet of aluminium that formed the basis of the key bed for the instrument. There was enough height between the keys and the bottom of these wooden supports, to allow a reasonable thickness of aluminium to be used for these left-to-right bars. For strength in the other direction of the key bed – “front to back” – 4 steel bars were added, positioned so that, as I sat at the piano keyboard, they were underneath but still out of the way. Legs made of square steel tubing were then added to the correct height to take this key bed down to a “base” platform, onto which everything was mounted. Although this key bed ended up being quite heavy in its own right, with the legs added it was as solid as a rock, so the over-engineering did at least work in that respect.

If you have ever looked inside an upright piano, you might have noticed that the “action” mechanism usually has 2 or 3 large round nuts you can unscrew, after which it is possible to lift the whole mechanism up and out of the piano and away from the keys themselves. On this instrument, I used the same general approach to do the final “marrying up” – of piano keys and action, to the 3 planks of wood suspended vertically. The existing action layout already had “forks” that are used for this, so everything on the 3 planks was designed to allow room for hefty sized bolts fastened down tightly in just the right spots, in relation to where the forks would go when the action was presented up to the planks. The bottom of a normal upright piano action fits into “cups” on the key bed, and I also used these in my design. Once the planks and the key bed were fastened down to the aluminium frame and to the base during assembly, then in much the same way as on an upright piano, the action was simply “dropped down” into the cups, then bolted through the forks and onto, in this case, the 3 planks.

It’s usually possible to do fine adjustments to the height of these cups on an upright piano, and it’s worth noting that even a tiny change to this will make any piano action behave differently. This is why it was so important to have both very precise tolerances in the design of the upright electric guitar’s overall structure, together with as much strength and rigidity as possible for the frame and other parts.

With a normal upright piano action, when you press a given key on the piano keyboard, it moves the damper for that single note away from the strings, and the damper returns when you let go of that key. In addition to this, a typical upright piano action includes a mechanism for using a “sustain pedal” with the right foot, so that when you press the pedal, the dampers are pushed away from all the strings at the same time, and when you release the pedal, the dampers are returned back onto all the strings. The upright piano action bought for this instrument did include all this, and I especially wanted to take advantage of the various dampering and sustain possibilities. Early study, drawing and calculations of forces, fulcrums and so on, eventually enabled use of a standard piano sustain foot pedal – bought off the shelf from that same firm, Herrberger Brooks – together with a hefty spring, some square hollow aluminium tube for the horizontal part of the “foot to dampers transfer” function, and a wooden dowel for the vertical part of the transfer. Adjustment had to be made to the position of the fulcrum, as the first attempt led to the foot pedal needing too much force, which made it hard to operate without my leg quickly getting tired. This was eventually fixed, and then it worked perfectly.

At ground level I designed a simple “base” of aluminium sheeting, with “positioners” fastened down in just the right places so that the legs of the key bed, the triangular frame holding up the 3 planks, and the legs of the piano stool to sit on, always ended up in the correct places in relation to each other. This base was also where the right foot sustain pedal and its accompanying mechanism were mounted up. To make it more transportable, the base was done in 3 sections that could fairly easily be fastened together and disassembled.

After building – further tests and possible modifications

When all this design was finished, all the parts were made and adjusted as needed, and it could finally be assembled and tried out, the first time I put the instrument together, added the wiring leads, plugged it into the Marshall stack, and then tuned it all up, it was a real thrill to finally be able to sit and play it. But even with plenty of distortion on the amp, it didn’t really sound right – it was immediately obvious that there was too much high frequency in the tone. It had wonderful amounts of sustain, but the price being paid for this was that the sound was some distance away from what I was really after. In short, the instrument worked, but instead of sounding like a Gibson SG – or any other electric guitar for that matter – it sounded a bit sh***y.

When I had first started working on this project, my “ear” for what kind of guitar sound I wanted, was in what I would describe as an “early stage of development”. Mock-up tests done during Approach 1, before 1990, had sounded kind of right at that time. But once I was able to sit and play the finished instrument, and to hear it as it was being played, with hindsight I realised that my “acceptable” evaluation of the original mock-up was more because, at that point, I had not yet learned to identify the specific tone qualities I was after. It was only later as the work neared completion, that my “ear” for the sound I wanted became more fully developed, as I began to better understand how a solid body electric guitar behaves, what contributes to the tone qualities you hear from a specific instrument, and so on.

I began asking some of the other people who had been involved in the project, for their views on why it didn’t sound right. Two things quickly emerged from this – it was too heavy, and the strings were being struck, instead of plucking them.

Kent Armstrong, who made the pickups for the upright electric guitar, told me a story about how he once did a simple experiment which, in relation to my instrument, demonstrated what happens if you take the “it’s too heavy” issue to the extreme. He told me about how he had once “made an electric guitar out of a brick wall”, by fastening an electric guitar string to the wall at both ends of the string, adding a pickup coil underneath, tuning the string up, sending the result into an amp, and then plucking the string. He said that this seemed to have “infinite sustain” – the sound just went on and on. His explanation for this was that because the brick wall had so much mass, it could not absorb any of the vibration from the string, and so all of its harmonics just stayed in the string itself.

Although this was a funny and quite ludicrous example, I like this kind of thing, and the lesson was not lost on me at the time. We discussed the principles further, and Kent told me that in his opinion, a solid body electric guitar needs somewhere around 10 to 13 pounds of wood mass, in order for it to properly absorb the strings’ high harmonics in the way that gives you that recognisable tone quality we would then call “an electric guitar sound”. In essence, he was saying that the high frequencies have to “come out”, and then it’s the “warmer” lower harmonics which remain in the strings, that makes an electric guitar sound the way it does. This perfectly fit with my own experience of the tones I liked so much, in a guitar sound I would describe as “desirable”. Also, it did seem to explain why my instrument, which had a lot more “body mass” than 10 to 13 pounds – with its much larger wood planks, a great deal of extra hardware mounted onto them, and so on – did not sound like that.

As for striking rather than plucking the strings, I felt that more trials and study would be needed on this. I had opted to use hammers to strike the strings, partly as this is much simpler to design for – the modifications needed to the upright piano action bought off the shelf, were much less complicated than those that would have been required for plucking them. But there was now a concern that the physics of plucking and striking might be a lot different to each other, and if so there might be no way of getting around this, except to pluck them.

I decided that in order to work out what sorts of changes would best be made to the design of this instrument to make it sound better, among other things to do as a next step, I needed first-hand experience of the differences in tone quality between various sizes of guitar body. In short, I decided to make it my business to learn as much as I could about the physics of the solid body electric guitar, and if necessary, to learn more than perhaps anyone else out there might already know. I also prepared for the possibility that a mechanism to pluck the strings might be needed.

At that time, in the mid 1990s, there had been some excellent research carried out on the behaviour of acoustic guitars, most notably by a Dr Stephen Richardson at the University of Cardiff. I got in touch with him, and he kindly sent me details on some of this work. But he admitted that the physics of the acoustic guitar – where a resonating chamber of air inside the instrument plays a key part in the kinds of sounds and tones that the instrument can make – is fundamentally different to that of a solid body electric guitar.

I trawled about some more, but no one seemed to have really studied solid body guitar physics – or if they had, nothing had been published on it. Kent Armstrong’s father Dan appeared on the scene at one point, as I was looking into all this. Dan Armstrong was the inventor of the Perspex bass guitar in the 1960s. When he, Kent and I all sat down together to have a chat about my project, it seemed to me that Dan might in fact know more than anyone else in the world, about what is going on when the strings vibrate on a solid body guitar. It was very useful to hear what he had to say on this.

I came away from all these searches for more knowledge, with further determination to improve the sound of the upright electric guitar. I kept an eye out for a cheap Gibson SG, and as luck would have it, one appeared online for just £400.00 – for any guitar enthusiasts out there, you will know that even in the 1990s, that was dirt cheap. I suspected there might be something wrong with it, but decided to take a risk and buy it anyway. It turned out to have a relatively correct SG sound, and was cheap because it had been made in the mid 1970s, at a time when Gibson were using inferior quality wood for the bodies of this model. While it clearly did not sound as good as, say, a vintage SG, it was indeed a Gibson original rather than an SG copy, and it did have a “workable” SG sound that I could compare against.

I also had a friend with a great old Gibson SG Firebrand, one that sounded wonderful. He offered to let me borrow it for making comparative sound recordings and doing other tests. I was grateful for this, and I did eventually take him up on the offer.

One thing that I was keen to do at this stage, was to look at various ways to measure – and quantify – the differences in tone quality between either of these two Gibson SGs and the upright electric guitar. I was advised to go to the Department of Mechanical Engineering at the University of Bristol, who were very helpful. Over the Easter break of 1997, they arranged for me to bring in my friend’s SG Firebrand and one of my 3 planks – with its strings all attached and working – so that one of their professors, Brian Day, could conduct “frequency sweep” tests on them. Brian had been suffering from early onset of Parkinson’s disease and so had curtailed his normal university activities, but once he heard about this project, he was very keen to get involved. Frequency sweep tests are done by exposing the “subject” instrument to an artificially created sound whose frequency is gradually increased, while measuring the effect this has on the instrument’s behaviour. Brian and his colleagues carried out the tests while a friend and I assisted. Although the results did not quite have the sorts of quantifiable measurements I was looking for, they did begin to point me in the right direction.

After this testing, someone else recommended I get in touch with a Peter Dobbins, who at that time worked at British Aerospace in Bristol and had access to spectral analysis equipment at their labs, which he had sometimes used to study the physics of the hurdy gurdy, his own personal favourite musical instrument. Peter was also very helpful, and eventually he ran spectral analysis of cassette recordings made of plucking, with a plectrum, the SG Firebrand, the completed but “toppy-sounding” upright electric guitar, and a new mock-up I had just made at that point, one that was the same length as the 3 planks, but only around 4 inches wide. This new mock-up was an attempt to see whether using around 12 or 13 much narrower planks in place of the 3 wider ones, might give a sound that was closer to what I was after.

Mock-up of possible alternative to 3 planks – would 12 or 13 of these sound better instead? Shown on its own (with a long test coil), and mounted up to the keys and action setup so that plucking tests could make use of the dampers to stop strings moving between recordings of single notes

As it turned out, the new mock-up did not sound that much different to the completed upright electric guitar itself, when the same note was plucked on each of them. It was looking like there was indeed a “range” of solid guitar body mass / weight of wood that gave the right kind of tone, and that even though the exact reasons for the behaviour of “too much” or “too little” mass might be different to each other, any amount of wood mass / weight on either side of that range, just couldn’t absorb enough of the high harmonics out of the strings. Despite the disappointing result of the new mock-up sounding fairly similar to the completed instrument, I went ahead and gave Peter the cassette recordings of it, of the completed instrument, and of my friend’s SG Firebrand, and he stayed late one evening at work and ran the spectral analysis tests on all of these.

Peter’s spectral results were just the kind of thing I had been after. He produced 3D graphs that clearly showed the various harmonics being excited when a given string was plucked, how loud each one was, and how long they went on for. This was a pictorial, quantitative representation of the difference in tone quality between my friend’s borrowed SG Firebrand, and both the completed instrument and the new mock-up. The graphs gave proper “shape” and “measure” to these differences. By this time, my “ear” for the sort of tone quality I was looking for, was so highly developed that I could distinguish between these recordings immediately, when hearing any of them. And what I could hear, was reflected precisely on these 3D graphs.

Spectral analysis graphs in 3D, of Gibson SG Firebrand “open bottom E” note plucked, and the same note plucked on the upright electric guitar. Frequency in Hz is on the x axis and time on the y axis, with time starting at the “back” and moving to the “front” on the y axis. Harmonics are left-to-right on each graph – leftmost is the “fundamental”, then 1st harmonic etc. Note how many more higher harmonics are found on the right graph of the upright electric guitar, and how they persist for a long time. I pencilled in frequencies for these various harmonics on the graph on the right, while studying it to understand what was taking place on the string.

While this was all underway, I also mocked up a few different alternative types of hammers and carried out further sound tests to see what sort of a difference you would get in tone, from using different materials for these, but always still striking the string. Even though I was more or less decided on moving to a plucking mechanism, for completeness and full understanding, I wanted to see if any significant changes might show up from using different sorts of hammers. For these experiments, I tried some very lightweight versions in plastic, the usual felt upright piano hammers, and a couple of others that were much heavier, in wood. Not only was there almost no difference whatsoever between the tone quality that each of these widely varied types of hammers seemed to produce, it also made next to no difference where, along the string, you actually struck it.

Other hammer designs tried – there was little variation in the sound each of these produced

These experiments, and some further discussions with a guitar maker who had helped out on the project, brought more clarification to my understanding of hammers vs plucking. Plucking a string seems to make its lower harmonics get moving right away, and they then start out with more volume compared to that of the higher harmonics. The plucking motion will always do this, partly because there is so much energy being transferred by the plectrum or the player’s finger – and this naturally tends to drive the lower harmonics more effectively. When you hit a string with any sort of hammer though, the effect is more like creating a sharp “shock wave” on the string, but one with much less energy. This sets off the higher harmonics more, and the lower ones just don’t get going properly.

In a nutshell, all of this testing and research confirmed the limitations of hammers, and the fact that there are indeed fundamental differences between striking and plucking an electric guitar string. Hammers were definitely “out”.

To summarise the sound characteristic of the upright electric guitar, its heavy structure and thereby the inability of its wood planks to absorb enough high frequencies out of the strings, made it naturally produce a tone with too many high harmonics and not enough low ones – and hitting its strings with a hammer instead of plucking, had the effect of “reinforcing” this tonal behaviour even more, and in the same direction.

The end?

By this point in the work on the project, as 1998 arrived and we got into spring and summer of that year, I had gotten into some financial difficulties, partly because this inventing business is expensive. Despite having built a working version of the upright electric guitar, even aside from the fact that the instrument was very heavy and took some time to assemble and take apart – making it impractical for taking on tour for example – the unacceptable sound quality alone, meant that it was not usable. Mocked-up attempts to modify the design so that there would be many planks, each quite narrow, had not improved the potential of the sound to any appreciable degree, either.

I realised that I was probably reaching the end of what I could achieve on this project, off my own back financially. To fully confirm some of the test results, and my understanding of what it is that makes a solid body electric guitar sound the way it does, I decided to perform a fairly brutal final test. To this end, I first made recordings of plucking the 6 open strings on the cheap SG I had bought online for £400.00. Then I had the “wings” of this poor instrument neatly sawn off, leaving the same 4-inch width of its body remaining, as the new mock-up had. This remaining width of 4 inches was enough that the neck was unaffected by the surgery, which reduced the overall mass of wood left on the guitar, and its shape, down to something quite similar to that of the new mock-up.

I did not really want to carry out this horrible act, but I knew that it would fully confirm all the indications regarding the principles, behaviours and sounds I had observed in both the 3 planks of the completed upright electric guitar, in the new mock-up, and in other, “proper” SG guitars that, to my ear, sounded right. If, by doing nothing else except taking these lumps of wood mass away from the sides of the cheap SG, its sound went from “fairly good” to “unacceptably toppy”, it could only be due to that change in wood mass.

After carrying out this crime against guitars by chopping the “wings” off, I repeated the recordings of plucking the 6 open strings. Comparison to the “before” recordings of it, confirmed my suspicions – exactly as I had feared and expected, the “after” sound had many more high frequencies in it. In effect I had “killed” the warmth of the instrument, just by taking off those wings.

In September 1998, with no more money to spend on this invention, and now clear that the completed instrument was a kind of “design dead end”, I made the difficult decision to pull the plug on the project. I took everything apart, recycled as many of the metal parts as I could (Reg Huddy was happy to have many of these), gave the wood planks to Jonny Kinkead for him to use to make a “proper” electric guitar with as he saw fit, and then went through reams of handwritten notes, sketches and drawings from 12 years of work, keeping some key notes and drawings which I still have today, but having a big bonfire one evening at my neighbour’s place, with all the rest.

Some “video 8” film of the instrument remained, and I recently decided to finally go through all of that, and all the notes and drawings kept, and make up a YouTube video from it. This is what Greatbear Analogue & Digital Media has assisted with. I am very pleased with the results, and am grateful to them. Here is a link to that video: https://youtu.be/pXIzCWyw8d4

As for the future of the upright electric guitar, in the 20 years since ceasing work on the project, I have had a couple of ideas for how it could be redesigned to sound better and, for some of those ideas, to also be more practical.

One of these new designs involves using similar narrow 4-inch planks as on the final mockup described above, but adding the missing wood mass back onto this as “wings” sticking out the back – where they would not be in the way of string plucking etc – positioning the wings at a 90-degree angle to the usual plane of the body. This would probably be big and heavy, but it would be likely to sound a lot closer to what I have always been after.

Another design avenue might be to use 3 or 4 normal SGs and add robotic plucking and fretting mechanisms, driven by electronic sensors hooked up to another typical upright piano action and set of keys, with some programmed software to make the fast decisions needed to work out which string and fret to use on which SG guitar for each note played on the keyboard, and so on. While this would not give the same level of intimacy between the player and the instrument itself as even the original upright electric guitar had, the tone of the instrument would definitely sound more or less right, allowing for loss of “player feeling” from how humans usually pluck the strings, hold down the frets, and so on. This approach would most likely be really expensive, as quite a lot of robotics would probably be needed.

An even more distant possibility in relation to the original upright electric guitar, might be to explore additive synthesis further, the technology that the firm Musicom Ltd – with whom I collaborated during Approach 2 in the early 1990s – continue to use even today, for their pipe organ sounds. I have a few ideas on how to go about such additive synthesis exploration, but will leave them out of this text here.

As for my own involvement, I would like nothing better than to work on this project again, in some form. But these days, there are the usual bills to pay, so unless there is a wealthy patron or perhaps a sponsoring firm out there who can afford to both pay me enough salary to keep my current financial commitments, and to also bankroll the research and development that would need to be undertaken to get this invention moving again, the current situation is that it’s very unlikely I can do it myself.

Although that seems a bit of a shame, I am at least completely satisfied that, in my younger days, I had a proper go at this. It was an unforgettable experience, to say the least!

Posted by greatbear in video tape, 8 comments

The difference ten years makes: changes in magnetic tape recording and storage media

Data Storage Catalogue_front_2004 Generational change for digital technologies are rapid and disruptive.  ‘In the digital context the next generation may only be five to ten years away!’ Tom Gollins from the National Archives reminds us, and this seems like a fairly conservative estimate.

It can feel like the rate of change is continually accelerating, with new products appearing all the time. It is claimed, for example, that the phenomena of ‘wearable tech chic’ is now upon us, with the announcement this week that Google’s glass is available to buy for £1,000.

The impact of digital technologies have been felt throughout society, and this issue will be explored in a large immersive exhibition of art, design, film, music and videogames held at the Barbian July-Sept 2014. It is boldly and emphatically titled: Digital Revolution.

To bring such technological transformations back into focus with our work at Greatbear, consider this 2004 brochure that recently re-surfaced in our Studio. As an example of the rapid rate of technological change, you need look no further.

A mere ten years ago, you could still choose between several brands of audio mini disc, ADAT, DAT, DTRS, Betacam SP, Digital Betacam, super VHS, VHS-C, 8mm and mini DV.

Data Storage Catalogue_back_2004 Storage media such as Zip disks, Jaz CartExabytes and hard drives that could store between 36-500Gb of data were also available to purchase.

RMGI are currently the only manufacturer of professional open reel audio tape. In the 2004 catalogue, different brands of open reel analogue tape are listed at a third of 2014 retail prices, taking into account rates of inflation.

While some of the products included in the catalogue, namely CDs, DVDs and open reel tape, have maintained a degree of market resiliency due to practicality, utility or novelty, many have been swept aside in the march of technological progress that is both endemic and epidemic in the 21st century.

 

 

Posted by debra in audio tape, video tape, 1 comment

Significant properties – technical challenges for digital preservation

A consistent focus of our blog is the technical and theoretical issues that emerge in the world of digital preservation. For example, we have explored the challenges archivists face when they have to appraise collections in order to select what materials are kept, and what are thrown away. Such complex questions take on specific dimensions within the world of digital preservation.

If you work in digital preservation then the term ‘significant properties’ will no doubt be familiar to you. The concept has been viewed as a hindrance due to being shrouded by foggy terminology, as well as a distinct impossibility because of the diversity of digital objects in the world which, like their analogue counterparts, cannot be universally generalised or reduced to a series of measurable characteristics.

Cleaning an open reel-to-reel tape

In a technical sense, establishing a set of core characteristics for file formats has been important for initiatives like Archivematica, ‘a free and open-source digital preservation system that is designed to maintain standards-based, long-term access to collections of digital objects.’ Archivematica implement ‘default format policies based on an analysis of the significant characteristics of file formats.’ These systems manage digital information using an ‘agile software development methodology’ which ‘is focused on rapid, iterative release cycles, each of which improves upon the system’s architecture, requirements, tools, documentation, and development resources.’

Such a philosophy may elicit groans of frustration from information managers who may well want to leave their digital collections alone, and practice a culture of non-intervention. Yet this adaptive-style of project management, which is designed to respond rapidly to change, is often contrasted with predictive development that focuses on risk assessment and the planning of long-term projects. The argument against predictive methodologies is that, as a management model, it can be unwieldy and unresponsive to change. This can have damaging financial consequences, particularly when investing in expensive, risky and large scale digital preservation projects, as the BBC’s failed DMI initiative demonstrates.

Indeed, agile software development methodology may well be an important key to the sustainability of digital preservation systems which need to find practical ways of maneuvering technological innovations and the culture of perpetual upgrade. Agility in this context is synonymous with resilience, and the practical application of significant properties as a means to align file format interoperability offers a welcome anchor for a technological environment structured by persistent change.

Significant properties vs the authentic digital object

What significant properties imply, as archival concept and practice, is that desiring authenticity for the digitised and born-digital objects we create is likely to end in frustration. Simply put, preserving all the information that makes up a digital object is a hugely complex affair, and is a procedure that will require numerous and context-specific technical infrastructures.

As Trevor Owens explains: ‘you can’t just “preserve it” because the essence of what matters about “it” is something that is contextually dependent on the way of being and seeing in the world that you have decided to privilege.’ Owens uses the example of the Geocites web archiving project to demonstrate that if you don’t have the correct, let’s say ‘authentic’ tools to interpret a digital object (in this case, a website that is only discernible on certain browsers), you simply cannot see the information accurately. Part of the signal is always missing, even if something ‘significant’ remains (the text or parts of the graphics).

It may be desirable ‘to preserve all aspects of the platform in order to get at the historicity of the media practice’, Jonathan Sterne, author of MP3: Meaning of a Format suggests, but in a world that constantly displaces old technological knowledge with new, settling for the preservation of significant properties may be a pragmatic rather than ideal solution.

Analogue to digital issues

To bring these issues back to the tape we work we with at Great Bear, there are of course times when it is important to use the appropriate hardware to play the tapes back, and there is a certain amount of historically specific technical knowledge required to make the machines work in the first place. We often wonder what will happen to the specialised knowledge learnt by media engineers in the 70s, 80s and 90s, who operated tape machines that are now obsolete. There is the risk that when those people die, the knowledge will die with them. Of course it is possible to get hold of operating manuals, but this is by no means a guarantee that the mechanical techniques will be understood within a historical context that is increasingly tape-less and software-based.  By keeping our wide selection of audio and video tape machines purring, we are sustaining a machinic-industrial folk knowledge which ultimately helps to keep our customer’s magnetic tape-based, media memories, alive.

Of course a certain degree of historical accuracy is required in the transfers because, very obviously, you can’t play a V2000 tape on a VHS machine, no matter how hard you try!

Yet the need to play back tapes on exactly the same machine becomes less important in instances where the original tape was recorded on a domestic reel-to-reel recorder, such as the Grundig TK series, which may not have been of the greatest quality in the first place. To get the best digital transfer it is desirable to play back tapes on a machine with higher specifications that can read the magnetic information on the tape as fully as possible. This is because you don’t want to add any more errors to the tape in the transfer process by playing it back on a lower quality machine, which would then of course become part of the digitised signal.

It is actually very difficult to remove things like wow and flutter after a tape has been digitised, so it is far better to ensure machines are calibrated appropriately before the tape is migrated, even if the tape was not originally recorded on a machine with professional specifications. What is ultimately at stake in transferring analogue tape to digital formats is the quality of the signal. Absolute authenticity is incidental here, particularly if things sound bad.

The moral of this story, if there can be one, is that with any act of transmission, the recorded signal is liable to change. These can be slight alterations or huge drop-outs and everything in-between. The agile software developers know that given the technological conditions in which current knowledge is produced and preserved, transformation is inevitable and must be responded to. Perhaps it is realistic to assume this is the norm in society today, and creating digital preservation systems that are adaptive is key to the survival of information, as well as accepting that preserving the ‘full picture’ cannot always be guaranteed.

Posted by debra in audio / video heritage, audio tape, video tape, 1 comment

2″ Quad Video Tape Transfers – new service offered

We are pleased to announce that we are now able to support the transfer of 2″ Quadruplex Video Tape (PAL, SECAM & NTSC) to digital formats.

Quadruplex Scanning Diagram

2” Quad was a popular broadcast analogue video tape format whose halcyon period ran from the late 1950s to the 1970s. The first quad video tape recorder made by AMPEX in 1956 cost a modest $45,000 (that’s $386,993.38 in today’s money).

2” Quad revolutionized TV broadcasting which previously had been reliant on film-based formats, known in the industry as ‘kinescope‘ recordings. Kinescope film required significant amounts of skilled labour as well as time to develop, and within the USA, which has six different time zones, it was difficult to transport the film in a timely fashion to ensure broadcasts were aired on schedule.

To counter these problems, broadcasters sought to develop magnetic recording methods, that had proved so successful for audio, for use in the television industry.

The first experiments directly adapted the longitudinal recording method used to record analogue audio. This however was not successful because video recordings require more bandwidth than audio. Recording a video signal with stationary tape heads (as they are in the longitudinal method), meant that the tape had to be recorded at a very high speed in order accommodate sufficient bandwidth to reproduce a good quality video image. A lot of tape was used!

Ampex, who at the time owned the trademark marketing name for ‘videotape’, then developed a method where the tape heads moved quickly across the tape, rather than the other way round. On the 2” quad machine, four magnetic record/reproduce heads are mounted on a headwheel spinning transversely (width-wise) across the tape, striking the tape at a 90° angle. The recording method was not without problems because, the Toshiba Science Museum write, it ‘combined the signal segments from these four heads into a single video image’ which meant that ‘some colour distortion arose from the characteristics of the individual heads, and joints were visible between signal segments.’

Quad scanning

The limitations of Quadruplex recording influenced the development of the helical scan method, that was invented in Japan by Dr. Kenichi Sawazaki of the Mazda Research Laboratory, Toshiba, in 1954. Helical scanning records each segment of the signal as a diagonal stripe across the tape. ‘By forming a single diagonal, long track on two-inch-wide tape, it was possible to record a video signal on one tape using one head, with no joints’, resulting in a smoother signal. Helical scanning was later widely adopted as a recording method in broadcast and domestic markets due to its simplicity, flexibility, reliability and economical use of tape.

This brief history charting the development of 2″ Quad recording technologies reveals that efficiency and cost-effectiveness, alongside media quality, were key factors driving the innovation of video tape recording in the 1950s.

 

Posted by debra in video tape, 2 comments

Digital Optical Technology System – ‘A non-magnetic, 100 year, green solution for data storage.’

‘A non-magnetic, 100 year, green solution for data storage.’

This is the stuff of digital information managers’ dreams. No more worrying about active data management, file obsolescence or that escalating energy bill.

Imagine how simple life would be if there was a way to store digital information that could last, without intervention, for nearly 100 years. Those precious digital archives could be stored in a warehouse that was not climate controlled, because the storage medium was resilient enough to withstand irregular temperatures.

Imagine after 100 years an archivist enters that very same warehouse to retrieve information requested by a researcher. The archivist pulls a box off the shelf and places it on the table. In their bag they have a powerful magnifying glass which they use to read the information. Having ascertained they have the correct item, they walk out the warehouse, taking the box with them. Later that day, instructions provided as part of the product licensing over 100 years ago are used to construct a reader that will retrieve the data. The information is recovered and, having assessed the condition of the storage medium which seems in pretty good nick, the digital optical technology storage is taken back to the warehouse where it sits for another 10 years, until it is subject to its life-cycle review. Group_47_DOTS

Does this all sound too good to be true? For anyone exposed to the constantly changing world of digital preservation, the answer would almost definitely be yes. We have already covered on this blog numerous issues that the contemporary digital information manager may face. The lack of standardisation in technical practices and the bewildering array of theories about how to manage digital data mean there is currently no ‘one size fits all’ solution to tame the archive of born-digital and digitised content, which is estimated to swell to 3,000 Exabytes (thousands of petabytes) by 2020*. We have also covered the growing concerns about the ecological impact of digital technologies, such as e-waste and energy over-consumption. With this in mind, the news that a current technology exists that can by-pass many of these problems will seem like manna from heaven. What can this technology be and why have you never heard about it?

The technology in question is called DOTS, which stands for Digital Optical Technology System. The technology is owned and being developed by Group 47, who ‘formed in 2008 in order to secure the patents, designs, and manufacturing processes for DOTS, a proven 100-year archival technology developed by the Eastman Kodak Company.’ DOTS is refreshingly different from every other data storage solution on the market because it ‘eliminates media and energy waste from forced migration, costly power requirements, and rigid environmental control demands’. What’s more, DOTS are ‘designed to be “plug & play compatible” with the existing Linear Tape Open (LTO) tape-based archiving systems & workflow’.

In comparison with other digital information management systems that can employ complex software, the data imaged by DOTS does not use sophisticated technology. John Lafferty writes that at ‘the heart of DOTS technology is an extremely stable storage medium – metal alloy sputtered onto mylar tape – that undergoes a change in reflectivity when hit by a laser. The change is irreversible and doesn’t alter over time, making it a very simple yet reliable technology.’

DOTS can survive the benign neglect all data experiences over time, but can also withstand pretty extreme neglect. During research and development, for example, DOTS was exposed to a series of accelerated environmental age testing that concluded ‘there was no discernible damage to the media after the equivalent of 95.7 years.’ But the testing did not stop there. Since acquiring patents for the technology Group 47,

‘has subjected samples of DOTS media to over 72 hours of immersion each in water, benzine, isopropyl alcohol, and Clorox (™) Toilet Bowl Cleaner. In each case, there was no detectable damage to the DOTS media. However, when subjected to the citric acid of Sprite carbonated beverage, the metal had visibly deteriorated within six hours.’

Robust indeed! DOTS is also non-magnetic, chemically inert, immune from electromagnetic fields and can be stored in normal office environments or extremes ranging from -9º – 65º C. It ticks all the boxes really.

DOTS vs the (digital preservation) world

The only discernible benefit of the ‘open all hours’, random access digital information culture over a storage solution such as DOTS is accessibility. While it certainly is amazing how quick and easy it is to retrieve valuable data at the click of a button, it perhaps should not be the priority when we are planning how to best take care of the information we create, and are custodians of. The key words here are valuable data. Emerging norms in digital preservation, which emphasise the need to always be responsive to technological change, takes gambles with the very digital information it seeks to preserve because there is always a risk that migration will compromise the integrity of data.

The constant management of digital data is also costly, disruptive and time-consuming. In the realm of cultural heritage, where organisations are inevitably under resourced, making sure your digital archives are working and accessible can sap energy and morale. These issues of course affect commercial organisations too. The truth is the world is facing an information epidemic, and surely we would all rest easier if we knew our archives were safe and secure. Indeed, it seems counter-intuitive that amid the endless flashy devices and research expertise in the world today, we are yet to establish sustainable archival solutions for digital data.

256px-Dictionary_through_lens (2)

Of course, using a technology like DOTS need not mean we abandon the culture of access enabled by file-based digital technologies. It may however mean that the digital collections available on instant recall are more carefully curated. Ultimately we have to ask if privileging the instant access of information is preferable to long-term considerations that will safeguard cultural heritage and our planetary resources.

If such a consideration errs on the side of moderation and care, technology’s role in shaping that hazy zone of expectancy known as ‘the future’ needs to shift from the ‘bigger, faster, quicker, newer’ model, to a more cautious appreciation of the long-term. Such an outlook is built-in to the DOTS technology, demonstrating that to be ‘future proof’ a technology need not only withstand environmental challenges, such as flooding or extreme temperature change, but must also be ‘innovation proof’ by being immune to the development of new technologies. As John Lafferty writes, the license bought with the product ‘would also mandate full backward compatibility to Generation Zero, achievable since readers capable of reading greater data densities should have no trouble reading lower density information.’ DOTS also do not use propriety codecs, as Chris Castaneda reports, ‘the company’s plan is to license the DOTS technology to manufacturers, who would develop and sell it as a non-proprietary system.’ Nor do they require specialist machines to be read. With breathtaking simplicity, ‘data can be recovered with a light and a lens.’

It would be wrong to assume that Group 47’s development of DOTS is not driven by commercial interests – it clearly is. DOTS do however seem to solve many of the real problems that currently afflict the responsible and long-term management of digital information. It will be interesting to see if the technology is adopted and by who. Watch this space!

* According to a 2011 Enterprise Strategy Group Archive TCO Study

Posted by debra in audio tape, video tape, 0 comments

Digital Preservation – Establishing Standards and Challenges for 2014

2014 will no doubt present a year of new challenges for those involved in digital preservation. A key issue remains the sustainability of digitisation practices within a world yet to establish firm standards and guidelines. Creating lasting procedures capable of working across varied and international institutions would bring some much needed stability to a profession often characterized by permanent change and innovation.

In 1969 The EIAJ-1 video tape was developed by the Electronic Industries Association of Japan. It was the first standardized format for industrial/non-broadcast video tape recording. Once implemented it enabled video tapes to be played on machines made by different manufacturers and it helped to make video use cheaper and more widespread, particularly within a domestic context.

Close up of tape machine on the 'play', 'stop', 'rewind' button

The introduction of standards in the digitisation world would of course have very little impact on the widespread use of digital technologies which are, in the west, largely ubiquitous. It would however make the business of digital preservation economically more efficient, simply because organisations would not be constantly adapting to change. For example, think of the costs involved in keeping up with rapid waves of technological transformation: updating equipment, migrating data and ensuring file integrity and operability are maintained are a few costly and time consuming examples of what this would entail.

Although increasingly sophisticated digital forensic technology can help to manage some of these processes, highly trained (real life!) people will still be needed to oversee any large-scale preservation project. Within such a context resource allocation will always have to account for these processes of adaptation. It has to be asked then: could this money, time and energy be practically harnessed in other, more efficient ways? The costs of non-standardisation becomes ever more pressing when we consider the amount of the digital data preserved by large institutions such as the British Library, whose digital collection is estimated to amass up to 5 petabytes (5000 terabytes) by 2020. This is not a simple case of updating your iphone to the next model, but an extremely complex and risky venture where the stakes are high. Do we really want to jeopardise rich forms cultural heritage in the name of technological progress?

The US-based National Digital Stewardship Alliance (NDSA) National Agenda for Digital Stewardship 2014 echoes such a sentiment. They argue that ‘the need for integration, interoperability, portability, and related standards and protocols stands out as a theme across all of these areas of infrastructure development’ (3). The executive summary also stresses the negative impact rapid technological change can create, and the need to ‘coordinate to develop comprehensive coverage on critical standards bodies, and promote systematic community monitoring of technology changes relevant to digital preservation.’ (2)

File Format Action Plans

One step on the way to more secure standards is the establishment of File Format Action Plans, a practice which is being increasingly recommended by US institutions. The idea behind developing a file format action plan is to create a directory of file types that are in regular use by people in their day to day lives and by institutions. Getting it all down on paper can help us track what may be described as the implicit user-standards of digital culture. This is the basic idea behind Parsimonious Preservation, discussed on the blog last year: that through observing trends in file use we may come to the conclusion that the best preservation policy is to leave data well alone since in practice files don’t seem to change that much, rather than risk the integrity of information via constant intervention.

As Lee Nilsson, who is currently working as a National Digital Stewardship Resident at the US Library of Congress writes, ‘specific file format action plans are not very common’, and when created are often subject to constant revision. Nevertheless he argues that devising action plans can ‘be more than just an “analysis of risk.” It could contain actionable information about software and formats which could be a major resource for the busy data manager.’

Other Preservation Challenges

Analogue to Digital Converter close up What are the other main challenges facing ‘digital stewards’ in 2014? In a world of exponential information growth, making decisions about what we keep and what we don’t becomes ever more pressing. When whole collections cannot be preserved digital curators are increasingly called upon to select material deemed representative and relevant. How is it possible to know now what material needs to be preserve for posterity? What values inform our decision making?

To take an example from our work at Great Bear: we often receive tapes from artists who have achieved little or no commercial success in their life times, but whose work is often of great quality and can tell us volumes about a particular community or musical style. How does such work stand up against commercially successful recordings? Which one is more valuable? The music that millions of people bought and enjoyed or the music that no one has ever heard?

Ultimately these questions will come to occupy a central concern for digital stewards of audio data, particularly with the explosion of born-digital music cultures which have enabled communities of informal and often non-commercial music makers to proliferate. How is it possible to know in advance what material will be valuable for people 20, 50 or 100 years from now? These are very difficult, if not impossible questions for large institutions to grapple with, and take responsibility for. Which is why, as members of a digital information management society, it is necessary to empower ourselves with relevant information so we can make considered decisions about our own personal archives.

A final point to stress is that among the ‘areas of concern’ for digital preservation cited by the NDSA, moving image and recorded sound figure highly, alongside other born-digital content such as electronic records, web and social media. Magnetic tape collections remain high risk and it is highly recommended that you migrate this content to a digital format as soon as possible. While digitisation certainly creates many problems as detailed above, magnetic tape is also threatened by physical deterioration and its own obsolescence challenges, in particular finding working machines to play back tape on. The simple truth is, if you want to access material in your tape collections it needs now to be stored in a resilient digital format. We can help, and offer other advice relating to digital information management, so don’t hesitate to get in touch.

Posted by debra in audio tape, video tape, 0 comments

Big Data, Long Term Digital Information Management Strategies & the Future of (Cartridge) Tape

What is the most effective way to store and manage digital data in the long term? This is a question we have given considerable attention to on this blog. We have covered issues such as analogue obsolescence, digital sustainability and digital preservation policies. It seems that as a question it remains unanswered and up for serious debate.

We were inspired to write about this issue once again after reading an article that was published in the New Scientist a year ago called ‘Cassette tapes are the future of big data storage.’ The title is a little misleading, because the tape it refers to is not the domestic audio tape that has recently acquired much counter cultural kudos, but rather archival tape cartridges that can store up to 100 TB of data. How much?! I hear you cry! And why tape given the ubiquity of digital technology these days? Aren’t we all supposed to be ‘going tapeless’?

The reason for such an invention, the New Scientist reveals, is the ‘Square Kilometre Array (SKA), the world’s largest radio telescope, whose thousands of antennas will be strewn across the southern hemisphere. Once it’s up and running in 2024, the SKA is expected to pump out 1 petabyte (1 million gigabytes) of compressed data per day.’

SKA_dishes

Image of the SKA dishes

Researchers at Fuji and IBM have already designed a tape that can store up to 35TB, and it is hoped that a 100TB tape will be developed to cope with the astronomical ‘annual archive growth [that] would swamp an experiment that is expected to last decades’. The 100TB cartridges will be made ‘by shrinking the width of the recording tracks and using more accurate systems for positioning the read-write heads used to access them.’

If successful, this would certainly be an advanced achievement in material science and electronics. Smaller tape width means less room for error on the read-write function – this will have to be incredibly precise on a tape that will be storing a pretty extreme amount of information. Presumably smaller tape width will also mean there will be no space for guard bands either. Guard bands are unrecorded areas between the stripes of recorded information that are designed to prevent information interference, or what is known as ‘cross-talk‘.They were used on larger domestic video tapes such as U-Matic and VHS, but were dispensed with on smaller formats such as the Hi-8, which had a higher density of magnetic information in a small space, and used video heads with tilted gaps instead of guard bands.

The existence of SKA still doesn’t explain the pressing question: why develop new archival tape storage solutions and not hard drive storage?

Hard drives were embraced quickly because they take up less physical storage space than tape. Gone are the dusty rooms bursting with reel upon reel of bulky tape; hello stacks of infinite quick-fire data, whirring and purring all day and night. Yet when we consider the amount of energy hard drive storage requires to remain operable, the costs – both economic and ecological – dramatically increase.

The report compiled by the Clipper Group published in 2010 overwhelmingly argues for the benefits of tape over disk for the long term archiving of data. They state that ‘disk is more than fifteen times more expensive than tape, based upon vendor-supplied list pricing, and uses 238 times more energy (costing more than the all costs for tape) for an archiving application of large binary files with a 45% annual growth rate, all over a 12-year period.’

This is probably quite staggering to read, given the amount of investment in establishing institutional architecture for tape-less digital preservation. Such an analysis of energy consumption does assume, however, that hard drives are turned on all the time, when surely many organisations transfer archives to hard drives and only check them once every 6-12 months.

Yet due to the pressures of technological obsolescence and the need to remain vigilant about file operability, coupled with the functional purpose of digital archives to be quickly accessible in comparison with tape that can only be played back linearly, such energy consumption does seem fairly inescapable for large institutions in an increasingly voracious, 24/7 information culture. Of course the issue of obsolescence will undoubtedly affect super-storage-data tape cartridges as well. Technology does not stop innovating – it is not in the interests of the market to do so.

Perhaps more significantly, the archive world has not yet developed standards that address the needs of digital information managers. Henry Newman’s presentation at the Designing Storage Architectures 2013 conference explored the difficulty of digital data management, precisely due to the lack of established standards:

  • ‘There are some proprietary solutions available for archives that address end to end integrity;
  • There are some open standards, but none that address end to end integrity;
  • So, there are no open solutions that meet the needs of [the] archival community.’

He goes on to write that standards are ‘technically challenging’ and require ‘years of domain knowledge and detailed understanding of the technology’ to implement. Worryingly perhaps, he writes that ‘standards groups do not seem to be coordinating well from the lowest layers to the highest layers.’ By this we can conclude that the lack of streamlined conversation around the issue of digital standards means that effectively users and producers are not working in synchrony. This is making the issue of digital information management a challenging one, and will continue to be this way unless needs and interests are seen as mutual.

Other presentations at the recent annual meeting for Designing Storage Architectures for Digital Collections which took place on September 23-24, 2013 at the Library of Congress, Washington, DC, also suggest there are limits to innovation in the realm of hard drive storage.  Gary Decad, IBM, delivered a presentation on the ‘The Impact of Areal Density and Millions of Square Inches of Produced Memory on Petabyte Shipments for TAPE, NAND Flash, and HDD Storage Class‘.

For the lay (wo)man this basically translates as the capacity to develop computer memory stored on hard drives. We are used to living in a consumer society where new improved gadgets appear all the time. Devices are getting smaller and we seem to be able buy more storage space for cheaper prices. For example, it now costs under £100 to buy a 3TB hard drive, and it is becoming increasingly more difficult to purchase hard drives which have less than 500GB storage space. Compared with last year, a 1TB hard drive was the top of the range and would have probably cost you about £100.

A 100TB storage unit in 2010, compared with a smaller hard drive symbolising 2020.

Does my data look big in this?

Yet the presentation from Gary Decad suggests we are reaching a plateau with this kind of storage technology – infinite memory growth and reduced costs will soon no longer be feasible. The presentation states that ‘with decreasing rates of areal density increases for storage components and with component manufactures reluctance to invest in new capacity, historical decreases in the cost of storage ($/GB) will not be sustained.’

Where does that leave us now? The resilience of tape as an archival solution, the energy implications of digital hard drive storage, the lack of established archival standards and a foreseeable end to cheap and easy big digital data storage, are all indications of the complex and confusing terrain of information management in the 21st century. Perhaps the Clipper report offers the most grounded appraisal: ‘the best solution is really a blend of disk and tape, but – for most uses – we believe that the vast majority of archived data should reside on tape.’ Yet it seems until the day standards are established in line with the needs of digital information managers, this area will continue to generate troubling, if intriguing, conundrums.

Post published Nov 18, 2013

Posted by debra in audio tape, video tape, 0 comments

Repairing obsolete media – remembering how to fix things

A recent news report on the BBC website about recycling and repairing ‘old’ technology resonates strongly with the work of Greatbear.

The story focused on the work of Restart Project, a charity organisation who are encouraging positive behavioural change by empowering people to use their electronics for longer. Their website states,

the time has come to move beyond the culture of incessant electronics upgrades and defeatism in the face of technical problems. We are preparing the ground for a future economy of maintenance and repair by reskilling, supporting repair entrepreneurs, and helping people of all walks of life to be more resilient.

We are all familiar with the pressure to adopt new technologies and throw away the old, but what are the consequences of living in such a disposable culture? The BBC report describes how ‘in developed nations people have lost the will to fix broken gadgets. A combination of convenience and cultural pressure leads people to buy new rather than repair.’

These tendencies have been theorised by French philosopher of technology Bernard Stiegler as the loss of knowledge of how to live (savoir-vivre). Here people lose not only basic skills (such as how to repair a broken electronic device), but are also increasingly reliant on the market apparatus to provide for them (for example, the latest new product when the ‘old’ one no longer works).

A lot of the work of Greatbear revolves around repairing consumer electronics from bygone eras. Our desks are awash with soldering irons, hot air rework stations, circuit boards, capacitors, automatic wire strippers and a whole host of other tools.

OLYMPUS DIGITAL CAMERA

We have bookshelves full of operating manuals. These can help us navigate the machinery in the absence of a skilled engineer who has been trained how to fix a MII, U-Matic or D3 tape machine.

As providers of a digitisation service we know that maintaining obsolete machines appropriate to the transfer is the only way we can access tape-based media. But the knowledge and skills of how to do so are rapidly disappearing – unless of course they are actively remembered through practice.

The Restart Project offers a community-orientated counterpoint to the erosion of skills and knowledge tacitly promoted by the current consumer culture. Promoting values of maintenance and repair opens up the possibility for sustainable, rather than throwaway, uses of technology.

Even if the Restart Project doesn’t catch on as widely as it deserves to, Greatbear will continue to collect, maintain and repair old equipment until the very last tape head on earth is worn down.

Posted by debra in audio tape, video tape, 1 comment

Archiving for the digital long term: information management and migration

As an archival process digitisation offers the promise of a dream: improved accessibility, preservation and storage.

However the digital age is not without its archival headaches. News of the BBC’s plans to abandon their Digital Media Initiative (DMI), which aimed to make the BBC media archive ‘tapeless’, clearly demonstrates this. As reported in The Guardian:

‘DMI has cost £98.4m, and was meant to bring £95.4m of benefits to the organisation by making all the corporation’s raw and edited video footage available to staff for re-editing and output. In 2007, when the project was conceived, making a single TV programme could require 70 individual video-handling processes; DMI was meant to halve that.’

The project’s failure has been explained by its size and ambition. Another telling reason was cited: the software and hardware used to deliver the project was developed for exclusive use by the BBC. In a statement BBC Director Tony Hall referred to the fast development of digital technology, stating that ‘off-the-shelf [editing] tools were now available that could do the same job “that simply didn’t exist five years ago”.’

g tech pro hard-drive-raid-array

The fate of the DMI initiative should act as a sobering lesson for institutions, organisations and individuals who have not thought about digitisation as a long, rather than short term, archival solution.

As technology continues to ‘innovate’ at startling rate,  it is hard to predict how long the current archival standard for audio and audio-visual will last.

Being an early adopter of technology can be an attractive proposition: you are up to date with the latest ideas, flying the flag for the cutting edge. Yet new technology becomes old fast, and this potentially creates problems for accessing and managing information. The fragility of digital data comes to the fore, and the risk of investing all our archival dreams in exclusive technological formats as the BBC did, becomes far greater.

macos-x-copy-dialogue-box

In order for our data to survive we need to appreciate that we are living in what media theorist Jussi Parikka calls an ‘information management society.’ Digitisation has made it patently clear that information is dynamic rather than stored safely in static objects. Migrating tape based archives to digital files is one stage in a series of transitions material can potentially make in its lifetime.

Given the evolution of media and technology in the 20th and 21st centuries, it feels safe to speculate that new technologies will emerge to supplant uncompressed WAV and AIFF files, just as AAC has now become preferred to MP3 as a compressed audio format because it achieves better sound quality at similar bit rates.

Because of this at Greatbear we always migrate analogue and digital magnetic tape at the recommended archival standard, and provide customers with high quality and access copies. Furthermore, we strongly recommend to customers to back up archive quality files in at least three separate locations because it is highly likely data will need to be migrated again in the future.

Posted by debra in audio tape, video tape, 0 comments

The Grain of Video Tape

videotek-vtm-waveform-monitor-display

From U-matic to VHS, Betacam to Blu Ray, Standard Definition to High Definition, the formats we use to watch visual media are constantly evolving.

Yet have you ever paused to consider what is at stake in the changing way audio-visual media is presented to us? Is viewing High Definition film and television always a better experience than previous formats? What is lost when the old form is supplanted by the new?

At Greatbear we have the pleasure of seeing the different textures, tones and aesthetics of tape-based Standard Definition video on a daily basis. The fuzzy grain of these videos contrasts starkly with the crisp, heightened colours of High Definition digital media we are increasingly used to seeing now on television, smartphones and tablets.

This is not however a romantic retreat to all things analogue in the face of an unstoppable digital revolution, although scratch the surface of culture and you will find many people are.

At Greatbear we always have one foot in the past, and one foot in the future. We act as a conduit between old and new media, ensuring that data stored on older media can continue to have a life in today’s digital intensive environments.

 

 

Posted by greatbear in video tape, 0 comments