iXBT Labs - Computer Hardware in Detail






Memory of the future: two directions

The upcoming problems requiring very huge computing power make us today looking properly for new technical solutions not only in terms of CPU enhancement but also in terms of other PC components. Regardless of the technology used for CPU production, the data number transferred for processing are determined also by possibilities of other subsystems. Capacity of modern devices of mass memory reflects this tendency. CDs discs allow storing up to 700 MBytes, the developing technology of DVD-ROM - up to 17 GBytes. Technology of magnetic recording develops quickly as well - for the last year the typical capacity of a hard disc in the desktop computers has increased up to 15-20 GBytes and higher. But in the future computers are to process hundreds of gigabytes and even terabytes - much more than any current CDs or hard discs can accommodate. Servicing of such data volumes and their transfer for processing by ultraspeed processors requires completely new approaches when creating storage devices.

Holographic memory

Wide possibility in this case are provided by technology of optical recording, it's known as holography: it allows high record density together with maximum data access speed. It's achieved due to the fact that the holographic image (hologram) is coded in one big data block which is recorded at one access. And while reading this block is entirely extracted out of the memory. For reading and recording of the blocks kept holographically on the light-sensitive material (LiNbO3 is taken as the basic material) they use lasers. Theoretically, thousands of such digital pages, which contain up to a million bits each, can be put into a device measuring a bit of sugar. And theoretically they expect the data density to be 1 TBytes per cubic cm (TBytes/cm3). In practice, the developers expect around 10 GBytes/cm3, what is rather impressive when comparing with the current magnetic method which allows around several MBytes/cm2 - and this without the mechanism itself. With such recording density an optical layer which is approx 1 cm in width will keep around 1TBytes of data. And considering the fact that such system doesn't have any moving parts, and pages are accessed parallel, you can expect the device to be characterized with 1 GBytes/cm3 density and higher.

Exceptional possibilities of the topographic memory have interested many scientists of universities and industrial research laboratories. This interest long time ago poured into two research programs. The first of them is PRISM (Photorefractive Information Storage Material), which is targeted at searching of appropriate light-sensitive materials for storing holograms and investigation of their memorizing properties. The second program is HDSS (Holographic Data Storage System). Like PRISM, it includes fundamental investigations, and the same companies participate there. While PRISM is aimed at searching the appropriate media for storing holograms, HDSS is targeted at hardware development necessary for practical realization of holographic storage systems.

How does a system of holographic memory operate? For this purpose we will consider a device assembled by a task group from the Almaden Research Center.

At the first stage in this device a beam of cyan argon laser is divided into two components - a reference and an object beam (the latter is a carrier of data). The object beam undergoes defocusing in order it could entirely illumine the SLM (Spatial Light Modulator) which is an LCD panel where a data page is displayed in the form of a matrix consisting of light and dark pixels (binary data).

The both beams go into the light-sensitive crystal where they interact. So we get an interference pattern which serve a base for a hologram and is recorded as a set of variations of the refractive exponent and the reflection factor inside this crystal. When reading data the crystal is illuminated with a reference beam, which interacts with the interference factor and reproduces the recorded page in the image of "chess-board" of light and dark pixels (the holograms converts the reference wave into the copy of the object one). After that, this image is transferred into the matrix detector where the CCD (Charge-Coupled Device) serves a base. While reading the data the reference beam must fall at the same angle at which the recording was made; alteration of this angle mustn't exceed 1 degree. It allows obtaining high data density: measuring the angle of the reference beam or its frequency you can record additional pages of data in the same crystal.

However, additional holograms change properties of the material, and such changes mustn't exceed the definite number. As a result, the images of holograms become dim, what can lead to data corruption when reading. This explains the limitation of the volume of the real memory which belongs to this material. The dynamic area of the medium is defined by the number of pages which can be virtually housed, that's why PRISM participants are investigating limitations to the light sensitivity of substances.

The procedure in 3-dimensional holography which concludes in enclosure of several pages with data into the same volume is called multiplexing. Traditionally the following multiplexing methods are used: of angle of dip of the reference beam, of wavelength and phase; but unfortunately they require complicated optical systems and thick (several mm) carriers what makes them unfit for commercial use, at least in the sphere of data processing. But lately Bell Labs have invented three new multiplexing methods: shift, aperture and correlative, which are based on the usage of change in position of the carrier relative to the beams of light. The shift and aperture multiplexing use a spherical reference beam, and the correlative uses a beam of more complicated form. Besides, considering the fact that the correlative and shift multiplexing enable mechanical moving elements, the access time will be the same as that of the usual optical discs. Bell Labs managed to build an experimental carrier on the same LiNBO3 using the technology of correlative multiplexing but this time with 226 GBytes per square inch.

Another problem standing on the way of development of holographic memory devices is a search of the appropriate material. The most of the investigations in the sphere of holography were carried out with usage of photoreactive materials (mainly the mentioned LiNBO3), but they are not suitable for data recording especially for commercial use: they are expensive, weak sensitive and have a limited dynamic range (frequency bandwidth). That's why they developed a new class of photopolymer materials facing a good perspective in terms of commercial use. Photopolymers are the substances where the light cause irreversible changes expressed through fluctuation of the composition and density. The created material have a longer life circle (in terms of storing data) and are resistant to temperature change, besides, they have improved optical characteristics and are suitable for WORM (write-once/read many).

One more problem concludes in the complexity of the used optical system. For holographic memory the LEDs based on semiconductor lasers used in traditional optical devices are not suitable, since they have insufficient power, give out a wide beam angle, and at last it's too difficult to get a semiconductor laser generating radiation in the middle range of the visible spectrum. There you need as powerful laser as possible which gives the most exact parallel beam. The same we can say about the SLM: yet some time ago there were no any such devices which could be used in the holographic memory systems. But time flies and today you can get inexpensive solid-state lasers; besides, there appeared the MEM technology (Micro-Electrical Mechanical). The devices on its base consist of the arrays of micromirrors around 17 micron in size, they suit very much for the role of SLM.

Since the interference patterns fill up the whole substance uniformly, it gives another useful property to the holographic memory - high reliability of the recorded information. While a defect on the surface of the magnetic disc or tape destroy important data, a defect in holographic medium doesn't cause a loss of information, it leads only to tarnish of the hologram. The small desktop HDSS-devices are to appear by 2003. Since the HDSS equipment use an acoustooptical deflector for measuring angle of dip (a crystal which properties change with a sound-wave passing through it), the extraction time for adjacent data pages will constitute 10ms. Any traditional optical or magnetic memory device needs special means for data access of different tracks, and this access time constitutes several milliseconds.

The holographic memory is not a completely new technology since its basic conceptions were developed about 30 years ago. The only that has changed is availability of the key components - the prices considerably fell down. The semiconductor laser, for example, is not unusual. On the other hand, SLM is a result of the same technology which is used in production of LCD-screens for notebooks and calculators, and the CCd detector array is taken right from a digital video camera.

Well, the new technology has more than enough highlights: apart from the fact that information is stored and recorded parallel, you can reach very high data rate, and in some cases high speed of random access. And the main advantage is that mechanical components are practically absent (those that typical for current storage devices). It ensures not only a fast data access, less probability of failures, but also lower power consumption, since today a hard disc is one of the greatest power-consuming elements of a computer. However, there are problems with adjustment of optical devices, that's why at the beginning the data of the device will probably "fear" exterior mechanical effects.

Molecular memory

Another approach in creation of storage devices is a molecular method. A group of researchers of the "W.M. Keck Center for Molecular Electronic" with Professor Robert R. Birge as a head quite a long time ago received a prototype of memory subsystem which uses digital bits of a molecule. These are protein molecules which is called bacteriorhodopsin. It's purple, absorbs the light and presents in a membrane of a microorganism called halobacterium halobium. This bacterium lives in salt bogs where the temperature can reach +150 C. When a level of oxygen contents is so low in the ambient that to obtain power breathing (oxidation) is not enough, it uses protein for photosynthesis.

Bacteriorhodopsin was chosen because a photocycle (a sequence of structural changes undergone by a molecule when reacting with light) makes this molecule an ideal logically storing element of "&" type or a type of a switch from one condition into another (trigger). According to Birge's investigation, bR-state (logical value of the bit "0") and Q-state (logical value of the bit "1") are intermediate states of the molecule and can remain stable during many years. This property (which in particular provides a wonderful stability of protein) was obtained by an evolutional way in the struggle for survival under severe conditions of salt bogs.

Birge estimated that the data recorded on the bacteriorhodopsin storage device must "live" around 5 years. Another important feature of the bacteriorhodopsin is that these both states have different absorption spectra. It allows easily defining the current state of the molecule with the help of a laser set for the definite frequency.

They built a prototype of memory system where the absorption spectrum stores data in 3-dimensional matrix. Such matrix represents a cuvette (a transparent vessel) filled up with polyacryde gel, where protein is put. The cuvette has an oblong form 1x1x2 inch in size. The protein which is in the bR-state is fixed in the space with gel polymerization. The cuvette is surrounded with a battery of lasers and a detector array based on the device using a principle of CID (Charge Injection Device), they serve for data recording and reading.

When recording data first you need switch on a yellow-wave "page" laser - for converting the molecules into Q-state. The SLM which represents an LCD-matrix creating a mask on the beam way stimulates appearing of an active (excited) plane in the material inside the cuvette. This poweractive plane is a page of data which can house 4096x4096 bit array. Before returning of protein into the quiescent state (in such state it can remain quite a long time keeping the information) a red-wave recording laser lights on; it's positioned at the right angle to the yellow one. The other SLM displays binary data, and this way creates the corresponding mask on the way of the beam, that's why only definite spots (pixels) of the page will be irradiated. The molecules in these spots will convert into Q-state and will represent a binary one. The remaining part of the page will come into the initial bR-state and will represent binary zeros. In order to read the data you will need again the "page" laser which converts the read page into Q-state. It's implemented so that in the future one can identify binary one and zero with the help of difference in absorption spectra. 2ms later the page is plunged into low-intensive light flux of the red-wave laser. Low intensity is necessary to prevent jumping into Q-state. The molecules that represent a binary zero absorb red light, and those that represent a binary one let the beam pass by. It creates a "chess" picture of light and dark spots on the LCD-matrix which takes a page of digital information.

For erasing information a short impulse of a cyan laser is enough in order to convert the molecules from Q-state back into bR-state. This beam can be obtained not necessary with the laser: you can erase the whole cuvette with a usual ultraviolet lamp. In order to ensure the entirety of data when erasing only the definite pages there used caching of several adjacent pages. For read/write operations two additional parity bits are also used to prevent errors. The data page can be read without corruption up to 5000 times. Each page is traced by a meter and after 1024 reading the page gets refreshed (regenerated) with a new recording operation.

Considering that a molecule changes its states within 1 ms, the total time taken for read or write operations constitutes around 10 ms. But similar to the system of a holographic memory this device makes a parallel access in the read-write cycle, what allows counting on the speed up to 10 Mbps. It is assumed that combining 8 storing bit cells into a byte with a parallel access, you can reach, then, 80 Mbps, but such method requires a corresponding circuit realization of the memory subsystem. Some versions of the SLM devices implement a page addressing, which in cheap constructions is used when sending the beam to the required page with the help of rotary system of galvanic mirrors. Such SLM provides 1ms access but costs four times more.

Birge states that the system suggested by him is close to the semiconductor memory in operating speed until a page defect is come across. On detecting of a defect it's necessary to resend the beam for accessing these pages from the other side. Theoretically, the cuvette can accommodate 1TBytes of data. Limitation of capacity is mainly connected with problems of lens system and quality of protein.

Will the molecular memory be able to compete against the traditional semiconductor memory? Its construction has undoubtedly some advantages. First, it's based on protein which is produced in large volumes and at low price. Secondly, the system can operate in the wider range of temperatures than the semiconductor memory. Thirdly, data are stored constantly - even in case of power switching off, it won't cause data loss. And at last, bricks with data which are rather small in size but contain gigabytes of data can be placed into an archive for storing copies (like magnetic tapes). Since the bricks do not have moving parts, it's more convenient than usage of portable hard discs or cartridges with magnetic tape.

Materials from "BYTE" journal used as major source for this article.

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.