IBM scientists today announced that they have created a small, low-cost chipset that could allow wireless electronic devices to transmit and receive ten times faster than today's advanced WiFi networks.
Using the IBM-pioneered chip-making technology called silicon germanium, the chipset is able to send and receive information in a portion of the radio spectrum that is both unlicensed and can carry a much higher volume of data, a key advantage as data-intensive digital media formats, such as HDTV, become more pervasive.
Scientists refer to the portion of the radio spectrum from roughly 30 to 300 GHz as "millimeter wave frequency bands," since the actual length of the electromagnetic wave in a signal in these bands is measured in millimeters. Electronics makers have been looking for ways to exploit this portion of the radio spectrum, recognizing its potential for carrying vast amounts of information. However, previous chip designs attempting to exploit this spectrum have been too large, expensive and difficult to integrate with the rest of their products. Their use often required the purchase of multiple separate components and access to specialized skills. This represented a time consuming, expensive process with very low yield.
IBM's novel design and use of silicon germanium technology permits a high level of integration in the chips themselves. The embedding of the antennas directly within the chipset package helps further reduce system cost since fewer components are needed. As an example, a prototype chipset module, including the receiver, the transmitter, and two antennas, would occupy the area of a dime. By integrating the chipset and antennas in commercial IC packages, companies can use existing skills and infrastructure to build this technology into their commercial products.
Some applications that might now be possible using this 60 GHz technology include wireless personal-area networks (PANs) for intra-office communications in the 10m and below range. PANs are designed to support wireless Gb Ethernet, wireless display, wireless docking station, synchronization of PDAs with desktops/laptops, and wireless downloading of pictures from a camera. Similarly, the technology could enable wireless broadband video distribution, in which a 60-GHz link could be used to stream an uncompressed high-definition video signal from, for example, the DVD player to the plasma display mounted on the wall.
Besides the 16-megabit MRAM announced together with NEC, Toshiba today announced a newly developed FeRAM — Ferroelectric Random Access Memory. The new chip takes FeRAM storage to the 64-megabit level and pushes read and write speed to 200-megabytes a second.
Fabricated with 130-nanometer CMOS process technology, the 64-megabit FeRAM is based on Toshiba's chainFeRAM architecture, which significantly reduces memory cell size. It also integrates optimized circuitry designed to reduce the circuit area and squelch noise during read operation, and ECC, a high-speed error checking and correcting circuit that assures data reliability at high speed operation, even in severe operating conditions.
The key to the performance boost is adoption of burst mode for high-speed data transfers. Its successful integration pushes read and write speed to 200-megabytes a second.
FeRAM combines the fast operating characteristics of DRAM and SRAM with flash memory's ability to retain data while powered off, characteristics that continue to attract semiconductor industry attention. Toshiba will continue its R&D in FeRAM, aiming for eventually use in a wide range of applications, including high-performance mobile digital equipment and computers.
Toshiba and NEC today announced that they have developed a magnetoresistive random access memory (MRAM) that combines the highest density with the fastest read and write speed yet achieved. The new MRAM achieves a 16-megabit density and a read and write speed of 200-megabytes a second, and also secures low voltage operation of 1.8V.
A major challenge of MRAM development to date has been the acceleration of read speeds: the current drive circuit used to generate the magnetic field for writing degrades read operation from memory cells. The new MRAM has an improved circuit design that divides the current paths for reading and writing, realizing a faster read speed. It also reduces equivalent resistance in wiring by approximately 38% by forking the write current. These innovations together achieve a read and write speed of 200-megabytes a second and a cycle time of 34 nanoseconds. This performance is underlined by a low operating voltage of only 1.8V.
Alongside advances in performance, the new MRAM achieves advances in chip size. Toshiba and NEC have introduced the above mentioned technologies and optimized overall circuit design, achieving a chip that, at 78.7mm², is approximately 30% smaller than its equivalent without the new circuit design. The new MRAM is the world's smallest in the 16-megabit era.
MRAM is expected to be a next-generation non-volatile memory that retains data when powered off and that achieves fast random access speeds and unlimited endurance in operation.
Micron Technology today introduced a new, one-quarter inch, 2-megapixel image sensor for mainstream camera phones. According to the press release, Micron is the first company to incorporate Mobile Industry Processor Interface (MIPI) standards into an image sensor, allowing cameras to be more easily designed into mobile phones.
The new 2-megapixel sensor (product number MT9D112) complements Micron's existing CMOS image sensor portfolio for mobile handsets, which span the gamut from VGA to 5-megapixel resolution. As with all Micron image sensors, the MT9D112 is powered by Micron’s proprietary DigitalClarity technology.
Key highlights of the MT9D112 include:
General customer sampling is planned for April 2006, with mass production expected in August 2006.
Source: Micron Technology
Write a comment below. No registration needed!