By Tom Ryan, CEO, The Fanfare Group
Constant product innovation has contributed to a thriving equipment-manufacturing industry over the last two decades. The enabler for innovation has been a steady stream of new tools and technologies that help software developers work faster and smarter. Developers have been empowered to be more productive, but QA groups have not been given the tools they need to keep up. No matter how quickly features are developed, products cannot be released before they are tested. As a result, testing is quickly becoming a barrier to innovation, threatening the pace of feature advancements.
By Joanne De Peralta, BiTMICRO
The release of Apple’s MacBook Air, Lenovo Thinkpad, and Dell Latitude Notebooks — all equipped with solid state disks — have created a strong buzz in the computing industry. Debates of whether or not the era of SSDs has finally arrived have stimulated activity in quite a number of forums. The advantages of solid state disks over conventional hard drives have been highlighted with the stream of products being introduced to both consumer and enterprise markets.
By Ollie Smith, Telegesis
ZigBee has been a bit of a buzz word among engineers looking for wireless solutions over the past eighteen months or more. ZigBee low power mesh networking radio technology is its more accurate — if somewhat cumbersome — description for those who have yet to encounter it. But despite being a prominent topic of conversation among engineers looking for a robust wireless solution across a wide cross section of markets and applications, there has been a perception that up till now, ZigBee has been “unfinished” and not yet ready for use in the real world environment.
Avionic, railway, and high-end automotive systems have become too complex to develop and coordinate without the assistance of a design environment that connects all of the developers through their participation in the execution of the engineering process. The more efficient solution in this case is to use a model-based design tool. The adoption of Model-Based Design brings several benefits such as:
Disk Imaging Recovery Software with Copy-On-Write Snapshot Technology Lessens the Burden of “Digital Disasters”
By Don Lewis, FarStone Technology
When mishaps, malfunctions, failures and crashes take down a computer system, the potential for lost data and increased calls for Technical Support assistance can cause a “digital disaster” that affects both computer system builders and end users alike. For the system builder, warranty period obligations can lead to a high volume of tech support calls. For end users, PC configurations are always changing, as users constantly create and save new files and install or update software. Any changes that were not previously backed up are at risk for loss without a good back-up recovery system in place.
By Masaki Gondo, Director of Engineering, eSOL Co., Ltd.
ARM11 MPCore is an excellent multicore processor, appropriate for both SMP and AMP. Many embedded system designs already uses AMP model, either on chip or off chips, so unforeseen design challenges are limited for software developers using AMP model with MPCore. On the other hand, the increasingly demanding application requires more and more CPU power, and now the utilization of SMP is soon to become a necessity. However, SMP brings new design issues along with its advantages. The first half of this article is dedicated to discussing the issues and pros and cons of AMP/SMP, including throughput, concurrency, realtime determinism, reuse of existing software, programming model, and debug/analysis. The later half introduces eSOL’s eT-Kernel Multi-Core Edition RTOS and eBinder tools, and how the issues can be put under control with its unique blending technology.
Developers of applications with sophisticated graphics (for example: medical and industrial imaging, gaming and entertainment machines, POS/POI terminals, commercial outdoor broadcasting, public facilities and high-end residential gateways) have always had a problem with graphics. If they were looking for powerful graphics cards with long-term availability for their embedded designs they found that such cards simply do not exist. Standard cards from Asus or MSI are often discontinued after just a few months, this being the typical lifecycle for standard computer boards intended for the consumer market. And that is just the beginning of the problems associated with using consumer graphics cards. If OEMs rely on products from the mass market, they will incur significant expenditure during the products’ lifecycle: frequent driver updates, extremely high energy use (sometimes up to 150W), and in some cases limited MTBF due to fan failures. Additionally, the proportions of the consumer cards and their cooling designs often conflict with the embedded principles of compact dimensions, simplified cooling, and standardized form factors.
Aside from flash memory prices, the write endurance limitation of flash memory is probably one of the remaining impediments to the widespread application of non-volatile solid-state storage in the enterprise. Flash SSD critics have long harped on this apparent “weakness” that remains a thorn on the side of SSD manufacturers despite the development of advanced error correcting codes and wear leveling techniques utilized in their products.
By Leilani Junghan, BiTMICRO Networks
The launch of Windows Vista, Microsoft’s latest OS, in the last quarter of this year may change the computing landscape. One of its intriguing features called ReadyDrive requires the use of hybrid drives. Strictly speaking, this technology centaur is half hard disk drive (HDD), and half solid state disk (SSD) drive. Aiming to exploit the best of both worlds, hybrid drives consists of a rotating magnetic platter for storage, and a non-volatile flash memory chip for caching.
Vista’s endorsement is exciting but it’s human nature not to trust strange disk drives. The tricky part is getting users acquainted with SSDs to care enough about hybrid drives. The HDD half of the pair presents no problem. Almost everyone is familiar with HDDs – its form factor, RPM, and storage capacity – that it’s almost like household furniture. While people can readily identify HDDs, they have difficulty defining SSDs. This limited knowledge of SSDs may be attributed to the fact that it is distributed to the consumer market in small doses.
By Joanne De Peralta, BiTMICRO Networks
Next to price, capacity has been among the top issues that level out solid-state flash disk with magnetic hard drives. If not for those two factors, solid-state disks would be the runaway winner offering ruggedness, speed and small footprints. However, recent developments are starting to change the conditions.
According to the “New Data Center” benchmark published by Nemertes Research, “Storage is growing at a rate of 22% year-on-year through 2005 and 2006 (predicted to continue through 2007), and many companies top even that growth, reporting growth rates of 100%, 150%, and in some cases 300% or more.” This only proves the point that storage capacity has gone into the level of a commodity.