Friday, August 29, 2008

Profile: DriveSavers Stays True to Data-recovery Roots

DriveSavers takes data recovery seriously. So seriously, in fact, that the company recently installed a $2 million cleanroom complex in its Novato, Calif., headquarters. In this 2,000 square foot facility, DriveSavers can unseal and open up hard drives for diagnosis and repair without dust-borne contamination assaulting the now-naked spinning platters and swiftly seeking head assemblies.
Hard drives are delicate machines-even the tiniest speck of dust can render them inoperable. Destructive dust, however, has been effectively banished from the new Class 100 cleanroom, allowing DriveSavers engineers to open, repair, and begin data recovery of ailing hard drives without fear of damaging them further.
As impressive as the new facility is, the value of its specialized equipment and precision tools pales in comparison to the most critical element in DriveSavers' data-recovery arsenal-its experienced and tenacious engineers.
Engineering as a healing art
Like medicine, the healing of a sick storage system is as much an art as it is a science. And like doctors, the DriveSavers engineers consult with one another, confer on courses of treatment, and take their work very seriously. They even have their own version of the Hippocratic oath's central tenet, "Do no harm." As cleanroom manager and 12-year DriveSavers veteran Ed Sit put it, "If it's not broke, don't break it more."
In DriveSavers Class 100 cleanroom, hard drives can be operated with their cases open without fear of being contaminated by drive-destroying dust.
"Do no harm" is more than a slogan at DriveSavers; it's the basis of the company's workflow. When a sick storage system is brought into the company's hardware hospital, it's given a preliminary examination in what Sit refers to as the "triage area." After diagnosis, 95 percent of ailing drives are sent into the cleanroom for disassembly, repair, and preliminary data recovery. There, a team of seven veteran engineers transfers the drive's data onto another drive after repairing the malfunctioning drive using parts from an on-site, 20,000-drive inventory. After the sickly drive has yielded its raw data, it's no longer part of the workflow-it's laid aside, and all further recovery work is done on the data now moved to its clone.
The recovered data is then transferred from the clone to DriveSavers' immense and highly secure storage network, which includes 65TB of 24/7 online storage, with another 100TB of near-line backup. Due to the sensitivity of much of the data that DriveSavers saves-clients include major financial services, Hollywood filmmakers, and the U.S. Government-the company has deployed a Cisco Self-Defending Network architecture to keep even the most resourceful hackers from snooping.
Reassembling files
Electrically grounded coveralls, hood and mask, safety glasses, and latex gloves: the all-day uniform of a cleanroom engineer.
After the physical repair and data-collection has been performed by the cleanroom team, the recovered raw data is reassembled into files by what DriveSavers refers to as its "logical group." Here the often-tedious detective work of file identification transforms what director of Mac/Unix engineering, Mike Cobb, describes as "just ones and zeros" into its anxious owner's customer records, financial data, or digital media.
Cobb, who will celebrate his 15th anniversary with DriveSavers next Valentine's Day ("This was my new girlfriend and we haven't broken up yet," he says), is a file-recovery fanatic. "It may take a couple of months. It may take a year. But I'm going to get it," Cobb says about reconstructing a drive's contents. When asked what percentage of file recovery was luck, he responds, "Is it luck? No. It's the knowledge that I'm not going to give up."
Luckily, most data-recovery operations take only days and not the worst-case year to which Cobb referred. But recovery is getting more difficult.
Take the sheer number of files on a typical hard drive, for example. "Here's a fun stat," Cobb says. "Back in 1994-95, the total number of files on a hard drive with the System on it was around 48,000-if that. Leopard starts with 540,000 files before you even enter your name."
Today's files aren't always straightforward data buckets, either: Enterprise IT systems often encrypt sensitive files, so DriveSavers trains and certifies its engineers in common encryption technologies.
Also, hard-drive engineering has progressed over the years. While this has made life better for hard-drive owners-higher capacities, faster performance, longer drive life-steadily increasing complexities and closer tolerances have made drive repair dicier. "Back in 1994 when I started, most things were corruption, or electronic, or slight mechanical problems," Cobb recalls. "Hardly anything had to go into a clean room in that year."
Complexities are increasing
Times have changed in other ways, as well. Today the new cleanroom hosts more than simple, straightforward hard drives. Into it now come RAID arrays, NAS (network-attached storage) and SAN (storage-area network) devices, tape drives-even such solid-state devices as USB drives, Memory Sticks, and digital-camera cards. PC and storage system manufacturers are also making life more difficult for the cleanroom engineers by subtly modifying the firmware in the hard drives they install in their products-the exact same Seagate hard drive, for example, may have different firmware in a Dell system than it does in one from HP.
But Sit, DriveSavers' cleanroom manager, takes the increasing challenges in stride. If fact, he seems to relish them, and credits his team-oriented approach for keeping DriveSavers on top of the changing landscape. "Some of the guys are better at firmware, some of the guys are better at component repair-and then there's my solid-state guy, who rocks," Sit says. They're also productive: "Each guy may touch ten recoveries a day; I've got seven guys in here; 70 a day; I can do 1,400 a month."
Sit also cites the culture of continuing improvement among his team. "In earlier years different symptoms were impossible, but now we've resolved them. What was once impossible is now an easy fix." For example, he says, "I used to think platter swaps were impossible, and now my guys do them routinely."
Even when Sit thinks a drive is too far gone for repair, "These younger guys just say, 'Let me have at it.'” Like doctors, they're curing ills that were incurable just years ago.
A legacy of Macs
DriveSavers engineers repair clients' ailing drives by cannibalizing parts from the cleanroom complex's inventory of 20,000 healthy drives.
DriveSavers traces its history back to 1985, when co-founders Scott Guidano and Jay Hagan were working at the Mac-focused hard-drive vendor Jasmine. When Jasmine went under, Guidano and Hagen (now DriveSavers' president and CEO, respectively) took over its tech-support line, and began repairing Jasmine drives. Simple drive repairs such as replacing power supplies grew into data recovery, and soon the team was working on drives from other vendors, from PCs, Unix boxes, and more.
But Macs were central to the data-recovery effort then, and remain so today. Cobb recalls that when he was hired, Hagan told him to spend as much money as needed to get the best computers for data recovery. So he bought six Mac SE/30s-at more than $4,000 each. Today, the hallway outside the logical lab at DriveSavers is full of Mac Pro boxes-the company just added 40 new Mac Pros to the mix-and the lab itself had a broad array of Power Macs, Mac minis, and more. Why use Macs as the main tool, even for PC and Unix jobs? "Because I love Mac," Cobb says.
Macs promise to be the backbone of DriveSavers' data-recovery efforts well into the future-and as file sizes grow, as professional content creators continue to migrate to digital media, and as enterprise-level IT departments and data centers offer more and better customer-targeted digital services, that future promises to be increasingly lucrative for DriveSavers.
The key word there is "lucrative." DriveSavers' services aren't cheap-and they aren't for the everyday user, seeing as how the average price for a typical data recovery is about $1,500. Prices vary based on how quickly a client needs their data to be recovered, the operating system that created and managed the files, and the capacity of the affected system-if it's a complex RAID 5 or Xsan setup, the fee may run to many thousands of dollars. However, if a company's data is irreplaceable, if its business would be dealt a devastating blow by the loss of its records, or even if an average-if deep-pockets-user can't bear the loss of that digital video of their child's first steps, DriveSavers might be a life saver.
So backup your files-and DriveSavers can help you with that chore, as well. As Lynda Martel, DriveSavers' director of marketing pointed out, "Every drive that goes back after recovery goes with tips on how to backup your data and protect it from data loss," despite the fact that if everyone kept current, off-site backups of all their important files, DriveSavers would have little reason to exist.
With most users being dangerously cavalier about backups, however-and with the probability that anything that can go wrong, usually will-it's safe to assume that DriveSavers' future is assured.

Oracle Technical Forum Upgrade Plagued With Problems

Oracle's technical forums have been racked with performance issues all week since the vendor upgraded the system.
Forums.oracle.com underwent a "long, long overdue" upgrade last weekend to Jive Forums 5.5, according to a blog post by Justin Kestelyn, editor in chief of Oracle Technology Network.
But apparently, some Oracle users have had to wait a long, long time to access the system, receiving error messages and experiencing slow performance overall.
The situation has users who did manage to get into the forums sounding off in colorful fashion.
"I do not care whether or not this forum has loads of funky new features -- if no one can get to the site and/or post anything, what's the point?," wrote one poster, "ATD," on Thursday.
"I would also like to register my disgust at what's been happening for several days and my sheer disappointment that Oracle, of all companies, would allow an application to go live without thorough testing or, if it was thoroughly tested in UAT, without rolling back immediately when it was obvious that there were problems in the production environment," ATD added.
Oracle teams have been scrambling to resolve the issues all week, according to Kestelyn, and uptime reached 80 percent by Wednesday, compared to 7 percent on Monday.
"Uptime is still not where it should be of course; forums.oracle.com is business-critical for a lot of folks (as well as for Oracle), and I'm glad they consider it so," he wrote.
"We made a conscious decision early in this process to stick with the upgrade; to fight through the problems instead of run from them," he added. "Regardless, I do want to apologize for the downtime you've suffered through thus far."
It is possible the company will change course platform-wise, he said. "Stability is our top priority - much more so than features. If we have to trade the latter for the former, we will."

Wednesday, August 27, 2008

Fujitsu Readies Eight-core Sparc64 Chip

Fujitsu is developing an eight-core version of its Sparc64 processor, which should give a performance boost to the Sparc Enterprise Servers that Fujitsu jointly develops with Sun Microsystems.
Fujitsu's Takumi Maruyama mentioned the chip briefly at the end of a presentation at the Hot Chips conference in Palo Alto, California, Tuesday but he provided few details, including when the processor will ship.
It will succeed the four-core Sparc64 VII processor released in servers from Fujitsu and Sun in July. The Sparc Enterprise Servers use Fujitsu's chips and Sun's Solaris 10 operating system. The companies develop the systems together but market and sell them separately.
The eight-core processor is code-named Venus and will be manufactured using a 45-nanometer process, Maruyama said, a step up from the 65-nanometer process used for the quad-core Sparc64 VII.
It will have an embedded memory controller and offer peak throughput of 128G flops (floating operations per second), he said. He said it is being designed for the age of "petascale computing."
"I hope I can tell you more about it at Hot Chips next year," Maruyama said.
The chip will be likely be welcomed by Sun, which confirmed in a separate presentation that its own Rock processor won't ship until the second half of 2009, about a year later than originally planned.
Rock is a 16-core processor that Sun has billed as a dramatic step forward in chip design. It will be able to address very large amounts of memory and uses innovative data "pre-fetching" techniques to achieve high levels of parallelism.
Sun says the chip will offer lightening-fast performance for databases and other enterprise applications. However, the newness of the design may have contributed to the chip's delay, which Sun disclosed earlier this year.

Mozilla Extension Would Tap Into Typed Commands

An experimental extension to Mozilla Firefox lets people substitute simple text commands for complex Web tasks such as putting links to maps in e-mail messages.
On Tuesday, Mozilla Labs released its first version of Ubiquity, which is related to software called Enso that was developed at a small Chicago company called Humanized. Mozilla hired three executives of Humanized in January, and Aza Raskin, the former president of that company, introduced Ubiquity 0.1 in a Mozilla Labs blog entry on Tuesday. Raskin is now head of user experience at Mozilla Labs.
Ubiquity is designed to help ordinary people create something like mashups and to do it on a personal basis instead of in the form of a public Web page. The commands that users type in Ubiquity, such as "map" and "e-mail," find resources on the Web and can gather information from those sources in one place.
For example, someone inviting a friend to dinner could highlight the name of the restaurant, type "map," and instantly call up a Google Map showing the location of the restaurant. The user could then edit that map and place it in the body of the e-mail message. Similarly, typing "yelp" and the name of the restaurant would bring the text of reviews from Yelp.com right into the message.
In an interview, Raskin compared it to a search engine, except that Ubiquity users type in what they want to do instead of what they want to find.
Other commands that are already available include "defi," which brings up a definition for a highlighted word; "trans," which translates any highlighted text; and "twit," which takes the highlighted text and puts it up on Twitter.
It's easy to create new commands, so average users can do it without advanced Web development skills, according to Raskin.
"You don't have to wait for a developer to think of a user case. ... You can do it for yourself," Raskin said.
Users who created commands for Ubiquity can post them on the Web and allow others to subscribe to them for free.
Ubiquity may or may not be added as an extension to Firefox. Mozilla Labs is designed to be an open test environment for new ideas, with participation by anyone, in which some ideas will graduate to use in Firefox and others won't, Raskin said.

Microsoft Tweaks Anti-piracy Check for Windows XP

Microsoft has updated software that verifies whether a copy of Windows is genuine in its Windows XP Professional edition, making it similar to the notification in Windows Vista and thus more persistently visible to users.
In a blog posting attributed to Alex Kochis, a Microsoft director of product marketing and management, the company said it made the changes to the Windows Genuine Notification (WGA) alerts for XP Pro because it is "the product edition that is most often stolen."
Now when a version of Windows XP Pro is found to be pirated or counterfeit, the next time a user logs on to the system, the desktop screen background will be black, replacing whatever custom desktop may have been set by the user. This will reappear every 60 minutes, even if a user resets the screen's background. Previously, this was not a part of the WGA notification for Windows XP Pro.
Another new feature of the alert system is to put the PC into "persistent desktop notification" mode, with a banner at the bottom of the screen informing the user that the copy of Windows is not genuine. The notification is translucent and users can interact with any objects underneath it; however, it will continue to appear on the screen until a user installs a genuine copy of Windows.
Microsoft said the update to WGA also simplifies the installation of the alert system on Windows XP Pro. In addition, the company has improved its ability to detect non-genuine copies of Windows.
Users have had mixed reactions to the WGA program, which Microsoft launched two years ago as part of an aggressive program to eliminate counterfeit and pirated versions of Windows. While some think it's a good way for Microsoft to prevent use of non-genuine Windows software, others found the program irksome and an intrusion, particularly when it would peg systems they knew to be genuine as pirated or counterfeit.
The program even at one point was thought to be acting like spyware by sending information from people's computers back to Microsoft. However, Microsoft said it only provides information about whether the copy of Windows is genuine, not any other information about the user or the PC.
Microsoft first distributed WGA only to users of Microsoft's download services who wanted to install add-on software, excluding security releases, for Windows XP. Eventually, it became an automatic part of Microsoft's update services and then was built directly into Windows Vista as the company developed that OS.

Monday, August 25, 2008

Time Working Against AMD's Asset-light Plans

The clock is ticking for Advanced Micro Devices.
As part of AMD's efforts to recover from the damaging delay of its Barcelona quad-core processor in 2007, the chip maker wants to spin off its two manufacturing plants, called fabs, as part of a strategy called asset light. The move will turn AMD into a fabless chip maker, reliant on contract chip makers to produce all of its chips but no longer saddled with the massive capital expenditures and R&D programs required to keep pace with advances in semiconductor technology.
Spinning off the manufacturing division would also give AMD, which is burdened with US$5 billion in long-term debt, a badly needed infusion of cash. AMD executives hope this strategy will turn the company into a more formidable competitor for Intel.
There's just one problem: AMD hasn't found a buyer.
AMD has two plants, Fab 36 and Fab 38, both located in Dresden, Germany. The newer plant, Fab 36, makes chips using 300-millimeter wafers, which offer better economies of scale than the 200-millimeter wafers used in older plants. The other one, Fab 38, started out using 200-millimeter wafers and is in the process of switching to 300-millimeter wafers, a transition that should be complete early next year.
One of the reasons AMD hasn't found a buyer for these plants is that the company continues to lose money, to the tune of US$1.2 billion during the second quarter alone. That loss helped push down the value of shareholders' equity in the chip maker to $1.5 billion, down from $3 billion at the end of 2007.
During the same period, the amount of cash held by the company fell from $1.9 billion to $1.6 billion.
Coupled with expectations that AMD will continue to report net losses for the rest of this year, these financial factors make it more difficult to sell off the chip plants, said Craig Berger, an analyst at Friedman, Billings, Ramsey & Co., in a research note.
"We think the logistics of getting such a deal done are very challenging, particularly with AMD's equity value and cash position falling everyday. After all, why would a firm want to do a deal with AMD today when it can just wait a little longer and do the deal even cheaper?" he wrote.
Nevertheless, AMD executives are counting on a deal to unload these plants and shore up the company's balance sheet. The Barcelona product problems are history and AMD has been hitting milestones laid out on its roadmap with regularity this year. Most recently, the release of the company's Puma notebook platform and its latest ATI graphics cards gave the company a boost.
There's more to come in the months ahead. Later this year, AMD plans to release an improved version of its quad-core server chips, called Shanghai, and will introduce a line of processors in 2009 that combines multiple CPU cores and a graphics processor on a single piece of silicon. But given the scale of AMD's financial problems, great technology and products are not enough to restore the company's financial health in a relatively short period of time.
To that end, AMD executives want the company's fabs sold off as soon as possible.
During a July conference call with investors, Hector Ruiz, AMD's chairman and former CEO, who is overseeing the asset-light plans, told analysts he expected a deal to be completed by the end of this year. That timing is critical. During the same call, AMD CFO Robert Rivet warned he would feel "nervous" if the company's cash fell to $800 million, suggesting AMD would need to turn to the capital markets at that point.
That could happen during the first quarter of 2009, when the company is estimated to have $855 million in cash left, Berger said.
"We are assuming that AMD does not reach operating profitability in 2009 and will not successfully transition to an asset-light strategy by the end of the year," he wrote.

Intel's Future: Real Transformers and Power by Wi-Fi

The intelligence gap between man and machine will largely close by the year 2050, according to Intel Corp.'s chief technology officer, who last week reiterated that point during a keynote address at the Intel Developer Forum.
At the IDF event in San Francisco, Intel CTO Justin Rattner said that the chip maker's research labs are working on human-machine interfaces and looking to foster big changes in robotics and in the ability of computers to interact with humans. He specifically pointed to work that Intel is doing on wireless power and on developing tiny robots that can be programmed to take on the shape of anything from a cell phone to a shoe or even a human.
"The industry has taken much greater strides than anyone ever imagined 40 years ago," Rattner said. "There is speculation that we may be approaching an inflection point where the rate of technology advancements is accelerating at an exponential rate, and machines could even overtake humans in their ability to reason in the not-so-distant future."
Just last month, Rattner, who also is a senior fellow at Intel, made similar comments in an interview with Computerworld, saying that perhaps as early as 2012, the lines between human and machine intelligence will begin to blur. The intelligence gap should become awfully narrow within the next 40 years, he added, predicting that by 2050, computing will be less about launching applications and more about using systems that are inextricably woven into our daily activities.
In that same vein, Rattner talked about programmable matter during his IDF speech. He explained that Intel researchers are working to figure out how to harness millions of miniature robots, called catoms, so they could function as shape-shifting swarms.
"What if those machines had a small amount of intelligence, and they could assemble themselves into various shapes and were capable of movement or locomotion?" he said. "If you had enough of them, you could create arbitrary shapes and have the assembly of machines that could take on any form and move in arbitrary ways."
The basic idea is that the catoms, which one day should be about the size of a grain of sand, could be manipulated with electromagnetic forces to cling together in various 3D forms. Rattner said that Intel has been expanding on research work done by Seth Goldstein, an associate professor at Carnegie Mellon University.
"We're actually doing it for real," Rattner said. He added that Intel started "at the macro scale," with catoms that were "inches across." The robots had microprocessors associated with them and could attract or repel one another via electromagnetism or the use of electrostatic charges, according to Rattner. "It's programmable matter," he said.
During his speech, Rattner showed off millimeter-scale 3D catoms and said that electronics could be embedded inside the miniature robotic spheres.
Jason Campbell, a senior researcher at Intel, said in an interview that the development and use of catoms will change the way people interact with computers and other devices in significant ways.
"Think of a mobile device," Campbell said. "My cell phone is too big to fit comfortably in my pocket and too small for my fingers. It's worse if I try to watch movies or do my e-mail. But if I had 200 to 300 milliliters of catoms, I could have it take on the shape of the device that I need at that moment." For example, the catoms could be manipulated to create a larger keypad for text messaging. And when the device wasn't being used, Campbell said, he could command it "to form its smallest shape or even be a little squishy, so I can just drop it in my pocket."
Campbell envisions that each catom would have a computer processor and some form of memory. Four years ago, he thought it would take 30 to 50 years for this kind of technology to be realized. Now, though, he estimates that the time it will take is much closer to 10 years.
Both Campbell and Rattner said the biggest obstacle will be figuring out how to make the micro-bots think like a swarm. Instead of sending individual directions to each catom, one set of instructions will have to be sent to make the catoms work together, so each one takes the correct position to create the desired 3D shape. But both were optimistic that it will happen, eventually.
"Sometime over the next 40 years, this will become everyday technology," Rattner said in an interview before his speech. And could catoms actually take human form? "Sure," he said. "Why not? It's an interesting thing to speculate on."
Wireless Power
Another technology that Rattner said will change the way users deal with computers is wireless power. Imagine, he said, being able to take your laptop, cell phone or music player into a room and have them begin to charge automatically. What if it could be done in a certain area of an airport or at your office desk? No more power cords. No more need to find a place to plug in.
Working off of principles proposed by MIT physicists, Intel researchers have been working on what they're calling a Wireless Resonant Energy Link. During his keynote, Rattner demonstrated how a 60-watt light bulb can be powered wirelessly and said that doing so requires more power than would be needed to charge a typical laptop.
"Wouldn't it be neat," he said in the interview, "if we could really cut the cord and not be burdened with all these heavy batteries, and not worry if you have the charger? If we could transmit power wirelessly, think of all the machines that would become much more efficient."
Joshua Smith, a principal engineer at Intel, said in a separate interview that the company's researchers are able to wirelessly power the light bulb at a distance of several feet, with a 70 percent efficiency rate -- meaning that 30 percent of the energy is being lost during the power transfer.
Even so, "it's a big step," said Smith. Within a few years, he envisions having laptops that recharge themselves via a wireless connection if they're within 10 feet of a base station.
"You could certainly imagine integrating it into existing computer equipment," Smith added. "You'd have power hot spots in your house or office. Where you have a wireless hot spot, you could [also have a power hotspot] and get data and power there. That's the vision."

'Gorgeous Geeks' Offer Tips for Women in IT

Volunteer organisation 'Gorgeous Geeks' maintains that work-life balance is even more crucial for women IT professionals. This was a key message at the annual "Women in Technology" event at Microsoft Tech.Ed SEA 2008 last week in Kuala Lumpur.
The forum, moderated by the government agency Multimedia Development Corporation (MDeC) vice president Ng Wan Peng, was driven by four prominent women in the technology field -- Microsoft Malaysia managing director Yasmin Mahmood, Woman Entrepreneur Network (WENA) ICT Bureau's Nuraizah Shamsul Baharin, mobile service provider Digi corporate communications head Yohani Yusof, and events firm Crystal Edge Sdn Bhd founder and managing director Suriza Hing Abdullah.
Microsoft's Mahmood said that women -- who form 30 percent of about Malaysia-based 70,000 IT professionals -- need to be "equal to or better than" their male counterparts, while simultaneously seeking the right balance in their personal and professional lives.
Gorgeous Geeks president Alecia Heng said, "An impressive 40 percent of the 3000 attendees of Tech.Ed SEA 2008 were women and the forum was specially engineered for women who are interested in starting a career in IT."
More than 160 participants attended the session, co-hosted by MDeC and Microsoft Malaysia together with Gorgeous Geeks, a group of women volunteers from the IT industry who provide mentorship and inspiration to other women to join the industry.
Maintain a Positive Mindset
Panelists shared personal insights on leadership, entrepreneurship and the avenues of growth that could lead to rewarding and fulfilling careers for women.
On combating challenges to be faced, Nuraizah Shamsul Baharin emphasised the importance of networking groups where members share advice and experiences.
Yohani Yusof, who has been in the IT industry for 21 years, said, "Whether you're an entrepreneur or working for a corporation, there is a need to network."
On a personal level, the panelists advised the potential women leaders on the importance of work-life balance and the challenges of managing expectations in the workplace as an entrepreneur and a leader. Suriza Hing Abdullah said, "Maintaining a positive mindset is key, especially when you face obstacles."
Networking Support from MDeC
Ng highlighted that one of MDeC's initiatives, the MSC Malaysia Capability Development Programme "was specifically designed to suit the needs of the local ICT industry to raise standards of MSC in Malaysia as the ICT hub and we strongly believe that women will benefit from the program."
The MDeC programme aids companies in way of financial assistance, education & awareness to maximize their potential in adopting global best practices and process improvement frameworks.
Mahmood said, "There is a lot of richness that comes from women, and there a lot of initiatives to bring women into the workforce because women definitely help contribute diversity."

Debating the Firefox SSL Certificate

Debate is reaching a fever pitch over a new security feature in Firefox 3.0 that throws out a warning page to users when a Web site's SSL certificate is expired or has not been issued by a trusted third party.
Critics say that Firefox 3.0 is putting undue fear and confusion into everyday Web surfers, makes it difficult to set exceptions for certain Web sites, and is forcing Web site operators to do business with specific vendors of SSL certificates or risk the appearance that their Web sites are broken.
Browsers require SSL certificates to initiate encrypted communications and to validate the authenticity of a site.The Mozilla.com Web site, where Firefox 3.0 can be freely downloaded, defends the new feature, saying SSL certificates not issued by a validated certificate authority -- so-called self-signed certificates (SSC) -- don't provide even basic validation; and expired certificates should not be viewed as "harmless" because they open avenues for hackers.
Mozilla officials say the new feature helps curb electronic eavesdropping or so-called "man in the middle" attacks.
The certificate issue is cropping up on such major sites as the U.S. Army's, which uses certificates issued by the Department of Defense. In the Army's case, Firefox does not recognize the DOD as an authorized certificate provider. Firefox, therefore, rejects the Army site's certificate and defaults to a Web page showing a traffic-cop icon and proclaiming "secure connection failed" and that the site's certificate can not be trusted.
The problem also has surfaced with expired SSL certificates on such sites as Google Checkout and LinkedIn. The issue also could crop up on intranet sites that use SSCs and force IT administrators to configure exceptions within the browser or other workarounds.
Some are saying that Firefox 3.0 is out of line.
The Pingdom.com blog this week took Mozilla to task, saying the issue could affect tens of thousands of sites. "People most in need of a clear and explicit warning regarding SSL certificates are inexperienced users, and those are not very likely to understand the error message that Firefox 3 is displaying. A large portion will simply be scared away, thinking that the Web site is broken," according to the blog.
Developer Nat Tuck called the Firefox feature bad for the Web in a blog post he wrote July 31: "Mozilla Firefox 3 limits usable encrypted (SSL) Web sites to those who are willing to pay money to one of their approved digital-certificate vendors. This policy is bad for the Web."
Tuck concedes that the SSCs provide no value for authenticating a Web site, but he says Firefox is ignoring the encryption capabilities of SSL certificates, which thwart snooping on Web traffic.He even goes so far as to suggest perhaps open source advocates should create a derivative of the open source Firefox code that includes full SSL functions.
Mozilla.com officials says SSCs have been treated as "disconcerting" for some time by the open source browser and what changed in Firefox 3.0 is an attempt to make users understand the potential consequences of accepting such certificates.
The officials directed inquires on the certificate topic to a blog penned by Mozilla developer Jonathan Nightingale, who wrote that one reason for the changes is that man-in-the-middle attacks "used to be the stuff of scary security fiction, but now they are point-and-click." Some of these attacks were highlighted at the recent Black Hat conference.
Home cable and DSL routers, and Wi-Fi access points can be compromised easily by hackers, who can reconfigure the boxes and route traffic anywhere they want, Nightingale wrote. "The only thing that will tell you whether the sites you are visiting are real is the existence of a trusted certificate, which only the legitimate site can have," he added.
Nightingale wrote that SSCs are not evil, but the question is can they be trusted? "So we ask the user," he wrote. He also pointed out that users can create exceptions, in essence telling the browser to trust specific site certificates.
Nightingale did admit the SSL feature isn't above questioning. "I don't think the approach in Firefox 3 is perfect, I'm not sure any of us do," he wrote. And he invited input and solicited help making browsing safer: "It sure would be nice if we didn't start from the assumption that changes are motivated by greed, malice, or stupidity."

Friday, August 22, 2008

New Algorithm Offers Hope for Old Routers

A team of computer scientists has proposed a new algorithm that makes routers operate more efficiently by automatically limiting the number of network route or link-state updates they receive.
The algorithm could be important in large heterogeneous corporate networks where the oldest, slowest routers make all the others wait while they absorb updates and recalculate their path tables. The Approximate Link State (XL) algorithm suppresses the updates so only those routers that are directly affected receive them, says Professor Stephan Savage, who along with three other computer scientists at the University of California at San Diego developed the algorithm. Savage presented a paper this week about XL at the Association for Computing Machinery's conference of its Special Interest Group on Data Communications.
Without XL, routers typically flood the network with route updates, with every router receiving every update. In very large networks the sheer number of routers and inevitable link-state changes would episodically grind routers to a halt.
"Updates may only be relevant to very localized areas," Savage says. Using a map analogy to illustrate the point, he says that a driver on the East Coast doesn't care if Interstate 5 is flooded out in Portland, Ore.. "But metaphorically, we tell everyone that information in networking."
To deal with that problem, large networks are manually engineered to create areas -- conceptually isolated groups of routers -- that limit the number of routers any flood reaches. Routers still receive floods, but only from the routers within their areas.
Savage says XL can eliminate manual configuration of areas. Instead, each router automatically figures out what other routers it needs should pass along updates so all destinations can still be reached and loops don't occur that effectively black-hole packets.
The XL algorithm selectively withholds some updates, creating a trade-off. If a new link becomes available after a failure, the algorithm decides whether forwarding the information beyond a router's immediate neighbors will improve enough paths and by a great enough percentage to warrant passing it along.
If not, the router suppresses the update by not forwarding it. The result is updates are sent only to the immediate areas where topology has changed, making the distribution less disruptive.
The Trade-off
That benefit is balanced against the fact that employing the algorithm means each router has less precise information about what the state of the network actually is.
Each router with XL would maintain data about its neighbors' shortest path tree -- how its neighbor views the network -- and uses that to determine whether to forward path updates. That would increase the amount of data routers keep, but Savage says his team thinks that additional data is very small.
In big networks, overall performance is limited by the slowest router. "That's the router you're waiting for so the new network configuration can converge," Savage says.
Because buying cycles for routers may vary within very large networks, older slower routers can have a big impact, he says. "Scalability may be limited by stuff you bought 10 years ago that you can't afford to replace yet," he says.
The algorithm is compatible with Intermediate System-to-Intermediate System and Open Shortest Path First link-state routing, he says, which means the software upgrade containing the algorithm could be deployed incrementally and would interoperate with existing router protocols. The goal in these networks would be to optimize paths based on a given parameter such as latency or bandwidth, he says.
Getting XL into practical use would require router makers to incorporate it in their software, Savage says. "It would need to be embraced by vendors. If Cisco picked it up it would have impact," he says.
He has already briefed Cisco -- which helped fund his research through the Center for Network Systems.

Mozilla Names Best Firefox 3 Add-Ons

Mozilla Corp. today announced the winners of its third "Extend Firefox" contest, an annual competition that recognizes the year's best Firefox add-ons.
This year's contest, which launched in March and brought in more than 100 entries, focused on add-ons that took advantage of Mozilla's newest open-source browser, Firefox 3.0.
Mozilla unveiled the final version of Firefox 3.0 in June after a long series of alpha, beta and release candidate versions.
The company awarded three grand prizes in the "Best New Add-on" category to Pencil, a diagramming and graphics interface tool; Tagmarks, which adds additional tagging icons to Firefox 3.0's location bar; and HandyTag, an extension that provides relevant keywords for associating with bookmarked sites.
In the "Best Updated Add-on" category, Mozilla also pegged three winners, including Read it Later, a bookmarking substitute; TagSifter, which lets users browse bookmarks by their tags; and Bookmark Previews, an extension that adds album and thumbnail views of bookmarked sites.
It's not surprising that many of the contest winners add functionality to Firefox's bookmarking; Mozilla revamped the tool by adding an integrated search engine, single-click bookmarking and other features.
Mozilla also singled out Fire.fm in the "Best Music Add-on" category. The extension lets users access the music library on Last.fm, the U.K.-based Internet radio Web site.
The winners in the new add-on group will be given travel and accommodations to the Mozilla Developer Day of their choice, as well as new Apple Inc. MacBook Air notebooks. The three developers awarded prizes in the updated add-on category will receive a 15-in. MacBook Pro notebook, while the author of Fire.fm will be flown to London to visit the headquarters of Last.fm.
The contest also recognized six runner-ups in the new add-on category.

Android Apps Might Not Feature Bluetooth

While developers have been hard at work building Android applications that can use GPS (Global Positioning System), Wi-Fi and cameras, they just discovered they likely won't be able to offer applications that use one common mobile phone feature: Bluetooth.
The most recent Android SDK (software development kit), released on Monday, says that Android 1.0 won't include a "comprehensive" Bluetooth API (application programming interface).
Developers aren't exactly sure what that means and a Google spokeswoman said the company plans to elaborate later on Friday in a blog post.
Some developers contributing to Google's Android forum say they find it hard to believe that Android 1.0, the first version of the Linux-based mobile operating system expected to become available soon, won't support Bluetooth. "HTC would not release a smartphone in this day and age that lacked Bluetooth support," wrote a developer going by the name Jeff Craig on the forum.
HTC's Dream phone is expected to be the first on the market to run Android software.
Google may plan to build support for Bluetooth into Android so that end users can wirelessly link standard Bluetooth gear, such as ear pieces, to the phone. But a lack of APIs would mean that developers couldn't build applications that use Bluetooth.
Some developers have focused on the word "comprehensive" to surmise that a future SDK update that Google has said might come in September could include very basic Bluetooth support.
End users and developers alike have eagerly anticipated the release of Android. Google's software along with Apple's iPhone software are rare new entrants into the mobile phone market.
While recent rumors suggested that Android would be released later than expected, Google has maintained that the first Android devices are on schedule to appear before the end of the year.

Vista Users Rush for SP1; XP Owners Dawdle on SP3

Windows Vista users jumped at Microsoft Corp.'s troubled operating system's first service pack, but people running Windows XP haven't been in much of a hurry to install its newest service pack update, a Windows performance and metrics researcher said today.
According to Devil Mountain Software Inc., by the end of July, 86% of the machines in its community-based Exo.performance.network (Xpnet) running Vista had been upgraded to Service Pack 1 (SP1).
That was a 17-point increase over the 69% who reported running Vista SP1 at the end of April, six weeks after Microsoft released the major update.
"There was pent-up demand for Vista SP1," said Craig Barth, chief technology officer at Devil Mountain. "If users are frustrated with a platform, they're going to be more likely to go out and snag any update that purports to fix the problems."
Meanwhile, Windows XP users have apparently felt less pressure to download and install that aged operating system's Service Pack 3 (SP3), which was released in early May.
The service-pack uptake difference between Vista and XP has been dramatic. Where more than two-thirds of the network's Vista users had grabbed SP1 within six weeks, fewer than half -- just 47% -- of XP users had updated to SP3 by the end of July, more than 12 weeks after Microsoft first posted it for download.
"Windows XP users were generally happy with Service Pack 2," Barth said. "There was not a huge clamor for [Windows XP] SP3 like there was for Vista SP1, and that shows in the results. It's pretty clear that a lot of XP users are very content with SP2."
The well-publicized troubles that some users had with XP SP3 -- including endless reboots after installing the service pack on PCs equipped with processors made by Advanced Micro Devices Inc. -- may have had some impact on its uptake, Barth acknowledged.
But Microsoft's own emphasis may also have played a part. "Microsoft didn't promote XP SP3," he said. "They heavily promoted Vista SP1, and went out of their way to put a good foot forward for it. But they barely mentioned XP SP3."
Devil Mountain's Xpnet collects data from more than 3,000 PCs, 70% of which run Windows XP, Barth said. Users can join the network by downloading and installing a small utility, DMS Clarity Tracker Agent, from Devil Mountain's site.

Wednesday, August 20, 2008

OQO Shows off Handheld Computer Based on Atom

Handheld computer maker OQO showed off an Atom-based device at the Intel Developer Forum in San Francisco, a significant design win for the chip maker.
Pictures of the Atom-based OQO device published by mobile-computing blog UMPC Portal show a device that looks identical to the company's Model e2. Unveiled with other Atom-based computers, the OQO device is marked with a piece of white tape labeled "OQO MID," a reference to mobile Internet device, the term Intel uses to describe small handheld computers.
Details about the Atom-based OQO, including pricing and availability, were not available.
OQO's current Model e2 lineup uses processors from Via Technologies, a Taiwanese processor supplier that was first to see a market for low-power processors that could fit inside mobile computers and embedded applications. The OQO handhelds have won praises for their polished design and addition of features, such as support for high-speed mobile networks and use of solid-state drives in some versions.
The appearance of the Atom-based OQO device is an important milestone for Intel. The company's sales executives tried to convince OQO to switch from using Via's C7 processor to one of its own processors for some time without any luck, according to a source familiar with the relationship between the chip maker and OQO. That changed with the release of Atom.
In many ways, Atom is Intel's answer to the C7, targeting a product niche that Via carved out for itself when Intel and rival Advanced Micro Devices were instead focused on chips that ran at ever higher clock speeds, and generated increasing amounts of heat. With the release of Atom earlier this year, Intel signalled its intention to compete in this segment of the market as seriously as it does in all others and put Via's C7 squarely in its sights.
Like Via's C7 series, the Atom processor is also designed to consume little power. The chip is available in two versions, one designed for handheld computers that is paired with a single-chip chipset, and a second for laptops that uses a standard two-chip chipset.
It was not immediately clear which Atom version is used in the OQO, but the MID label suggests the computer likely uses the version with the single-chip chipset, a platform formerly called Centrino Atom. That version of the processor, the Z series, is available at several clock speeds, ranging from 800Mhz to 1.86GHz.
OQO executives could not be reached for comment.

Microsoft Sends Up Trial Balloons for Windows 7

Windows Vista hasn't fared so well since its debut. Its generally low reputation among customers has led one Forrester analyst to dub Microsoft's latest OS "the New Coke of tech," while some studies have suggested that nearly a third of customers who buy a PC with Vista pre-installed may actually be downgrading those machines to XP.
Still other customers seem to wish the whole thing will just go away. They don't want to hear about Vista at all -- they'd rather hear about Windows 7, the upcoming OS from Microsoft that will be Vista's successor. And given the dismal consumer reaction to its latest attempts to market Vista, Microsoft seems willing to oblige. The sketchy early reports of Windows 7 have lately grown into a steady trickle of hints and rumors. The catch is, not all of it sounds particularly encouraging.
Perhaps because of the beatings it so often receives from the press, Microsoft seems to want you to get your Windows 7 news from the horse's mouth as much as possible. To that end, the Windows team has launched a new blog to chronicle the Windows 7 engineering efforts in detail. Senior Windows 7 product managers Jon DeVaan and Steven Sinofsky promise to "post, comment, and participate" regularly.
Among the factoids revealed in the blog so far: The workforce tasked with assembling the forthcoming OS is immense, and it's dense with middle managers. As many as 2,000 developers may be involved, according to reports. That sounds like a truly Herculean project-management undertaking -- and indeed, if the figures quoted in the Windows 7 blog are to be believed, Microsoft has staffed up with one manager for every four developers. It's enough to make one wonder how Windows 7 will avoid the implementation failures and missed deadlines that plagued Vista's launch.
The engineering blog isn't the only evidence of Microsoft's recent lip-loosening, either. Elsewhere this week we learned even more interesting information. We've known for a while now that Windows 7 is expected to build on the Vista code base, rather than reinventing any substantial portion of the Windows kernel. As it turns out, however, the next version of Windows may be even closer to the current one than we expect.
According to Microsoft spokespeople, the server version of Windows 7 will be considered a minor update, rather than a high-profile new product. In fact, it's expected to ship under the name Windows Server 2008 R2 -- a designation that suggests it will offer few features that aren't already available in the current shipping version of Microsoft's server OS.
As tantalizing as these tidbits of information may be, however, hard facts about Windows 7 remain scarce. At this stage, any talk about the forthcoming product counts as little more than free marketing. As long as we all keep talking about Windows in some form or another, the less likely we are to jump ship to Mac OS X or (heaven forbid) Linux.
According to Microsoft, however, developers can expect to get their first in-depth look at the new OS at the Professional Developers Conference (PDC) conference in October, and further information will be revealed at the Windows Hardware Engineering Conference (WinHEC) the following week. Until then, expect the rumor mill to remain in full force.

Move Over Quad-Cores, Intel's Ready to Ship 6-Core Chips

The quad-core chips that have sat at the top of the microprocessor heap for two years are about to begin to be replaced by their bigger, burlier older brother - the 6-core processor.
Pat Gelsinger, senior vice president and general manager of Intel Corp.'s Digital Enterprise Group, this afternoon announced that the company is set to release its 6-core Xeon processor for expandable servers in September. Dubbed Dunnington, the Xeon processor X7460 will be built with Intel's new 45-nanometer Penryn technology, the company said.
"The big cache and six cores will give customers a nice bump in performance," Gelsinger said previously. "We're quite excited about it."
Moving beyond quad-core processors, which to date have been the high-water mark in the semiconductor industry, is a major step - one that keeps Intel well ahead of rival Advanced Micro Devices Inc., according to Dan Olds, principal analyst with the Gabriel Consulting Group.
"This is a big deal," said Olds. "It looks like, at least from the benchmarks we're seeing, that six-core chips offer more performance than quad-cores. So, yes, customers are going to want them. What we don't know is how much power the chips consume and how much heat they will dissipate, and those are key concerns. But, all in all, this is a pretty big advance in the state-of-the-art and all the major vendors are on board."
AMD, which has been getting its feet under itself after a rocky 2007, has shipped a lot of new products this year, including triple-core Phenom processors, along with quad-core Phenoms, graphics chips and chipsets.
But AMD hasn't yet launched its 45nm processor and it's not slated to release its upcoming 6-core Istanbul server processor until the second half of 2009 - about a year after Intel version ships.
"This will put a bunch of pressure on AMD," said Olds. "These chips outperform anything AMD has and probably win on price/performance too. This could cut AMD's share of the server market considerably."
And with Intel ahead of AMD in terms of market share, nanometer manufacturing and core-size, it's pressure that AMD could do without right now.
"Intel doesn't have to crank up chip performance right now to thwart AMD," added Olds. "Their current products handily outperform AMD on server products. From my own research, I can tell you that x86 server customers have moved away from AMD and towards Intel. So Intel didn't need to release this stuff now to catch up to AMD or to top them, but they're doing it anyway -- just to keep the pressure up on AMD and on themselves."
While it may be a shot at AMD, are customers eagerly waiting to buy 6-core machines? Maybe not, according to Jim McGregor, an analyst at In-Stat.
"If software can't take advantage of it, what does it buy you," said McGregor, noting that a lot of software today still isn't designed to take advantage of quad-core processing. "You still have software partitioning issues and when you put more cores on a chip, you have to run it slower or increase the power consumption budget or thermal limitations."
He added that if he had a choice, he would bypass the new 6-cores and hold out to buy 8-core machines. Intel's Nehalem, which is expected to go into production in the fourth quarter of this year, is designed to scale from two cores to eight cores.
Olds, however, said more server software than desktop applications are designed to take advantage of multi-cores.
Intel hasn't announced when it might release 6-core chips for the desktop and laptops.

Monday, August 18, 2008

Intel Plans to Support 2.3GHz, 3.5GHz WiMax Next Year

Intel plans to extend the frequency ranges supported by its WiMax chipset next year beyond the 2.5GHz profile, according to a company executive.
WiMax support currently is an option with Intel's Centrino 2 chip package for laptops, but Intel's WiMax chipset only supports the version of the technology that uses 2.5GHz spectrum. This version of WiMax is being rolled out in the U.S., where operator Clearwire plans to launch services in three cities before the end of this year.
"For 2009, we will start supporting other markets outside of the U.S., at 2.5GHz and in other spectrum profiles," said Garth Collier, general manager of WiMax at Intel Asia-Pacific.
While Collier did not specify the additional WiMax profiles Intel plans to support next year, there are only three supported by the WiMax Forum's interoperability testing: 2.3GHz, 2.5GHz and 3.5GHz. Since Intel already supports 2.5GHz, Collier's reference to support for multiple additional profiles suggests the chip maker will add support for the remaining 2.3GHz and 3.5GHz profiles to its product lineup.
Collier's comments offer the first glimpse into Intel's future WiMax product plans. Until now, the company has only said it plans to add support for 2.3GHz and 3.5GHz at a future date, without specifying when that might happen.
Asked about the official ambiguity that surrounds release dates for WiMax chipsets that support 2.3GHz and 2.5GHz, Collier suggested a 2009 release for such a product may not be set in stone.
"We haven't given out a definite date," Collier said. "It's one of those chicken and egg situations, it's dependent on the development of networks, how much coverage they have [and] what the underlying demand is."
By those metrics, the release of 2.3GHz and 3.5GHz chipsets should happen sooner rather than later. Some of the world's largest commercial WiMax networks are in South Korea, which uses the 2.3GHz profile, and Pakistan, which uses 3.5Ghz spectrum.
If Intel does plan to release products that support additional WiMax profiles in 2009, such an announcement could be made at Intel Developer Forum conference, which is being held in San Francisco this week.

Microsoft Faces Taiwan Antitrust Investigation

Taiwan's Fair Trade Commission has launched an investigation into whether or not Microsoft holds a monopoly position over the island's software market and if it abuses such a position, an official said Monday.
The government investigation into Microsoft will also look into complaints Microsoft is limiting consumer choices by restricting the availability of Windows XP on new PCs and whether or not pricing of Microsoft products is fair to consumers on the island.
Taiwan's investigation is unique in that no other country where Microsoft has traditionally faced regulatory issues, including the U.S., Europe and South Korea, is currently looking at the company for the same reason.
"Taiwan doesn't have its own (OS) software," said an official from the Fair Trade Commission. "Most people in Taiwan use Microsoft software and depend on it for work. Their market share should be very high," she said.
Should the world's largest software maker be found to have broken Taiwanese antitrust laws, the company could face a fine of up to NT$25 million (US$797,361) as well as be forced to change some of its business practices on the island.
"We fully intend to comply with the process and make sure they get all the information they need," said Matt Pilla, Microsoft's director of public relations in Asia.
Taiwan's investigation was launched in part due to urging by Taiwan's non-profit Consumers' Foundation.
The group last month called on Microsoft to continue selling Windows XP as an option on all new PCs, saying that discontinuing sales of the OS would violate Taiwanese antitrust laws. The Consumers' Foundation alleges that Microsoft is using its market position to try to force people in Taiwan to switch to Windows Vista.
The foundation conducted a survey on the island that found 67 percent of consumers are opposed to Microsoft's decision to stop selling XP at the end of June.
The main complaint is against a lack of choice when people buy new computers. Around 56 percent of survey respondents who had bought a new computer recently were told they could not buy Windows XP and instead were forced to purchase Vista, the foundation said.
The foundation said Microsoft controls 98 percent of Taiwan's OS market share, with 75 percent of survey respondents using Windows XP on their PCs and 23 percent using Vista.
A majority of respondents to the survey, over 53 percent, said they did not think Vista is as useful as XP, while 23 percent said Vista is the better OS.
Pilla pointed out that Microsoft has extended XP's life beyond traditional norms for the company, including allowing it to be sold on certain systems meant for businesses until June 30, 2009 and on ultra-low cost PCs through June 30, 2010.
Extending the life of an older product isn't easy, he said. By extending the dates of usage, Microsoft also has to extend the time it will support Windows XP, which now stands at April 2014.
Long after it will cease being sold, the product will still have to be updated with new hardware drivers and other software support.
In addition, most of Microsoft's software developers are already working on Vista, so the company has to reallocate resources to continue working on XP.
Taiwan's Fair Trade Commission investigation is at least the third action taken against Microsoft in recent years.
In 2004, the commission worked with Microsoft to resolve disputes around Windows Media Player after a ruling by the E.U. found Microsoft guilty of trying to destroy competition in that market.

HP Ruggedizes Ultraportable Laptops

Hewlett-Packard on Monday introduced ultraportable laptops that can easily switch between wireless 3G broadband networks and withstand harsh environmental conditions.
HP's EliteBook 2530p ultraportable and EliteBook 2730p tablet PC have been tested to meet the U.S. military's standards to withstand harsh elements like high altitude and temperatures, the company said.
The laptops, targeted at business users, have also been designed for sensitive parts like hard drive or displays to withstand impact on falls, said Keith LeFebvre, vice president and general manager at HP.
Gobi wireless technology from Qualcomm allows the laptops to seamlessly switch between 3G networks, LeFebvre said. With the technology, users can decide to change 3G networks without changing the hardware. 3G is a type of mobile broadband technology offered by cellular providers that allows people to access the Internet wirelessly.
All users have to do is change firmware, which changes the 3G radio. The chip supports EV-DO (Evolution-Data Optimized) and HSUPA (High-Speed Uplink Packet Access) 3G technologies. In the U.S., users will be able to switch between 3G services from Verizon, Sprint and AT&T, LeFebvre said. Depending on the wireless technology, users will be able to switch between other 3G networks worldwide.
The new laptops also have a unique feature to read business cards and convert them to text. Users can place a business card into a slot carved into the front of a notebook, after which the Webcam shoots its picture. Software extracts important contents from the business card -- including names, addresses and phone numbers -- and converts it to text. The data is then transferred to a file.
The EliteBook 2530p comes with a 12.1-inch screen and weighs 3.19 pounds (1.45 kilograms), with prices starting at US$1,499.
The EliteBook 2730p tablet PC comes with a 13.1-inch screen, weighs 3.7 pounds and is priced starting at $1,670.
The laptops run on Intel Core 2 Duo low-voltage or ultra-low voltage processors, support up to 8G bytes of RAM and support for additional wireless networking including Wi-Fi and Bluetooth. Storage options include hard drives or 80G-byte solid-state drives. The laptops come with Intel's integrated graphics technology.
Running Microsoft's Windows Vista OS, users will have the option to downgrade to Windows XP. The laptops will be available worldwide, according to the company.
HP is the world's top PC vendor, followed by Dell, which is closing in on the top spot. During the second quarter of 2008, HP shipped 13.32 million units worldwide for an 18.9 percent market share, followed by Dell, which held a 16.4 percent market share, according to IDC.

Cloud Computing: Hope or Evolution?

Companies may be expecting too much from cloud computing, but other technologies such as video conferencing are starting to live up to their promises, according to a study released last week.
Gartner Inc. of Stamford, Conn. has published Hype Cycle for Emerging Technologies 2008, which discusses a wide range of technologies and separates the gee-whiz factor from the reality. The report includes an analysis of cloud computing, social networking, green IT, microblogging and telepresence.
"You really need to know why you're adopting these technologies," said Jackie Fenn, the vice-president and Gartner Fellow who contributed to the report. "You can't just leap in every time a new things gets hyped.
She cited as an example the cloud computing paradigm, whereby companies like Google and Amazon are offering to store user's data on servers and some firms can provide IT more as a utility.
With cloud computing, Fenn said, users are expecting "very flexible access" to IT resources, with a view that "one day IT will be as simple to use as electricity and you just plug it in."
But she adds in reality, it's not that simple.
"A lot of the hype will be around the promise that has been held out for a very simple to use, flexible approach to information technology, and the reality of achieving that is going to be out of line," with expectations, she said. "Whenever you try to achieve one of these big dreams it's inevitably harder to get there than you think, and it takes longer to get there than people tend to think."
A technology that's a bit closer to reality is that of telepresence, according to the report. Gartner cites as examples HP, which makes the Halo system, plus Cisco Systems Inc. and Teliris.
In the past, many users weren't satisfied with the video quality of conference systems, Fenn said, adding Telepresence has better quality but some companies are put off by the price.
"I think the quality of the experience that these systems have brought is very attractive and it does start to introduce an element of feeling that you are at least partially there," she said. "That has been missing in previous generations of the technology."
The study also noted microblogging, whereby users post short blurbs about their current activities, has "caught on" in some communities.
For collaboration, companies are taking different approaches, depending on their needs.
Some firms, she said, want to wait for enterprise-grade collaboration tools that address issues of privacy and security. But others are more inclined to take advantage of existing Internet sites like Facebook now because they think it's valuable for their companies.
"They will go to the actual Facebook or Myspace route and say, 'Yep, we want to bring it in and we'll figure out what the problems are once we do that.' Others will say, 'No we want to wait until we have something that's industrial strength.'"
Those who are early adopters of social computing platforms are attracted to them because they allow a wider range of communication.
"You have this peripheral vision of what's happening around you, that a lot of the more point-to-point collaboration tools don't give you," she said. "They are willing to put up with the unknowns that you get when you get popular consumer technologies into the enterprise."
Another set of technologies that's benefit to companies now is green IT, which is valuable in more ways than one, Fenn said.
"The happy thing about green IT is that the greater good is aligned with the selfish benefit of saving money," she said.

Vendors Rally While Windows Sleeps

Dell, Intel and their partners announced last week new technologies that represent major leaps forward for mobility. The companies seem to have discovered the secret to making such bold leaps: Cut Microsoft out of the deal.
One technology involves enabling users to gain instant access to a laptop's e-mail, browser and other basic functionality -- without booting Windows at all.
The second technology enables an Internet-based message to wake a Windows PC from sleep mode. It's useful both for VoIP applications and for anyone away from their PC who wants remote access.
These new technologies are perfect metaphors for what's happening in the industry. In both cases, Windows is asleep while Microsoft's own partners give users what they really want.
Let's have a look at the new technologies.
Dell Latitude ON
Dell announced this week a new feature called Latitude ON that enables the use of e-mail, Web surfing, basic PIM functionality and document reading -- all without booting Windows. The idea is to enable basic use without having to wait for the main OS to boot, and also to extend battery life.
A more accurate name than "Latitude ON" would have been "Windows OFF."
The codename was "BlackTop," a combination of "BlackBerry" and "laptop." The original aim of the project was to give users the same basic functionality of a BlackBerry using their laptops' full-size keyboard and screen.
What Dell is really doing here is building the equivalent of a secondary ASUS Eee PC into a full-featured, full-size laptop. The Latitude ON feature uses a low-power Intel Arm processor (just like the new Eee PCs), flash storage and Linux (SUSE Linux Enterprise Desktop 10) separate from the laptop's main CPU, hard drive and Windows OS. But unlike a subnotebook, the Latitude ON system won't allow you to install applications. It's essentially a "cloud computing" device that depends on the Internet for much of its functionality.
As far as I can tell, none of the applications are made by Microsoft. ON's custom Web browser is based on Firefox. E-mail, "diary" and contacts are, of course, non-Microsoft applications. But some Microsoft data types are supported in one way or another. The system, for example, includes viewers for Microsoft Office documents (as well as for Adobe PDF documents). The built-in organizer grabs the 100 most recent Outlook e-mail messages from the laptop's cache and displays them.
If you use only the Latitude ON system, battery life lasts not hours but days, according to Dell.
ON is expected to hit in two months for just some of Dell's laptops.
From a Microsoft perspective, Latitude ON represents a debacle comparable to the UMPC disaster. Microsoft led a big push to drive sales of Vista -based Ultra-Mobile PCs, all of which failed catastrophically in the market, rejected by users in favor of first Linux-, then XP-based subnotebooks.
Now, it's happening again. Remember Windows Vista Sideshow? The feature was part of a broader effort by Microsoft to provide basic functionality on laptops while the main Windows OS was in sleep mode. A tiny screen on the lid would display the UI. Obviously that failed, and now partner Dell is delivering roughly similar but vastly superior functionality using Linux and other non-Microsoft software.
Intel Remote Wake
Intel introduced new technology Thursday called Remote Wake, a chipset and software development kit that enables a PC to be "awakened" over the Internet when in sleep mode.
Intel worked not with software giant Microsoft, but Silicon Valley VoIP startup JaJah to build the company's software into the Intel chipset in some PCs. The Intel-JaJah combination will enable you to dump your landline phone and use a PC-based VoIP phone without leaving your PC on all the time. Other VoIP applications, such as Skype, can also take advantage of Remote Wake, but will need to be tweaked to support Remote Wake, then installed by the user. Orb Networks, Cyberlink and Pando Networks are also Intel partners on Remote Wake.
Remote Wake should also be useful beyond VoIP calls for things like remote, off-peak backups and for downloading media and other files. Remote Wake also makes PCs greener, because they don't have to be left on all the time.
You can check out a demo on the Pando site.
Again, from Microsoft's perspective, this is another disaster. It couldn't be more obvious that Microsoft and Intel should have partnered on this functionality 10 years ago. Microsoft has been pushing Remote Desktop and its communications software for years. But apparently it never occurred to anyone in Redmond that people might want to leave their PCs in sleep mode, then have them turn on for remote access or VoIP calls.
Based on all the information released so far, there is literally no downside (other than marginal additional cost) to either of these new offerings. They both improve life dramatically for mobile users.
The usefulness of these technologies stands in stark contrast to Microsoft Windows' ongoing slumber. When is the last time Microsoft rolled out something that boosted mobility the way these new features do?
The "old Microsoft" would have never allowed all this. The company would have leveraged its multi-billion dollar labs to figure all this out first, then, coerced Intel, Dell and the rest of the industry into supporting it. Now, Microsoft is on the sidelines while its closest partners innovate using companies that compete with Microsoft in the software marketplace.
When will Microsoft itself wake up from "sleep mode"?

Hosted E-Mail Explosion Forecast

The number of hosted e-mail seats will grow by nearly 40% in the next four years, with small-to-midsize businesses contributing to a portion of the boost, according to a new study by The Radicati Group.
The study shows there are nearly 1.6 billion hosted e-mail seats today and that number by 2012 will hit 2.2 billion, which represents an annual average growth rate of 9%. The numbers include both enterprise and consumer e-mail.
Two sources of fuel for the growth are the evolution of the technology and IT's changing attitude about hosting. The hype around cloud computing also is helping interest in online services.
"E-mail is more of a commodity, and now a lot more things are being outsourced," says Sara Radicati, CEO of Radicati. "And I think a lot of the solutions have been getting better. They are more professional, and more of the providers better understand the customer relationship and how to manage it."
Providers are adding other features as well, including archiving, compliance, security and social-networking tools to make hosted offerings more attractive, Radicati says.
The survey breaks hosting providers into three categories: hosted business e-mail that targets companies with one to 1,000 mailboxes; managed business e-mail, typically used by large enterprises; and ISP- or Web-mail services that are largely free and target consumers.
Smaller businesses are likely to go the hosted route, but large enterprises may find that hosted e-mail is a fit for regional offices, Radicati says.
"All companies are starting to analyze the costs and trade-offs of hosting," Radicati says. "The real trick is going to be which provider can reliably offer this on a worldwide basis. Companies don't want different deals with different providers in different countries and then try to tie the whole thing together."
The survey says that over the next four years, SMBs will switch from standard-based SMTP hosted mail to a more advanced collection of hosted messaging services centered on e-mail.
The hosted business-e-mail market probably will consolidate, the survey says, as Google with Google Apps and Microsoft with Online Services begin to work out the bugs and attract a critical mass of customers.
"But I think it is a totally wide-open space at the moment," Radicati says. "Certainly Google and Microsoft are two giants, and they are going into this with full force."

Friday, August 15, 2008

AMD to Make a Splash in the Server Chipset Space

Advanced Micro Devices plans to deliver a new server platform in the first half of 2009, the company announced on Friday, with the platform revolving around a new chipset.
The new chipset will be geared toward servers, with multiple sockets to plug in additional server chips. The chipset could improve the way chips in multiple sockets and components like graphics cards communicate with each other. The improved performance comes through new virtualization capabilities and support for HyperTransport 3.0 bus technology, according to the company.
AMD's upcoming Shanghai server chips -- which will be delivered in the fourth quarter this year -- will go into this chipset, said Phil Hughes, a company spokesman. Shanghai chips will also go into Nvidia and Broadcom chipset offerings.
This could be a significant announcement for AMD as the company hasn't had a server platform that included a chipset since the very start of Opteron around 2003, said Dean McCarron, principal analyst at Mercury Research. Most of the AMD servers today contain either Nvidia or Broadcom chipsets, he said.
The new server chipset won't necessarily boost AMD's processor business, but server vendors do not accept desktop chipsets, he said. AMD has a strong presence in the desktop chipset business, and the server chipset could open a new market for the company, McCarron said.
Though the input-output performance on the chipset will improve with the new features, it is not designed to boost processor performance, McCarron said. That will solely depend on how Shanghai chips perform.
Shanghai chips will be the first manufactured by AMD using the 45-nanometer manufacturing process. Desktop offerings manufactured using the 45-nm process technology will closely follow, said Randy Allen, senior vice president of computing solutions at AMD, during a conference call on Friday.
AMD is in talks with server vendors to finalize its Shanghai-based server shipment schedules. Based on how those discussions go, AMD hopes the servers will be in the market by the end of the year, Allen said.
AMD is roughly a year behind Intel in shipping chips based on the 45-nm technology, but efforts are underway to close the gap, Allen said.
The server chipset announcement may be an attempt by AMD to pre-empt Intel's server-based announcements at its Intel Developer Forum show next week in San Francisco. Last year, AMD announced the Phenom triple-core chip prior to the start of IDF.
AMD executives spent a lot of time on the conference call berating Intel for photocopying its past innovations, including quad-core processors and chip technology.
"It is often said that imitation is the sincerest form of flattery, and we certainly see that in play here. But at another level it is somewhat annoying when you see the over-the-top rhetoric they utilize, and at some level you get mad," Allen said.
AMD has historically announced products prior to IDF to show that they are still alive and should not be forgotten, Mercury Research's McCarron said.

Stay-at-home Mobile Accessories

Most laptop gear is designed to enhance your mobile life, but what about all those hours you spend using your portable at home? Today's Mobile Mac takes a look at a couple laptop accessories for your humble abode.
Belkin Laptop Hideaway
I don't know about you, but in our home, laptops are often used away from a desk--for example, on the couch, sitting in bed, or at the kitchen table. Belkin's magazine-rack-looking Laptop Hideaway offers a convenient and attractive solution for carrying your laptop and its accessories around the house--and for storing them when not in use. The rigid shell is covered in good-looking, snag-resistant fabric, with the inside lined with softer material over thick padding. One side of the Hideaway holds a 15.4-inch (or smaller) laptop; the other is split into two smaller pockets: one for your power adapter and other small accessories, the other for larger accessories or books and magazines. The handle on top makes for easy room-to-room toting.
The Hideaway is surprisingly sturdy and stays upright even when fully loaded. It also looks good, although an all-brown design, instead of brown and blue, might have been a better match for the furnishings in many homes.
Kangaroom Bamboo Laptop Stand
Unlike many of the laptop stands we've reviewed, Kangaroom's Bamboo Laptop Stand isn't designed to lift your notebook up to an ideal working height (or at least that's not its primary purpose). Instead, the ecologically-friendly platform--it's made of sustainable bamboo--aims to keep your desk free of wires and gadgets. On the right-hand side of the stand are two "cradles," each 3.2 inches long, 1.4 inches wide, and 1.5 inches deep, with an opening at the bottom for a charging/docking cable; you pull the cable for your iPod, iPhone, or other phone or portable player through the opening, plug it into the device, and then rest the device in the cradle. (Unfortunately, there are no clips or grooves inside the cradles to keep your cables from falling underneath the stand.)
To further reduce cable clutter, the back of the Stand is open for hiding a surge protector--not included, although Kangaroom sells an appropriately-sized model for US$13--underneath. You then plug all your power adapters into this power strip to keep them, and their cables, hidden beneath the Stand. There's also a large opening in the middle of the Stand to provide ventilation for your laptop.
The Bamboo Laptop Stand is quite solid; its wood is attractively finished; and its overall width--20 inches--isn't much wider than a 15-inch MacBook Pro sitting next to an iPod dock. (The Stand is 12 inches deep.) However, the Stand's low height, just 3 inches in the rear, has a couple consequences. First, it doesn't quite raise your laptop's screen enough on an typical desk. Second, and more important, given the Stand's advertised benefits, is that this low height doesn't provide enough room for larger power adapters, such as Apple's laptop adapters, to fit underneath when plugged into a power strip. I also wish the stand provided features for keeping your laptop cables--USB, power, network, and the like--organized and to prevent them from falling behind your desk when unplugged.
(Kangaroom claims you can use the Stand on your lap, but I didn't find that to be the case, given the Stand's open-bottom design.)

Anti-Georgia Spammers Building New Botnet

Hackers targeting Georgia in the midst of its conflict with Russia have started sending out a new batch of malicious spam messages, apparently with the aim of building a new botnet network of remote-controlled computers.
The poorly worded messages started going out early Friday morning, and now make up close to five percent of the spam traffic measured by the University of Alabama at Birmingham's Spam Data Mine, according to Gary Warner, a director of computer research and forensics at the university. That's about a third of the volume of the CNN- and MSNBC-related spam that has been flooding inboxes this week, but it's still significant, he said.
With headlines like "Mikheil Saakashvili gay scandal! New of this week!" the stories try to trick victims into clicking on a fake BBC story about the president of Georgia. When the victim clicks on the link, however, he is taken to a malicious Web server that then tries to infect his computer.
Disturbingly, the attack code used by this Web server is not blocked by most antivirus products, Warner said. In tests, his team found that only four out of the 36 antivirus products featured in the Virus Total malware testing service spotted the code.
So far, Warner's team has tracked the messages back to 44 spam-sending computers, none of which has previously been associated with junk e-mail. Interestingly, six of these computers are located in Russia, which is rarely a direct source of spam, and one of them lies within the Russian Ministry of Education.
Although the spammers seem to be setting up a botnet, the ultimate use of this network remains unclear. Warner speculated that it could be used to launch further cyber-attacks against Georgian government computers.
Symantec has identified the malicious software as a variant of the Trojan.Blusod program, said Kevin Haley, director of product management with Symantec Security Response. In the past, spammers have used this program to install fake antivirus software on victim's computers, which then falsely identifies problems and offers to clean them up for a fee, he said.
Warner disputed Symantec's analysis, noting that Symantec itself was not detecting the Trojan program, according to Virus Total. "This is new malware," he said.
The question of whether Georgia and Russia are engaging in state-sponsored cyber-warfare has been a matter of some debate, following the eruption of hostilities between the two countries on Aug. 7.
On Monday, Georgia moved its Ministry of Foreign Affairs Web site to Google's Blogspot, claiming that a Russian cyberattack had knocked its server offline.
Security experts say that while the recent Georgian cyber-attacks are more intense then those launched a year ago against Estonia, there is no evidence that either of the events were actually state-sponsored cyber-warfare.
Some have likened those events to a "cyber brawl," with nationalistic Russian hackers launching spontaneous computer attacks against neighboring Estonia.
"I think it's almost exactly what we saw back in Estonia," Warner said of the recent events in Georgia. "I really doubt this is any action by the Russian government."

IMac Performance Evolves Through the Years

No one touted the iMac as a computing power house when it first began shipping 10 years ago today. But at some point in the ensuing decade, the iMac evolved into a viable pro system for many users, blurring the line between professional and consumer desktops.
Just how much has it evolved? In honor of the iMac's 10th anniversary, we decided to use Macworld Lab's collection of older iMacs to find out.
Macworld readers are always asking the Lab to compare current Mac models to vintage systems. The problem, of course, is that a G3-based iMac can't run all of the applications included in Speedmark 5, our test for benchmarking Macs. Still, we thought we'd mark the occasion by trying to quantify the progress in performance made by the iMac over this last decade.
What we tested
To that end, we assembled (and reassembled) five iMacs from the past, with systems from the G3, G4, G5 and Intel eras all represented. We upgraded the RAM as best we could and loaded the latest version of OS X that would run on each machine. We picked 12 different tests that could be run all of our systems, as well as our Photoshop suite, though even CS2 wouldn't run on our earliest G3 iMac.
We wanted to include the original 233MHz Bondi Blue iMac G3, but unfortunately, it didn't pull through its RAM upgrade surgery. (The original iMac is a bear to upgrade.) The closest we could get to the original model was a 333MHz Grape iMac G3. Originally, this fruity iMac shipped with just 32MB of RAM. We were able to install two 256MB DIMMs, but the iMac only recognized half of each stick, so instead of 512MB, we ended up testing the system with 256MB and OS X 10.3.9 Panther-the latest version of the OS we were able to install.
Next we picked the iMac DV SE, running a 400MHz G3 processor. We were able to get this system outfitted with 1GB of RAM, and OS X 10.4.11 Tiger. This was the first iMac to feature FireWire, making file transfers a whole lot faster. It's also the Mac that both my children and my in-laws use on a daily basis, so I have a personal interest in how it performs.
We also tested an iMac G4, with an adjustable 15-inch LCD display; we were able to run Tiger with 1GB of RAM on this system as well. Our final models included a 2.1GHz iMac G5 with built-in iSight camera and the current entry-level iMac, a 2.4GHz Intel Core 2 Duo model. Both of these final two machines were tested with the latest version of Leopard (OS X 10.5.4) and 1GB of RAM.
Testing notes
Before we start comparing the performance of all of the models, let me point out a couple of issues that arose during testing. First, in order to get applications that would run on all of the systems, we needed to use primarily older, pre-Intel applications that put the current shipping iMac at a disadvantage, since it had to run many apps like iMovie, iPhoto, Microsoft Office, and Photoshop using Apple's Rosetta translation technology; as you may remember from the Intel transition, running programs via Rosetta can dramatically slow down performance. For that reason, the G5 iMac was able to keep up with, and, in some cases, top the Intel iMac. Had we been able to run Intel-native versions, those results would have been very different.
Secondly, we gave the 333MHz Grape iMac a bit of a break as the hard drive was not large enough to hold all of the test files and applications. So, unlike with the other iMacs--which had all apps and documents installed for the duration of the testing--we were swapping applications and documents as necessary to fit onto the iMac's less-than-roomy 4GB hard drive.
The results
Overall, the results show marked improvement at each step down the iMac family tree, with big performance jumps between the G3, G4, and G5 processors. I think the above charts speak for themselves, but I'll call out a couple of the more interesting results. For instance, compressing a 512MB file took 7 minutes 28 seconds on the Grape iMac, 5 minutes 45 seconds on the iMac DV, 3 minutes 15 seconds on the G4 iMac, 1 minute 3 seconds on the G5, and just 44 seconds on the Intel iMac: you could run that same test 10 times on the Intel in the time it took the Grape to finish the task just once.
The biggest difference between the G3s and the Intel iMac was the Camino test. Downloading Web pages over a closed network took a whopping 40 minutes on the Grape iMac, 20 minutes on the G3 iMac DV, nearly 12 minutes on the G4 iMac, about 2 minutes on the G5 iMac, and 55 seconds on the Intel 2.4GHz iMac.
As mentioned above, the Intel iMac is running many of the applications under Rosetta, so results from Word 2004, Photoshop CS2 and the older iLife applications don't compare very favorably to the results of the G5 iMac. In our Universal Binary application tests, we found the Intel Core 2 Duo iMac to be 45 percent faster than the G5 iMac in Photoshop CS3 tests and 42 percent faster in our iTunes MP3 encode tests.
And while those results are dramatic, I think the fact that the original iMac can't even run 11 of the 17 tests making up the current version of Speedmark is just as striking.
It's one thing to say that performance has progressed over the past decade; it's quite another to see the extent of that progress. And maybe it will finally convince my father-in-law that it's time to upgrade.

Thursday, August 14, 2008

Microsoft, Google, Yahoo Sued for Sex Selection Ads in India

Microsoft, Google and Yahoo were issued notices by India's Supreme Court on Wednesday, following a complaint that they were promoting techniques and products for the selection of an unborn child's sex through advertising and links on their search engines.
There is a deliberate attempt by these companies to target Indian users with advertisements that claim to help in the selection of a child's sex, said Sabu Mathew George, the petitioner in the case, in a telephone interview on Thursday.
The three companies were unavailable for comment, despite repeated phone calls to Yahoo in Bangalore, Google in Hyderabad and Microsoft in Delhi.
The advertisement of products and techniques to aid in the selection of an unborn child's sex is an offense under India's "The Pre-conception and Pre-natal Diagnostic Techniques (Prohibition of Sex Selection) Act".
In India, at least 900,000 unborn girls die each year through feticide, said George who is a social activist associated with organizations fighting for the rights of young girls in India.
As activists were able to effectively stop sex selection advertising in the print medium, Indian and foreign advertisers have moved to the Internet, George said. Unlike the print medium, Internet search engines allow for very targeted advertising, he added.
"These companies are making money by breaking Indian laws," George said.
The country's Ministry of Health and Family Welfare and Ministry of Communications and IT have also been made respondents in this case, as they did not take any action against the three companies, although the offenses were brought to their notice, George said.
In India, search engines, video sharing sites and social networking sites, including Google's Orkut and YouTube have been sued for objectionable content or copyright violations.
Google has in the past objected to provisions in India's Information Technology Act 2000 which make intermediaries like ISPs (Internet service providers), website hosting companies, search engines, email services, and social networks, liable for their users' content.
Section 79 of the Act holds network service providers liable unless they can prove that the offense or contravention was committed without their knowledge or that they had exercised all due diligence to prevent the commission of such offense or contravention.
"We don't hold the telephone company liable when two callers use the phone lines to plan a crime," Rishi Jaitly, a policy analyst at Google India said in a Google blog post in October.
"For the same reasons, it's a fundamental principle of the Internet that you don't blame the neutral intermediaries for the actions of their customers," Jaitly added.

Intel Drops Centrino Atom Brand After Five Months

Intel has dropped the Centrino Atom brand after just five months, opting instead to use just the Atom brand across this part of its product line.
"Basically, we are simplifying and coalescing our efforts around 'Atom' as the single brand for Internet devices," said Nick Jacobs, a company spokesman in Singapore.
Centrino Atom was the brand name given to a chip package formerly codenamed Menlow, which includes an Atom processor and a single-chip chipset. The package was designed for small, handheld computers that Intel calls Mobile Internet Devices, or MIDs. But that segment of the market has been slow to take off, with only a trickle of devices hitting the market since Intel launched Centrino Atom at its Intel Developer Forum conference earlier this year.
The Centrino Atom brand was mildly confusing to some observers. Intel's Centrino brand is closely associated with laptops, but Atom-based laptops -- sometimes called netbooks -- were not allowed to use the Centrino Atom brand since these devices used a different version of the Atom processor and a traditional two-chip chipset.
Hardware makers have been notified of the branding change, and MIDs will now be branded with stickers that say Atom, instead of Centrino Atom. The change comes as Intel prepares to roll out the Core brand for its upcoming Nehalem processor line, eventually replacing the Core 2 brand used with Intel's current top-end chips.

AOL Phisher Gets Seven Year Sentence

A West Haven, Connecticut, man has been sentenced to seven years in prison for masterminding a phishing scheme that targeted AOL users over a four-year period.
Michael Dolan, 24, was sentenced Wednesday in Connecticut federal court. The seven-year sentence was the maximum he could have received, said Assistant U.S. District Attorney Edward Chang, via e-mail. Dolan was also sentenced to three years' supervised release, and a US$200 special assessment, he added.
Last year Dolan pleaded guilty to fraud and aggravated identity theft charges.
Federal prosecutors had argued that he masterminded a scam in which he and five other men harvested thousands of AOL e-mail addresses and then infected victims' PCs with malicious software that would prevent them from logging on to AOL without entering their credit card numbers, bank account numbers and other personal information. The scam ran between 2002 and 2006, prosecutors said
All of the defendants have pleaded guilty. Another defendant, Keith Riedel, is set to be sentenced Thursday.
Victims would receive fake e-mail greeting cards that would silently infect their computers with the log-on software, according to a grand jury indictment. They were also spammed with phony e-mail messages that claimed to have come from AOL's billing department.
"Due to a central server meltdown, your credit card information was lost," one such e-mail read. "In order to enjoy your AOL experience and keep your account active, you must enter your credit card information within 24 hours."
Some of the fake greeting cards claimed to come from Web sites such as Hallmark.com or BlueMountain.com. Proceeds from the crime were used to purchase gaming consoles, laptop computers and gift cards, the indictment states.
In court filings, Pickerstein had asked for a lighter sentence, saying that his client suffered from "severe mental illness" and had made poor decisions following his father's suicide. He argued that there were probably less than 50 victims of the scam, and that victim losses were closer to US$43,000 -- far less than argued by the government.
His lawyer, Harold Pickerstein, declined to comment further on the matter on Wednesday.
Assistant U.S. Attorney Edward Chang painted a far different picture of the man, saying in a sentencing memorandum that Dolan has attempted to bribe a codefendant, threatened to kill someone he thought was a government informant, and suborned perjury from his girlfriend. "Michael Dolan is a born leader -- a leader of criminals," he wrote.
Dolan had previously admitted that the scam had netted more than $400,000 from 250 or more victims, Chang argued in the memorandum.
Before the AOL phishing charges, Dolan had previously been sentenced to two years of probation after pleading guilty to accessing a protected computer without authorization. He later was given nine months' jail time for violating his probation terms.

Is America Ready for Its First BlackBerry President?

The next U.S. president could shape cybersecurity, government research initiatives, intellectual property laws, and wired and wireless communications services in ways that affect both enterprise IT executives and average citizens. Yet some experts say he could handle all this without having Twittered, texted or even used a PC, although his familiarity with information technologies might strongly affect his policies.
Much has been made in the media of a seemingly wide gulf in tech smarts between Republican John McCain, who has described letting his wife handle his computer tasks, and Democrat Barack Obama, who has appeared on video using a BlackBerry while walking down the street.
President George W. Bush has been ridiculed for talking about rumors on "the Internets," but only one other president has even been in office in the age of the Web, laptops and ubiquitous cell phones. The expectation that a president "gets it" regarding IT is a fairly new one. Yet whatever the result of the November election, cutting-edge technology seems to be entering the realm of elected officials for good.
About half the members of the U.S. Senate use BlackBerry devices, estimates Louis Libin, CEO of unified communications vendor PhoneFusion, who has set up the networks for the past five Democratic and Republican nominating conventions. Mitt Romney, former Massachusetts governor and McCain rival, gave every member of his campaign team a BlackBerry and carried one himself, according to David Palmer, a senior consultant and technical architect at Molecular, a digital marketing company that handled Romney's Web site. On Monday, Obama announced via the microblogging site Twitter that he would alert his supporters via text message immediately after he chose his running mate for vice president.
Several observers in the IT industry and scholars who study the intersection of technology and policy say it will be important for future presidents to grasp IT issues.
"To the extent that we think the Internet economy ... is a major source of value-creating activity in the U.S. economy and a major source of social experience for lots of Americans, it seems to me really critical that the president understand what that's about," said Steven Weber, a professor of political science at UC Berkeley.
Some believe IT issues are so central that they should guide the choice of a chief executive.
"The president should come from a technology background and understand technology at a high level," said Avi Silberschatz, chairman of the Computer Science Department at Yale University. "It would be good if the person at the helm of the country had an appreciation of the changes taking place as we speak." One critical issue is spending on research and development, which has fallen in the past 10 years, leaving the U.S. trailing other countries, he said.
However, others say firsthand knowledge of something like social networking isn't necessary for a president to grasp its importance. Presidents have set policies on many technologies in the past without understanding them, Berkeley's Weber said.
"Is it more complex than a nuclear power plant? Hardly," Weber said. And a president who needed a rundown on Facebook could quickly summon founder Mark Zuckerberg to the Oval Office to explain it, he said.
Intimate experience with a technology might make a candidate appear more tech-savvy and better equipped to deal with IT policy issues. But in fact, a user's knowledge is only one way of understanding a technology, said Jason Hong, an assistant professor of computer science at Carnegie Mellon University. Hong believes he'll never use social networking the way today's undergraduates do, but his studies have taught him things the average user wouldn't know, he said.
It's similar to the way the U.S. trusts its presidents -- few of whom have led armies in battle -- to be commander in chief, said Robert Holleyman, president and CEO of the Business Software Alliance, which lobbies for the commercial software industry.
In any case, once the president takes office, he or she lacks the time not only to learn new technologies but to use them, PhoneFusion's Libin said.
"Anything to do with technology becomes a distraction at some level," Libin said. "I don't think we're going to find him on Facebook or chatrooms."
There are also built-in constraints to a leader's use of technology, Libin said. He predicted that a BlackBerry user who was elected president might continue using the device, but not in the same way. To minimize distractions, very few people, such as the president's spouse and closest advisers, would be able to send e-mail to the device. The BlackBerry Enterprise Server that enterprises normally use in delivering e-mail to the handsets might have to be augmented with more security. Also for security reasons, the device's coverage might sometimes be cut off by Secret Service jamming, Libin added. But the president might find it useful for keeping track of a frequently changing daily schedule, he said.
However, even though the president could make technology policy without having to rely on firsthand knowledge, that kind of perspective could help to shape his or her policies, some observers said.
People who don't use technology tend to see its benefits and overlook its dangers, said Tom Kellermann, vice president of security awareness at security testing software vendor Core Security and a member of the Commission on Cyber Security for the 44th Presidency. When confronted by a technology problem, novices often see more technology as the solution.
"A mechanical problem is not always solved by a mechanical widget" is a lesson that can only be learned by using technology, Kellermann said. In reality, the country's IT and communications infrastructure has security problems that require strategic thinking and proactive policy in addition to technological fixes, he said.
"His desire to deal with the problem ... will be motivated by his experience with the box," Kellermann said.
The benefit of a truly tech-savvy president to IT managers would be a long-overdue appreciation for the critical role they can play in keeping the country running and secure, Kellermann said. "The fact that they maintain and secure the IT infrastructure ... is ten times more important than anything you do," he said, comparing the importance of the role of IT managers to most other jobs.
Over-reliance on advisers can also be dangerous, said Harry Lewis, a longtime computer science professor at Harvard who teaches a course to help nontechnical people understand the digital world. An ideal president would have at least kept up with technology trends, such as social networking, by spending time with knowledgeable people over the years, Lewis said.
"It's not so much (the) use of it, but there's a level of familiarity. There's a sense of what it means for this amount of information about people's interconnections to be out there," Lewis said.
"If the president himself is not tech-savvy, there will be someone who will be, in effect, the person calling the shots on these issues of where technology meets public policy. And that, I think, would be pretty scary," Lewis said.
In that scenario, IT leaders would best spend their time trying to influence the choice of that adviser, he said.
"That person is going to be in a position to put things over on the president, basically, and therefore to put things over on the country," Lewis said.