Tuesday, July 29, 2008

SMIC Swimming in Red Ink, Hopes to Break Even in Q4

Semiconductor Manufacturing International (SMIC), China's largest chip maker, racked up another loss during the second quarter and restated its first-quarter results, nearly doubling the loss it earlier reported for that period.
SMIC reported second-quarter revenue fell 8.5 percent compared to last year, to US$342.9 million. Meanwhile, the company's net loss widened to $45.6 million from $2.1 million last year.
Company officials blamed the losses on its move away from making memory chips to more profitable logic chips. "We are still transitioning from majority DRAM to pure logic in our Beijing facility," said Richard Chang, SMIC's chairman and CEO, in a conference call with analysts.
SMIC stopped producing DRAM during the second quarter, although the company has inventory of the chips that it plans to sell off. Most of that inventory should be sold off by the fourth quarter, Chang said.
SMIC hopes to break even in the fourth quarter, he said.
SMIC also restated its first-quarter results, taking a $105.8 million charge to write down the value of long-lived assets related to the company's move away from memory-chip manufacturing. The additional charge against SMIC's first-quarter earnings means the company's loss for that period increased to $224.9 million from $119.7 million previously.
In its first-quarter conference call with analysts, SMIC announced a third-party was hired to evaluate whether or not an impairment charge would be required for these assets, and warned any charge, if required, would be taken against its first-quarter earnings. The company did not offer an estimate of the potential charge at that time.
Restatements of previous earnings results are rare, as generally accepted accounting principles (GAAP) usually require these types of charges to be recorded during the same period that a decision is made on how much to write down. But GAAP is a framework, not a set of hard and fast rules, which means many accounting decisions depend on the judgement of a company's executives.
"We announced plans to exit the DRAM business in Q1, so we also have to take the impairment charge in Q1," said Theresa Teng, SMIC's head of finance and investor relations, explaining why the company opted to restate its first-quarter results.
The company does not expect to take further impairment charges this year.

China Telecom to Pay US$6.41 Billion for CDMA Business

China Telecom finalized a deal on Monday to buy China Unicom's CDMA business, another big step towards consolidation among Chinese telecom companies.
China Telecom is paying 43.8 billion yuan (US$6.41 billion) for the CDMA (Code Division Multiple Access) network, which had 43.17 million subscribers as of June 30. It will also take on 29.3 percent of China Unicom's total employees, the two companies said in a statement.
China Unicom is merging with China Netcom, and plans to use the proceeds from the sale to expand its GSM (Global System for Mobile communications), which had 127.6 million subscribers at the end of June. Unicom said it will also begin preparations to offer 3G services, but gave no timetable as to when those services would begin.
The latest round of consolidation was announced in late May, and will create three major carriers, China Netcom, China Mobile and China Telecom. Each of the companies will offer fixed-line, mobile and other services. As such, China Unicom -- which was originally created as the state-run competitor to former monopoly service provider China Telecom -- will be folded into China Netcom, while China Mobile is acquiring China Tietong for its fixed-line network and China Telecom is taking on China Unicom's CDMA network.
Once the consolidation is complete, China's new telecom regulator, the Ministry of Industry and Information Technology, will issue 3G licenses. China Mobile is already publicizing its 3G service, which will use the domestically-developed TD-SCDMA (Time Division Synchronous Code Division Multiple Access) standard, and is providing 3G services to about 18,000 users during the Beijing Olympics, which begin August 8.

Toshiba to End Direct-to-handset Satellite Broadcasts

Toshiba will end its direct-to-handset satellite broadcasting service in Japan in March next year after several years of losses caused by poor consumer acceptance.
The service, operated by Toshiba-subsidiary Mobile Broadcasting Co. (MBCO), drew headlines when it first went on-air in October 2004 as the world's first direct-to-handset service.
A dedicated satellite broadcasts a signal in the 2.6GHz S-band that is strong enough to be received with an antenna built into a portable terminal so a dish antenna isn't required. The signal can be received anywhere within view of the satellite and city areas obscured from the satellite by tall buildings are covered by gap-filler transmitters.
Currently MBCO delivers 7 video channels and 40 audio channels and while it's been a technical success its failed badly in the marketplace. When it first launched Toshiba hoped to attract 1.5 million users within the first three years of service but there are only around 100,000 subscribers to MBCO today, almost four years since launch.
The service was hobbled at the start by the necessity to buy a dedicated terminal. In contrast TU Media, which operates a similar service in South Korea using the same satellite as MBCO, managed to sign up 200,000 subscribers in less than three months thanks in-part to the service being integrated into several cell phone handsets. More recently MBCO has seen tough competition from digital terrestrial TV, which delivers Japan's major broadcast networks at no-cost to cell phone handsets.
Closing the service will cost Toshiba around ¥25 billion (US$233 million) and the full impact on its business forecast for the current fiscal year is under review, it said.

Saturday, July 26, 2008

Gateway to Stop Selling PCs Through Web Site

Gateway on Friday said it would stop selling PCs through its Web site, instead focusing on selling PCs through third-party stores and other online retailers.
Gateway has been selling PCs through partner retail stores since 2004, but it is now cutting direct online sales to consumers in order to cut costs and align its business model with parent company Acer, the company said. Acer last year acquired Gateway for US$710 million in an effort to boost its consumer presence in North America.
The change has resulted in some staff cuts, said Lisa Emard, a Gateway spokeswoman. "These reductions have been happening in small waves as the company has methodically evaluated each department and function," she said.
The transition away from direct sales is happening over the upcoming weekend, Emard said. Consumers will be able to purchase products off Gateway's Web site until Saturday evening.
The change could help Gateway better compete with rivals Hewlett-Packard and Dell, said David Daoud, research manager at IDC. Also, since the indirect model has worked well for Acer, it may be hoping that focusing on the same model can help boost Gateway sales.
Daoud suggested that the decision to kill off Gateway's online sales could mean that Acer may replace some Gateway brands with its own.
Some of Gateway's brands, like eMachines, which has a strong consumer presence, may conflict with Acer's offerings, Daoud said. By getting rid of Gateway's online sales and potentially in the future some of the Gateway brands, Acer may be simply trying to consolidate the brands, which could increase Acer brand awareness, Daoud said.
However, Emard said that Acer is focusing different brands on different sectors, and all of Gateway's brands will continue to be offered.
Acer offers four brands worldwide -- Acer, eMachines, Gateway and Packard-Bell -- with Gateway products currently available through retailers in Japan, China, Mexico, Canada and the U.S., Emard said.
"While there is indeed some crossover today, you're going to see Acer moving upstream with its product line and offering more high-performance products featuring advanced technology," Emard said.
Since the acquisition, Gateway has helped Acer boost its U.S. presence, where it was running neck and neck with Apple as the third-largest PC retailer. The combined company sold 1.3 million units, a 7.8 percent market share and a 49.9 percent increase over last year's third quarter.

Google Counts More Than 1 Trillion Unique Web URLs

In a discovery that would probably send the Dr. Evil character of the "Austin Powers" movies into cardiac arrest, Google recently detected more than a trillion unique URLs on the Web.
This milestone awed Google search engineers, who are seeing the Web growing by several billion individual pages every day, company officials wrote in a blog post Friday.
In addition to announcing this finding, Google took the opportunity to promote the scope and magnitude of its index.
"We don't index every one of those trillion pages -- many of them are similar to each other, or represent auto-generated content ... that isn't very useful to searchers. But we're proud to have the most comprehensive index of any search engine, and our goal always has been to index all the world's data," wrote Jesse Alpert and Nissan Hajaj, software engineers in Google's Web Search Infrastructure Team.
It had been a while since Google had made public pronouncements about the size of its index, a topic that routinely generated controversy and counterclaims among the major search engine players years ago.
Those days of index-size envy ended when it became clear that most people rarely scan more than two pages of Web results. In other words, what matters is delivering 10 or 20 really relevant Web links, or, even better, a direct factual answer, because few people will wade through 5,000 results to find the desired information.
It will be interesting to see if this announcement from Google, posted on its main official blog, will trigger a round of reactions from rivals like Yahoo, Microsoft and Ask.com.
In the meantime, Google also disclosed interesting information about how and with what frequency it analyzes these links.
"Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, we do the computational equivalent of fully exploring every intersection of every road in the United States. Except it'd be a map about 50,000 times as big as the U.S., with 50,000 times as many roads and intersections," the officials wrote.

Microsoft Bolsters Ruby Efforts

Microsoft on Thursday plans to delve deeper into Ruby programming, with plans to ship Ruby libraries and participate in a testing project for the language.
The libraries are akin to any other software library, helping developers build software.
The company at the O'Reilly Open Source Convention (OSCON) also will announce intentions to participate in the RubySpec project, which features a standard test suite used to define a compliant Ruby implementation.
In a prepared statement, Microsoft's John Lam, program manager for the company's Dynamic Language Runtime team, stressed the company's Ruby backing.
"All of these [OSCON] announcements underscore our commitment to listening to customer feedback and ensuring that we are true to Ruby as a language while still bringing the full benefits of .Net programming to the Ruby user base," Lam said.
IronRuby, a version of Ruby for Microsoft's .Net platform, is in development at the company, which as of Wednesday morning had not yet announced a release date for the 1.0 version.
Also at OSCON, Microsoft will unveil IronRuby-Contrib, a Microsoft Public License (Ms-PL) open source project for collaborative development of code supporting IronRuby or the underlying platform, but not part of the IronRuby distribution. An example of such a project would be the Ruby on Rails plug-in built to make it easier for Rails developers to add Microsoft's Silverlight rich Internet application technology to their applications, a Microsoft representative said.
Under Ms-PL, licensees can change source code and share it with others. They also can charge a licensing fee for modified work. Microsoft uses this license most commonly for developer tools, applications, and components.
While often criticized by open-source advocates, Microsoft nonetheless has established a presence at OSCON this week, with its sponsorship of the Participate08 session at OSCON, which was focused on boosting dialogue about open source and other collaborative communities.
On Friday at OSCON, Sam Ramji, Microsoft director of platform strategy, is scheduled to present on "Open Source Heroes." His brief talk will cover Microsoft community participation and ways in which Microsoft plans to contribute during the next 10 years of open-source development, according to the conference program.

Infineon Lays off 3,000, Reorganizes Divisions

German semiconductor company Infineon on Friday said it would lay off 3,000 employees as part of a cost-reduction program to bring the company back to profitability.
The company blamed the layoffs on "adverse foreign exchange rate development and the requirements of the reorganization of the company," saying that headcount reduction was "inevitable."
As part of the cost-reduction program, called IFX 10, Infineon is also looking to cut manufacturing costs and reorganize divisions.
"Within five quarters, we expect to realize at least €200 million (US$313.6 million) in annualized savings that should pave the way for continued profitability," said Peter Bauer, Infineon's CEO, in a statement.
Infineon will remove unprofitable product families and reorganize its business into five divisions: automotive, chipcard and security, industrial and multimarket, wireline communications, and wireless solutions, the company said.
The reorganization was originally announced in late May. Infineon's CEO at the time, Wolfgang Ziebart, resigned from his post, citing differences of opinion on the "future strategic orientation" of the company.
The layoffs came during the earnings announcement by Infineon for the third quarter of 2008. The company reported a quarterly revenue of €1.03 billion, up two percent year-over-year. It reported a net loss of €592 million for the third quarter. The loss included €411 million in charges related to Qimonda AG, a memory company that was spun off from Infineon in 2006.
Qimonda this week reported a net loss of €401 million in the quarter ending June 30, blaming it partly on a decline in average selling prices of chips. Memory makers have been feeling the pinch since late last year, posting losses amid competitive pricing and an oversupply of chips.

Microsoft: Stodgy or Innovative? It's All About Perception

When many people think of Microsoft, they think of a stodgy old corporation churning out boring PC software.
But is that image accurate?
Some analysts say no, and at Thursday's annual Microsoft analyst get-together they urged executives to do more to improve the company's image and to let the wider world know that it is developing great new products and services.
At the meeting, Craig Mundie, chief research and strategy officer, showed off a futuristic application for Surface, Microsoft's multitouch tabletop computer. He virtually entered an art gallery on a downtown Seattle street, browsing through items that he could pick up and spin around to look at them from all directions.
In another demonstration, he took a photograph of a street and his handheld computer identified it in real time and began displaying information about shops on the street, including information about table availability in a restaurant.
After the demo, one analyst commented to Mundie that the technology looked great but that the rest of the world doesn't get to see such demonstrations, and he urged Mundie to spread the word so that people will perceive Microsoft as the innovative company that it is, rather than as a legacy software vendor.
Mundie pledged to do just that. "That is a commitment I can make to you and to shareholders," he said. For years, he and Microsoft founder Bill Gates spent a lot of time on the road talking about Gates' vision of the future, he said. "Over the last few years, both of us got out of the habit of going out and talking about it. I think we share your observation that we haven't done a great job in recent years communicating about the tremendous things this company does."
As Mundie and others begin talking more about new innovations, however, the company runs the risk of being accused of marketing "vaporware," a criticism it has faced in the past. In fact, Microsoft has been accused of announcing its work on technologies very early as a way to discourage other companies from developing similar products in competition.
But Microsoft needs to address the perception problem, which runs deep and could have repercussions on sales of future products if the company doesn't manage to fix it. Executives showed just how real the problem is by running a brief video collected during a recent customer study conducted by the company. Microsoft chose people for the study who continue to use XP and who said that they weren't interested in upgrading to Vista because of its bad reputation. Microsoft offered to show the people the next version of the operating system to see if they might be interested in it when it comes out.
The people loved the future version and said they'd definitely upgrade. Then they learned that the software they loved was actually Vista, not some future version of the operating system.
Perhaps with that video containing the user comments in mind, another analyst at the meeting asked Microsoft executives how the company expects to be able to sell Windows 7, the next version of the operating system, when people have such a poor perception of Vista. Executives didn't have a great reply, beyond assuring the audience that the problems that plagued Vista at its initial launch are now fixed.
Vista initially had serious compatibility problems but SP1 largely fixed the problems, so with Windows 7, Microsoft "takes that issue effectively off the table," said Bill Veghte, senior vice president of the online services and Windows business group. Starting later this year, his team plans to spend a lot of time spreading the word about Windows 7 and explaining that it won't encounter the same issues that Vista faced, he said.
The perception problem stretches into the online services market, where Microsoft has struggled to attract users. Another analyst at the meeting asked executives if they planned to make changes to the company's online branding and offer a single place where end-users could discover that some of Microsoft's online tools are better than the competition. Currently, Microsoft offers a host of online services, including maps, blogs, e-mail and instant messaging. But the services are difficult to find, sometimes available under different brands including Live and MSN.
CEO Steve Ballmer assured the crowd of analysts that the company is working on streamlining its online brand and developing a single page where people can find all available Microsoft online services. The page will predominantly feature a search bar, since that's an opportunity for revenue, but it will also display content tailored for each user, he said.

Wednesday, July 23, 2008

San Francisco's Mayor Gets Back Keys to the Network

San Francisco Mayor Gavin Newsom met with jailed IT administrator Terry Childs Monday, convincing him to hand over the administrative passwords to the city's multimillion dollar wide area network.
Childs made headlines last week when he was arrested and charged with four counts of computer tampering, after he refused to give over passwords to the Cisco Systems switches and routers used on the city's FiberWAN network, which carries about 60 percent of the municipal government's network traffic. Childs, who managed the network before his arrest, has been locked up in the county jail since July 13.
On Monday afternoon, he handed the passwords over to Mayor Newsom, who was "the only person he felt he could trust," according to a declaration filed in court by his attorney, Erin Crane. Newsom is ultimately responsible for the Department of Telecommunications and Information Services (DTIS) where Childs worked for the past five years
Mayor Newsom secured the passwords without first telling DTIS about his meeting with Childs, according to DTIS chief administrative officer Ron Vinson, who added, "We're very happy the mayor embarked on his clandestine mission."
The department now has full administrative control of the network, he said in an interview Tuesday night.
It's likely that Childs had a lot to tell the mayor when the two met.
Childs' attorney has asked the judge to reduce Childs US$5 million bail bond, describing her client as a man who felt himself surrounded by incompetents and supervised by a manager who he felt was undermining his work.
"None of the persons who requested the password information from Mr. Childs ... were qualified to have it," she said in a court filing.
Childs intends to disprove the charges against him but also "expose the utter mismanagement, negligence and corruption at DTIS, which if left unchecked, will in fact place the City of San Francisco in danger," his motion reads.
Vinson dismissed the allegations. "In Terry Childs' mind, obviously he thinks the network is his, but it's not. It's the taxpayers'," he said. "The reason he's been sitting in jail is because he denied the department and others access to the system."
The court filings help explain just how this happened.
According to an affidavit from James Ramsey, an inspector with the San Francisco Police Department, he and other investigators discovered dial-up and DSL (digital subscriber line) modems that would allow an unauthorized connection to the FiberWAN. He also found that Childs had configured several of the Cisco devices with a command that would erase critical configuration data in the event that anyone tried to restore administrative access to the devices, something Ramsey saw as dangerous because no backup configuration files could be found.
This command, called a No Service Password Recovery is often used by engineers to add an extra level of security to networks, said Mike Chase, regional director of engineering with FusionStorm, an IT services provider that supports Cisco products.
But without access to either Childs' passwords or the backup configuration files, administrators would have to essentially re-configure their entire network, an error-prone and time consuming possibility, Chase said. "It's basically like playing 3D chess," he said. "In that situation, you're stuck interviewing everybody at every site getting anecdotal stories of who's connected to what. And then you're guaranteed to miss something."
Without the passwords, the network would still continue to run, but it would be impossible to reconfigure the equipment. The only way to restore these devices to a manageable state would be to knock them offline and then reconfigure them, something that would take weeks or months to complete, disrupt service and cost the city "hundreds of thousands, if not millions of dollars," Ramsey claims.
Crane argues that these monitoring devices were installed with management's permission and were critical to the smooth functioning of the network. They would page Childs when the system went down and allow him to remotely access the network from his personal computer in case of an emergency.
In interviews, current and former DTIS staffers describe Childs as a well respected co-worker who may have gone too far under the pressure of working in a department that had been demoralized and drastically cut as the city moved forward with plans to decentralize IT operations.
About 200 of the department's 350 IT positions had been cut since 2000, mostly to be relocated to other divisions within city government, said Richard Isen, IT chapter president with Childs' union, the International Federation of Professional and Technical Engineers, Local 21.
Despite his conflict with some in the department, Childs has a lot of support there, Isen said. "There is a lot of sympathy, only because there is a basic feeling that management misunderstand what we actually do and doesn't appreciate the complexity of the work."

NAND Flash Memory Downturn to Continue

A global glut of NAND flash memory chips, which store songs, photos and other data in gadgets from iPods to digital cameras, will continue for at least the next few months because companies have been slow to rein in production, according to DRAMeXchange Technology.
The market researcher, which is based in the heartland of the global memory spot market in Taipei, predicts the NAND flash supply will grow 149 percent this year despite worsening prices for the chips. The problem is that chip makers such as Samsung Electronics, Hynix Semiconductor and SanDisk's partner, Toshiba, have not moved fast enough to cut production.
The good news for users is that companies will be able to offer more NAND flash storage capacity for a lower price, or offer better deals on existing products such as flash memory cards and MP3 players. Low NAND flash prices could also spur companies to lower prices on hot products such as SSDs (solid state drives) in hopes of growing the market for the drives.
Prices of NAND flash memory dropped 20 percent on average in the month of June, DRAMeXchange said, and an upturn for the market may not be in the offing until as late as September.
The NAND flash market has been so bad that the creator of the chips, SanDisk, on Monday reported a surprise loss of US$68 million for the second quarter. The company blamed the supply glut for its problems, pointing out that it sold a record amount of flash, 120 percent more than the same time last year, but that prices are down 55 percent compared to then.
SanDisk also said NAND flash prices may worsen in the third quarter. The company's Nasdaq-listed stock fell US$4.31, or 24 percent, to end Tuesday at $13.62 as a result of its earnings news.
To counter the deteriorating market, SanDisk will delay the start of production at a new joint venture chip factory until April 2009 and put plans for another factory on hold until market conditions improve.
Credit Suisse analyst John Pitzer notes that SanDisk's plans to delay building new production lines are a positive for the NAND flash industry and rivals are likely to follow. SanDisk and partner Toshiba account for around a third of the global NAND flash supply, he said in a report.

Sun Moves to Indirect Sales for Most US Customers

Sun Microsystems is moving to an indirect sales model in the U.S. for all but about 300 of its largest customers, a step designed to help boost its flagging revenue.
The change means customers who aren't among Sun's biggest U.S. accounts from a revenue perspective will be switched to one of its reseller partners in the coming months, said Tom Wagner, vice president of Sun's North America partner sales organization, in an interview on Tuesday.
"Effectively we're going to go 100 percent 'channel' below the top 300 or so accounts," he said. That means Sun will depend wholly on its partners to generate leads, architect systems, close deals and provide much of the support and services for those customers.
The move will likely be welcomed by Sun's 600 or so channel partners in the U.S. because they will no longer be competing with Sun for business. Sun believes it will give them more motivation to attack areas of the market where Sun's "share of the wallet" is low today, and allow Sun to scale its sales efforts to target those accounts, Wagner said.
It is less clear how the move will be received by customers. "At the end of the day it'll be a 'wait and see' in terms of the customer reaction," Wagner said.
"We have a portfolio of partners who play pretty high up in the value stack and who we believe can provide quality technical support and system engineering resources," he said. But he acknowledged that some customers may have "very specific demands about how we handle their accounts."
"We'll have to deal with that when it comes to it," he said.
The so-called Partner First initiative is limited to the U.S. today and Sun didn't announce any plans to extend it overseas. Companies will sometimes try a new strategy in one region and roll it out worldwide if it's successful.
Sun does about two-thirds of its business through channel partners today and the proportion outside the top 300 accounts is roughly the same, Wagner said. "We're turning over what we believe is a fairly significant amount of our existing business" to the channel, he said.
The plan was announced internally on July 11 and relayed to Sun's partners through a conference call last week, Wagner said. The goal is to complete the transition by the end of this quarter or early next, which means by September or October.
The change comes at a time when Sun is struggling to grow its business as fast as competitors. Last week it announced that revenue for the June quarter will probably be lower than what it reported a year ago, although the preliminary figures were roughly in line with analyst estimates. It will report its full results on Aug. 1.
Sun is also restructuring and announced in May that it would lay off 7 percent of its workforce, or about 2,500 staff. Wagner said he couldn't comment on whether the new sales plan is related to the layoffs, but one industry analyst said that's likely to be the case.
"This will help to streamline their operations. It will result in lower headcount," said Dan Olds, principal analyst at Gabriel Consulting Group, in Beaverton, Oregon.
Sun may also end up handing over a larger proportion of its professional services revenue to the channel, Olds said.
"The challenge will be ensuring that they get the shelf space with these partners, and that they invest enough to make sure they're well represented in the field," he said.
Sun will make "targeted investments" in partners or recruit new ones as necessary, Wagner said. The company is also changing the way it supports its partners. In the past, managers were allocated to a particular region and had little incentive to help partners grow their businesses in other parts of the country. It is changing so that Sun's managers now have an incentive to help the partners they manage nationwide.
Wagner wouldn't be pinned down on which areas Sun hopes to get more business from. The company is strongest today in the telecommunications, financial and federal government sectors, and is pursuing a bigger share of the healthcare and education markets, as well as that for mid-market customers.
"We believe we have a value proposition for just about anyone out there," Wagner said.

With DNS Flaw Now Public, Attack Code Imminent

One day after a security company accidentally posted details of a serious flaw in the Internet's Domain Name System (DNS), hackers are saying that software that exploits this flaw is sure to pop up soon.
Several hackers are almost certainly already developing attack code for the bug, and it will most likely crop up within the next few days, said Dave Aitel, chief technology officer at security vendor Immunity. His company will eventually develop sample code for its Canvas security testing software too, a task he expects to take about a day, given the simplicity of the attack. "It's not that hard," he said. "You're not looking at a DNA-cracking effort."
The author of one widely used hacking tool said he expected to have an exploit by the end of the day Tuesday. In a telephone interview, HD Moore, author of the Metasploit penetration testing software, agreed with Aitel that the attack code was not going to be difficult to write.
The flaw, a variation on what's known as a cache poisoning attack, was announced on July 8 by IOActive researcher Dan Kaminsky, who planned to disclose full details of the bug during an Aug. 6 presentation at the Black Hat conference.
That plan was thwarted Monday, when someone at Matasano accidentally posted details of the flaw ahead of schedule. Matasano quickly removed the post and apologized for its mistake, but it was too late. Details of the flaw soon spread around the Internet.
And that's bad news, according to Paul Vixie, president of the company that is the dominant maker of DNS software, the Internet Systems Consortium. Vixie, like others who were briefed on Kaminsky's bug, did not confirm that it had been disclosed by Matasano. But if it had, "it's a big deal," he said in an e-mail message.
The attack can be used to redirect victims to malicious servers on the Internet by targeting the DNS servers that serve as signposts for all of the Internet's traffic. By tricking an Internet service provider's (ISPs) servers into accepting bad information, attackers could redirect that company's customers to malicious Web sites without their knowledge.
Although a software fix is now available for most users of DNS software, it can take time for these updates to work their way through the testing process and actually get installed on the network.
"Most people have not patched yet," Vixie said. "That's a gigantic problem for the world."
Just how big of a problem is a matter of some debate.
Neal Krawetz, owner of computer security consultancy Hacker Factor Solutions, took a look at DNS servers run by major ISPs earlier this week and found that more than half of them were still vulnerable to the attack.
"I find it dumbfounding that the largest ISPs ... are still identified as vulnerable," he wrote in a blog posting. "When the [hackers] learn of the exploit, they will go playing. They are certain to start with the lowest hanging fruit -- large companies that are vulnerable and support a huge number of users."
He expects that users will see attacks within weeks, starting first with test attacks, and possibly even a widespread domain hijacking. "Finally will be the phishers, malware writers and organized attackers," he wrote in a Tuesday e-mail interview. "I really expect these to be very focused attacks."
Most ISPs will have probably applied the patch by the time any attacks start to surface, and that will protect the vast majority of home users, said Russ Cooper, a senior information security analyst with Verizon Business. And business users who use secure DNS-proxying software will also be "pretty much protected" from the attack at their firewall, Cooper said.
"If anyone actually tries to exploit this, the actual number of victims will end up being extremely small," he predicted.
HD Moore said he didn't exactly see things that way. Because the flaw affects nearly all of the DNS software being used on the Internet, he said that there could be lots of problems ahead.
"This is a bug we'll be worrying about a year from now," he said.

Yahoo's Profit Down in Q2

Yahoo reported a modest revenue increase and a considerable drop in profit for its second quarter, along the way missing Wall Street's expectations in both categories, results that are unlikely to please its nervous shareholders.
Although Yahoo managed to defuse Carl Icahn's proxy fight this week, a rare victory in its months-long, tumultuous sparring match with shareholders and suitor Microsoft, its results for the quarter ended June 30, 2008, will probably do little to dispel doubts over its ability to survive as an independent company.
"We believe it is more efficient for Yahoo to be acquired. Scale is a competitive advantage. As a result, a combined Yahoo and Microsoft makes a great deal of sense," Financial analyst Clayton Moran from Stanford Group Company said in an e-mail interview after the results were released.
Asked whether he sees Yahoo as being on the right track or not, Moran, who has a "Hold" recommendation on the stock and a 12-month target of $24 per share, said: "Yahoo is struggling with no clear solution to reignite growth."
Yahoo had revenue of US$1.798 billion, a 6 percent increase from 2007's second quarter, the company announced Tuesday. Deducting the commissions it pays to its ad network publishers, Yahoo had revenue of $1.346 billion, up 8 percent but short of the $1.374 billion consensus expectation from financial analysts polled by Thomson Financial.
Net income fell to $131 million, or $0.09 per share, from $161 million, or $0.11 per share, in 2007's second quarter.
On a pro forma basis, taking into account one-time items, net income was $139 million, or $0.10 per share, a penny short of analysts' consensus expectation. Yahoo had pro forma net income of $163 million, or $0.12 per share, in 2007's second quarter.
Still, Yahoo's top executives repeatedly said, during a conference call to discuss the results, that they were pleased with Yahoo's performance considering the challenges it has faced, including adverse economic conditions and the distractions of the Microsoft acquisition bid and the strident controversies it has generated.
"We're executing and delivering against the strategy we laid out, even under extraordinary conditions," said CEO Jerry Yang.
CFO Blake Jorgensen said the conversion of joint broadband deals with AT&T and Rogers Communications to a revenue-sharing format, in late 2007 and early 2008, have hurt Yahoo's revenue growth this year.
Yahoo also said it saw economic conditions affect advertising revenue, especially in categories such as finance, travel and retail.
Yahoo, which has been struggling on the financial and technology fronts for the past two years, has been embroiled in a corporate soap opera since Microsoft announced a bid to acquire the company in February.
That bid collapsed in May, leading to accusations from shareholders, including Icahn, that Yahoo's managers and board had purposely sabotaged the negotiations in order to protect their own financial interests, violating their fiduciary duty to shareholders.
Yahoo's management and board have denied the accusations, which have led to shareholder lawsuits, saying they negotiated in good faith and that ultimately it was Microsoft's decision to walk away. In the meantime, Yahoo has seen a steady parade of high-profile executives leave the company in recent months.
Yahoo this week managed to reach an agreement with Icahn, who had proposed an alternate slate of director candidates for the Aug. 1 shareholder meeting in order to unseat the entire board. By expanding the board and granting Icahn three seats, Yahoo convinced the billionaire investor to call off the plan. Icahn had indicated previously that his intention was to unseat Yang as Yahoo CEO and attempt to lure Microsoft back to the negotiating table, a possibility that now seems remote.
The proxy-contest settlement "eliminates the distractions and allows us to move forward," Yang said.
An attempt by Microsoft to acquire Yahoo's search advertising business also fell through, as Yahoo instead opted for an alternate deal to outsource part of that business to rival Google.
The deal with Google raised eyebrows, since Google's dominance in search advertising is a big reason why Yahoo has struggled financially. Search advertising makes up about 40 percent of all online ad spending, and Google has a stranglehold on that segment of the market.
By comparison to Yahoo, Google last week reported second-quarter revenue of US$5.37 billion, up 39 percent over the same quarter last year. Almost all of Google's revenue comes from search advertising. It earned $4.63 per share.
The Yahoo/Google search ad outsourcing deal is being reviewed by U.S. regulators and hasn't been implemented yet.
Yahoo has said the deal with Google will give it a revenue boost while allowing Yahoo to continue honing its search advertising business, a key component of a broad advertising strategy that also includes the display ad formats, an area where Yahoo traditionally has been strong.
President Sue Decker said Yahoo is focusing on innovating in search technology, as opposed to trying to replicate the current models, because the company believes the search experience can be greatly improved.
For the third quarter, Yahoo expects revenue in the range of $1.78 billion to $1.98 billion, and for the full year between $7.35 billion and $7.85 billion. For the full-year forecast, Yahoo raised its minimum outlook from $7.20 billion and dropped its maximum outlook from $8 billion. That full-year forecast excludes the impact of certain items, such as a round of layoffs in the first quarter and costs associated with the Microsoft acquisition bid.
Yang said during the call that his management team and the board are focused on increasing shareholder value and are open to any alternative that advances that goal.
Judging by Yahoo's stock performance lately, it has its work cut out for it. Yahoo's stock closed at $21.40 on Tuesday, down 1.25 percent. During the time of Microsoft's bid, Yahoo's stock once closed at nearly $30. Microsoft's last offer for Yahoo was for $33 per share, but Yahoo wanted $37 per share, at which point Microsoft walked away in early May.

Brocade Deal to Help Drive Data-center Transition

Brocade Communications Systems' planned US$3 billion acquisition of Foundry Networks is a major strategic move in a brewing war over the future of data-center connectivity, industry analysts said Tuesday.
The deal, expected to close in the fourth quarter, would combine a maker of Fibre Channel SAN (storage area network) switches for data centers and a specialist in enterprise Ethernet LANs, two technologies that are headed toward a merger themselves.
The future of data centers lies with Ethernet, because it's relatively inexpensive, keeps scaling up to higher speeds and is ubiquitous throughout the rest of enterprise networks, analysts say. Virtualization and data-center consolidation are helping to drive the need for Ethernet's growing speeds. The idea is to create a "unified fabric" that spans both the data center at the enterprise's core and the LAN where client systems are located. But there are two main ways to bring Ethernet to data centers with the features needed there.
Both Brocade and Cisco are pushing FCoE (Fibre Channel over Ethernet), an IEEE standard expected later this year that would combine characteristics of both systems. By mapping Fibre Channel traffic over Ethernet networks, it will let enterprises take advantage of Ethernet speeds of 10G bps (bits per second) and up while keeping the latency, security and traffic management benefits of Fibre Channel. FCoE will also smooth the migration to Ethernet by letting the two technologies coexist in a single switch, so existing SANs (storage area networks) can stay.
The alternative is iSCSI, (Internet Small Computer System Interface) which some smaller enterprises have adopted because it can be used with conventional Ethernet switches and without in-house Fibre Channel expertise, said Bob Laliberte of Enterprise Strategy Group. Its main proponents have been storage vendors, he said.
Although it will take years for current Fibre Channel SANs to be replaced, one of the two is likely to win out, analysts said.
"There's a major religious war between FCoE and iSCSI," said Burton Group analyst Dave Passmore. They represent completely different technical approaches to combining Ethernet and storage transport protocols. "Reasonable people will disagree," he said.
Like Fibre Channel, FCoE does not use TCP/IP (Transmission Control Protocol/Internet Protocol), the basic communication protocol of the Internet and Ethernet networks, instead making up for it with other tools. Of the two approaches, only FCoE requires expensive, specialized switches, Passmore said, but it's more attractive to many organizations because it allows for a smoother transition from existing architectures, he said.
Enterprises could eventually lose out by choosing the technology that loses, but FCoE and iSCSI will probably coexist for years, Passmore said.
A unified fabric could save users money as well as complexity, Passmore said. For example, instead of having one network connection to the LAN and another to the SAN that it taps into for data, a blade server could have just one set of connections.
"That would greatly simplify the user's network infrastructure and require fewer switches," Passmore said.
Security is the main potential concern about having a common type of network across data centers and LANs, he said. Having two completely different networks as is traditionally done has built-in security benefits. But costs and benefits always have to be balanced in adopting new technologies, he said.
Brocade's purchase of Foundry will create a second powerful vendor of FCoE, said Yankee Group analyst Zeus Kerravala. So far, Cisco has been the only company with both the vision and the technology to create a unified fabric, he said. Brocade had the vision and now is gaining the Ethernet goods, Kerravala said.
"If the concept of unified fabric really does come true, there are really only two vendors," Kerravala said.

Tuesday, July 22, 2008

Printer Ink: How Do You Define 'Empty'?

Steve Bass finds 20 percent of the ink he paid for left in supposedly empty cartridges, but Brother has a logical (if not legal) explanation.
"I'm out of ink. Feed me." That was what my Brother 640CW multifunction printer demanded recently. I checked and there was still enough fluid in its cartridge for goodness knows how many more pages.
I examined all three allegedly empty cartridges--cyan, yellow, and magenta. From the top to bottom, they measured 1 1/8 inches. There was still roughly 1/4 inch of fluid at the bottom of each one. That's about a fifth of the cartridge's capacity, so my loss in ink was roughly $2.25 per cartridge. That's not exactly big bucks, but enough to make me feel like I was being scammed. (Oh, right, what printing manufacturer would do that, eh?)
I was fuming.
Brother Says: Oh, That's Normal
I used my pull and fired a note off to Brother's PR person. My question was simple: Is there a mechanical reason to leave fluid in the cartridge?
Brother's rep had a logical answer, of course. Here it is, verbatim--make sure to slip on a pair of hip boots so you don't get splattered with anything.
"First, we would like to assure you that Brother stands behind our product and the information disclosure that we provide to the consumer. It is always our policy to provide such information to consumers to help them understand both the product and the conditions under which the product operates.
"To address your specific question regarding ink volume, the rated yield for each cartridge follows the industry standard of that period which was based on 5% page coverage. So regardless of what small ink volume you may see remaining in an ink cartridge when it needs to be replaced, we guarantee that the ink volume that was provided and 'used' meets this industry standard calculation. Any additional ink volume left in a cartridge at that time was not put into the rated yield calculation that is guaranteed by Brother.
"Importantly, there is a technical and performance reason for why the small amount of ink is remaining in a cartridge that is identified as 'empty.' As mentioned in the User Manual, 'even though the machine informs you that an ink cartridge is empty, there will be a small amount of ink remaining in the ink cartridge. It is necessary to keep some ink in the ink cartridge to prevent air from drying out and damaging the print head assembly.' By doing so, the machine is protected and consistent print quality is ensured to satisfy the consumer. In effect, remaining ink should not be viewed as waste, but as Brother's affirmative action to provide ongoing high quality output and performance of the machine."
Horsepucky, says I. Granted, the printer may need a small amount of ink to keep the printer heads from drying out, but the volume left in the cartridge isn't what I'd call small. And I'm not interested in the industry standard of 5 percent coverage. What I know is that even with minimal printing, the Brother needs a new cartridge way too often--and I want every last drop of ink.
Inkjet Cartridges? It's a Hot Topic
I'm not the only one incensed about the ink issue. Here's what a few of my blog readers had to say:
"It's environmentally unfriendly. The more frequently we're required to change our ink cartridges unnecessarily, the more landfill waste. Granted many people recycle their used cartridges, but just as many throw them in the garbage."
--cwashizawa
"Change the name in your rant from Brother to Canon and it's exactly the same story. My brand new Canon was telling me the color cartridge was dangerously low for months before I actually got a printout with some missing color."
--rherman
"I've been in the supplies industry for 30+ years and 7 years ago developed my own Web site (OfficeSupplyOutfitters.com) to sell aftermarket and compatible replacement alternatives.... Why? Because inkjet and toner cartridges were appallingly high priced. If that weren't enough, the printer manufacturers are now using new technology to get you to buy more than you need.... Now some of the printer manufacturers are using chips on their cartridges to prevent aftermarket suppliers from being able to remanufacture their cartridges!"
--rookiecando
"Just have to add my 2 cents to this, in addition to my raging fury with HP for installing mini-ink cartridges in new printers that will print a test page and then force you to buy full-sized ink cartridges right out of the gate. The HP Officejet Pro K850... forces me to change practically full cartridges because it says they have 'expired.' This machine takes 4 'high-yield' tanks of ink at about $80 to replace."
--Mary E.
For more of the same, read "Inkjet Printer Ink: Reader Rants and Hacks" and browse the reader comments on "Study: Over Half of Inkjet Printer Ink is Thrown Away."
Save Yourself Some Cash
Want to thumb your nose at the big printer companies? Before you run out and buy third-party cartridges, read "Cheap Ink: Will It Cost You?" But not to worry, there are reputable companies out there--read "Where and How to Buy Cheap Ink" for some recommendations on buying third-party ink and saving money on big-name supplies.
After much due diligence, I found two spots with decent prices and good service. The first is Abacus where I bought a bunch of Brother cartridges. If you use the secret URL, you'll get a better price. I also use LDProducts to buy my Epson cartridges. They gave me a code for a 5-percent discount code good through December 2008: INKRET77.
We've got more money-saving tips in a video aptly titled "How to Save Money on Printing," and I covered the topic last year in "Save Money on Inkjet Printer Ink."
This Week's Roundup of Time Wasters
Steve Bass for president! Despite it all, I'm going to do it. Head for News3Online and watch The Steve Bass Phenomenon for details.
Board Dots is easy (ha!). Just fill in each of the blocks by drawing a path horizontally and vertically through each square. I did splendidly with level one. After that I decided to go back to writing because that's much easier. [Thanks, Jerame]
It's a long video, as long as some of the Dodger games I've been to. But if you follow baseball, you're going to love this.
In last week's Time Wasters I mentioned a site with great photos. That must have hit the sweet spot because I got tons of correspondence about it. So here's another two I suspect you'll enjoy. First, photojournalist Mary Shwalm's work. (I love "Zebra Tipping" and "The Brave Goose.") The other is Judith Wolfe, also a photojournalist, who pops up a new photo collage each week. Here are two favs: NYC Panorama and Coney Island.
Steve Bass writes PC World's monthly "Hassle-Free PC" column and is the author of "PC Annoyances, 2nd Edition: How to Fix the Most Annoying Things About Your Personal Computer," available from O'Reilly. He also writes PC World's daily Tips & Tweaks blog. Sign up to have Steve's newsletter e-mailed to you each week. Comments or questions? Send Steve e-mail.

Blu-ray Disc Rapidly Gaining Popularity in Japan

Shipments of Blu-ray Disc-based video recorders and players are increasing fast in Japan as the market rallies around the format after the end of its battle with the defeated HD DVD format.
Shipments of recorders and players based on Blu-ray Disc hit 122,000 in June marking the first time that monthly shipments have broken into six-figures, according to data published on Tuesday by the Japan Electronics and Information Technology Industries Association (JEITA). The data is gathered from member companies, which include all the major consumer electronics manufacturers in Japan.
That figure is a healthy jump on the 82,000 units shipped in May and is likely due to anticipated demand for the devices going into July, when millions of Japanese workers receive a mid-year bonus, and August, when the Olympics are held in Beijing. Both events typically provide a boost to the consumer electronics sector.
The sector was also boosted by the July 4 launch of a new system called "Dubbing 10" that allows consumers to make copies of TV shows they have recorded. In the past consumers were able to make one digital recording of a TV show but not make subsequent copies of that recording. The new system, which required new firmware or updated machines, allows up to 9 additional copies to be made and its arrival had some consumers holding back on purchases.
Because of the widespread availability of high-definition digital TV Japanese electronics makers are pushing Blu-ray Disc recorders that, in many cases, are combined with hard-disk drive recording capability.
A quick check of comparison shopping Web site Kakaku.com shows the cheapest Blu-ray Disc machine, Sharp's BD-AV1, can be found for ¥44,800 (US$420). The machine, which doesn't include HDD recording, is typically priced at between ¥55,000 and ¥65,000 at many retailers.
The cheapest machine with HDD recording that is widely available is Sony's BDZ-T50, which packs a 250G-byte drive that can accommodate about 50 hours of HDTV. The recorder, which was first released in November 2007, costs as little as ¥71,180. That's about half the original list price of ¥140,000.
However, buyers need to be wary of purchasing older machines that, in some cases, don't support the latest version of the Blu-ray Disc format. The Sharp BD-AV1, for example, won't record to the newer 2-layer Blu-ray Disc media although it does offer playback. That means owners are limited to single-layer 25G-byte discs that hold about 3 hours worth of HDTV.
No major vendor has released a playback-only Blu-ray Disc machine in the Japanese market.
Japan domestic shipments of Next-generation optical disc recorders/players
Month Jan 2008 Feb 2008 March 2008 April 2008 May 2008 June 2008
Shipments 35,000 58,000 77,000 81,000 82,000 122,000
Source: Japan Electronics and Information Technology Industries Association

Japanese Browser Maker Taking on IE, Firefox

A Japanese software company is stepping up international promotion of its Web browser in the hope of carving out a 5 percent share over the next few years of a market dominated by Internet Explorer and Firefox.
The Sleipnir browser is well-known among Japanese geeks, many of whom value the high level of customization that the browser allows. At the center of this customization is the ability to select either the Trident or Gecko layout engines for each Web site visited. Trident was developed by Microsoft and is used in Internet Explorer while Gecko is used in Mozilla's Firefox.
As any user who has changed Web browsers knows, some sites look different or offer different functionality depending on the browser in use. By clicking a small button in the bottom left of the browser and switching between Trident and Gecko users can choose the best one for the particular site.
Fenrir, which is based in Osaka, began development of the browser in 2005 and has been offering an English version alongside its main Japanese version for sometime but decided to step-up promotion overseas after noticing demand rising for the browser from international users, said Yasuhiro Miki, director of the overseas marketing division, at Fenrir.
"We'd like to focus on advanced users," he said.
In the next couple of years, Fenrir hopes to dramatically grow it's user base from the current roughly 100,000 users to around 17 million, said Miki. That corresponds to about 5 percent of the English-speaking Web user base, he said.
In Japan the browser has a 9 percent market share, according to Fenrir. No independent data to verify that claim is available but a recent survey of 3,003 computer programmers published by Nikkei ITpro put Sleipnir's share at 6 percent among that group.
Initially the focus is on the English-speaking market but Fenrir has plans to look at other language versions including Spanish and French.

Groups Urge FCC to Keep the Internet Open

The U.S. Federal Communications Commission needs to take steps to keep the Internet free of interference from broadband providers, such as the slowing of peer-to-peer traffic and the tracking of subscribers' Web habits, several witnesses told the FCC at a hearing Monday.
The FCC should take fast action against broadband providers that block access to legal online applications, especially those who don't notify their subscribers, said Marge Krueger, administrator of the Communications Workers of America (CWA) for the district covering Pennsylvania and Delaware.
Krueger didn't name providers that have slowed access to applications, but Comcast has been in the news in recent months for slowing access to the BitTorrent peer-to-peer application. A Comcast representative didn't testify at Monday's hearing at Carnegie Mellon University in Pittsburgh, but the company has repeatedly said it slows BitTorrent traffic at limited times of peak traffic.
Another witness complained that some broadband providers are using deep-packet inspection techniques to track subscribers' Internet use, in an effort to deliver targeted advertising. NebuAd, a California company, has worked with several broadband providers to provide this targeted ad service, but several privacy groups and U.S. lawmakers have objected to the tracking.
Deep-packet inspection can be a useful tool for network management, said David Farber, a computer science and public policy professor at Carnegie Mellon. "What's almost obscene is the fact that people are using it to gather information about what I'm sending on the network and selling that information to other people," Farber said. "That is completely obscene and should be stopped."
Several members of the public also called on the FCC to enforce so-called network neutrality rules that would prohibit broadband providers from blocking or slowing Web content from competitors. Small video producers and other online businesses will not be able to compete without net neutrality rules, said one Carnegie Mellon student.
But Robert Quinn, senior vice president for federal regulatory policy at AT&T, asked the FCC to look carefully before regulating how broadband providers can mange their networks. While the FCC has the power to enforce net neutrality rules, broadband providers need to be able to manage their networks as more and more subscribers begin to use high-bandwidth applications such as video, he said.
AT&T spent about US $17.5 billion in 2007 on expanding networks and other capital improvements, Quinn said. The broadband provider expects bandwidth demand to increase by more than 400 percent in the next three years, he said.
"With the kind of growth we are seeing in bandwidth demand today, we cannot simply stay ahead of the bandwidth curve by building bigger and better pipes," Quinn added. "The money to build them just doesn't exist. Network operators must be able to manage those networks to squeeze out every last ounce of efficiency that we can, in order to keep the cost to the end-user customer as affordable as we can possibly make it."
The CWA's Krueger and several other witnesses called on the U.S. to create a comprehensive broadband policy that would help providers roll out broadband to rural areas and increase speeds. Average U.S. broadband speeds are slower than in several other industrialized nations, putting U.S. consumers and businesses at a disadvantage, she said.
But Scott Wallsten, vice president for research at the conservative Technology Policy Institute, suggested that many reports showing the U.S. falling behind other nations in broadband are misleading, particularly studies by the Organisation for Economic Cooperation and Development (OECD) showing the U.S. 15th in the world in per capita broadband adoption. The U.S. has a larger household size than many other OECD members, and households typically get one broadband connection to share, he said.
The OECD also undercounts business broadband connections, he said.
While better information about broadband availability is needed, the U.S. is not facing a broadband crisis that cries for major new policies, Wallsten said.
FCC member Michael Copps said he found it hard to believe that people were still arguing against a comprehensive broadband policy. All major infrastructure built in the U.S., from the railroads to the telephone network to the interstate highway system, required major investments by the federal government, he said.
"I am unaware of any infrastructure built in the history of this country that has not been accomplished through a public sector/private sector partnership," Copps said. "We're sitting here saying, 'Should there be a [national] strategy?' We've never done that before."

Details of Major Internet Flaw Posted by Accident

A computer security company on Monday inadvertently published details of a major flaw in the Internet's Domain Name System (DNS) several weeks before they were due to be disclosed.
The flaw was discovered several months ago by IOActive researcher Dan Kaminsky, who worked through the early part of this year with Internet software vendors such as Microsoft, Cisco and the Internet Systems Consortium to patch the issue.
The companies released a fix for the bug two weeks ago and encouraged corporate users and Internet service providers to patch their DNS systems as soon as possible. Although the problem could affect some home users, it is not considered to be a major issue for consumers, according to Kaminsky.
At the time he announced the flaw, Kaminsky asked members of the security research community to hold off on public speculation about its precise nature in order to give users time to patch their systems. Kaminsky had planned to disclose details of the flaw during a presentation at the Black Hat security conference set for Aug. 6.
Some researchers took the request as a personal challenge to find the flaw before Kaminsky's talk. Others complained at being kept in the dark about the technical details of his finding.
On Monday, Zynamics.com CEO Thomas Dullien (who uses the hacker name Halvar Flake) [cq] took a guess at the bug, admitting that he knew very little about DNS.
His findings were quickly confirmed by Matasano Security, a vendor that had been briefed on the issue.
"The cat is out of the bag. Yes, Halvar Flake figured out the flaw Dan Kaminsky will announce at Black Hat," Matasano said in a blog posting that was removed within five minutes of its 1:30 p.m. Eastern publication. Copies of the post were soon circulating on the Internet, one of which was viewed by IDG News Service.
Matasano's post discusses the technical details of the bug, saying that by using a fast Internet connection, an attacker could launch what's known as a DNS cache poisoning attack against a Domain Name server and succeed, for example, in redirecting traffic to malicious Web sites within about 10 seconds.
Matasano Researcher Thomas Ptacek declined to comment on whether or not Flake had actually figured out the flaw, but in a telephone interview he said the item had been "accidentally posted too soon." Ptacek was one of the few security researchers who had been given a detailed briefing on the bug and had agreed not to comment on it before details were made public.
Matasano's post inadvertently confirmed that Flake had described the flaw correctly, Ptacek admitted.
Late Monday, Ptacek apologized to Kaminsky on his company blog. "We regret that it ran," he wrote. "We removed it from the blog as soon as we saw it. Unfortunately, it takes only seconds for Internet publications to spread."
Kaminsky's attack takes advantage of several known DNS bugs, combining them in a novel way, said Cricket Liu vice president of architecture with DNS appliance vendor Infoblox, after viewing the Matasano post.
The bug has to do with the way DNS clients and servers obtain information from other DNS servers on the Internet. When the DNS software does not know the numerical IP (Internet Protocol) address of a computer, it asks another DNS server for this information. With cache poisoning, the attacker tricks the DNS software into believing that legitimate domains, such as idg.com, map to malicious IP addresses.
In Kaminsky's attack a cache poisoning attempt also includes what is known as "Additional Resource Record" data. By adding this data, the attack becomes much more powerful, security experts say. "The combination of them is pretty bad," Liu said.
An attacker could launch such an attack against an Internet service provider's domain name servers and then redirect them to malicious servers. By poisoning the domain name record for www.citibank.com, for example, the attackers could redirect the ISP's users to a malicious phishing server every time they tried to visit the banking site with their Web browser.
Kaminsky declined to confirm that Flake had discovered his issue, but in a posting to his Web site Monday he wrote "13>0," apparently a comment that the 13 days administrators have had to patch his flaw before its public disclosure is better than nothing.
"Patch. Today. Now. Yes, stay late," he wrote.
He has posted a test on his Web site that anyone can run to find our if their network's DNS software is patched

Monday, July 21, 2008

Dispelled Intel Rumor May Disappoint Gamers

Intel announced prices for some of its latest, most powerful chips that might disappoint digital gaming enthusiasts because they're more expensive than reports had speculated.
The latest Intel processors for gamers, the Intel Core 2 Extreme Quad-Core processors for desktop PCs that run at 3.2GHz will cost US$1,499 each in lots of 1000 processors, with a slightly different version for $1,399. Another version of the chip that runs at 3.0GHz will cost $999 in the same amounts, according to Intel's latest price list.
Gamer blogs and some news reports had speculated the new 3.2GHz Core 2 Extreme Quad-Core might come out priced at around $999, far less than Intel actually announced.
Each of the new chips comes with four calculating engines on board for more realistic 3D images and ultra-fast gaming speeds, according to Intel.
Several gaming PCs have already been designed around the chip family, including Velocity Micro's Raptor Signature Edition for around $5,995, which will ship next month and Gateway's FX541XG for around $2,707.99, which starts shipping this week.
The Core 2 Extreme Quad-Core chips had been code-named Bloomfield.
One big surprise on Intel's latest pricing list, which is dated July 15, is the lack of price declines on desktop PC processors. The list actually shows no price declines on any processors, but it in desktop PC microprocessors, there have only been rare occasions when prices don't decline. One reason prices go down is because the company needs to clear out older technology chips as it creates newer products. Another reason is because Intel continually advances its chip-making technology to gain greater efficiencies, which it can pass on to customers with lower prices.
A price war with rival Advanced Micro Devices (AMD) over the past few years has helped push microprocessor prices generally lower. Microprocessors made for laptop PCs have held up better in price because the laptop PC market has been growing at a breakneck pace. Desktop microprocessor prices, have nearly always fallen due to lackluster demand and the price war.
But recently, desktop chip prices have stabilized.
Converge, a U.S. company that does some of its work in the microprocessor spot market has noted a "dramatic resurgence of shortages in the desktop market after a sustained period of relative calm" in the third quarter.
Intel and market researcher Gartner both noted that corporations have been buying more desktop PCs recently, and Intel also said demand has been strong in emerging markets.

Five Storage Strategies to Save Money

Storage costs eat up at least 11% of IT hardware budgets, but there are plenty of ways to save money without sacrificing performance or security. In a new report called "Five Key Storage Strategies for a Down Economy," Forrester analyst Andrew Reichman provides a road map for smart purchasing and maximizing the resources you already have. Here's a summary.
1. Play hardball with vendors. The storage market is highly competitive, but vendors also know that the cost of switching can be prohibitive. (Compare storage products.) This means your current vendor might have become complacent, particularly if you have been loyal for many years, expanding capacity without competitive bids, Reichman writes.
But as the economy gets worse, "storage vendors will be trying even harder to win new deals and protect their existing accounts from competitors trying to do the same thing," Reichman writes. "Use this situation to your advantage by introducing a fresh sense of competition among the vendors you work with." By undertaking a request for proposals bidding process, you can win discounts from your current vendor or discover a new, less-expensive vendor you weren't aware of.
2. Avoid new purchases by reclaiming what you have. Wasted storage, not surprisingly, is a waste of money. Storage is allocated but not used all the time for many reasons. "Some applications and operating systems don't lend themselves to gradual storage expansion over time; they require a large up-front allocation that may or may not be consumed eventually," Reichman writes. "This tendency for over-allocation combined with limited ability to effectively forecast data growth in most organizations leads to a significant gap in the amount of capacity that is allocated versus actually used."
Reclaiming wasted storage will often require application downtime, making careful planning necessary. Using storage virtualization is one way to migrate without disruption.
Other examples of wasted storage include: servers that have been taken off-line without its associated storage being returned to the free pool, and storage that's "'mapped but not masked,' meaning it has been allocated within the storage array but not recognized by a server."
3. Audit backup and replication configurations to cut waste. As important as disaster recovery is, the technologies that enable it sometimes lead to waste. "In a typical storage environment, there can often be as many as 10 copies of the same data -- several days of full backups, a couple of snapshots, and a fully replicated copy at the alternative site," Reichman notes. "Most backup systems have inadequate reporting capabilities, so it's difficult for storage administrators to associate applications to their backup jobs and their retention schedules."
An audit of backup policies and storage configurations can "eliminate unnecessary backup jobs, snapshots, clones and replication, and can return unused disk or tape media to the free pool to reduce future expenditure." Another strategy is to review replication levels to make sure the right amount of storage is being allocated to each application. This work can be tedious but can also be done internally and for little cost. (Compare data backup and replication products.)
4. Rethink storage network decisions. When you need high performance and availability, Fibre Channel isn't the only option. Alternatives that can sometimes provide both lower cost and meet performance needs include iSCSI, the Network File System (NFS) protocol, and direct-attached storage, Reichman writes.
Oracle and VMware are throwing "their hat into the NFS ring," he says, noting that more applications are supporting NFS as a way to connect servers to storage. Direct-attached storage is also a good alternative when the benefits of centralized networked storage are limited, such as when each storage device is dedicated to one application.
"While these options might not make sense for every application or every environment, cost-conscious firms should take a good, hard look at their storage network decisions and give some consideration to [these] approaches," Reichman writes.
5. Use a tiering methodology that delivers results simply. Every cost-control strategy requires an up-front investment of time or money, but for some the ROI happens quickly. Tiering, on the other hand, has to be viewed as a long-term strategy because you won't realize savings right away. For many users, the complexity of tiered storage has outweighed cost savings, but that doesn't mean it can't be effective.
"If tiering means buying a wholly separate platform in addition to the tier one infrastructure, it can take years to realize any benefit," Reichman notes. "By shifting investments you would already have made to lower tiers instead, you can realize cost avoidance."
Because of the down economy, more businesses are putting data on tier two storage right from the beginning, and only promoting it to tier one if the performance is unacceptable, he writes. "Buying cheaper, dense disks in the systems you already own makes sense for tiering without the added cost of a separate platform," Reichman writes. "Remember to keep it simple and consistent -- having too many tiers and options makes it hard to manage the environment, which can negatively impact cost savings."

Watch for Security in the Clouds

Security applications delivered as cloud-based services will more than triple by 2013, according to Gartner.
The firm said 20 percent of the revenue of messaging security tools, such as antimalware and antispam services for email and instant messaging, currently comes through the cloud delivery model. But this will jump to 60 percent by 2013.
Popular on-demand enterprise applications, such as those provided by Salesforce.com, are allowing mobile workers to bypass the corporate network to access business data. Gartner said this will force security teams to put controls between mobile workers and cloud based services.
"Although perimeter security controls will be required to protect the remaining data center functions and the large portions of enterprise populations that are not mobile, new approaches will be needed to secure cloud-based IT services," John Pescatore, vice president and Gartner analyst said in a statement.
"One answer will be cloud-enabled security 'proxies' whereby all access to approved cloud-based IT services will be required to flow through cloud-based security services that enforce authentication, data loss prevention, intrusion prevention, network access control, vulnerability management and so on," he said.
Gartner defines cloud computing as a type of computing where IT-related capabilities are provided as a service using Internet technologies to multiple external customers. This delivery model is getting closer towards widespread acceptance, according to Gartner, because it allows enterprises to gain security services such as distributed denial-of-service attack (DDoS) protection without huge capital investments.
But, Pescatore warned the use of cloud computing will make organizations more vulnerable to some security risks.
"Inexpensive cloud-based processing will make it easier and cheaper to break encryption keys or find vulnerabilities in software, and financially motivated criminals will certainly seek to take advantage of that," he said. "Enterprises will need to prioritize the adoption of encryption technologies that provide easy movement to longer keys."

Tech Job Cuts Forecast for 2009

CIOs plan sharp reductions in contract staff, professional services, and hardware -- and almost no investment in cloud computing
IT staff jobs are at increasing risk -- both for contractors and in-house workers -- according to a survey of top CIOs by Goldman, Sachs & Co released last week. Global services companies will also feel the pinch because of the slowing economy.
A second survey showed that basic PC and network hardware, as well as professional services providers, would bear the largest proportion of spending cuts. It also showed that CIOs planned to emphasize economizing measures over investments new technologies, with cloud computing emerging as the last item on their priority lists, despite the hype around it.
IT contractors to bear the brunt of cuts
"Demand for discretionary IT projects dropped to its lowest point" in the 41-study history of the Goldman Sachs staffing survey, which asked 100 managers with strategic-decision-making authority (mainly CIOs at multinational Fortune 1000 companies) about their about IT staffing plans for 2009.
The Sachs report states that "in a cost-constrained IT budget scenario, CIOs will most likely look to cut their resources first from lower-value augmented [contract] IT staff." The company also describes its survey as "an early warning flag" for service providers' 2009 bookings of new projects.
These intended cutbacks are a change from last fall. When the managers were asked in October which area of IT service delivery resources they would cut for application-related development or maintenance work, the answer was 0 percent for in-house staff. However, with a declining economy, a February survey's results saw 8 percent of respondents saying in-house IT programming staff would be cut. In April, 15 percent of respondents said in-house staff would be cut. That dropped to 11 percent in the June survey (the most recent), which was released last week.
But contract staff fare much worse, with 48 percent of the respondents saying that such staff would be cut. And 30 percent of the responders said on-site third-party service provider staff would also be cut for application-related development or maintenance work. Twelve percent of the managers said they would cut staff from offshore third-party service providers.
Consultants, hardware targets of spending cuts
The second survey by Goldman Sachs probed 2009 spending plans based on type of IT projects. This survey also showed cuts are in the offing. "ROI is the name of the game. CIOs have emphasized to us that they are buying on a need versus want basis, are often downsizing deals to fit with current budget constraints, and are searching for solutions with a high and fast ROI," the survey authors wrote.
The spending survey indicated CIOs see the "greatest potential for cost reduction in IT in the area of networking equipment." A full 47 percent of the responders said the most likely area where spending would be slowed would be on purchases of personal computer systems, servers, and storage.
Spending cuts won't be limits to equipment: 42 percent of the CIOs indicated that "they are reluctant to spend money on third-party professional services." This is in keeping with the decline in interest for discretionary IT projects and could indicate more of a reliance on in-house IT staff.
Cloud computing may get buzz, but it won't get spend
The CIOs surveyed indicated that server virtualization and server consolidation are their No. 1 and No. 2 priorities. Following these two are cost cutting, application integration, and datacenter consolidation. At the bottom of the list of IT priorities are grid computing, open source software, content management, and cloud computing (called on-demand/utility computing in the survey) -- less than 2 percent of the respondents said cloud computing was a priority.
Charles King, a principal analyst with Pund-IT, said that such hot-button technologies like cloud computing deployments may slow down. "The message here is CIOs are looking primarily to tested, well-understood technologies that can result in savings or increased business efficiencies whose support can be argued from a financial point of view," he said.
One reason for the low priorities of grid computing, open source software, and cloud computing may be that CIOs and business executives don't understand their value. "They require a technical understanding to get to their importance. I don't think C-level executives and managers have that understanding," King said.

Green isn't Great, Study Says

The notion of green computing is an unhelpful one, making it harder for companies to implement carbon reduction policies. That's according to a new report from the Carbon Disclosure Project, which questioned 11 leading enterprises about their environmental policies.
The term 'green' came in for heavy criticism: The report said that the word was an "employee or consumer-friendly way of introducing climate-change topics," but was too vague for general use, lacking "the specific definitions needed to manage carbon and/or other greenhouse gas (GHG) emissions." Marieke Beckmann, responsible for communication and corporate partnership at CDP, agreed that the term was too misleading. "Green shouldn't be used," she said.
The report, which was produced in conjunction with IBM, sets out a variety of measures by which companies could set guidelines to reduce carbon emissions, including: setting definitions, appointing a carbon information manager, more detailed electricity billing, league-tables of departmental carbon use, greater use of videoconferencing and IM, more mobile working and a reduction in business travel.
The need to set definitions is a thorny one, as different companies offer different power ratings for servers. Beckmann agreed and said that was a difficulty measuring carbon use and power emissions. "There's no one standard that's applicable across all companies, across all sectors," she said. "In fact, there's no standard that can be applied across one sector," she added.
While accepting that it was stating the obvious to say that to reduce carbon emissions would have to have an accurate definition of carbon emissions in the first place, she said that many organizations didn't have that. "Sometimes, stating the obvious is sometimes a good thing," she said.
Companies who took part in the survey included HBOS, IBM, Lloyds TSB, Tesco and Unilever.

Google Gets 70% of U.S. Searches

Seventy percent of U.S. searches in June were done on Google Inc.'s search engine, according to Web measurement figures from market research company Hitwise Pty. Ltd. .
In the four weeks ending June 28, Google accounted for 69.17% of all U.S. searches, up from 63.92%, or 8%, from the same period a year ago, according to Hitwise.
But Google's gain led to lower numbers for other search companies. Yahoo Inc. received 19.62% of U.S. searches in June, down from 21.31% last year, while MSN Search received 5.46%, down from 9.85% in June 2007.
Ask.com , on the other hand, had better news, collecting 4.17% of searches, up from 3.42% a year ago, according to Hitwise.
"Google just continues to grab market share," said Hitwise spokesman Matt Tatham. "There's just no ceiling for them." And when searches on Google go up, searches on the other search engines go down, he said.
The jump in searches at Ask.com was probably because it redesigned its home page and added some new functions to its search that may have drawn people in, Tatham said.
"Search engines continue to be the primary way Internet users navigate to key industry categories," Hitwise said. Year over year, the travel, news and media, entertainment, business and finance, sports, online video and social networking categories had double digit growth in traffic coming from search engines, the company said.

Friday, July 18, 2008

Yahoo Uses Home Page to Lobby Against Icahn

Yahoo has added a button on its main Web site linking to a page that lobbies shareholders to vote against Carl Icahn's plans for the company, stepping up its rhetoric in the days leading up to its annual meeting.
The page, reached via a button that reads "Your Yahoo! Your Vote," clearly tries to discredit Icahn, who has proposed a new board of directors for Yahoo in hopes of facilitating a deal with Microsoft.
Featured on the page is a chart listing companies that Yahoo says Icahn has been involved with and noting how the stock price of each of the companies has changed since his involvement, dating back to 2004. According to Yahoo, of the 15 companies on the list, all but three have seen their stock prices decline. The companies include struggling concerns such as Motorola and Blockbuster.
In bold type at the very top of the page, Yahoo quotes Icahn from a Wall Street Journal article as saying, "It's hard to understand these technology companies," in an apparent attempt to portray Icahn as unable to make an informed plan regarding a technology company like Yahoo.
The site links to Yahoo press releases, letters to shareholders and information about how shareholders can cast their votes.
Yahoo plans to hold its annual meeting on Aug. 1. Its entire board of directors is up for re-election at that time, as is a new board proposed by Icahn. The activist investor has said that if his board is elected, Microsoft has agreed to consider a transaction that would include buying Yahoo's search business.
On his blog, Icahn has not reacted to the new Yahoo Web page, but to date he has been quite clear about his intentions and how he feels about Yahoo's current leadership. Earlier this week, he issued an open letter to shareholders that said: "Our company is on a precipice and our board seems ready to take the risk of seeing it topple." That letter was written in response to an earlier letter Yahoo sent to shareholders that was critical of Icahn's plans.
Large institutional shareholders are beginning to publicly take sides in the dispute. Legg Mason, which owns 4.4 percent of Yahoo's stock, said on Friday that it plans to back Yahoo's slate of candidates for the board.

AMD Takes on Intel With Its Own Low-power Chip

Advanced Micro Devices is developing a low-power processor for mobile devices and sub-notebooks, the company confirmed Friday, quashing months of speculation that it had abandoned the project.
The chip will compete with Intel's Atom processor and potentially supplant AMD's low-power Geode x86 system-on-chip, which is included in One Laptop Per Child's XO laptop. Based on the x86 system-on-chip design acquired from National Semiconductor in 2003, Geode is also offered in thin clients and embedded equipment.
AMD declined to comment on release dates for the chip.
Plans to develop a low-power chip, code-named Bobcat, were first revealed by AMD last year. At the time, AMD officials described the chip as "designed for maximum energy efficiency and performance-per-watt for next-generation mobile devices, scaling as low as 1 watt."
The company has been quiet about plans for Bobcat ever since, leading to speculation among industry observers that it had abandoned the project as it tries to recover from consecutive quarterly losses and restructuring.
Further details about the new mobile chip are expected to be revealed in November at the company's analyst conference, said AMD's new CEO Dirk Meyer, during a conference call on Thursday to discuss the company's financial results.
"Clearly, when you talk about smaller form-factor notebooks and inexpensive notebooks that is a market segment we are interested in," Meyer said.
AMD could be a late entrant to the market of low-power chips for mobile devices rife with competition. Intel released Atom processors earlier this year, building the x86 architecture into low-power chips that are now being used in low-cost sub-notebooks and mobile Internet devices. Via also introduced the Isaiah processor for mobile devices and sub-notebooks. In June, Nvidia announced the Tegra system-on-chip for cell phones with an integrated graphics processor.
Apple is also taking a stab at the mobile chip market, using the recent acquisition of PA Semi to develop system-on-chips for the iPhone.
Intel is already working an Atom successor code-named Moorestown, due for release in 2009. The platform includes a system-on-chip code-named Lincroft, which is based on a 45-nanometer Silverthorne core, and puts a graphics, video and memory controller on a single chip.

Going Up: Slow Progress on 'space Elevator'

Disney World, Epcot, Universal Studios and ... Space Orlando. In the future, Florida could be the site of a simulated "elevator" that allows people to check out life on a space station, virtually.
That's one dream of Bradley Edwards, president of Black Line Ascension and one of the leading proponents of space elevators. The center, which would be a combined entertainment and research facility, could help solve one of the many critical issues plaguing the concept of a simulated space elevator, namely a lack of funding.
At the first space elevator conference in four years, this time in Redmond, Washington, on Microsoft's campus, Edwards announced that he is investigating the feasibility of a combined entertainment and research center, to be called Space Orlando, designed to help fund the building of a space elevator. The cluster of buildings would comprise 2 million square feet (929,030 square meters) and a 10-story-high structure that visitors could enter as if they were walking into a terminal for a real space elevator. They'd buy a ticket, enter the climber vehicle and feel like they're ascending into space, thanks to virtual reality technologies.
They'd step off the climber into space -- or really, a massive room lined with plasma screens displaying what it would look like to be on a space station, looking out into the solar system.
The entertainment facility would also be a working research center. "Wrapped into it are real research labs with glass walls, unfortunately for the researchers," Edwards joked. Visitors would be able to observe the technology the researchers are working on, such as a habitat for people in space.
Edwards estimates the facility would cost US$500 million to $1 billion to build and would attract 8 million visitors a year. Their entrance tickets would help fund the research and development of a space elevator. As Edwards envisions it, a real space elevator, as opposed to a simulated version, would consist of a very long "ribbon" made of carbon nanotubes stretching from a platform on Earth into geosynchronous altitude, around 22,000 miles (35,406 kilometers) above Earth's surface. Lightweight cars would attach to the ribbon and ride up into space. Travel time to the geosynchronous altitude: eight days, moving at 120 miles per hour.
The center could be a relatively easy way to fund research, he said. "Applying for NASA grants is a bit more of a challenge for getting funding," he said. To date, only about $570,000 in funding has been dedicated to the concept of the space elevator in total, he said. "Nobody's getting paid for this," he said.
There are a number of other hurdles, in addition to the funding issue. Technically, scientists are still working out how to piece together carbon nanotube strands at the length required.
One conference speaker pointed out a bigger problem that has yet to be solved. "When you have an object that extends from the surface of Earth to geosynchronous altitude, every satellite currently in orbit, every piece of debris and every satellite in the future will crash into the elevator," said Ivan Bekey, a former NASA scientist currently with Bekey Designs. "Every one, with no exception."
There are about 6,000 satellites in orbit today, he said, many of which are no longer in use. When a satellite hits the space elevator, it would "vaporize it," he said.
So far, none of the potential solutions for avoiding such a collision are viable, he said. The dead satellites essentially can't alter their orbits to avoid the elevator and it would be too costly to require live satellites to move out of the way.
Some proponents say that the elevator could be tethered to a platform in the ocean that could be moved so that the elevator could avoid approaching satellites. That plan opens issues around the oscillations that would travel up and down the elevator each time the platform is moved. Some research into the matter has been done but there's still some uncertainty, particularly around how big the oscillations would be, Edwards said.
The idea of a space elevator grew in science-fiction novels around the 1960s but didn't become a potential reality until the discovery of carbon nanotubes in 1991, Edwards said. A space elevator is of interest to scientists because it could enable a much cheaper method for transporting items to and from space. The ability to move objects easily into space could spawn "the full commercialization of space," including manufacturing, tourism, solar-energy generation and research and development, Edwards said.
NASA has a space elevator on its road map for around the year 2200, Edwards said. But it's possible that a space elevator could come first from a non-U.S. country. Japan currently has a space elevator on its road map for 2030, Edwards said.
Speakers at the conference recognize that the whole concept of the space elevator strikes many people as unbelievable, but they argue that the technology required to build such an elevator is available or at least plausible.

Bugs & Fixes: Fixing IPhone 2.0 Sync Problems

It's been a tough week for Apple. First, there were the activation hassles during the iPhone 3G launch day. Next up were the numerous complaints about poor 3G network signal strength and batteries that lose their charge too quickly. (Expect these to be resolved via future updates to the iPhone software.)
And then there's MobileMe. It had so many launch problems that Apple felt compelled to offer an apology and give all current subscribers a free 30-day extension.
Amidst all of this, it's hard to choose exactly which one or two issues to highlight here. Ultimately, I decided to focus on two lesser known fixes for iPhone problems--fixes that, nonetheless, can resolve a wide range of sync symptoms.
Fix iPhone sync problems by reinstalling iTunes
After switching over to an iPhone 3G, I discovered that syncing my iPhone's Contacts list had gone astray. In particular, syncing was now inexplicably unidirectional--only from the Mac to the iPhone. Any changes I made on the iPhone itself were eradicated after a sync. Among other things, this meant that ringtones I had assigned to contacts (which cannot be linked except on the iPhone itself) would not survive a sync. Very annoying.
I went through a laundry list of potential fixes, none of which had any effect. What finally succeeded (and a tip of the hat to Apple tech support for guidance here) was to completely remove iTunes from my Mac and reinstall it.
To do so on your Mac, drag iTunes to the Trash, restart the Mac and then empty the Trash. Next, go to the iTunes download page to download and install the latest version of the software. After I did this, I was pleasantly surprised to find that my sync problem was gone.
A related not-well-known but often helpful fix is to launch iSync, select Preferences from the iSync menu, and click the Reset Sync History button (as noted in this support document). I might have tried this before I set about to reinstall iTunes--but the button was grayed out and unselectable on my Mac. Interestingly, reinstalling iTunes fixed this glitch as well. While a sync reset should leave all your data intact, I recommend playing it safe and backing up your Address Book and iCal data before proceeding.
Fix MobileMe-to-iPhone sync problems by deleting the MobileMe account
After setting up MobileMe, I noticed that none of my calendar changes on the MobileMe Web site were being pushed to my iPhone. I tried numerous fixes, as suggested in this Apple article. However, I had to go all the way down to the tenth and final suggestion before finding the one that worked: Deleting and re-adding my MobileMe account on the iPhone (via Settings -> Mail, Contacts and Calendars). Especially if you had previously set up a .Mac account on your iPhone that was converted to MobileMe, I recommend starting with this step rather than saving it for last. It costs very little in time and hassle--and is a cure-all for a variety of MobileMe-to-iPhone symptoms.

Wednesday, July 16, 2008

Chairman of Chip Maker UMC Resigns

The chairman and CEO of United Microelectronics (UMC), the second-largest contract chip maker in the world, announced his resignation on Wednesday, with the company immediately naming his replacements.
Jackson Hu took over as chairman of UMC in early 2006 after his boss stepped down in a spat with the Taiwan government. Robert Tsao [CQ], founder and former chairman of UMC, left his job amid allegations of illegally investing in China. Last year, he and another UMC executive were exonerated when a Taiwan court ruled there was not enough evidence to convict them in the case.
Hu will take on a new role as a senior advisor to UMC.
In his place, UMC appointed a more youthful, professional management team aimed at revitalizing the chip maker, according to a statement.
UMC's chief financial officer, Stan Hung, will take over as chairman at UMC, while Sun Shih-wei, chief operating officer at the company, will take over the CEO role.
Tsao was indicted in 2006 for allegedly investing in and transferring chip technology to Chinese chip maker He Jian Technology. He admitted to advising He Jian during its start-up phase, but maintained that all help was within legal bounds.
Taiwan carefully controls chip investments in China, fearing it could lead to job losses on the island or that its technology could be used to bolster Chinese military might. The two separated in 1949 amid civil war, and Beijing has long threatened the use of force to take the island if it moves towards formal independence.

Symbian CEO Says Collaboration With Google Is Possible

Broader collaboration between Symbian and Google at either the application or operating system level is possible in the future, Symbian's CEO said Wednesday.
"We have a good relationship with Google," Nigel Clifford said at a Tokyo news conference."In fact Symbian was one of the first mobile platforms to put their applications such as Google search and maps," he said as he showed his mobile phone.
Google is about to compete head-to-head with Symbian in the cell phone space with the launch of its Android platform and Symbian is reorganizing to meet that challenge.
Last month Nokia, which holds a major stake in Symbian, said it plans to acquire all the shares in Symbian and turn it over to the Symbian Foundation, a new group backed by several companies in the mobile phone business. As part of the move the three platforms that run on Symbian -- S60, UIQ and MOAP -- will be unified into a single open mobile platform.
"By making our software open, we're inviting more developers to play and learn on the Symbian platform, "he said."Anyone can join and that is the fundamental idea behind the foundation."
Clifford said that while the change may be viewed as a reaction to growing competition from open-source mobile software, Symbian is doing it to make life easier for application developers and phone manufacturers.
He also questioned Google's purpose in pursuing Android, when Symbian has already accomplished Android's goal: a platform that is proven in the market, fully open, and operates with a royalty-free license.
Google, however, is not the only competitor from the computer world that is trying to take a slice of the mobile phone market. Apple's successful entry into the market through its iPhone could also be a threat.
"Not all PC developers can transition to developing for mobile phones -- screen sizes are different plus there are limitations in power and memory", said Clifford. He added that crossing over into the specialized mobile phone market is not as easy as it looks.
Despite all the competition Clifford is certain that Symbian, with its new open and integrated platform, will win this battle. Its close ties to handset manufacturers, network companies and hardware is a competitive advantage, which the company hopes will help keep it at the top spot.
"The mobile phone software market is a complicated world and with the Symbian Foundation, we will win this," he said.