Friday, February 29, 2008

Green Computing Finds its Place at Cebit

The Climate Savers Computing Initiative will play a major role at the German trade show, opening March 4.

Cebit is taking on a green tinge this year, with the Climate Savers Computing Initiative playing a central role at the trade show, which opens March 4 in Hanover, Germany.

The climate initiative aims to reduce IT's carbon dioxide emissions from computer operations by 50 percent between 2007 and 2010. The group, led by PC manufacturers Dell, Hewlett-Packard and Lenovo, among others, will present energy-efficient IT products in a special "green village," and a central information point in Hall 9 will point visitors to other companies with environmentally friendly products. Showgoers can also take away a green IT guide produced with the help of IDG's Computerwoche magazine.

Climate Savers will hold a news conference Monday evening to laud the environmental efforts of some companies -- while those featured in a Greenpeace event the next morning can expect the opposite treatment: the campaign group in recent months has focused on uncovering IT manufacturers' use of pollutants.

The environmental interest of some of the "green" products highlighted by show organizers is a little obscure: a solar-powered flashlight and a banknote sorter figure on the list.

Other products won't save the earth, but will at least allow us to document, or measure, how much damage we're doing to it. For those who want to keep tabs on how much of the earth they've seen, the latest locating devices will also be on hand. In addition to GPS (Global Positioning System), some add GSM (Global System for Mobile Communications) functions for transmitting data, offering a way to keep track of loved ones, according to one vendor. Or maybe unloved ones, too.

Hot specs for today's trackers include strong magnets to keep the unit on a vehicle and a tough form factor so the device can endure extreme weather. Many devices are also very small to stay hidden from view.

The new emphasis on saving energy and reducing emissions is just one of the changes at this year's show, which runs from Tuesday through the following Sunday. Previous shows have run Thursday through Wednesday. The new schedule will make life simpler for professional IT users, the organizers said.

This year 5,845 exhibitors from 77 countries are attending, a little down on last year's 6,153 exhibitors from 79 countries. The strong euro has discouraged some overseas exhibitors, organizers said, although that hasn't bothered the Chinese: After Germany, China is now the most-represented country with 500 exhibitors, overtaking Taiwan.

France is also strongly represented this year: It is this year's featured country. One of the opening speeches will come from French president Nicolas Sarkozy.

Another famous straight-talker is Steve Ballmer, who will be speaking on the theme "innovation for people and the environment" at a Microsoft event on the eve of the show.

Changes to Office Open XML Draft Standard Passed

The OOXML document format will implement changes approved at this week's ECMA meeting, but problems lie ahead in format's quest to become a standard.

About four-fifths of the proposed changes to a draft standard for the OOXML document format were waved through, undiscussed, at the conclusion of a weeklong meeting in Geneva.

If the specification for the Office Open XML file format is adopted as a standard in its current form, "there are likely to be hundreds of defects," said the head of the U.S. delegation at the meeting, Frank Farance.

OOXML, the default document format in Microsoft Office 2007, has already fallen at one hurdle on the route to becoming an international standard. Members of Joint Technical Committee 1 of the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) rejected it in a vote last September. National standards bodies participating in the vote made 3,500 comments suggesting improvements to the draft.

At this week's ballot resolution meeting (BRM) in Geneva, the sponsor of the draft standard, industry consortium ECMA International, presented 1,100 recommendations for changes to the draft.

However, delegates only had time to discuss and modify around 20 percent of those, said Farance, an industry consultant with expertise in standards issues.

"Virtually every comment we processed did not survive unedited," he said.

The 80 percent of comments that were not discussed during the meeting were put to a "default vote," resulting in the automatic adoption of ECMA's recommendations without modification by delegates, he said.

Farance questioned why the meeting's business had to be rushed.

"I see no particular rationale for why we were limited in time. I don't know how you can deal with 6,000 pages with 3,500 comments in a week. It's like trying to run a two-minute mile," he said.

Andy Updegrove, a Boston lawyer who works with industry consortia on technical standards, described the meeting process as unsuccessful.

"Hopefully, the national bodies will not compound this error by approving a clearly unfinished specification during the voting period ahead," he said.

Although not a delegate to the BRM, Updegrove had spent the week in Geneva at the meeting venue. He said he had heard from people within the meeting that only six countries had voted in favor of adopting the undiscussed recommendations.

Representatives for ISO and IEC could not be reached as the meeting ended.

Now that the ballot resolution meeting is over, the 87 national standards bodies that voted in last September's ballot have 30 days to vote on the revised draft. That ballot concludes March 29.

Pinnacle Systems VideoSpin Video Editing Software

VideoSpin is useful for free video editing software, but watch out for the ads.

Pinnacle Studio is far from my favorite video editing application, but Pinnacle's VideoSpin, a new, free editor based on Studio, does a pretty darn good job compared with most free video tools. Though it doesn't match the sophistication of paid-for applications, it does let you perform basic edits and output movies for use online or--if you're willing to sacrifice the cost of a cheap lunch--on your iPod or on a DVD.

VideoSpin has some strict limitations. You can't import from DV camcorders, since the application works only with files on your hard drive. You can't edit audio, except for levels. And since the software has no video effects, you can't brighten dark movies. Unless you purchase add-ons, you can export only to AVI, Flash, Real, or MPEG-1 format; you can't exchange files with Studio, either. And the interface size is fixed, so you can't view it full-screen.

You also must put up with advertisements--for Pinnacle products, and even for products sold on Amazon.com--rotating in the top-middle of the interface. I didn't find them intrusive, but like Studio, VideoSpin has a few spots where you must watch where you click. If you select 'More Transitions' from a drop-down menu, the application will open your Web browser to an ad for upgrades. That's more annoying.

The application provides a surprising number of video transitions, and though it has merely a handful of title templates, you can customize them to your heart's desire. It has several cheesy sound effects, and you can import audio files for use as a soundtrack. When you're done, you can use VideoSpin to upload movies directly to YouTube or Yahoo Video easily; the process takes just a couple of simple steps. Alternatively you can output in DivX, MPEG-2, or MPEG-4 format to your hard drive, albeit only for 15 days; after that you can purchase the set of codecs for $15 or pay $5 each for individual codecs.

Perhaps because VideoSpin has been stripped of so many features, I have few complaints about its stability, as I have had with Studio. It crashed every time the Windows Vista laptop I was using it on went into hibernation; but when I restarted the application, it reopened my project quickly.

VideoStudio is head and shoulders better than any other free video application I've seen; Adobe's Premiere Express online editor is practically worthless in comparison. I'd still recommend that most people buy a full video editor, but VideoSpin is a good option if all you want to do is dress up clips before uploading them to an online service.

Google Previews Google Health

Google shed more light on the health care service it is developing with several screenshots of what it will look like.

Google shed more light on the health care service it is developing, showing off a couple of screenshots of what it will look like.

Google has been talking about its health initiative for some time now, slowly revealing more aspects of the project. Last week it announced a pilot of the service with the Cleveland Clinic, but was short on details.

On Thursday, Google said that Google Health aims to offer users a central place to store their medical records. They will be able to import and share records from multiple institutions, provided the organizations already allow customers to digitally access their records.

A user's profile lists important information such as conditions, medications, test results, allergies and past operations. It also lists current doctors with their contact information.

Through the Cleveland Clinic pilot, Google has already discovered that the service is particularly useful to people who may live part of the year in Ohio and part of the year in Florida, said Marissa Mayer, vice president of search and user products for Google, in a blog post that also contained screenshots.

Those people have historically carried paper health records back and forth between the locations. Now they can import their data from each medical facility and share it electronically with the other facility.

Privacy and Security Issues

Mayer stressed the privacy and security that Google will offer around customers' health data. Unless users give explicit permission, Google won't share or sell their data, she said. It has developed its privacy policy in collaboration with the Google Health Advisory Council, a group of medical professionals that offers feedback to Google on its health care product ideas and development.

Google is working on a directory of third-party services that will be accessible from Google Health. For now that simply allows users to import records into their profiles. In the future, Mayer wrote, it will let users schedule appointments and refill prescriptions online.

Despite Mayer's blog post and a speech on Thursday about Google Health by CEO Eric Schmidt at the Healthcare Information and Management Systems Society conference in Orlando, the service still isn't available beyond the Cleveland Clinic pilot. It should become publicly available in the "coming months," Mayer wrote.

In September, the lead for the Google Health project, Adam Bosworth, left the company. At the time, Google said that Mayer would run the project until a permanent replacement was found.

Bosworth was blogging about issues related to health care and how online tools might help as far back as 2006. The Cleveland Clinic pilot, which will be available to between 1,500 and 10,000 participants, is the first tangible offering of a Google Health service.

Google isn't alone among companies tackling the problem of organizing health care information. Archrival Microsoft last year launched an online health care service, HealthVault, to allow users to store and share health records online. Users can also feed data from devices like diabetes meters and heart rate monitors into their HealthVault accounts.

Both services are limited to institutions that have customer-accessible electronic records and to people interested in using them. Between 1 percent and 3 percent of U.S. residents have used e-health records, according to Lynne Dunbrack, program director at Health Industry Insights, a market research firm.

Thursday, February 28, 2008

Flash Attack Could Take Over Your Router

Security researchers release code that shows how a pair of widely used technologies could be misused to take control of a victim's Web browsing experience.

Security researchers have released code showing how a pair of widely used technologies could be misused to take control of a victim's Web browsing experience.

The code, published over the weekend by researchers Adrian Pastor and Petko Petkov, exploits features in two technologies: The Universal Plug and Play (UPnP) protocol, which is used by many operating systems to make it easier for them to work with devices on a network; and Adobe Systems' Flash multimedia software.

By tricking a victim into viewing a malicious Flash file, an attacker could use UPnP to change the primary DNS (Domain Name System) server used by the router to find other computers on the Internet. This would give the attacker a virtually undetectable way to redirect the victim to fake Web sites. For example, a victim with a compromised router could be taken to the attacker's Web server, even if he typed Citibank.com directly into the Web browser navigation bar.

"The most malicious of all malicious things is to change the primary DNS server," the researchers wrote. "That will effectively turn the router and the network it controls into a zombie which the attacker can take advantage of whenever they feel like it."

Because so many routers support UPnP, the researchers believe that "ninety nine percent of home routers are vulnerable to this attack."

In fact, many other types of UPnP devices, such as printers, digital entertainment systems and cameras are also potentially at risk, they added in a Frequently Asked Questions Web page explaining their research.

Cross-Platform Attack

The attack is particularly worrisome because it is cross-platform -- any operating system that supports Flash is susceptible -- and because it is based on features of UPnP and Flash, not bugs that could be easily fixed by Adobe or the router vendors.

Users could avoid this attack by turning UPnP off on their routers, where it is normally enabled by default, but this would cause a variety of popular applications, such as IM (instant-message) software, games and Skype, to break and require manual configuration on the router.

Adobe could make changes to Flash to mitigate the problem, but attackers could most likely also launch this attack using another technique, known as DNS pinning, said Aviv Raff, a researcher who has also blogged about the attack.

"This is a critical issue," he said in an IM interview. "People should turn off UPnP in their devices, and vendors should put UPnP disabled by default in the devices they deliver."

Although this could make life difficult for nontechnical users, Raff believes it would be worth the effort. "It's better than having your traffic owned by malicious people," he said.

However, another security expert said that turning off UPnP would be overkill, considering that online criminals have not even begun using this attack. "Look... if you get hit by a meteor, it's devastating," said Roger Thompson, chief research officer with Grisoft, via IM. "But no one goes around building meteor shelters."

Wednesday, February 27, 2008

What is oracle

Based in Redwood, California, Oracle Corporation is the largest software company whose primary business is database products. Historically, Oracle has targeted high-end workstations and minicomputers as the server platforms to run its database systems. Its relational database was the first to support the SQL language, which has since become the industry standard.

Along with Sun Microsystems, Oracle has been one of the leading champions of network computers.

Tuesday, February 26, 2008

Data Saver

Michael Kogon, 38, founder and CEO of IT consulting and interactive marketing firm Definition 6, has a simple philosophy when it comes to sensitive data: "If you don't need it, don't store it."

Common sense is a big component of data security. "You can solve 99 percent of your problems with 1 percent of the effort by doing simple things," says Ira Winkler, security expert and author of Spies Among Us. For digital data, a strong bundle of security software, including anti-virus, anti-spyware, firewalls and encryption, is a must.

While digital data gets a lot of attention, paper data can be especially vulnerable. "We have both personal shredders and a company shredding area," says Kogon, who projects 2008 sales of $11 million for his Atlanta company. "We have a third party handle our document disposal." Winkler reminds entrepreneurs not to overlook things like notes written on scraps of paper or passwords taped to monitors.

And the same security policies that are used in the main office should be enforced with telecommuters, says Winkler. To protect off-site data, Definition 6 gives workers remote access to computers that are monitored by the central office and keeps comprehensive logs of who accesses what.

Laptops are another source of concern for mobile entrepreneurs. Biometrics, password protection and encryption can all be used to stymie laptop thieves. Still, don't let your laptop out of your sight, and be aware of where you're using it. But the best policy is to prevent sensitive information from ever making its way onto a laptop. "Our consultants are trained to leave business information that is confidential on the encrypted servers and to not download it locally," says Kogon. Most important, train employees to handle sensitive information properly. It's not just about protecting data; it's about protecting your business and your customers.

Monday, February 25, 2008

Wireless Broadband Test Continues

The White Spaces Coalition disputes reports that a prototype device failed in FCC testing.


A wireless broadband device tested by the U.S. Federal Communications Commission for interference with television and wireless microphone signals has not failed, as a broadcasting group claimed last week, members of the White Spaces Coalition say.

The National Association of Broadcasters (NAB) on Feb. 11 said a so-called prototype device submitted by Microsoft lost power during tests being run by the FCC. The power failure comes after another white spaces device malfunctioned in tests run by the FCC last year.

But Ed Thomas, a tech advisor to the White Spaces Coalition and a former chief of the FCC's Office of Engineering and Technology, said that while the devices power supply failed after many hours of continuous testing, it did not interfere with television signals due to the power failure.

Thomas, during a press briefing, said the NAB was engaged in "rhetoric" designed to complicate the FCC's device testing."Let this be based on science, not politics," Thomas said of the ongoing testing at the FCC. "Let the facts prevail."

The White Spaces Coalition, including Microsoft, Philips, Dell and Google, is asking the FCC to allow wireless devices to operate in the so-called white spaces of the television spectrum, space allocated for television signals but vacant. The coalition wants the white spaces opened up to give consumers more wireless broadband options, and the white spaces devices would be targeted at longer-range broadband than traditional Wi-Fi.

If the FCC approves the devices this year, commercial white spaces wireless devices could be available as soon as late 2009.

The FCC's in-house testing of four devices will continue for a couple more weeks, then the agency will conduct field tests for up to eight weeks. A second white spaces device has experienced no power failure problems, Thomas said.

But television broadcasters have opposed the coalition, saying it's likely that the that wireless devices will interfere with TV signals. The NAB has suggested the FCC should focus instead on a successful transition of TV stations to digital broadcasts, required by February 2009.

White spaces devices are "not ready for prime time," said Dennis Wharton, the NAB's executive vice president.

Wharton responded to Thomas' assertion that the Microsoft device did not interfere with TV signals.

"The devices they've tested haven't performed the way they were expected to perform," Wharton added. "That, in our view, constitutes a failure."

Friday, February 22, 2008

Document Format Battle Takes Shape

A European meeting next week focuses on revising the specifications for Microsoft's Office Open XML (OOXML), which the company hopes will become an ISO standard.

Microsoft faces a tough battle starting Monday at a meeting in Geneva that will influence how widely the company's latest document format will be used in the future.

Representatives of national standards bodies worldwide will attend the ballot resolution meeting (BRM) held by the International Organization for Standardization (ISO). They'll be focused on revising the specifications for Microsoft's Office Open XML (OOXML), which the company hopes will become an ISO standard.

Although OOXML has already been approved by an industry standards body, Ecma International, the ISO designation is key, since governments look to the ISO when choosing technical standards.

OOXML failed to become an ISO standard during a vote last September, but it has another chance if enough countries can agree on the revisions. Those countries will then have one month to vote on the new specification after the BRM.

Opposition

But Microsoft faces stiff opposition from companies and industry groups behind OpenDocument Format (ODF), which was approved by the ISO in 2006 as a standard. Those opponents contend that having more than one document standard makes software purchasing decisions harder for organizations.

In fact, those opponents are staging their own conference in the same venue in Geneva as the ISO meeting.

OpenForum Europe, an organization supporting ODF and open standards, has invited prominent OOXML critics and advocates of open standards to speak. They include Vint Cerf, vice president and chief Internet evangelist at Google and Hakon Wium Lie, chief technology officer of Opera, the Oslo-based browser developer.

The timing or venue choice wasn't a coincidence, said Graham Taylor, chief executive of OpenForum Europe. The organization has also timed its sessions to not conflict so BRM delegates can attend.

The shrewd timing is clearly aimed at sinking OOXML, which critics say is an overly complex standard and favors Microsoft in intricate, technical ways, even though the specification is open.

"We think there are a much wider set of issues that need to be considered by the national bodies when they come to make their vote," Taylor said.

More Than One Standard

Microsoft believes there is room for more than one standard. "We do not fundamentally believe that you have a uniform single view of technology ... in order to have interoperability," said Jason Matusow, senior director of interoperability, on Wednesday during a company event with journalists in London.

Microsoft also cites several projects under way to create translators to move formats from OOXML to ODF, and vice versa. However, Microsoft argues that the features of OOXML, a version of which is now used in Office 2007, are richer than ODF.

The meeting of the two sides at one venue has led some to speculate about heightened tension around what's already been an acrimonious debate. But Taylor said Microsoft representatives will attend OpenForum Europe sessions, and that there won't be any "heckling."

Taylor said he has assured the BRM conveners there will be no trouble. Press and observers can attend OpenForum Europe sessions, but the BRM is open only to official delegates from the 87 countries participating.

After the BRM is over, countries will look at the revisions to OOXML and then cast a vote. To become an ISO standard, a specification must win the support of two-thirds of national standards bodies that participated in work on the proposal, known as P-members. It also must receive the support of three-quarters of all voting members.

During the September vote, OOXML failed, receiving only 53 percent of the voting P-members, below the 67 percent needed. Among voting members, OOXML received only 74 percent, 1 percent shy of the mark.

This time around, countries are allowed to change their votes, adding another element of uncertainty around OOXML's fate. If the format is not approved, it means Microsoft might be forced to rethink its strategy around document formats if it wants government IT contracts.

OOXML Here for a While Either Way

Either way, the sheer dominance of Microsoft's Office suite means some version of OOXML will be used for years to come. The company said its partners are already using it in their own applications, but ODF supporters counter no vendor has come close to fully implementing the 6,000-page specification.

One of Microsoft's partners is Fractal Edge, a U.K. company that makes software that builds visual representations of complex financial data, which it calls "fractal maps." But displaying the fractal maps in older Excel versions required sending an additional configuration file for the map to be compatible with Microsoft's with binary file format, said Gervase Clifton-Bligh, vice president of product strategy.

The company has written an add-in for Excel 2007 to display the maps. OOXML container files can easily hold additional elements such as graphics -- or map configuration files.

Whether OOXML is a standard won't make a huge difference in the company's business since 100 percent of their customers use Excel, Clifton-Bligh said. But if other companies store their data in Open XML -- even if they are using a different spreadsheet program -- it would be easier to move their data into Excel, he said.

"We won't make an add-in for every spreadsheet," Clifton-Bligh said.

The British Library isn't taking a stand on whether OOXML should become an ISO standard or not, said Richard Boulderstone, director of e-Strategy.

The library is facing the long-term problem of how to continue to make its digital collection available. Universal agreement and implementation of a standard is most helpful, Boulderstone said. Also important is how a standard is built into products.

"You can create any kind of standard but there's always going to be different implementations," he said, adding that those characteristics can affect how a document is archived and viewed in the future.

Thursday, February 21, 2008

Microsoft Scrambles to Quash 'Friendly' Worm Story

Researchers' suggestions of using wormlike software to distribute patches draws security concerns.

Microsoft is moving to counter some scathing comments regarding a security paper authored by researchers at its Cambridge, England, facility.

The paper, "Sampling Strategies for Epidemic-Style Information Dissemination," looks at how worms sometimes inefficiently spread their code.

The research explores how a more efficient method could, for example, be used for distributing patches or other software. The advantage would be that patches could be distributed from PC to PC, rather than from a central server.

That method would reduce the load on a server, and patches would be distributed faster. But the patches would have the same qualities as a computer worm, a generally malicious file.

Poor Reception

Since a story about the paper appeared last week in the New Scientist magazine, the paper has been roundly assailed.

"This is a stupid idea," wrote Bruce Schneier, a security expert, author and CTO of Mountain View, Calif.-based enterprise security vendor BT Counterpane, on Tuesday, before quoting a passage from the New Scientist story on his blog.

Schneier wrote that the idea of so-called "benevolent worms" comes up every few years.

However, a worm is designed to run without the consent of a user, which doesn't make it a good method of software distribution, Schneier wrote. The worm patching technique could also make the patches hard to uninstall or interrupt during installation, he wrote.

Worms designed to distribute software patches could also be hacked to distribute malicious software, wrote Randy Abrams, director of education for security vendor Eset, in his regular e-mail commentary.

Forced patching is also troublesome since some patches may not be compatible with critical software, Abrams wrote.

"Breaking into computers is a bad idea," Abrams said.

Microsoft Clarifies Concept

A Microsoft spokesman said on Monday that the New Scientist story is not inaccurate. However, the writer of the story "sexed" up the research paper a bit, particularly with the headline that used the phrase "friendly worms," the spokesman said.

In response to the criticism, Microsoft said it doesn't intend to develop patch worms.

"This was not the primary scenario targeted for this research," according to a statement.

The company also said it will continue to let customers decide how and when they apply security updates.

One of the paper's authors, Milan Vojnovic, said in a statement that there were no plans to incorporate the ideas into Microsoft's products. Efforts to reach Vojnovic for comment were unsuccessful.

Wednesday, February 20, 2008

What is Computer System

A complete, working computer. The computer system includes not only the computer, but also any software and peripheral devices that are necessary to make the computer function. Every computer system, for example, requires an operating system.

Tuesday, February 19, 2008

Health Care Reform Must Include IT Issues, Group Says

Congress urged to require standards in health care databases so private firms can exchange information as needed.

The U.S. Congress needs to pass health-care IT legislation before private companies develop multiple systems that don't talk to each other, two advocacy groups say.

Members of the Health IT Now Coalition and the Information Technology Industry Council (ITI) urged Congress to move ahead with health IT legislation such as the Promoting Health Information Technology Act. The bill would establish a public/private group to recommend health IT standards and certification and would budget US$163 million a year for health-care providers to adopt health IT products, such as electronic health records.

Health technologies can help improve health-care quality, reduce costs and encourage changes in treatment, said former U.S. Representative Nancy Johnson, co-chairwoman of the Health IT Now Coalition.

Health IT is "going to produce radical change," Johnson said at a news conference. "It's going to radically improve the quality of health care that Americans receive."

With health-care costs continuing to climb, moving to an electronic system that reduces paper and medical errors is the best hope to extend health care to U.S. residents who are uninsured, she added. "It is the only way that we guarantee to Americans of every age that our health-care system will continue to deliver the state-of-the-art medicine for which it has been known worldwide," Johnson said.

The Promoting Health Information Technology Act has stalled in the House of Representatives and a similar piece of legislation, the Wired for Health Care Quality Act, has stalled in the Senate.

Privacy Concerns

Some groups, including Patient Privacy Rights, have raised concerns that the legislation doesn't adequately address patient privacy issues. "The Senate Wired Act has no privacy protections or language ensuring patient control of health records," the group said on its Web site. "It must not pass unless patients have the right to keep their health records private."

Privacy and security must be major components of a health IT bill, Johnson said at the news conference. But she and Rhett Dawson, ITI's president and CEO, said Congress should pass a health IT bill before vendors develop multiple systems that don't interoperate. "The public interest is in interoperability," Johnson said.

A health IT bill would be a major accomplishment that lawmakers could show to voters before the November elections, Dawson said. "We believe the time to act is now," he added.

In addition to the news conference, a group of IT vendors, including Lexmark, NetApp and NCR, demonstrated health technologies at a congressional office building last week.

EMC's RSA division demonstrated secure Web sign-on technologies that patients and health-care providers can use. RSA's Secure Web Access for Patient and Provider Portals allows administrators to set access rights based on who's signing in, said Seth Geftic, RSA product marketing manger. For example, a doctor could have access to more patient records than a nurse or an insurance provider, he said.

In addition, RSA provides authentication technologies that can be used to recognize users when they first sign in. For example, the RSA Adaptive Authentication can use publicly available information, such as height, car loans and old addresses, to authenticate a user entering a health IT Web site for the first time, Geftic said.

"You don't even have to enter the last four digits of your Social Security number," added Shannon Kellogg, director of information security policy in EMC's office of government relations.

Monday, February 18, 2008

SanDisk Boosts Mobile Storage

A new 16GB embedded flash drive designed for use in mobile devices is scheduled for release this summer.


SanDisk has announced a 16GB embedded flash drive aimed at satisfying the need for increased storage from the latest generation of mobile devices.

Unlike microSD and miniSD cards, the iNAND flash drive is an embedded device. The 12mm x 16mm iNAND 16GB device is expected to be available for sampling to mobile handset vendors in the second quarter of this year.

It effectively doubles the storage capacity into a standard JEDEC package compared to SanDisk's previous 8GB iNAND card. SanDisk says it is able to do this thanks to recent advances in multi-level cell (MLC) NAND flash technology.

The iNAND 16GB drive made its debut at the Mobile World Congress in Barcelona last week. Product marketing director David Guidry said there that he expects to see mass production to begin sometime in the third quarter.

A 32GB iNAND is also slated for introduction in the second half of 2008, although this looks more likely towards the end of the year. The iNAND devices continue to utilize the SD interface2.

"We see a lot of interest from OEMs for high capacity storage," said Guidry, highlighting the fact that this card is aimed at the new generation of mobile handsets, which are combining more and more storage intensive features such as music players, digital cameras, video recording, and GPS capabilities.

Sony Ericsson is one handset maker that currently uses the iNAND drive, and according to Guidry, "a few others are in the pipeline."

"Featuring a standard package interface, the 16GB iNAND EFD is designed to be quickly integrated into various handset designs for both storage capacity scalability and a smooth migration to future products and functionalities, such as system boot," said Dan Inbar, general manager of SanDisk's mobile handset vendors division in a statement.

SanDisk unveiled a 4GB iNAND flash device back in September 2005 at the CTIA Wireless IT & Entertainment Expo.

Looking forward, Guidry said that it could be feasible to expect a 64GB iNAND device during 2009. "Look at our history, we double roughly every twelve months, although we cannot guarantee we will continue at that rate," he said. "But so far we have been fairly consistent with that time frame."

Saturday, February 16, 2008

Amazon's S3 Down for Several Hours

Amazon's data storage service was down for several hours on Friday morning, leaving businesses that rely on the service offline.

As of around 9 a.m. PST, the issue had been resolved, according to an Amazon employee posting on a user group forum. "This morning's issue has been resolved and the system is continuing to recover," wrote Kathrin, the Amazon employee, on the forum.

She said that the company plans to post technical information about what exactly happened, but that the priority is to make sure the system is stable.

What is S3?

Companies use Amazon's Simple Storage Service, known as S3, to store and quickly retrieve large amounts of data, often to run Web sites and services.

A press spokesman said that one of three geographic locations for the service was unreachable for about two hours, but that it was operating at 99 percent of normal performance before 7 a.m. on the West Coast. "We've been communicating with our customers all morning via our support forums and will be providing additional information as soon as we have it," said Drew Herdener in a statement.

Many customers appeared not to have gotten that communication. They complained on the forum about a lack of information from Amazon about the outage and when it would be fixed. One suggested that Amazon could have at least posted a message on the front page of the Web services site, so that customers would be aware that the problem wasn't on their end.

Others wrote about the problems that the outage was causing their businesses. "It's becoming very embarrassing for us here," one wrote. "We desperately need an update... it's a huge hit on our reputation."

Many of the users said that the service was down for around three hours.

Gustavo, a user in Brazil, said that his company hosts more than 30,000 images from a large television station in Brazil. "Now we are having several problems because of this S3 issue," he wrote. "My company chose to work with Amazon because of its reliability."

Late last year, Amazon introduced a new service level agreement for S3 that guaranteed 99.9 percent uptime each month. If the service slips below that level, the company promised to provide service credits to certain users.

Friday, February 15, 2008

Inside Net Neutrality: Find an Honest ISP

Having ISP choices is a good thing, particularly if yours is engaging in filtering or traffic shaping or in any way throttling back your bandwidth.

If you've been online long enough, you probably remember the days when your local Internet service provider was just some guy down the block with fancy equipment, and he was the only choice in town. But those days are largely gone for most of us and, unless you live in a rural area, you probably have numerous options when it comes to Internet connectivity.

And having choices is a good thing, particularly if your ISP is engaging in filtering or traffic shaping or in any way throttling back your bandwidth. If your ISP is guilty of any of the practices outlined in part one of our series on network neutrality, there's no reason to keep doing business with that service.

The practice of slowing down certain types of traffic goes against the principle of network neutrality, the idea that all Internet traffic should be treated equally. While network neutrality has plenty of advocates, the best way to support it is to vote with your wallet by supporting ISPs that don't practice data discrimination.

So how do you find an ISP that isn't throttling traffic? One easy solution is to look to the Web site of the popular BitTorrent client Azureus. The Azureus Wiki page maintains a list of ISPs that in some way limit BitTorrent or encrypted traffic.

One way to check is to run a speed test--there are numerous such tests available online at sites like Speakeasy and BroadbandReports.com.

With a test in place, try downloading a torrent. If the torrent traffic is notably slower, it could indicate throttling. Another indicator is to check your Torrent speeds by experimenting with encryption, as discussed in part two of our series. If you get faster speeds when you encrypt your traffic, it's also an indicator that your ISP is traffic shaping.

Keep in mind, however, that other variables, such as the number of peers and seeds and their connection rates, can influence your download speeds. Neither of the methods outline above can tell you with 100-percent certainty that your provider is shaping your traffic.

Hopefully, however, traffic shaping may soon be relegated to the list of bad Internet ideas of yore, along with hourly rates and Internet taxes. The FCC could nix the practice, which does interfere with legitimate businesses that use applications like BitTorrent to distribute large files such as software and video.

Just this week, Rep. Ed Markey (D-Mass.), chairman of the House Energy and Commerce Committee's subcommittee on telecommunications and the Internet introduced a network neutrality bill that would require the FCC to determine if Internet service providers are traffic shaping, and whether or not it is legal for them to charge extra for access.

But until the government acts, the best way to make sure you aren't getting taken is to encrypt your traffic, or run it through an SSH tunnel or VPN, and to make sure your ISP is giving you the service you've paid for.

Thursday, February 14, 2008

Storm Worm Reappears as Malicious Valentine

Security vendor McAfee warns of a Valentine come-on that downloads a virus instead of a greeting.


A Valentine's themed outbreak of the Storm worm has been detected.

Around the world, malicious e-mail messages are being received that contain a link that directs users to a website where they can supposedly download a Valentine's card, but in fact are infected with the Storm bug. The virus mirrors the fake Christmas and New Year messages seen in previous months.

According to Greg Day, security analyst at McAfee, the virus will try to steal personal information from your PC, bring down its security defenses and use your PC to send out millions of junk emails.

"There are about 10 million PCs worldwide infected with the Storm worm. These threats have suddenly spiked from 0 percent of all spam emails to 1.5 percent and they are continuing to rise as we draw closer to Valentine's Day and more people are fooled into downloading the malicious file," he commented.

"With all the hype that surrounds Valentine's Day, it was only to be expected that they (the people behind Storm) would use a similar tactic and exploit people's eagerness to receive Valentine's cards," added Diego d'Ambra of email-security company SoftScan.

Wednesday, February 13, 2008

When is a server not a server?

When it's a Storage Server.

Ask people what a storage server is, and you can expect to hear a variety of answers. Some will say it is a regular server with added features, a few describe it as a stripped-down box dedicated to a specialized function, and still others believe the term refers only to a network attached storage (NAS) box. This article will attempt to define a storage server, differentiate it from a regular server, and give examples of storage servers on the market today.

Not Your Average Server
The typical server is configured to perform multiple functions. It operates as a file, print, application, Web, or miscellaneous server. As such, it must have fast chips, more RAM, and plenty of internal disk space to cope with whatever end users decide to do with it. Not so with a storage server. It is designed for a specific purpose, and thus configured differently. It may come with a little extra storage or a great deal more storage. A general-purpose server typically has five or fewer disks inside. A storage server, on the other hand, has at least six, and more, usually 12 to 24 disks.

Storage servers are normally individual units. Sometimes they are built into a 4U rackmount. Alternatively, they can consist of two boxes: a storage unit and a server located near by. Both boxes can then be placed side-by-side in a rack. The Sun StorEdge 3120 storage unit and SunFire X4100 server, for example, can be combined into a storage server and placed in a rack.

Apart from extra disks, what else is different about storage servers? In many cases, they come with a host of specialized services. This can include storage management software, extra hardware for higher resilience, a range of RAID configurations and extra network connections to enable more users to be desktops to be connected to it.

Just a NAS Box?
Interestingly, some vendors define storage servers purely in terms of NAS. A NAS appliance (also known as a NAS filer) generally has a slimmed-down OS and file system, and only processes I/O requests by the main file sharing protocols. The big advantage of the NAS architecture is that it enables storage to be rapidly added by plugging the appliance into a network hub or switch.

HP has five ProLiant models available as general-purpose servers or storage servers/NAS filer — each has the same basic hardware configuration. If licensed as a storage server, the user may not run general-purpose applications on that server. If the same ProLiant server is being used as a regular server, however, applications can be run on it. To sweeten the deal, HP prices its storage servers a little lower than their general-purpose siblings. In addition, HP's NAS-based storage servers have extra functionality built into the operating system — storage-specific management tools, "quota-ing" features, storage reporting capabilities, and a Web-based user interface that makes it easier to configure file and print. These features are not available on its general-purpose servers.

So is NAS really just a storage server? The answer varies, depending on whom you ask, but there appears to be very little difference between them. NAS, it turns out, isn't really storage networking. Actual network-attached storage would be storage attached to a storage-area network (SAN). NAS, on the other hand, is just a specialized server attached to a local-area network. All it does is make its files available to users and applications connected to that NAS box — much the same as a storage server.

From nowhere in the mid-1990s, Gartner projections predict the NAS market will exceed $2 billion by 2008, with an annual growth rate of 9 percent. And those numbers don't take into account a new NAS flavor called the NAS gateway. These gateways act as a file serving portal into a SAN: There are disk arrays in a Fibre Channel SAN that have a storage server on the perimeter acting as a NAS gateway. This is a one way to marry up NAS and SAN assets. There are basically two flavors of storage servers — NAS appliances that have the disk storage in the appliance, and NAS gateways (the ProLiant DL585 storage server is one example of a HP NAS gateway).

What's Missing
While some vendors use the same box as a plain vanilla server, others use a scaled-down version that is adequate for file serving. A storage server is considered to be an optimized appliance designed to feed information, via a network, to a user or an application. As such, it is not typically compute heavy, but it has been designed from the ground up to provide specific I/O capabilities along with data protection capabilities. A regular server has to be generic, it doesn't know what kind of load demands it will have — gaming is much different than running a database, for example. A storage server, such as a NAS box, is a contained appliance that does one thing really well, like file serving. A regular server will typically have more processing power, more RAM, and a more generic I/O structure and file system. Lacking these features means that most storage servers perform at 50% of the performance of a regular server for the same function.

This trend toward specialized computing elements is far from new. TCP/IP routing, for example, was a function that every operating system ran — until Cisco came out with a dedicated box that did it far better than hosting it on a general-purpose server, making a storage server a specialized server or appliance. Using a vanilla server for file serving could lead to problems. Administering a general-purpose server is more complex. Further, someone might be tempted to use the server for multiple functions. Dedicated storage servers, therefore, have become the norm.

Not surprisingly, Microsoft introduced Windows Storage Server 2003 to distinguish it from general servers running the Windows 200x operating system. Windows Storage Server 2003 is a dedicated file and print server based on Windows Server 2003 and tailored to networked storage. It supports file serving and backup and replication of stored data. It can also be used to consolidate multiple file servers into a single box.

Windows Storage Server 2003 includes advanced availability features, such as point-in-time data copies, replication and server clustering. They come in sizes ranging from a few hundred gigabytes to several terabytes. It is available in pre-configured NAS appliances from vendors such as HP and Dell. As a result, IDC reports NAS appliances running Windows now account for about half of all appliances in the market.

Storage Servers vs. Disk Arrays
Just as there is some confusion between ordinary servers and storage servers, there is also sometimes a misunderstanding between storage servers and disk arrays. Exactly where does one end and the other begin? A storage server can have as many as 24 disks — enough to quality as an array. Disk arrays, however, can have hundreds of disks. So where do you draw the line?

A storage server is usually stand-alone and not connected to other servers. Multiple servers, however, typically connect to a disk array. Disk arrays, too, often connect to a server that could be styled a storage server. The storage server is the intelligence that goes in front of the array. In this arrangement, the server can manage several tiers of storage. It can even arrange the replication of data from one tier to another. A storage server serves the storage, and the disk array is the storage and typically speaks to files and talks to people or applications over Ethernet, whereas a disk array is a low-level block device that speaks only to an operating system.

Tuesday, February 12, 2008

The Differences Between Hubs, Switches and Routers

Some technicians have a tendency to use the terms routers, hubs and switches interchangeably. One minute they're talking about a switch. Two minutes later they're discussing router settings. Throughout all of this, though, they're still looking at only the one box. Ever wonder what the difference is among these boxes? The functions of the three devices are all quite different from one another, even if at times they are all integrated into a single device. Which one do you use when? Let's take a look...

Hub, Switches, and Routers: Getting Started with Definitions

Hub
A common connection point for devices in a network. Hubs are commonly used to connect segments of a LAN. A hub contains multiple ports. When a packet arrives at one port, it is copied to the other ports so that all segments of the LAN can see all packets.

Switch
In networks, a device that filters and forwards packets between LAN segments. Switches operate at the data link layer (layer 2) and sometimes the network layer (layer 3) of the OSI Reference Model and therefore support any packet protocol. LANs that use switches to join segments are called switched LANs or, in the case of Ethernet networks, switched Ethernet LANs.

Router
A device that forwards data packets along networks. A router is connected to at least two networks, commonly two LANs or WANs or a LAN and its ISP.s network. Routers are located at gateways, the places where two or more networks connect. Routers use headers and forwarding tables to determine the best path for forwarding the packets, and they use protocols such as ICMP to communicate with each other and configure the best route between any two hosts.

The Differences Between These Devices on the Network
Today most routers have become something of a Swiss Army knife, combining the features and functionality of a router and switch/hub into a single unit. So conversations regarding these devices can be a bit misleading — especially to someone new to computer networking.

The functions of a router, hub and a switch are all quite different from one another, even if at times they are all integrated into a single device. Let's start with the hub and the switch since these two devices have similar roles on the network. Each serves as a central connection for all of your network equipment and handles a data type known as frames. Frames carry your data. When a frame is received, it is amplified and then transmitted on to the port of the destination PC. The big difference between these two devices is in the method in which frames are being delivered.

In a hub, a frame is passed along or "broadcast" to every one of its ports. It doesn't matter that the frame is only destined for one port. The hub has no way of distinguishing which port a frame should be sent to. Passing it along to every port ensures that it will reach its intended destination. This places a lot of traffic on the network and can lead to poor network response times.

Additionally, a 10/100Mbps hub must share its bandwidth with each and every one of its ports. So when only one PC is broadcasting, it will have access to the maximum available bandwidth. If, however, multiple PCs are broadcasting, then that bandwidth will need to be divided among all of those systems, which will degrade performance.

A switch, however, keeps a record of the MAC addresses of all the devices connected to it. With this information, a switch can identify which system is sitting on which port. So when a frame is received, it knows exactly which port to send it to, without significantly increasing network response times. And, unlike a hub, a 10/100Mbps switch will allocate a full 10/100Mbps to each of its ports. So regardless of the number of PCs transmitting, users will always have access to the maximum amount of bandwidth. It's for these reasons why a switch is considered to be a much better choice then a hub.

Routers are completely different devices. Where a hub or switch is concerned with transmitting frames, a router's job, as its name implies, is to route packets to other networks until that packet ultimately reaches its destination. One of the key features of a packet is that it not only contains data, but the destination address of where it's going.

A router is typically connected to at least two networks, commonly two Local Area Networks (LANs) or Wide Area Networks (WAN) or a LAN and its ISP's network
. for example, your PC or workgroup and EarthLink. Routers are located at gateways, the places where two or more networks connect. Using headers and forwarding tables, routers determine the best path for forwarding the packets. Router use protocols such as ICMP to communicate with each other and configure the best route between any two hosts.

Today, a wide variety of services are integrated into most broadband routers. A router will typically include a 4 - 8 port Ethernet switch (or hub) and a Network Address Translator (NAT). In addition, they usually include a Dynamic Host Configuration Protocol (DHCP) server, Domain Name Service (DNS) proxy server and a hardware firewall to protect the LAN from malicious intrusion from the Internet.

All routers have a WAN Port that connects to a DSL or cable modem for broadband Internet service and the integrated switch allows users to easily create a LAN. This allows all the PCs on the LAN to have access to the Internet and Windows file and printer sharing services.

Some routers have a single WAN port and a single LAN port and are designed to connect an existing LAN hub or switch to a WAN. Ethernet switches and hubs can be connected to a router with multiple PC ports to expand a LAN. Depending on the capabilities (kinds of available ports) of the router and the switches or hubs, the connection between the router and switches/hubs may require either straight-thru or crossover (null-modem) cables. Some routers even have USB ports, and more commonly, wireless access points built into them.

Some of the more high-end or business class routers will also incorporate a serial port that can be connected to an external dial-up modem, which is useful as a backup in the event that the primary broadband connection goes down, as well as a built in LAN printer server and printer port.

Besides the inherent protection features provided by the NAT, many routers will also have a built-in, configurable, hardware-based firewall. Firewall capabilities can range from the very basic to quite sophisticated devices. Among the capabilities found on leading routers are those that permit configuring TCP/UDP ports for games, chat services, and the like, on the LAN behind the firewall.

So, in short, a hub glues together an Ethernet network segment, a switch can connect multiple Ethernet segments more efficiently and a router can do those functions plus route TCP/IP packets between multiple LANs and/or WANs; and much more of course.

Bluetooth to Work With Wi-Fi

A future version of Bluetooth will allow connections to hop on Wi-Fi networks when additional bandwidth is required.

BARCELONA, Spain--A future version of Bluetooth will be able to increase throughput for sending videos, music, and other high-bandwidth applications by using a nearby Wi-Fi network, the group in charge of Bluetooth development says.

Ander Edlund, European Marketing Director for the Bluetooth Special Interest Group, said the technology enabling Bluetooth connections to hop on neighboring Wi-Fi networks is about a year from being ready. "We're on [Bluetooth version] 2.1 now, so it might be in version 3," Edlund said on the eve of Mobile World Congress, which opened its a four-day run Monday.

The technology that will enable Bluetooth to use Wi-Fi is called "Alternate MAC/PHY."

Bluetooth was designed as way to wirelessly connect devices--a PC and a printer, for example--that doesn't consume as much power as Wi-Fi, but also doesn't move data as quickly or as far.

Bluetooth developers are already working to make their technology compliant with another wireless broadband standard, the WiMedia Alliance's Certified Wireless USB ultra-wideband technology.

However Certified Wireless USB products have only begun to appear recently (see our review of IOGear's Wireless USB Hub/Adapter), and so Bluetooth is turning to a more ubiquitous high-speed wireless technology to help with applications where Bluetooth's own bandwidth (with a theoretical maximum of 3 megabits per second) is insufficient. The Bluetooth SIG stressed that it is continuing to work on pairing Bluetooth with Certified Wireless USB, however.

Alternate MAC/PHY would let a Bluetooth connection use an 802.11x network only when the additional bandwidth was needed; for tasks that can make do with standard Bluetooth speeds, the Bluetooth connection would stop using the Wi-Fi network.

Monday, February 11, 2008

First Google Android Phone to Debut This Week?

The Mobile World Congress in Barcelona this week may see the first mobile phone to incorporate Google's open-source Android platform.


The first mobile phone to incorporate Google's open-source Android platform will debut at this week's Mobile World Congress in Barcelona, a source close to British chipmaker ARM Holdings has told Reuters.

According to the source, ARM, a U.K.-based company that makes embedded and graphics processors, as well as processors for mobile devices such as cell phones and PDAs, will introduce a prototype for a telephone that uses Android as its operating platform as soon as today.

Innovation for Cell Phones Wanted

Google and its partners in the Open Handset Alliance unveiled their open source Android platform last November with the goal of spurring third-party developers to create new and innovative applications for mobile devices.

ARM's rumored Android phone is notable because the company is not a member of the Open Handset Alliance, the 34-member multinational group that's dedicated to promoting Android. Other OHA members, such as T-Mobile and Sprint Nextel, have already announced that they plan on releasing Android-based handsets later this year.

Although neither Google nor ARM would comment on the rumored Android phone, ARM did post an announcement on its Web site saying it planned to "showcase visually stunning mobile Internet devices" at the Mobile World Congress this week that will deliver a "full web in your pocket" experience.

The company also says it will be giving demonstrations of other ARM-powered phones such as the LG-KS20 and the Samsung BlackJack II.

Saturday, February 9, 2008

The Difference Between Adware & Spyware

As technology advances and more people come to rely on the Internet for information, leisure, and business it seems as if keeping your computer free of advertising is a daunting task. Not technically fitting into either the virus or spam category we have spyware and adware, which are growing concerns for Internet users. At times these programs may invade your privacy, contain malicious code, and at the very least they can be a nuisance when using a computer connected to the Internet.

Adware
Adware is considered a legitmate alternative offered to consumers who do not wish to pay for software. Programs, games or utilities can be designed and distributed as freeware. Sometimes freeware blocks features and functions of the software until you pay to register it. Today we have a growing number of software developers who offer their goods as "sponsored" freeware until you pay to register. Generally most or all features of the freeware are enabled but you will be viewing sponsored advertisements while the software is being used. The advertisements usually run in a small section of the software interface or as a pop-up ad box on your desktop. When you stop running the software, the ads should disappear. This allows consumers to try the software before they buy and you always have the option of disabling the ads by purchasing a registration key.

In many cases, adware is a legitimate revenue source for companies who offer their software free to users. A perfect example of this would be the popular e-mail program, Eudora. You can choose to purchase Eudora or run the software in sponsored mode. In sponsored mode Eudora will display an ad window in the program and up to three sponsored toolbar links. Eudora adware is not malicious; it reportedly doesn't track your habits or provide information about you to a third party. This type of adware is simply serving up random paid ads within the program. When you quit the program the ads will stop running on your system.

Spyware
Unfortunately, some freeware applications which contain adware do track your surfing habits in order to serve ads related to you. When the adware becomes intrusive like this, then we move it in the spyware category and it then becomes something you should avoid for privacy and security reasons. Due to its invasive nature, spyware has really given adware a bad name as many people do not know the differences between the two, or use the the terms interchangeably.

Spyware is considered a malicious program and is similar to a Trojan Horse in that users unwittingly install the product when they install something else. A common way to become a victim of spyware is to download certain peer-to-peer file swapping products that are available today.

Spyware works like adware but is usually a separate program that is installed unknowingly when you install another freeware type program or application. Once installed, the spyware monitors user activity on the Internet and transmits that information in the background to someone else. Spyware can also gather information about e-mail addresses and even passwords and credit card numbers.

Because spyware exists as independent executable programs, they have the capability to monitor your keystrokes, scan files on the hard drive, snoop other applications, such as chat programs or word processors, install other spyware programs, read cookies, change the default home page on the Web browser, while consistently relaying this information back to the spyware author who will either use it for advertising and marketing purposes or sell the information to another party.

Licensing agreements that accompany software downloads sometimes warn the user that a spyware program will be installed along with the requested software, but the licensing agreements are not always be read completely by users because the notice of a spyware installation is often couched in obtuse, hard-to-read legal disclaimers.

Friday, February 8, 2008

Netgear unveils femtocell residential gateway

February 7, 2008 (TechWorld.com) Netgear Inc. has unveiled its Femtocell Voice Gateway, which it will publicly demonstrate for the first time at the Mobile World Congress in Barcelona next week.

Based on 3G femtocell technology from Ubiquisys Ltd., Netgear claims that the DVG834GH is the world's first single-box femtocell product that includes a residential gateway with integrated ADSL2+ modem, router, 10/100 wired LAN switch, 802.11g wireless access point, voice over IP, and SPI double firewall.

The DVG834GH is a 3G access point that plugs into an ADSL line and enables users to share their broadband connection to the Internet with all of their mobile devices and networked computers, both wired and wirelessly, providing mobile operators with the means to deliver both wired and wireless broadband connectivity to converged home networks.

Femtocells are mobile access points that connect to a mobile operator's network using residential DSL or cable broadband connections. They have been developed to work with a range of different standards, including CDMA, GSM and UMTS.

Femtocells have the potential to make Wi-Fi networks redundant because they use less power and have a longer range. The downside is that such technology will be available only from mobile network operators because they own the licenses covering the frequencies that 3G operates in.

From a consumer point of view, Netgear's DVG834GH must be as painless to install and use as an existing cordless telephone or wireless router, meaning they can take it out of the box, plug it in and use it immediately. Consumers will expect to have the system up and running in a matter of minutes. Netgear's Smart Wizard Install Assistant should ease installation and management by automatically detecting and configuring the gateway for ISP connections.

The DVG834GH's double firewall (NAT + SPI) should help protect the network against intruders and malicious attacks, including logs and alerts of break-in attempts, while the VPN pass-through should allow safe connections to business networks. The gateway is also suitable for VoIP because it supports SIP and several popular codecs. Designed to industry-standards-based specifications with TR-069 Remote Management, the gateway supports the future addition of such advanced features as IGMP Multimedia Support.

"The proliferation of advanced multimedia applications and fixed mobile convergence now requires the existence of reliable home networks with fast Internet access, which can also support applications via mobile phones and other handheld devices," said David James, director of service provider products at Netgear.

Evaluation units of the DVG834GH are available now for operator testing, with commercial availability expected during the first half of 2008.

Ubiquisys femtocells are currently in trials with 10 mobile operators. Femtocells will be locked to an operator, at least at first, but the goal is to get the cost as low as possible and to get away from subsidies. There could eventually be SIM-free femtos on the shelves.

Thursday, February 7, 2008

Apple Fixes Critical QuickTime Bug

Apple fixes a critical bug in its QuickTime media player software that had been worrying security experts for nearly a month.


Apple has released a security fix for its QuickTime media player software, fixing a critical bug that had been worrying security experts for nearly a month.

The update, released Wednesday, fixes a vulnerability in the Real Time Streaming Protocol (RTSP) used by QuickTime to handle streaming media. It also fixes a previously reported incompatibility between QuickTime 7.4 and Adobe Premiere and After Effects, according to an Apple spokesman.

On Jan. 10, researcher Luigi Auriemma disclosed the flaw by posting proof-of-concept attack code that could be used to run unauthorized software on a victim's computer. For the attack to work, the criminal would have to first trick the user into viewing a maliciously encoded QuickTime media file.

With the attack code available, security researchers had been hoping that Apple would address the flaw. Wednesday's QuickTime 7.4.1 update is for both the Mac OS X and Windows operating systems.

It is Apple's fifth QuickTime update since October. The company has been forced to issue the flurry of patches as security researchers have taken a closer look at media player flaws during the past year. In December, Apple patched a separate RTSP vulnerability, which online criminals had already started to use in their attacks.

"In the past few months, QuickTime has been a prevalent target for security researchers," said Andrew Storms, director of security operations with nCircle Network Security, via instant message. "Internet media applications on the desktop have been a rich target for attackers and this trend is sure to continue as most users aren't yet accustomed to attacks arriving in the form of a viral video."

Wednesday, February 6, 2008

Why Users Hate Vista

Hands-on users of the new OS are proving to be the most resistant.


You rarely hear about a new OS causing people to panic. But IT consultant Scott Pam says that's exactly what his small-business clients are doing when they install Windows Vista on new PCs and run smack into compatibility or usability roadblocks.

Pam's clients are not alone : Since InfoWorld launched its petition drive on Jan. 14 to ask Microsoft to continue selling new XP licenses indefinitely alongside its Vista licenses, more than 75,000 people have signed on. And hundreds of people have commented -- many with ferocious, sometimes unprintable passion. "Right now I have a laptop with crap Vista and I'm going to downgrade to XP because Vista sucks," reads one such comment.

Where does all the vitriol come from?

IT managers and analysts suggest a range of reasons, some based on irrational fears and others based on rational reactions to disruptive changes.

Emotional Effects

"When we first deployed Vista, people told us it sucks, that it's not as good as XP," recalled Sumeeth Evans, IT director at Collegiate Housing Services, an 80-person college facilities management firm.

A month later, he surveyed the staff to see if their views had changed, and they had: "They said it was very good, that they were getting used to it. We asked what was different, and they said they originally didn't like Vista because it was a change. That's human nature."

Microsoft's overzealous schedule in replacing XP with Vista has exacerbated resistance to change, said Michael Silver , a research vice president at Gartner. The company had originally planned to discontinue XP sales on Dec. 31, 2007, just 11 months after Vista was made available to consumers and 14 months after it was made available to enterprises. The date for new license sales to end is now June 30.

In practice, XP's consumer availability ended for many users even sooner -- just six months after Vista's release -- since storefront retailers Best Buy and Circuit City and most computer manufacturers' Web sites stopped selling XP-equipped computers in July 2007. Typically, Microsoft has given customers two years to make such a transition, Silver noted.

Burton Group executive strategist Ken Anderson suggested that the strong emotional identification with XP represented a fundamental shift in how people, including IT staff, now think of operating systems. They have become a familiar extension of what we do and how we work, thus not something want to change often. "When technology becomes part of you, you don't want people to mess with it," he said.

Anderson likened the reaction to XP's impending demise to what happened in the 1980s when Coca-Cola replaced its classic Coke formula with New Coke, causing massive protests by customers who had no reason to change what they drank. The protests forced the company to bring back what we now call Coke Classic. "XP has come to the point of being Coke Classic," he said, with Vista playing the role of New Coke.

The Further the Better

The Englewood (N.J.) Hospital Medical Center switched to Vista shortly after its enterprise release, since it had been in Microsoft's early adopter program. Most users -- mainly nurses and other medical staff -- didn't really notice the upgrade and had few complaints, noted Gary Wilhelm, the business and systems financial manager (a combination of CTO and CFO) at the 2,500-employee facility. That's because they don't really use the OS, but instead work directly in familiar applications that load when they sign in using their ID.

Capacitor manufacturer Kemet saw a similar ho-hum reaction from most of its staff, says Jeff Padgett, the global infrastructure manager. And for the same reason: Users have little direct interaction with the OS. But the staff did push back on Office 2007, whose ribbon interface is a departure from the previous versions. They rebelled to the degree that Padgett has delayed Office 2007 deployment and may not install it at all.

Back at the Englewood hospital, Wilhelm did hear anti-Vista grumbling from people in the administration department, who work more closely with the OS itself for file management and so on. And at Kemet, another group of hands-on users complained about the switch to Vista, noted Padgett: "The people who suffered the most were engineers and IT people."

The phenomenon of hands-on users being the most resistant explains why so many small-business users and consultants have reacted so strongly against Vista, noted Gartner's Silver.

Conversely, those enamored of the latest technology tend to be Vista enthusiasts, said David Fritzke, IT director at the YMCA Milwaukee, which has been adding Vista to its workforce as it buys new computers. "Some users bought Vista for home and then wanted it more quickly at work than we had initially planned to deploy it," he said. Fritzke also found that younger users adapted to Vista more easily.

In Search of ROI

Users' personal reactions, positive or negative, ultimately impact the bottom line and help drive the business decision of whether to roll out Vista across an organization.

It's all about basic cost-benefit analysis, says Gartner's Silver. In most businesses, Vista offers few compelling advantages for users while introducing challenges. The cost of change is too high for the perceived benefit. For example, users often complain about Vista's constant nagging about possible system threats, about applications that no longer run, or about files that appear to be "lost" because they've been moved to new places by the OS, Silver said.

"It's really hard to convince someone to go to a product that's not quite as stable or as capable as what they're already using," Silver noted -- and so they get frustrated and angry. While IT managers and analysts appreciate some under-the-hood changes in Vista, these improvements don't have an immediate, obvious benefit for users. "Vista's benefits are not about the users," concurred Collegiate Housing Services' Evans.

Upgrades from Microsoft's past have also colored expectations, Silver said. Users tend to remember the straightforward transition from Windows 2000 to XP, even though technically it was a "minor" upgrade, he said. (Silver also noted that until XP Service Pack 2, XP had its own share of compatibility and security flaws that annoyed users, something that most forgot with SP2's release.)

And while the path from Windows 95 and 98 to Windows XP was rockier, the benefits were clear enough at each stage for most customers to make the upgrade investment gladly, Silver said.

Some users have decided to skip Vista altogether and instead wait for Windows 7 , whose release date has been reported as anywhere between 2009 and 2011.

"Why shoot yourself in the foot twice? Windows 7 will be out next year; I'll wait till then," said one InfoWorld reader. If Windows 7 arrives sooner rather than later -- or if a miraculous Vista service pack addresses all the major objections in one swoop -- then the uproar over upgrading to Vista will quickly fade into the hazy past of other Windows upgrade snafus.