Wednesday, April 8, 2009

Conficker Worm Is Much Ado About Nothing

The Conficker Worm is like the Paris Hilton of computer security: Famous solely for being famous. Neither has actually ever done anything of note. But, at least Paris has a sense of humor about her celebrity. Conficker just wastes people's time.
Your time and mine, for example. You're reading this because someone--not me--convinced you that Conficker matters. I am writing this because IBM has convinced me that Conficker is a wash. If it turns out differently, I'll owe the worm at apology. Paris can fend for herself.
I may host a daily call-in radio program, but I am not a conspiracy nut. Still, don't you sometimes wonder who is responsible for "threats" that develop such a high profile? I am not saying the industry that protects us against these threats might somehow be in cahoots with the people who create them. No, I am not saying that.
Conficker has once again reminded us that our systems are vulnerable and we need to invest $$$ in protection. Or has it already backfired?
Maybe Conficker will prove that what we already have works pretty well. Maybe Microsoft did a good job dealing with this threat and the anti-malware vendors likewise. Maybe Conficker will send the message that what we are doing is just fine, thank you. Spend more money to counter threats like this? Why?
Watching the news coverage as 12:01am local time on April 1 marches around the globe reminds me of the last time we did this. You remember the Y2K bug, don't you?
Back then, the world's mainframes were supposed to croak as 1999 rolled into 2000. Like today, I watched--only back then I was sitting in an emergency operations center--as countries around the global rang in the New Year with their vital infrastructure intact.
Last time, we were saved from a very real problem by a lot of recoding, necessary to work around the time/date problem. This time, we are saved from a not very significant problem by a Microsoft patch that everyone should already have had as well as wide variety of tools capable of clearing Conficker from our systems.
As I write this, Conficker seems to be passing more or less harmlessly by. The clock is actually working in our favor. IBM estimated that Asia has the largest collection of Infected-infected systems. North America about a third as many as Asia. Europe has more than we do.
If Asia and Europe survive Conficker, we don't have much to worry about. Conficker will pass from our consciousness and I won't owe the worm an apology.
If only Paris Hilton were so easy to protect ourselves against.

Conficker's Zero Hour Arrives Without Event -- Yet

An expected activation of the Conficker.c worm at midnight on April 1 passed without incident, despite sensationalized fears that the Internet itself might be affected, but security researchers said users aren't out of the woods yet.
"These guys have no designs, I think, on taking down the infrastructure, because that would separate them from their victims," said Paul Ferguson, a threat researcher at antivirus vendor Trend Micro, calling the technology and design of Conficker.c as "pretty much state of the art."
"They want to keep the infrastructure up and in place to make it much harder for good guys to counter and mitigate what they've orchestrated," he said.
The Worm Stirs
Conficker.c was programmed to establish a link from infected host computers with command-and-control servers at midnight GMT on April 1. To reach these control servers, Conficker.c generates a list of 50,000 domain names and then selects 500 domain names to contact. That process has started, researchers said.
Exactly how many computers are infected with Conficker.c is not yet known, but the estimated number of systems infected by all variants of the Conficker worm exceeds 10 million, making this one of the largest botnets ever seen.
While infected computers have started reaching out to command servers as expected, nothing untoward has happened.
"We have observed that Conficker is reaching out, but so far none of the servers they are trying to reach are serving any new malware or any new commands," said Toralv Dirro, a security strategist at McAfee Avert Labs, in Germany.
This may just mean the people who control Conficker are biding their time, waiting for researchers and IT managers to relax their guard and assume the worst is over.
"It would be pretty stupid for the guys running Conficker to use the first possible opportunity, when everybody is very excited about it and looking at it very carefully," Dirro said. "If something was going to happen, it would probably happen in a couple of days."
Detections, Innoculations Increase
Time is not on Conficker's side. The worm can be easily detected and removed by users. For example, if a PC is unable to reach Web sites such as McAfee.com, Microsoft.com, or Trendmicro.com that is an indication that the computer may be infected.
In addition, IT managers can easily spot traffic coming from odd domain names and block access to the computers on their company networks. "The longer criminals wait, the less infected hosts they've got," Dirro said.
Additional help comes from a loose coalition of security vendors and others called the Conficker Working Group, which has banded together to block access to domains that Conficker is trying to communicate with. But it's not immediately clear whether those efforts, which have been successful at blocking earlier versions of the worm, will be effective against the activation of Conficker.c.
"We can't really say how successful the attempts at blocking them or not routing them are," Dirro said. "That's something we'll see when the first domain actually starts serving malware, if at least one starts doing that."
Despite the uneventful passing of the activation deadline, the threat presented by Conficker remains real.
"These guys are very sophisticated, very professional, very determined and very measured in how they implement and make changes to things," Ferguson said, adding that Conficker.c is better defended and more survivable than previous versions of the worm. "This activation on April 1 was probably just arbitrary and picked to cause hysteria."
At some point, the people behind Conficker.c could try to generate revenue from the botnet they've created or they could have other intentions.
"The big mystery is that there's this big loaded gun out there, this network of millions of machines that's under the control of persons unknown," Ferguson said. "They've given no indication of what their motives are other than toying with people."

HP Confirms Considering Android in Netbooks

Hewlett-Packard confirmed Tuesday that it is testing Google's Android operating system as a possible alternative to Windows in some of its netbook computers.
Analysts said the move would allow HP to develop a low-cost netbook optimized for wireless networks that provides access to Web-based services such as Google Docs, but others questioned whether the Google software is ready for such a task.
"Right now Android is barely finished for phones," said Avi Greengart, an analyst at Current Analysis. While it works well enough for T-Mobile's G1 smartphone, the software was released only last year and "the UI still feels half-finished," he said.
HP stressed that it was still only testing Android, an OS based on the open-source Linux kernel. It has assigned engineers to the task but has made no decision yet whether to offer Android in products, said HP spokeswoman Marlene Somsak. The news was first reported earlier Tuesday by the Wall Street Journal.
"We want to assess the capability it will have for the computing and communications industry," Somsak said. "We remain open to considering various OS options."
Netbooks are small, low-cost computers that are designed primarily for browsing the Web and doing basic computing tasks. The category has proved popular -- about 10 million netbooks shipped in 2008 and the number is expected to double this year, according to IDC.
Android was designed for mobile phones but has been seen by some others besides HP as a potential OS for netbooks. Some enthusiasts have been testing Android on netbooks such as Asustek's Eee PC, and chip makers such as Qualcomm and Freescale hope to bring Android to netbooks running on their Arm-based chips.
HP may have in mind a netbook optimized for use with Web-based services such as the Google Docs hosted applications suite and Google's online storage service, said Roger Kay, president of Endpoint Technologies Associates.
The fact that notebooks are designed to provide quick access to online services, often over wireless networks, makes them in some ways like oversized smartphones.
There are also no license fees for Android, which could allow hardware makers to offer lower-priced computers than those running Windows. However, consumers have been willing to pay extra in the past for netbooks running Windows, analysts noted.
HP already offers some PCs with a choice of Linux or Windows, and introducing another OS choice would come with some risk, said David Daoud, a research manager at IDC. Some end-users don't like Linux because they are unfamiliar with it, he said.
"We've seen a number of netbooks returned as a result of the Linux OS. Consumers are used to the Microsoft Windows world," Daoud said. Linux adoption remains weak on client computers, especially in mature markets like the U.S. and Western Europe, he noted.
Still, there may be an upside for Android if HP were to make it work in netbooks. HP's heft as the world's largest PC maker would widen Android's use, Daoud said. It could see success in emerging markets like India and China, where Linux adoption is growing.
But HP would need to deliver a consumer-friendly product that makes Linux easier to use in PCs, Daoud said.

Friday, March 27, 2009

Will New Tracker Tools for Your Cell Phone Give You Away?

Cell phone apps like Loopt and the new Google Latitude allow you to track your friends' physical locations, and be tracked in return. That can be a huge boon for meeting up on a Friday night-and a real nightmare for privacy if proper safeguards aren't in place. (Read more on cell phone privacy.)
I checked out both applications. For starters, neither will share your location with anyone until you explicitly agree to such sharing with each individual friend. So you can install either one and see how it looks without divulging where you are.
Also, after inviting a friend to share his or her location, or being invited to do so yourself, you can go back and change the setting to stop sharing your location with a particular friend and continue sharing with others, or stop sharing with anyone.
But what happens if you set up either app to share with friends, and forget about it? Or what if someone else puts it on your phone, without your knowledge, to track you?
In what's usually seen as a limitation, the iPhone doesn't allow running programs in the background--so Loopt can't update your location unless you open the app (Google Latitude, when it becomes available for the iPhone, should work similarly).
But most other cell phone platforms allow background processes to run silently--a potential problem. Within a few days of installing Loopt, however, you'll get an SMS notice so you'll know it's there. Loopt CEO Sam Altman also says that if you don't use Loopt for a while it will automatically stop sharing your location-likely within a week of nonuse. Google Latitude will display a pop-up notification on all phones save Android-based devices (whose users will receive an e-mail, Google says), but it won't automatically shut off.
Google does let you limit sharing to only your city-level location, and in both apps you can enter a (possibly false) location for yourself.
Both Google and Loopt say they do not store historical locations, only your last location. That's important in case someone-the government, say, or a civil litigant-seeks that data. Loopt says it will share that info only under a wiretap or¬¬der. Google hasn't said it will do the same, but it does have a record of fighting government requests for its users' information.
My conclusions? Some things could be improved: First, you should be able to share your location only for a set amount of time-say, the next 2 hours, or from 6 to 9 p.m. on Fridays. Loopt says that ability will come in a future release, but Google isn't planning to announce anything along those lines.
Next, I think Google should have an auto-shutoff after a certain amount of time, in case you become forgetful. And it should explicitly declare it won't share your information without a wiretap order.
Of the two, you might try Loopt (ideally on an iPhone), since it has auto-off and will also come out with time-based controls.
But here's the kicker: As Kevin Bankston of the Electronic Frontier Foundation points out, the safeguards in place are only company policy, not a legal requirement. And policies can change.

IP Issues Could Be Slowing IBM-Sun Talks, Experts Say

If IBM is in the due diligence phase of acquisition talks with Sun Microsystems, as news reports suggest, then it has an awful lot to be diligent about.
In a merger of this scale, IBM would need to take a hard look not only at Sun's finances but also at any antitrust issues that may arise, as well as potential conflicts related to intellectual property. Those could include compatibility of software licenses and patent agreements with third parties.
"In a deal of this size, there are typically lots of moving parts," said Randall Bowen, an attorney at Grad, Logan and Klewans in Falls Church, Virginia. "Think of a kaleidoscope, where you turn it and everything comes together to form a nice symmetrical shape. Either that happens and everything falls into place, or else it shatters."
The Wall Street Journal reported last Friday that IBM was scouring Sun's business contracts for potential conflicts in a prelude to a possible merger, a process it said was expected to take "a number of days." With another week over and no word about a deal from the companies, some observers are starting to wonder if there's a holdup.
"It's impossible to know what it is they're looking at, but the fact that it's taking this long gives one pause to wonder whether there's just such a volume of contracts to look at that it's occupying all this time, or whether they've found some issues that they're busily chasing down," said Steven Frank, a partner with the law firm Goodwin Procter.
To be sure, the due diligence process for a merger this size could take months to complete. But companies often do a cursory review of the business they hope to acquire in order to announce a preliminary merger agreement. They then take several months before the deal is finalized to pore over the details.
If they do plan to merge, Sun and IBM may simply be haggling over price. But if the due diligence is holding them up, the thorny area of intellectual property could create some sticking points, said Frank, who spoke about IT industry mergers in general and not specifically this one.
Both companies have vast product portfolios governed by a mix of open-source and commercial licenses. They also have numerous patent and cross-licensing deals with third parties, including a byzantine agreement that Sun forged with Microsoft in 2004 that ended a lawsuit between them over the Java software technology.
Sun may be licensing a technology from a third party that is vital to one of its products, for example, and such agreements sometimes have clauses stipulating that the license can't be transferred if the licensee is acquired. IBM would need to approach the third party to extend the license, or decide whether to go ahead with the merger even if it has to find another way to build the product.
That's the issue Intel raised about Advanced Micro Devices' sale of its manufacturing operations to an Abu Dhabi investment group. Intel accused AMD of violating a cross-patent agreement on x86 processors that could not be transferred to a third party, and the companies are in talks with a mediator to resolve the dispute.
Conflicting software licenses can also be a problem. Dozens of Sun's products, including OpenSolaris, NetBeans and its GlassFish Web software, use its Common Development and Distribution License, which is based on the open-source Mozilla Public License. Its MySQL database is offered under the GPL or a Sun commercial license, while still other products use different licenses.
Depending on what IBM has planned for Sun's technologies, the mix of licenses could be a challenge, said Randall Colson, a partner at Haynes and Boone. For example, some industry analysts speculate that IBM wants to merge the best of Solaris into IBM's AIX Unix, which is offered under an IBM commercial license. If Sun has merged a third party's open-source code into Solaris, IBM may find barriers to merging Solaris with its proprietary AIX software.
Perhaps most complex for IBM would be the intricate deal that Sun entered into with Microsoft, which ended a long-standing lawsuit between them over Microsoft's alleged attempts to undermine Java.
The deal netted Sun almost $2 billion from Microsoft, including payments of $700 million for Sun to drop its Java lawsuit, and a further $900 million for a patent-sharing agreement that could be extended for as long as 10 years. IBM, whose software business depends heavily on Java, would need to pull those agreements apart to ensure nothing could interfere with its business or expose it to legal risk from Microsoft.
With reports of the due diligence work only a week old, it would be premature to assume that any talks under way have run into trouble, Bowen said. But the longer they take, the more uncertainty it creates for the customers and investors.
"It's fair to say that with every day that passes, it makes it seem a little less likely that this deal is going to happen," he said.

Fears of a Conficker Meltdown Greatly Exaggerated

Worries that the notorious Conficker worm will somehow rise up and devastate the Internet on April 1 are misplaced, security experts said Friday.
Conficker is thought to have infected more than 10 million PCs worldwide, and researchers estimate that several million of these machines remain infected. If the criminals who created the network wanted to, they could use this network to launch a very powerful distributed denial of service (DDOS) attack against other computers on the Internet.
April 1 is the day that the worm is set to change the way it updates itself, moving to a system that is much harder to combat, but most security experts say that this will have little effect on most computer users' lives.
Nevertheless, many people are worried, according to Richard Howard, director of iDefense Security Intelligence. "We have been walking customers down from the ledge all day," he said. Often, the problem has been that company executives have read reports of some April 1st incident and then proceed to "get their IT and security staffs spun up," Howard said in an e-mail interview.
That hype will probably intensify when the U.S. TV newsmagazine 60 Minutes airs a report Sunday on Conficker, entitled "The Internet is Infected."
Conficker "could be triggered, maybe on April 1st ... but no one knows whether on April 1st they'll just issue an instruction that says 'Just continue sitting there' or whether it will start stealing our money or creating a spam attack," CBS reporter Lesley Stahl said in a preview interview ahead of the show. "The truth is, nobody knows what it's doing there."
April 1 is what Conficker researchers are calling a trigger date, when the worm will switch the way it looks for software updates. The worm has already had several such trigger dates, including Jan. 1, none of which had any direct impact on IT operations, according to Phil Porras, a program director with SRI International who has studied the worm.
"Technically, we will see a new capability, but it complements a capability that already exists," Porras said. Conficker is currently using peer-to-peer file sharing to download updates, he added.
The worm, which has been spreading since October of last year, uses a special algorithm to determine what Internet domains it will use to download instructions.
Security researchers had tried to clamp down on Conficker by blocking criminals from accessing the 250 Internet domains that Conficker was using each day to look for instructions, but starting April 1, the algorithm will generate 50,000 random domains per day -- far too many for researchers to connect with.
Gradually, the Conficker network will get updated, but this will take time, and nothing dramatic is expected to happen on April 1, according to Porras, Howard, and researchers at Secureworks and Panda Security.
"There is no clear evidence that the Conficker botnet will do anything dramatic," said Andre DiMino, cofounder of The Shadowserver Foundation, a volunteer security group. "It will change its domain usage to the larger pool and may attempt to drop another variant, but so far, that's about it."
"Regular users just need to be sure they are patched and be extra diligent about possible new methods of infection."

Friday, March 13, 2009

Microsoft Disputes Attempt to Reinstate Class in Vista Suit

Microsoft is disputing an attempt to reinstate class-action status to an ongoing lawsuit against its Windows Vista Capable sticker program, a case that threatens to drag on and is reflective of the difficulties Microsoft has encountered by releasing its disappointing Windows Vista OS.
In court papers filed in a U.S. District Court in Seattle this week, Microsoft asked the court not to reconsider applying class-action status to the suit because people knew exactly which version of Vista they would receive through a coupon program called Express Upgrade Guarantee. The program allowed customers to buy PCs with Windows XP installed on them but then upgrade to Vista when the OS was released.
Microsoft also said that the plaintiffs took too long to ask for a narrowing of the class, even based on "theories known to them for more than a year," according to court papers.
The TechFlash blog Thursday posted a link to a PDF of Microsoft's most recent filing in the case, first brought against the vendor by plaintiff Dianne L. Kelley in April 2007.
Late last month, attorneys in the case asked the court to re-establish class by narrowing the scope of who could participate in the suit. This came a week after the judge in the case granted Microsoft's motion to dismiss the suit's class-action status but allowed it to go forward with six plaintiffs.
Plaintiffs now want the judge to allow the suit to apply to anyone who purchased Windows Vista Capable PCs in Microsoft's Express Upgrade Guarantee program. The Express Upgrade Guarantee program provided coupons to people who purchased Windows Vista Capable PCs so they could upgrade to the appropriate version of Vista either for free or for little cost once the OS was made available.
The overarching claim in the suit is that Microsoft's Windows Vista Capable sticker program, which theoretically let customers know which PCs were capable of running Vista before the OS was made generally available, was an example of deceptive business practices and violated consumer protection laws.
Microsoft's hardware partners began shipping PCs with the "Windows Vista Capable" logo in April 2006. However, the designation was potentially confusing, because a PC with the label was only guaranteed to run the least expensive, most basic version of Vista.
The case is scheduled to go to trial April 13; however, in last month's filing plaintiffs asked that the judge push back the trial date in case class is reinstated to give others time to join the suit. The judge has yet to respond to that filing.

Foreign Web Attacks Change Security Paradigm

Traditional security systems may be ineffective and become obsolete in warding off Web attacks launched by countries, according to Val Smith, founder of Attack Research. New attack trends include blog spam and SQL injections from Russia and China, Smith said during his talk at the Source Boston Security Showcase on Friday.
"Client-side attacks are where the paradigm is going," Smith said. "Monolithic security systems no longer work."
Hackers use Web browsers as exploitation tools to spread malware and collect sensitive information. Smith used examples from clients of his company, which analyzes and researches computer attacks, to demonstrate the threat posed by blog spam and SQL attacks.
Attackers targeted high-traffic sites with blog spam and posted comments on blogs, he said. The comments looked odd and tended to have non-English phrases placed in large blocks of text with random words hyperlinked, he said. Clicking on such links took users to sites that seemed like blogs but were pages loaded with malware, Smith said.
A Chinese bank owned the domains for each malware site, but the IP (Internet Protocol) addresses traced to Germany. Studying the links revealed that each one contained words in Russian or Romanian, said Smith. By placing an international spin on their nefarious activities, the hackers hoped to confuse anyone investigating their work, he said.
"How are you going to track these back to the bad guys?" he said, noting that tracking is complicated by language barriers, working with foreign law organizations and dealing with countries "that just may not want to talk to us."
While the goals of blog spam attacks remain unclear, Smith said financial incentives serve as motivation. Adware installed after a user visits an infected site nets a hacker money, as does clicking on an advertisement on the page. Other hackers are looking to expand their botnets, or networks of compromised machines used for malevolent purposes.
Smith's investigation traced the attacks to a home DSL account in Russia. The international nature of the incident made prosecution unlikely, he said.
The SQL injection attack Smith discussed originated in China and attempted to steal information on the businesses that visited the Web site of the company, which was Smith's client.
Hackers first launched a SQL injection and uploaded a back door that allowed them to take control of the system.
Additional SQL injections failed, so the hackers searched the system for another exploit. They found a library application that allows images to be uploaded. Hackers uploaded a GIF file with a line of code contained in the image. The computer system read the GIF tag and uploaded the photo and automatically executed the code.
Hackers "targeted an app that is custom-written, in-house, and launched a specific attack against that app," Smith said.
Hackers eventually placed "iFrame" HTML code on every page of the company's Web site. The iFrames redirected the victim's browser to a server that infects the computer using a tool called "MPack." This tool profiled a victim's OS and browser and launched attacks based on that information.
The result is that victims are getting hit with multiple attacks, said Smith.
Today, SQL injection attacks are the top threat to Web security, said Ryan Barnett, director of application security at Breach Security, in an interview separate from the conference.
Last year, cybercriminals began unleashing massive Web attacks that have compromised more than 500,000 Web sites, according to the security vendor.
"They started off in January and went through essentially the whole year," said Barnett. Previously, crafting a SQL injection attack took time, but last year attackers created worm code that could automatically seek out and break into hundreds of thousands of sites very quickly.
Now, instead of stealing data from the hacked Web sites, the bad guys are increasingly turning around and planting malicious scripts that attack the site's visitors. "Now the site is becoming a malware depot," he said.
(Bob McMillan in San Francisco contributed to this report.)

Friday, February 20, 2009

Brocade: Recession Dampens Data-center Trials

The ailing economy is leading some enterprises to put off transforming their data-center networks with emerging technologies such as FCOE (Fibre Channel over Ethernet), Brocade Communications' CTO said Thursday.
IT managers are delaying transitions to converged networks that use a single protocol across both the storage and server areas of a data center, CTO Dave Stevens said in an interview after the company announced a steep increase in revenue for its first fiscal quarter, which ended Jan. 24.
It was the first quarter since Fibre Channel storage network pioneer Brocade acquired Foundry Networks, an Ethernet LAN vendor. FCoE and Converged Enhanced Ethernet (CEE) are two emerging standards designed to combine the strengths of Fibre Channel and Ethernet.
"People are pushing back on trialing converged infrastructure right now," Stevens said. That reflects a greater selectiveness in pursuing IT projects as enterprises move into a mode of buying just what they need, he said.
However, growing network traffic and collections of data, along with requirements to keep data for longer periods, are forcing enterprises to upgrade their networks, he said. In doing so, they are saving money by consolidating ports in fewer platforms, such as large Ethernet switches that can accommodate as many connections as 10 smaller boxes, Stevens said.
"The FCOE stuff and the CEE stuff seem to be pushing out a little bit, and there seems to be more emphasis on the Ethernet side and the Fibre Channel side to implement high-density switching systems in both of those environments," he said.
Brocade reported revenue of $431.6 million for the quarter, up 8 percent from the previous quarter and 24 percent from a year earlier. That figure included about one month of revenue from Foundry, which was folded into the company in late December. It fell short of the consensus forecast of analysts by Thomson Reuters, which was US$441.7 million.
The company posted a loss of US$26 million, or $0.07 per share, because of one-time items that mostly were associated with the Foundry deal, according to Stevens. Not including those items, Brocade earned US$63.6 million or $0.15 per share, exceeding the consensus forecast of analysts by Thomson Reuters, which was $0.13 per share.
Brocade reported the integration of Foundry is ahead of schedule and that "the vast majority" of Foundry employees have remained on board. Brocade has been reorganized to focus on three market segments: Data center infrastructure, campus networks, and service-provider infrastructure, Stevens said. Engineers from both companies are working together on the next generation of technology, such as FCOE gear, but the traditional Fibre Channel and Ethernet product lines will remain and be updated for the foreseeable future, he said.
The biggest challenge in integrating the businesses has been allocating engineers and funding among the Ethernet, Fibre Channel and converged-infrastructure categories, Stevens said.
For fiscal 2009, Brocade predicted IT spending would continue to be held down by economic conditions but start to pick up in the fiscal fourth quarter and the next fiscal year. It forecast annual revenue of $1.9 billion to $2 billion, up from about $1.5 billion in fiscal 2008. But the company sees revenue rising only slightly in the following fiscal year, giving a revenue range for planning purposes of $2.1 billion to $2.2 billion for fiscal 2010.
In after-hours trading late Thursday, Brocade's shares on the Nasdaq (BRCD) were down $0.10 at $3.28.

How Will the $7.2 Billion Allotted for Broadband Stimulus Be Spent?

Though a number of details are vague, many people in tech and telecom circles hope that the $7.2 billion allotment for broadband in the newly enacted federal economic stimulus package marks the beginning of a nationwide broadband strategy.
In the American Recovery and Reinvestment Act of 2009, recently enacted by Congress, many details regarding the allocation of funds for high-tech projects remain blurry. Nevertheless, the nation's tech community appears to be encouraged by the $7.2 billion provision for broadband in the near $789 billion economic stimulus package signed into law by President Barack Obama earlier this week. Many observers believe that the allocation is a clear first step toward establishing a nationwide broadband strategy.
Officially known as "Title VI--Broadband Technology Opportunities Program," the $7.2 billion in broadband stimulus money accounts for less than 1 percent (and only five pages) of the entire package. Its purpose is to spur broadband growth in underserved areas of the country.
What the Law Says
The bureaucracy to allocate the money has not been set up yet, and no one can be absolutely sure exactly how the broadband program will work. Still some definite elements have emerged.
First, two entities will issue grants under Title VI: the National Telecommunications & Information Administration (NTIA), and the United States Department of Agriculture (USDA) Rural Utilities Service. Tech companies, telecommunications service providers, and other ISPs large and small will compete for the grant money through a bidding process managed by the two organizations.
But confusion exists even on this point. "There's no clear way to know which government entity they should apply to," says Derek Turner, research director of Free Press, a Washington media-reform think tank.
Urban vs. Rural Broadband
The debate has begun in earnest over how much of the money should go to developing and extending rural broadband service and how much to improving quality and choice in existing urban broadband service. The division of the $7.2 billion between the two agencies provides some clue: The NTIA will be responsible for about $4.7 billion of the money, while USDA will dispense about $2.5 billion of it.
Language in the new law explicitly mentions expanding broadband to rural areas: "The purposes of the program are to (1) provide access to broadband service to consumers residing in underserved areas of the United States; (2) provide improved access to broadband service to consumers residing in underserved areas of the United States."
The law does not define any of those terms, however, nor does it identify the mechanism for issuing funds. Rather, it simply states that "the grant program [will be created] as expeditiously as practicable" and that "if approved, provide the greatest broadband speed possible to the greatest population of users in the area."
The USDA has been operating a Rural Utilities Service since 2002 to help small towns obtain broadband access; but the program, operating with a much smaller budget than the one it will administer under the stimulus act, has achieved only limited success.
We also know something about the timing of the allocations. The new bill states that "all awards are [to be] made before the end of fiscal year 2010."
Many Unknowns in Allocation Plan
While the Obama Administration would like to dole out this money as quickly as possible, many industry experts say that several months--and perhaps a year or more--will pass before any tangible services are up and running. Furthermore, many of the program's details have yet to be determined.
According to Bart Forbes, spokesperson for the National Telecommunications & Information Administration (NTIA), the White House's technology policy arm, and one of main distributors of the new infusion of broadband money, no bureaucratic process is in place yet to move the funds to their needed destinations. "There's no procedure; there's no staff; there's no program," Forbes says. "The key players have not been put into place."
Forbes adds that the NTIA has no permanent head at the moment--and hasn't had one since November 2007. Moreover, the Department of Commerce, of which the NTIA is a component agency, has no secretary either.
Despite these ambiguities, many industry analysts seem hopeful about the broadband initiative's prospects for success. "There's lots of potential for waste, fraud, and abuse [in the new law], but our country is in trouble right now," Turner says. "I'm cautiously optimistic."
How Will It Work?
Once the NTIA and the USDA create a system for distributing stimulus grants, they will work with the various states to outline the states' needs. The resulting proposals could come in the form of wired or wireless projects--the language of the law doesn't specify any particular speed or technology.
Meanwhile, tech companies, nonprofits, and ISPs will submit grant proposals and the Washington, D.C., entities will broker the final arrangements for funding approved proposals.
Each grant must adhere to principles of openness, including generally recognized provisions of Net neutrality, which require an "open access basis."
To counter potential fraud and waste, the law also mandates a "fully searchable database, accessible on the Internet at no cost to the public, that contains at least a list of each entity that has applied for a grant under this section, a description of each application, the status of each such application, the name of each entity receiving funds made available pursuant to this section, the purpose for which such entity is receiving such funds, each quarterly report submitted by the entity pursuant to this section, and such other information sufficient to allow the public to understand and monitor grants awarded under the program."
Will It Ceate Jobs?
Industry watchers say that the new law is crucial if some 20 million Americans are to obtain the broadband Internet access they need.
Craig Settles, president of Successful.com and a longtime telecom industry observer, notes that public discussion of the broadband provision and of the larger stimulus package tends to focus on their similarity to New Deal-era public spending on infrastructure projects; but he says that the parallel is inexact.
"Broadband is as vital as roads and highways, but it isn't as much in the building of the infrastructure as in the job creation that comes out of the more physical, like dams and roads and so forth--those old-school infrastructure projects generate a lot of work," Settles says. "Where you're going to have the greatest impact [with the new projects] is after the network is done. It will draw new businesses to the communities; it will enable the businesses that are there to expand their markets."
What's Next?
In coming weeks, the person appointed as Secretary of Commerce by President Obama will appoint an assistant secretary--and that person will bear primary responsibility for overseeing execution of the provisions of Title VI.
"Over the next 60 days, the Department of Commerce and Department of Agriculture are going to write the [Request for Proposal] that puts the teeth into this bill, and the stipulations that the money gets appropriated to where it's needed and that it's open so it's not just the incumbents that are sucking up the money," Settles says.
Many other industry observers--including Harold Feld, a telecommunications consultant--say that the Obama Administration's attention to broadband indicates its commitment to making technology policy a high priority.
"So far, the Obama people who are going to be running this have shown that they have a drive and an appreciation for what broadband can do to transform people's lives," Feld says. "[Obama] has made a relatively minor part of the stimulus bill something that he talks about in every one of his speeches."

Conficker Worm Gets an Evil Twin

The criminals behind the widespread Conficker worm have released a new version of the malware that could signal a major shift in the way the worm operates.
The new variant, dubbed Conficker B++, was spotted three days ago by SRI International researchers, who published details of the new code on Thursday. To the untrained eye, the new variant looks almost identical to the previous version of the worm, Conficker B. But the B++ variant uses new techniques to download software, giving its creators more flexibility in what they can do with infected machines.
Conficker-infected machines could be used for nasty stuff -- sending spam, logging keystrokes, or launching denial of service (DoS) attacks, but an ad hoc group calling itself the Conficker Cabal has largely prevented this from happening. They've kept Conficker under control by cracking the algorithm the software uses to find one of thousands of rendezvous points on the Internet where it can look for new code. These rendezvous points use unique domain names, such as pwulrrog.org, that the Conficker Cabal has worked hard to register and keep out of the hands of the criminals.
The new B++ variant uses the same algorithm to look for rendezvous points, but it also gives the creators two new techniques that skip them altogether. That means that the Cabal's most successful technique could be bypassed.
Conficker underwent a major rewrite in December, when the B variant was released. But this latest B++ version includes more subtle changes, according to Phil Porras, a program director with SRI. "This is a more surgical set of changes that they've made," he said.
To put things in perspective: There were 297 subroutines in Conficker B; 39 new routines were added in B++ and three existing subroutines were modified, SRI wrote in a report on the new variant. B++ suggests "the malware authors may be seeking new ways to obviate the need for Internet rendezvous points altogether," the report states.
Porras could not say how long Conficker B++ has been in circulation, but it first appeared on Feb. 6, according to a researcher using the pseudonym Jart Armin, who works on the Hostexploit.com Web site, which has tracked Conficker.
Though he does not know whether B++ was created in response to the Cabal's work, "it does make the botnet more robust and it does mitigate some of the Cabal's work," Support Intelligence CEO Rick Wesson said in an e-mail interview.
Also known as Downadup, Conficker spreads using a variety of techniques. It exploits a dangerous Windows bug to attack computers on a local area network, and it can also spread via USB devices such as cameras or storage devices. All variants of Conficker have now infected about 10.5 million computers, according to SRI.

Scientists Claim Big Leap in Nanoscale Storage

Nanotechnology researchers say they have achieved a breakthrough that could fit the contents of 250 DVDs on a coin-sized surface and might also have implications for displays and solar cells.
The scientists, from the University of California at Berkeley and the University of Massachusetts Amherst, discovered a way to make certain kinds of molecules line up in perfect arrays over relatively large areas. The results of their work will appear Friday in the journal Science, according to a UC Berkeley press release. One of the researchers said the technology might be commercialized in less than 10 years, if industry is motivated.
More densely packed molecules could mean more data packed into a given space, higher-definition screens and more efficient photovoltaic cells, according to scientists Thomas Russell and Ting Xu. This could transform the microelectronics and storage industries, they said. Russell is director of the Materials Research Science and Engineering Center at Amherst and a visiting professor at Berkeley, and Xu is a Berkeley assistant professor in Chemistry and Materials Sciences and Engineering.
Russell and Xu discovered a new way to create block copolymers, or chemically dissimilar polymer chains that join together by themselves. Polymer chains can join up in a precise pattern equidistant from each other, but research over the past 10 years has found that the patterns break up as scientists try to make the pattern cover a larger area.
Russell and Xu used commercially available, man-made sapphire crystals to guide the polymer chains into precise patterns. Heating the crystals to between 1,300 and 1,500 degrees Celsius (2,372 to 2,732 degrees Fahrenheit) creates a pattern of sawtooth ridges that they used to guide the assembly of the block copolymers. With this technique, the only limit to the size of an array of block copolymers is the size of the sapphire, Xu said.
Once a sapphire is heated up and the pattern is created, the template could be reused. Both the crystals and the polymer chains could be obtained commercially, Xu said.
"Every ingredient we use here is nothing special," Xu said.
The scientists said they achieved a storage density of 10Tb (125GB) per square inch, which is 15 times the density of past solutions, with no defects. With this density, the data stored on 250 DVDs could fit on a surface the size of a U.S. quarter, which is 25.26 millimeters in diameter, the researchers said. It might also be possible to achieve a high-definition picture with 3-nanometer pixels, potentially as large as a stadium JumboTron, Xu said. Another possibility is more dense photovoltaic cells that capture the sun's energy more efficiently.
Russell and Xu's approach differs from how other researchers have been trying to increase storage density. Most have been using optical lithography, which sends light through a mask onto a photosensitive surface. That process creates a pattern to guide the copolymers into assembling.
The new technology could create chip features just 3nm across, far outstripping current microprocessor manufacturing techniques, which at their best create features about 45nm across. Photolithography is running into basic barriers to achieving greater density, and the new approach uses less environmentally harmful chemicals, Xu said. But actually applying the technique to CPUs would pose some challenges, such as the need to create random patterns on a CPU, Xu said.
Among other things, such a leap ahead in storage density could alter either the amount of content that a person could carry with them or the quality of media delivered on discs, said Nathan Brookwood, principal analyst at Insight64. For example, it might allow movies to turn into holograms, he said.
"Just when we think we're so technically sophisticated in what we can do, along comes somebody with a notion like this, which has the potential to fundamentally change economics in so many different areas," Brookwood said.
Ultra-high-definition displays have less practical potential, according to IDC analyst Tom Mainelli. The image and video standards of today, including those used in HDTV, couldn't take advantage of a display with 3nm pixels, he said. And when it comes to monitors, price is king.
"You could see how there would be a value to that level of precision (in an area like medical imaging) ... but are we talking about a [US]$10,000 display?" Mainelli said.
Insight64's Brookwood said the technology, for which Berkeley and Amherst have applied for a patent, harkens back to fundamental breakthroughs that created the IT industry, he said.
"It's this kind of basic materials research that has created the opportunities that have made Silicon Valley and American manufacturing great," Brookwood said. "The last few years (in the U.S.), there have been fewer and fewer people working on this level of basic stuff," he said.

Monday, February 16, 2009

Google, Nvidia Bringing Android to Tegra Chips

Nvidia on Monday said it is working with Google to build support for Linux applications on smartphones with its upcoming Tegra mobile chips.
The company has allied with Google and the Open Handset Alliance to support the open-source Android software stack, which is increasingly being adopted by smartphone makers including Samsung and HTC.
Primarily known as a graphics card vendor, Nvidia said Tegra chips would bring advanced graphics capabilities to smartphones while drawing less power.
The support for the Android platform is an attempt to drive up Tegra's adoption among smartphone makers. Nvidia is displaying an Android-based phone with a Tegra chip at the GSMA Mobile World Congress being held in Barcelona from Monday to Thursday.
Tegra-based phones will combine advanced graphics, better battery life and always-on Internet access, Nvidia said in a press release. Smartphone makers can now use the Android platform to build Web 2.0 and Internet-based applications for Tegra-based smartphones, the company said.
Tegra chips put an Arm-based processor core, a GeForce graphics core and other components on a single chip. The product lineup includes the Tegra 600 running at 700MHz and Tegra 650 running at 800MHz. It also includes Tegra APX 2500 and APX 2600.
The systems-on-chips will start shipping in mid-2009 for handheld devices like smartphones and mobile Internet devices. Nvidia couldn't immediately name companies that may ship smartphones with the chips. However, an analyst last week speculated that Microsoft would launch a smartphone with Tegra's APX 2600 chip at MWC.
Beyond open-source support, Tegra chips also support Windows-based applications. At last year's MWC, Nvidia announced Tegra would support Windows Mobile and enable 3D user interfaces and high-definition video on smartphones.
Nvidia also wants to help bring about mobile Internet devices (MIDs) for US$100 with Tegra chips. Mobile Internet devices are handheld communication and Internet devices that fall somewhere between a sub-notebook and a smartphone.
A $99 Tegra-based MID is expected to be announced by Nvidia at MWC. The MID includes full high-definition 1080p video playback and full Wi-Fi and 3G mobile broadband connectivity capabilities. The always-on device can go "days" between battery charges, a company spokesman said.
Other than saying similar MIDs would ship in the second half, the company provided no further details about the product.

Adobe to Show off New Flash for Smartphones

At the Mobile World Congress on Monday, Adobe plans to show off progress on its Flash Player 10 for smartphones and deliver a new software development kit that should make reading documents on small screens easier.
While Adobe has demonstrated Flash Player 10 on the Android G1, at MWC it will also show it running on Nokia S60 and Windows Mobile phones. While Flash Player 10 won't display absolutely everything developed for the Web, even on high-end smartphones, it will come closer than its predecessors, said Anup Muraka, director of technology strategy and partner development in Adobe's platform business unit.
Muraka couldn't add any more details about the possibility of Flash in either form on iPhones, a question that many of the phone's users have wondered about. "I can reiterate what our CEO recently said, that we'll continue our development efforts. There's a fair bit of work to be done, and we're looking forward to completing that and coordinating with Apple to try to make it available," he said.
Adobe also planned to announce that it released a new Adobe Reader Mobile SDK that will replace Reader LE 2.5, the current mobile PDF reader. Licensees will use the new SDK to enable the display of PDF documents in their own readers. Reader LE 2.5 is slightly less flexible, requiring licensees to use an included reader.
The new SDK will fit text to the screen rather than display documents in their full size. "In the existing reader, you have to zoom in and pan around," Muraka said.
Sony is already using the technology in its Reader Digital Book, and e-book readers from Bookeen and iRex Technologies as well as Lexcycle, the maker of the iPhone Stanza book reader, plan to use it.
For developers, Adobe introduced new technology that will automatically detect if users buying their applications have Flash Lite, and if they don't, offer to install it. "A developer no longer has to be dependent on whether a consumer has the latest device or software," said Muraka. The distributable player is now available as a beta.
Adobe will also use Mobile World Congress to push its Open Screen Project, an industry initiative that aims to make it easier for content providers to offer a consistent experience to users across devices including TVs, computers and phones. Nokia and Adobe announced that they plan to award US$10 million to developers who build applications that are based on Adobe Flash and will run on Nokia phones plus other kinds of devices. Developers will submit concepts for their applications, and a group of companies including Adobe and Nokia will review them and decide to award them seed money.

Lenovo Uses BlackBerry to Sync Laptop E-mail

PC maker Lenovo on Monday is expected to announce a partnership with Research In Motion that will make it easier for laptops to synchronize e-mail with servers with the help of BlackBerry smartphones.
Lenovo is providing a hardware and software bundle that allows ThinkPad laptops to sync e-mail with a server using Research In Motion's BlackBerry phone as an intermediary. This is part of Lenovo's new Constant Connect program, which the company plans to announce at the GSMA Mobile World Congress in Barcelona.
Through Constant Connect, synchronizing e-mail with servers is a two-step process. First, ThinkPad laptops transfer e-mail back and forth with a BlackBerry using Bluetooth wireless technology. The smartphone then synchronizes laptop e-mail with a server using a mobile-phone network.
This could be useful in certain places like airports where users have to pay for Wi-Fi connections to sync e-mail with servers. This technology does not use Wi-Fi networks, said Rich Cheston, distinguished engineer and executive director at Lenovo. Users may also prefer to see their e-mail on a laptop with a bigger screen and full keyboard rather than on a BlackBerry, Cheston said.
The hardware comes in the form of a PCI Express card with its own radio and storage that plugs into a laptop. A user doesn't need to start a laptop, as the card replicates with a RIM device by drawing its own battery power. The real-time syncing can provide quicker access to e-mail where wireless connectivity is spotty, Cheston said.
Users will also have the ability to sort and get alerts when specific e-mails arrive, Cheston said. "Let's suppose I want to be notified when my wife sends me an e-mail. I could have the card start blinking when the e-mail comes," Cheston said.
The bundle costs US$150 and will be available in the second quarter in the U.S., with worldwide availability scheduled for the second half of this year. It works with BlackBerry devices supporting the operating system 4.2 or later.
On laptops, it works with Windows XP and Windows Vista and supports Outlook or other POP (Post Office Protocol) e-mail clients like Gmail, Cheston said. The company plans to add Lotus Notes support in the second half.
The technology works only with ThinkPad laptops based on Intel's Montevina technology, which the company started shipping in the middle of last year. The package syncs only e-mail for now and plans to add calendar- and contact-synchronizing capabilities later this year, Cheston said.

Wednesday, February 11, 2009

Dell Hurls out Another Adamo Teaser

Dell has launched a new teaser surrounding its highly anticipated Adamo ultraportable laptop, now inviting users to sign up for the launch of the laptop as its unfolds.
"Prepare to fall in love," says Dell's Adamo laptop Web site, which then invites users to sign up to see how the "love story" unfolds. Though what the e-mail may contain is unclear, the invites may provide an early look at the laptop.
Dell officially unveiled Adamo at the Consumer Electronics Show in Las Vegas, calling it an ultrathin laptop representing the best of the company's craftsmanship, performance and design. An on-stage model held the light laptop with a few fingers, revealing an ultraslim design with a premium finish.
Dell did not reveal technical specifications, but said the laptop would ship later in the first quarter.
Adamo won't be just a laptop, but a whole new brand name of luxury products, said Michael Tatelman, vice president of global consumer sales and marketing at Dell at CES. The word Adamo means to fall in love with, Tatelman said at the time.
Speculation around Adamo heated up late last year when news media and observers said Adamo was Dell's response to Apple's MacBook Air.
Dell also leaked out accessories for an ultraportable laptop called Adamo Thirteen, which pointed to a laptop with a 13-inch screen in line with Dell's branding conventions. The company already offers the Inspiron Mini 9, which has a 9-inch screen, and Inspiron Mini 12, which has a 12-inch screen.

Internet Explorer 8 Offers Improved Privacy and Security

Internet Explorer has recently been losing market share to upstarts like Mozilla's Firefox, Apple's Safari, and Google's Chrome beta, but Microsoft hopes to reverse the tide with Internet Explorer 8, which is due out this summer. My conclusion after a close examination of the four browsers: As matters stand, IE 8 seems likely to be the easiest to deploy and maintain over a large or small network.
In addition, IE 8's capabilities will either match or exceed those of the other browsers. Here's a comparative look at some of the key features to be included in IE 8, and a discussion of why companies may be better off using IE 8 than one of the other browsers.
Easy to Deploy
IE 8 appears to be especially well suited for companies that want to adopt a browser across large network. In particular, Microsoft has equipped IE 8 with built-in deployment features, based on the company's existing deployment and update platforms. In contrast, Mozilla relies on third-party Firefox client customization add-ons such as FrontMotion Firefox MSI, CCK Wizard, or FirefoxADM; and Safari and Chrome don't as yet offer network-wide client customization deployment options at all.
Microsoft has been hyping IE 8's ability to switch automatically to IE 7 compatibility mode when necessary. But that's because IE versions 7 and earlier often didn't follow Web standards, and this failure to conform forced Web developers to code their pages differently in order to render on IE. Once deployed across a network, IE 8 won't break corporate intranet: Internal or intranet Web sites will automatically default to IE 7 compatibility so that businesses won't have to rewrite their inward-facing corporate pages. Similarly, Web surfing or external browsing in IE 8 will default to the new "standards mode" as well. Since Firefox, Chrome, and Safari have more or less conformed to Web standards over the years, they don't require this compatibility mode.
Taking a page from Google Chrome, IE 8 will offer built-in tab crash protection. In the event of a page fault, only the affected tab and not the entire browser will crash. The current versions of Firefox and Safari lack this isolation feature. Firefox will, however, restore the entire browser session after a browser crash; a similar feature in Safari called 'Reopen All Windows from Last Session' lets you restore previous browser windows whether or not the session ended with a crash.
Better Productivity
Though Microsoft took its time before embracing tabbed browsing, IE 8 is set to make significant strides in this area. As links on a page open new tabs, color-coded related tabs appear alongside the original. Chrome, Firefox, and Safari do not offer this capability. On the other hand, Chrome, Safari, and Firefox 3.1 can pull a tab out of the browser and create a new, stand-alone browser session; IE 8 won't be able to do this. IE 8 will offer some nice features within a tab, though: When you open a new tab, the browser will give you the option to reopen a closed tab or to restore your previous browsing session, among other choices.
Also unique to IE 8 will be "accelerators"--shortcuts to services that open within a given Web page. Instead of cutting and pasting to another tab, you may simply highlight the text and click the blue Accelerator icon to open blog, e-mail, map, search, and even translation services on the page you're currently viewing. This page-within-a-page feature is unavailable as yet from Firefox (without add-ons), Chrome, or Safari.
Web Slices, another unique feature, is designed to monitor a specific section of a Web page--a weather radar image, say, or an eBay auction--without requiring you to revisit the page. You'll simply select the page element and drag it to your toolbar to view as needed. Companies may be able to use Web Slices for intranet messaging and access to company services.
Mozilla dubbed its address bar in Firefox 3 the 'Awesome Bar' because it displays URL suggestions drawn from browser history and bookmarks. IE 8 will have its own awesome bar, with the unique ability to delete these suggestions--something Firefox doesn't offer. Deleting suggestions may help prevent over-the-shoulder snooping and assuage privacy concerns regarding a shared computer.
Private Browsing
If you share a computer with others, you may prefer that sites you visit not be added to your browser's history, or that any new cookies created be deleted when your browsing session ends. Safari was the first browser to offer Private Browsing. Chrome has answered with Incognito, and Firefox plans to add some form of private browsing to its Firefox 3.1 release.
With IE 8, Microsoft will introduce In Private browsing. Both IE 8 (when it is released) and Chrome (now) display visual indicators--icons in the upper lefthand corner--to signal when you're in a private session. Safari offers no visual cues, and Firefox hasn't said what UI changes it plans to make. With private browsing, all client-side evidence of your surfing session should disappear when the session ends, though records of your visits will remain on external Web servers.
The private browsing feature appears to provide secrecy, but both Apple and Microsoft maintain a cache that includes Private Browsing sessions. Is that a contradiction? No. Apple uses a DS cache so that the Safari browser doesn't have to request DNS information continually on frequently accessed sites. IE 8 will save information about your In Private sessions for sites that may be collecting information about your visits. Both Apple and Microsoft say that you can delete these caches through configuration options, however.
Better Security
Perhaps the most vexing aspect of past versions of Internet Explorer has been the browser's poor security. Here, too, Microsoft has made significant gains on the competition, starting with its 'Trustworthy Computing' inspection of lines of code. Both IE 8 (running in Protected Mode) and Chrome will run at low integrity, meaning that they can't launch applications without the user's express permission. And both browsers are designed to use 'Data Execution Prevention' and 'Address Space Layout Representation' to protect against remotely executing malware. Neither Firefox nor Safari offers similar protection.
All of the new browsers support Extended Verification SSL, a way of further establishing trust in a site you are visiting. Only Safari doesn't change its address bar to green to signal the extra security. And all four browsers include antiphishing protection, though Safari 3.2 stops there and doesn't yet offer antimalware protection.
Cross-Site Scripting and Other Demons
Cross-site scripting (aka "XSS") attacks occur when a malicious Web site uses Javascipt to read or write data onto another Web site. Unlike the three competing browsers, IE 8 will offer built-in XSS protection. Firefox recommends that users install No Script, a third-party add-on. So far, Chrome and Safari don't offer XSS-specific protection.
"Clickjacking," a term coined by security researchers Jeremiah Grossman of WhiteHat Security and Robert Hansen of SecTheory, refers to a less common but sinister practice: Bad guys trick a user into clicking a concealed link and performing unknown actions, such as activating a peripheral device like a Webcam or deleting data from a Webmail site. Since the attack uses a common coding procedure, Microsoft says that the best way to defeat it is for developers to add a special tag--X-FRAME-OPTIONS--that IE 8 will use to filter clickjacking attempts. Firefox recommends using the No Script add-on to ward off clickjacking attempts. Chrome and Safari do not offer specific protection against clickjacking.
In light of its robust new features and the ease with which it can be deployed, IE 8 appears poised to be the most network-ready browser of the bunch. Organizations currently running Internet Explorer should definitely upgrade to IE 8 when Microsoft releases it, and those that have migrated away from Internet Explorer should evaluate the productivity and security benefits they stand to gain by returning.

Friday, February 6, 2009

Removal of OLPC Donation Program Rattles Observers

One Laptop Per Child's removal of a program that enables small-scale XO laptop deployments has rattled observers, who are concerned that the nonprofit is changing its focus to large-scale deployments.
A program for donors to employ between 100 or more laptops for small-scale deployments, called "Give a School," has been removed from the participation page of the nonprofit's Web site. The nonprofit is now offering options to directly donate laptops or to make corporate purchases.
Designed for use by children in developing countries, the XO laptop has been praised for its innovative hardware features and environmentally friendly design. The Give a School program was defined as a "special program that allows donors to choose the country where the laptops go."
The change was first noted by Morgan Collett, who blogged about it. OLPC's changed focus could affect XO laptop deployments in South Africa that were purchased through the program, he wrote.
"This is a blow to future small deployments in South Africa, as we have over 600 XOs deployed in South Africa through this program with more that were planned," Collett wrote. A nonprofit organization was being set up to raise funds and coordinate deployments, but that will be to "no effect" unless laptops from other vendors are used, Collett wrote.
OLPC's grassroots focus through small deployments generate excitement for the nonprofit's larger efforts, wrote Wayan Vota, an OLPC observer in a blog entry on OLPC News.
"It's pilots that give us guidance for national rollouts. It's the OLPC movements in South Africa, Oceania, and South Asia that are giving OLPC is real successes. And to discount, or outright abandon them, is foolish," Vota wrote.
OLPC officials did not comment on removal of the "Give a School" program, but President and Chief Operating Officer Chuck Kane said OLPC is committed to small-scale deployments. "Would you like to purchase 1,000 computers and change the world?" Kane asked in an e-mail.
He did not provide further comment.
Rumors of OLPC's change of focus started after a message on OLPC's Web site said the nonprofit removed the Give a School program to refocus its efforts on larger deployments.
The authenticity of the e-mail could not be verified, but the Give a School program has been removed by OLPC, which led to concerns among observers.

E-Books Take Center Stage

New Amazon Kindle rumors and Google's e-book announcement help fuel e-reading furor.
It’s been over a year since the Amazon Kindle e-book reader was introduced. And the electronic-ink-based device--which in many ways has transformed the e-book category--has spent much of that time in high-demand: The Kindle was on backorder and sold out during the holidays. Today the Kindle remains on backorder at Amazon's site, by three to five weeks.
Rumor has it that the second-generation Kindle will be introduced at an Amazon event in New York on Monday. Last fall, images purported to be the Kindle 2 surfaced on The Boy Genius Report.
The first-generation Kindle cost $359--when you could buy it. “The Kindle has spurred much interest in the e-book category, not only because of its wireless capabilities, but also because it extends the footprint of Amazon nearly anywhere," notes Ross Rubin, NPD Group director of industry analysis. "It's been one of the first wirelessly connected consumer electronics products to offer fast connectivity at no end-user cost to the consumer.”
That connectivity--an integrated 3G cellular radio and Kindle’s free, Whispernet EvDO wireless connection provided in partnership with Sprint--allows immediate access to the Kindle store for on-demand e-book purchases. Plus, you can use Whispernet to subscribe to and receive blogs and RSS feeds, as well as to browse basic Web sites (text pages, not graphics-heavy sites, so it's handy for quick news and weather checks, or for Wikipedia lookups).
A second-generation Kindle has the opportunity to correct some of the design flaws of the first-gen model--it was too bulky, and handled PDFs and other document files less than gracefully--while making the device more competitive and appealing, given new competition.
Sony, for example, has added backlighting and a touch screen, on its slim second-generation Sony Reader Digital Book PRS-700BC. Meanwhile, Google announced that the 1.5 million public-domain books in its Google Book Search will be accessible via mobile handsets such as the Apple iPhone 3G and the T-Mobile G1. And Amazon has countered by saying that it is working on making Kindle e-book titles accessible on cell phones as well.
Cell phones could be the ultimate mobile e-book reader, by virtue of their portability and ubiquitous nature. “There's a relatively small market for a dedicated device for reading best-sellers, and we're seeing more development on e-book initiatives for the iPhone, with offerings such as Shortcovers and Zinio for the iPhone,” says Rubin. Add in the Google Book Search and Amazon mobile Kindle initiatives, and cell phones could become the next big platform for e-books, beyond the dedicated electronic-ink screens.
Rubin says that one area Amazon could potentially mine is that of electronic textbooks. “There's a tremendous opportunity for the first e-book provider that can tap into the textbook market,” he says. “At the appropriate price, that could transform these devices from frequent-flyer folios into a staple in the homes of students.”

Groups Push for Broadband Stimulus

The U.S. Congress should keep money for broadband deployment in a huge economic stimulus package, despite some calls to trim it out of the bill, representatives of three groups said Friday.
As the U.S. Senate debated cuts to a US$890 billion stimulus package, representatives of the Information Technology and Innovation Foundation (ITIF), the Communications Workers of America and Connected Nation, called on the Senate to keep funding for building broadband networks in rural and other underserved areas.
The Senate version of the economic stimulus package originally included $9 billion for broadband deployment, about $3 billion more than a House of Representatives' stimulus bill that passed Jan. 29. Late Friday, senators continued to debate their own stimulus bill, with several lawmakers calling for significant cuts in the spending package. One proposal would cut the broadband spending by $1.5 billion.
Money for broadband is important to help rural and some urban areas realize the economic and social benefits of broadband, said Raquel Noriega, director of strategic partnerships at Connected Nation, a nonprofit group focused on helping communities expand broadband deployment.
"I cannot see a better way" to stimulate the economy, she said during an ITIF forum on broadband stimulus in the U.S. Capitol.
Some groups have questioned whether there's a need for broadband deployment money in the bill. On Jan. 21, New York Times tech columnist Saul Hansell suggested that broadband providers would reach most of the nation without a large amount of stimulus money. Using new cable modem technology, the U.S. should be able to surpass other nations' broadband speeds, he wrote.
"As I look at it, the noise about a broadband gap is hooey," Hansell wrote. "With new cable modem technology becoming available, 19 out of 20 American homes eventually will be able to have Internet service that is faster than any available now anywhere in the world."
Hansell suggested the stimulus package should focus on unserved areas instead of spending tens of billions [b] of dollars to increase speeds in areas already served by broadband providers, as some groups have called for. "It is hardly clear that the country would get an adequate return from subsidizing what is essentially duplicate capacity," he wrote.
Berin Szoka, a fellow at conservative think tank the Progress and Freedom Foundation, also questioned how the government will be able to gauge the effectiveness of any stimulus money for broadband. He suggested the broadband stimulus is "corporate welfare" in a Jan. 20 blog post.
"How would one actually evaluate the efficacy of any proposed government intervention?" Szoka wrote. "As difficult as it is to predict the unintended consequences of intervention, it's even more difficult to do so in high-tech sectors of the economy, where the rate of change is particularly rapid."
But stimulus money will be needed to reach that last 5 percent to 10 percent of U.S. residents who don't have access to broadband, said Robert Atkinson, ITIF's president. Many of those people are in rural areas where broadband providers have been reluctant to provide service because of the cost per customer, he said.
"You can't make the money back at $35 a month," he said. "The numbers don't work."
Atkinson called on Congress to approve a mixture of tax cuts and grants to help broadband providers expand service. While a grant program would take time to set up, tax cuts would encourage providers to expand their networks almost immediately, he said.
He also called on Congress to get rid of open access and net neutrality requirements, as well as speed requirements, in the House version of the stimulus package. The House bill would require the U.S. Federal Communications Commission to define open access rules, and those rules could potentially include requirements for broadband providers to share their networks with competitors, he said.
Proponents of the open access rules say they're needed to keep broadband providers from blocking or slowing access to some Web content. But Atkinson said those requirements could drive away broadband providers from accepting stimulus money. With the requirements in place, "you'll see very little take-up of the grants," he said.

Friday, January 30, 2009

UAC Fix in Windows 7 Creates Security Hole, Blogger Says

A change that Microsoft made in Windows 7 to improve its controversial User Account Control security feature has left the new OS less secure, according to a blogger who follows Microsoft closely.
Microsoft made the change to UAC, a feature that was introduced with Windows Vista, to make it more user-friendly in Windows 7. But the change has allowed for "a simple but ingenious override" that disables UAC without any action on the part of the user, according to the I Started Something blog written by longtime Microsoft watcher Long Zheng.
Microsoft added UAC to Vista in an effort to improve its security and give people who are the primary users of a PC more control over its applications and settings. UAC prevents users without administrative privileges from making unauthorized changes to a system. But because of how it was set up in Vista, UAC sometimes prevents even authorized users from being able to access applications and features they should normally have access to.
It does this through a series of screen prompts that ask the user to verify privileges, and it may require them to type in a password to perform a task. This can interrupt people's workflow, even during some mundane tasks, unless they are set as Local Administrator. The UAC prompts became so problematic that Apple even spoofed them in a television commercial, and Microsoft vowed to improve the feature in Windows 7.
Windows 7 is still in beta and not expected to ship until late this year or early next. Microsoft released the beta earlier this month and outlined the changes to UAC on the Engineering Windows 7 blog.
The changes revise the UAC's default setting, and that is where the security risk lies, according to Zheng.
As he explained in his post, UAC's default setting in Windows 7 is to "Notify me only when programs try to make changes to my computer" and "Don't notify me when I make changes to Windows settings."
UAC distinguishes between a third-party program and a Windows setting with a security certification, and control-panel items are signed with this certificate so they don't issue prompts if a user changes system settings, he wrote.
However, in Windows 7, changing UAC is considered a "change to Windows settings," according to Zheng. This, coupled with the new default UAC security level, means a user will not be prompted if changes are made to UAC, including if it was disabled.
With a few keyboard shortcuts and some code, Zheng said he can disable UAC remotely without the end-user knowing.
"With the help of my developer side-kick Rafael Rivera, we came up with a fully functional proof-of-concept in VBScript (would be just as easy in C++ EXE) to do that -- emulate a few keyboard inputs -- without prompting UAC," he wrote. "You can download and try it out for yourself here, but bear in mind it actually does disable UAC."
Zheng also posted what he said is a workaround for the problem on his blog.
Microsoft said on Friday through its public relations firm that it was looking into the problem and did not have an immediate comment.

Second Life Profitable Despite Interface Woes

In exclusive interviews with The Industry Standard, Linden Lab's two top executives have confirmed that the company is still profitable and Second Life is continuing to grow users and expand its enterprise services. However, Linden Lab founder and chair Philip Rosedale and CEO Mark Kingdon admitted that the in-world experience still takes too long for new users to master, an issue that will require significant amounts of technological work to rectify.
The two executives spoke to the Standard at the company's headquarters in San Francisco earlier this month (see Interview with Linden Lab CEO Mark Kingdon and Interview with Second Life creator Philip Rosedale for transcripts).
Kingdon acknowledged an "incredible hype phase" that had introduced lots of people to the potential of virtual worlds, but had also put the spotlight on many negative aspects. He said that the company was in a "comfortable place" in terms of growth in active users, usage hours, and Second Life uptime.
Rosedale said that Second Life had moved beyond an emerging application for technology-savvy users. "There is a lot more diversity in use, demographics and behavior in Second Life today than there was, say, at the end of 2003," he said.
Kingdon echoed this assessment. "I think the world has gotten its head around the fact that virtual worlds are here to stay," Kingdon said. "There is a very compelling set of activities that virtual worlds are incredibly powerful for. They erase geographies, they allow for a type of interaction that you can't get in the real world and they bring with them really interesting economic and business opportunities for users."
Kingdon pointed to several localization projects for countries in Europe, Asia, and South America, and cited in-world training and remote meetings as compelling activities for companies. Both he and Rosedale portrayed Second Life as a competitor to enterprise video conferencing, which they believe is unable to match Second Life's ability to make people feel comfortable interacting with other remote users.
As for competing virtual worlds, Kingdon said he and his team tried to keep abreast of trends, but declined to name any current competitors.
Discussing Google's closure of Lively last year, Kingdon said it was a "natural" outcome, considering Google's focus and the state of the economy. "I don't think that Lively's departure is an invalidation of the market. I think, it's just recognition that, yeah, there is promise [and] a lot of hard work," Kingdon explained. "Google made the right decision and said, 'We need to kind of stick to our knitting in this economic downturn, in this climate, and focus our resources on some of our core properties,' which is quite natural."
Rosedale said There.com had some "unique" aspects and had effectively targeted certain vertical markets, but called it "substantially less interesting" in terms of the content that users can create. "The demographic is tighter, narrower, less diverse," he stated.
Linden Lab's CEO said that despite the recession, the company remained profitable. "We have not felt the same in world economic turmoil that the real world has faced," Kingdon said, noting that Second Life was an affordable entertainment alternative to activities such as going to a movie. "Dollar for dollar, it's high-value entertainment for the casual user," he said.
On the enterprise side, he and Rosedale described uptime improvements and new products, including a hosted service and a behind-the-firewall service nicknamed "Nebraska" aimed at companies with stronger security needs.
However, the Linden executives said that a lot of work remained to be done in terms of making the service easier to use. Rosedale singled out search, the user interface and new user orientation as needing major improvements. "We need to collapse the orientation experience on learning the interface down to a 30-minute timeframe," he declared. "We're not there yet."
Rosedale went on to describe the current interface as "overwhelming."
He said, "the basic UI of the software also needs to change. It has too many pixels," referring to the buttons, numbers, and other data presented to users on the screen. "They're all kind of demanding your attention -- your [Linden] dollar balance, your inventory window, all the buttons on the bottom bar, chat and text that are visible in the window, that's asking something of you, blue pop-ups that are coming up."
Rosedale said that while the work required to make the interface less complex was significant, it would have a huge impact on the adoption rate of virtual worlds. Currently, only 15% of the people who tried out Second Life continued to use the virtual world. "I'd like to triple that number," he stated.
Nevertheless, progress has been made in terms of making the technology more appealing to new users. Kingdon described how the old Second Life registration process -- a seven-page form which he likened to a mortgage application -- had been streamlined. "We've very substantially shortened the registration flow," he said. "We shortened it in July to one page and saw a very substantial increase in registration completions."
Kingdon added that the company has also implemented better email management techniques to increase activations, and was also paying attention to SEO, in order to help users get to helpful Second Life resources via Web search engines.

The Web 2.0 'Conversation' Is Really a Shouting Match

Web 2.0 wonks like to gush about how the Internet these days is all about "joining the conversation." Lately, though, it's been more like a shouting match.
Today's example: The fall of Michael Arrington.
As Rodrigues & Urlocker (the Captain & Tennille of InfoWorld bloggers) have also noted, Michael Arrington is hanging up his keds at TechCrunch -- at least through the month of February, if not longer. The reason? He's sick of all the haters.
On his most recent blog post, Arrington said death threats he'd received last summer, coupled with recently being spat upon at a conference a few days ago, were key factors in his decision. Hey, nobody should have to endure stuff like that just for expressing their opinions, whether you agree with them or not.
But in true Arrington form, he couldn't just leave it at that. In an interview with the Wall Street Journal, he implicitly blamed popular blog sites Valleywag (now part of Gawker) and AllThingsD -- led by poppa bear Walt Mossberg of the Wall Street Journal and BoomTown's Kara Swisher -- for daring to question his ethics and (thus) inciting the haters. Quoth Mr. TechCrunch:
"Whoever is the top blog will get attacked by everyone else and that'll just be the way it is," Mr. Arrington said. "We really need to think about, the community of bloggers, if we're going to continue to slay our own for competitive reasons."
Apply your own cliche here -- glass houses, stones, heat, kitchen, pot, kettle, black, etc. Just about any of them work. And who exactly anointed TechCrunch "top blog"? I must have missed that press release.
Still, you have to give TechCrunch its due. In a few short years, it's grown from one guy spouting his opinions on startups to one of the most popular (and feared) news sites on the Net. It has broken some real stories -- like Google's acquisition of YouTube -- before the mainstream media had even heard of "viral video." Its reach is impressive. Even the Washington Post deigned to syndicate TechCrunch.
On the other hand, there's still too much of one guy spouting his opinions masquerading as real journalism for my taste. The site seems willing to publish any rumor, which means it's wrong a lot of the time. It's become a running joke that every week or so TechCrunch will post a story saying Google is going to acquire some company, and when it doesn't happen, post a second story saying they walked away from the deal. Rightly or wrongly, questions about Arrington's relationship to the companies he writes about continue to dog him.
(Note: This blog is sometimes guilty of much of the above. But then, I make no claims to be a news source. We're all snark all the time here in Cringeville.)
My real point is, Arrington's right: It has gotten nastier out there. Maybe it's always been this way, and the flame wars that used to be confined to alt.geek.whatever on Usenet have now exploded across the Net.
I see it here in the comments to this blog. All I need to do is pick the right topic -- anti- or pro-Microsoft, Apple, Linux; Scientology vs "Anonymous"; science vs faith; and anything that touches on politics -- and the anonymous posters come out with guns blazing. It's like pushing the flame button; it's automatic.
So far, the worst thing that's happened is people unsubscribing from the e-mail newsletter -- no death threats or spittle yet. But it seems like it's only a matter of time before that kind of thing starts happening to more of us.
Has the Net gotten nastier? Can "love keep us together"? E-mail me direct: cringe@infoworld.com.

AMD Set to Release DDR3-Capable Processors

Advanced Micro Devices will soon introduce processors that are capable of supporting DDR3 memory, earlier than the company had anticipated.
The company in the next few weeks will launch new processors targeted at desktops that will include DDR3-capable memory controllers, said John Taylor, an AMD spokesman.
Taylor declined comment on specific processors being launched, though a leaked road map suggests the launch of new Phenom II and triple-core processors.
The support for DDR3 memory comes earlier than anticipated. Late last year the company said it aimed to add DDR3-capable Phenom II processors by the middle of 2009, but could push that up depending on factors including pricing of the memory.
Compared to current DDR2-capable processors, the new DDR3-capable chips will allow information from the memory to be communicated to a CPU faster, which translates to better PC performance. To run DDR3-capable processors, the company will introduce the AM3 socket for motherboards.
"The people who want the latest and greatest will want to use DDR3 memory," Taylor said.
AMD's decision to switch to DDR3 memory is to make CPUs faster so it can effectively compete with Intel in the high-end PC and server markets, said Dean McCarron, president of Mercury Research, a market analysis firm.
"When we make changes in PC architecture, it is because it's either faster or cheaper," said McCarron. For AMD, the decision was technical rather than financial, but the enhanced competitiveness could yield a financial benefit to AMD in the long run, McCarron said.
Intel's Core i7 processor for gaming systems, launched in November, already supports DDR3 memory. Intel is also adding DDR3 support to chips for portable products like laptops.
However, given AMD's inherent price advantage compared to Intel's products, price-sensitive buyers may initially oppose the high prices of DDR3 memory modules, McCarron said. As of early January, a 1GB DDR3 memory module running at 1333MHz was priced at $35, versus $12 to $14 per unit for a 1GB DDR2 unit.
"This is completely normal for technology. As the volume ramps [DDR3 memory prices] will come down," McCarron said.
Motherboard companies like Asus have already announced AM3-compatible motherboards, setting the stage for AMD to launch its new DDR3-capable processors, which could include new Phenom II processors. The new CPUs will include a DDR2- and DDR3-capable memory controller, allowing it to work with older motherboards with DDR2 memory.
AMD earlier this year launched new quad-core Phenom II processors, which the company called its "highest-performing" CPUs to date. Aimed at high-end desktop PCs, the chips ran at speeds of up to 3GHz and included 8MB of cache.
However, the Phenom II chips are capable of even faster clock speeds under certain circumstances. For example, the processors have been overclocked to run at speeds of up to 6.5GHz on liquid-cooled systems and up to 4GHz on air-cooled systems.
AMD remains on track to transition to DDR3 memory support for servers with the Maranello platform in 2010, Taylor said. The Maranello platform includes the six-core Sao Paulo and 12-core Magny-Cours chips.

Friday, January 23, 2009

Windows 7 Security Features Get Tough

Two years after Windows Vista debuted, many companies have yet to upgrade. And in many instances their reluctance to migrate to Vista stemmed from concern about security.
Microsoft hass responded with its latest operating system, Windows 7, currently in public beta and expected to ship later this year. In Windows 7, new security features have been added, popular features expanded, and familiar features enhanced. Here's a look at a dozen or so security improvements that we expect will convince even the most recalcitrant corporate clients to upgrade.
Improved Migration Tools
Microsoft says that Windows 7 will be faster and easier to roll out across an enterprise than previous OS migrations were. Much of the credit for the anticipated improvement goes to new tools such as Dynamic Driver Provisioning, Multicast Multiple Stream Transfer, and Virtual Desktop Infrastructure.
With Dynamic Driver Provisioning, drivers are stored centrally, separate from images. IT professionals can arrange for installation by individual BIOS sets or by the Plug and Play IDs of a PC's hardware. Microsoft says that reducing the number of unnecessary drivers installed will help avoid potential conflicts and will accelerate installation. With Windows 7, as with Windows Vista, IT professionals can update system images offline, and even maintain a library of images that includes different drivers, packages, features, and software updates.
Rolling out any particular image across the entire network--or even installing individual images on desktops--is faster in Windows 7, thanks to the new Multicast Multiple Stream Transfer feature. Instead of individually connecting to each client, deployment servers "broadcast" the images across the network to multiple clients simultaneously.
Virtual Desktop Infrastructure (VDI), another desktop deployment model, allows users to access their desktops remotely, thereby centralizing data, applications, and operating systems. VDI supports Windows Aero, Windows Media Player 11 video, multiple-monitor configurations, and microphone support for voice over IP (VoIP) and speech recognition. New Easy Print technology permits VDI users to print to local printers. But use of VDI requires a special license from Microsoft, and doesn't offer the full functionality of an installed operating system.
Protecting Corporate Assets
Once the OS is installed, organizations may protect their assets with authentication for log-in. Windows Vista included drivers for fingerprint scanners, and Windows 7 makes such devices easier for IT professionals and end-users to set up, configure, and manage. Windows 7 extends the smart card support offered in Windows Vista by automatically installing the drivers required to support smart cards and smart card readers, without administrative permission.
IT professionals may further protect the contents of their Windows 7 volumes with BitLocker, Microsoft's whole-disk encryption system. Windows Vista users have to repartition their hard drive to create the required hidden boot partition, but Windows 7 creates that partition automatically when BitLocker is enabled. In Windows Vista, IT professionals must use a unique recovery key for each protected volume. But Windows 7 extends the Data Recovery Agent (DRA) to include all encrypted volumes; as a result, only one encryption key is needed on any BitLocker-encrypted Windows machine.
BitLocker To Go is a new feature that lets users share BitLocker-protected files with users running Windows Vista and Windows XP. The BitLocker To Go desktop reader provides simple, read-only access to the protected files on non-BitLocker-protected systems. To unlock the protected files, the user must provide the appropriate password (or smart-card credentials).
Application Control
Windows 7 also introduces AppLocker , an enhancement to Group Policy settings that lets organizations specify which versions of which applications users have permission to run. For example, a rule might allow users to install Adobe Acrobat Reader version 9.0 or later, but it might block them from installing legacy versions without specific authorization. AppLocker contains a rule-generation wizard to make the process of creating policies much easier, and it includes automatic rule making for building a custom white list.
System Restore, first introduced in Windows ME, gets a much needed update in Windows 7. First, System Restore displays a list of specific files that will be removed or added at each restore point. Second, restore points are now available in backups, giving IT professionals and others a greater list of options over a longer period of time.
The Action Center is a new, integrated Control Panel feature that gives Windows 7 users a central spot for locating tasks and common notifications under a single icon. The Action Center includes alerts and configuration settings for several existing features, including the Security Center; Problem, Reports, and Solutions; Windows Defender; Windows Update; Diagnostics; Network Access Protection; Backup and Restore; Recovery; and User Account Control. Popup alerts are gone in Windows 7, replaced by a new task tray icon (a flag with an X) that provides streamlined access to the problem directly or to the Action Center for more information.
Perhaps the most famous and most annoying form of Windows Vista notification comes from the User Account Control (UAC) feature, which flashes administrative warnings whenever you need to configure a system setting. In Vista the choices are stark: Endure the messages, or turn off UAC. In Windows 7, you have additional options. A slider bar configures the appropriate notification level for your computer, and by default UAC will notify you only when programs try to make changes to your PC.
Better Performance
Windows Defender, Microsoft's antispyware product, gains a much-needed performance enhancement in Windows 7. But Microsoft has removed the Software Explorer tool, asserting that the utility doesn't affect spyware detection or removal. That might be true, but Software Explorer would allow you to see what programs and processes are running, including ones that you may not know about or want. Perhaps Microsoft will reverse this decision by the final build.
Another new feature of Windows 7 is the Windows Filtering Platform (WFP), a group of APIs and system services that allow third party vendors to tap further into Windows' native firewall resources, thereby improving system performance. Microsoft stresses that WFP is a development platform and not a firewall in itself, but WFP does address a few of Windows Vista's firewall problems.
In Vista, Microsoft introduced the concept of profiles for different types of network connections--home, network, public and domain. This, however, bound corporate IT professionals whenever a remote user accessed their corporate VPN, because the firewall was already set as either "home" or "public," and corporate network settings could not be applied later. Windows 7 and WFP in particular permit multiple firewall policies, so IT professionals can maintain a single set of rules for remote clients and for clients that are physically connected to their networks. Windows 7 also supports Domain Name System Security Extensions (DNSSEC), newly established protocols that give organizations greater confidence that DNS records are not being spoofed.
Features for Mobile Users
Windows 7 has two enhancements designed for mobile users. With DirectAccess, mobile workers can connect to their corporate network any time they have Internet access--without needing a VPN. DirectAccess updates Group Policy settings and distributes software updates whenever the mobile computer has Internet connectivity, whether the user is logged on to a corporate network or not. This ensures that mobile users stay up-to-date with company policies. And with BranchCache, a copy of data accessed from an intranet Web site or from a file server is cached locally within the branch office. Remote users can use BranchCache to access shared data rather than using a connection back to headquarters.
Windows 7 also makes enhancements to event auditing. Regulatory and business requirements are easier to fulfill through management of audit configurations, monitoring of changes made by specific people or groups, and more-granular reporting. For example, Windows 7 reports why someone was granted or denied access to specific information.

Study: Spam Is Getting More Malicious

Spam, especially junk e-mails with malicious links or attachments, continues to be a huge IT headache. Spammers are also getting more creative in their attempts to find victims, utilizing popular sites such as Facebook and Twitter, according to a report from UK-based security firm Sophos this week.
The consultancy published its latest spam trend report and said new figures reveal that spam is still causing problems for computer users. In the fourth quarter of 2008, Sophos research found one in every 256 e-mails contained a dangerous attachment in October. In November, that figure improved to one in 384. December saw a huge decline: Just one in every 2000 e-mails contained a spam. Graham Cluley, senior technology consultant at Sophos, said it is possible the drop-off may be related to the shut down of the McColo Corp., a Web-hosting firm that security experts believe was responsible for three-quarters of the world's spam.
"It's hard to say exactly what can be causing this," said Cluley. "Certainly that is possible."
Numbers for January have not been assessed yet and Cluley said it is too early to determine if the drop off in spam levels has continued, or if spam is now back at levels seen in earlier months. What is clear, said Cluley, is that more spam is malicious in nature now and often designed to infect users' computers via sophisticated malware attachments or a link to malicious or infected websites, in order to steal sensitive information. Cluley also said social networking venues, such as Facebook and Twitter, are now the hot targets for spammers.
"Spammers really took to using sites like Facebook and Twitter as a vehicle for their spam antics during the last three months of 2008," he said. "Cybercriminals have cottoned onto the fact that social networking users can be more easily fooled into clicking on a link that appears to have come from a trusted Facebook friend, than if it arrived as an unsolicited email in their inbox. The notorious Nigerian 419 scammers have even evolved, masquerading as Facebook friends in order to trick unwary users into parting with valuable sensitive and financial information. Ultimately, while users are still falling for these scams, the fraudsters will continue. And while the authorities are making great progress, everyone must take steps to ensure they don't fall victim."
Death to Spam?
The report also referenced a 2004 prediction by Bill Gates that spam would be a thing of the past in 2 years.
"The rumors of spam's death have been greatly exaggerated over the years the threat remains alive and kicking despite increased legal action against spammers, the occasional takedown of Internet companies which assist the cybercriminals, and constantly improving anti-spam software," said Cluley. "Many IT professionals cast doubt on Bill Gates' assertion back in 2004, deeming the timeframe of his pledge to be unrealistic. Although the latest stats show that the proportion of spam relayed per country may have decreased year-on-year, spammers have turned to more creative, not to mention devious, methods to ensure their messages reach as many unsuspecting computer users as possible."
And the Spam King Crown Goes to...
Between October and December 2008, the United States was responsible for most of the world's spam, according to Sophos. China was in the second spot and Russia was third. Sophos officials pointed to Canada, Japan and France as countries that have made progress in spam prevention. All three, considered "serial offenders" five years ago, are no longer present in the list of spam reprobates.
"Although there's no denying that some countries have significantly reduced their contribution to the spam epidemic over the past five years, the United States still holds the crown," said Cluley. "Though its spam contribution has significantly decreased since Bill Gates' proclamation, falling from almost half of all spam relayed at the end of 2004, to 21.3 percent by the end of 2007, and now resting at 19.8 percent, this shows there's certainly no quick fix."