Friday, January 30, 2009

UAC Fix in Windows 7 Creates Security Hole, Blogger Says

A change that Microsoft made in Windows 7 to improve its controversial User Account Control security feature has left the new OS less secure, according to a blogger who follows Microsoft closely.
Microsoft made the change to UAC, a feature that was introduced with Windows Vista, to make it more user-friendly in Windows 7. But the change has allowed for "a simple but ingenious override" that disables UAC without any action on the part of the user, according to the I Started Something blog written by longtime Microsoft watcher Long Zheng.
Microsoft added UAC to Vista in an effort to improve its security and give people who are the primary users of a PC more control over its applications and settings. UAC prevents users without administrative privileges from making unauthorized changes to a system. But because of how it was set up in Vista, UAC sometimes prevents even authorized users from being able to access applications and features they should normally have access to.
It does this through a series of screen prompts that ask the user to verify privileges, and it may require them to type in a password to perform a task. This can interrupt people's workflow, even during some mundane tasks, unless they are set as Local Administrator. The UAC prompts became so problematic that Apple even spoofed them in a television commercial, and Microsoft vowed to improve the feature in Windows 7.
Windows 7 is still in beta and not expected to ship until late this year or early next. Microsoft released the beta earlier this month and outlined the changes to UAC on the Engineering Windows 7 blog.
The changes revise the UAC's default setting, and that is where the security risk lies, according to Zheng.
As he explained in his post, UAC's default setting in Windows 7 is to "Notify me only when programs try to make changes to my computer" and "Don't notify me when I make changes to Windows settings."
UAC distinguishes between a third-party program and a Windows setting with a security certification, and control-panel items are signed with this certificate so they don't issue prompts if a user changes system settings, he wrote.
However, in Windows 7, changing UAC is considered a "change to Windows settings," according to Zheng. This, coupled with the new default UAC security level, means a user will not be prompted if changes are made to UAC, including if it was disabled.
With a few keyboard shortcuts and some code, Zheng said he can disable UAC remotely without the end-user knowing.
"With the help of my developer side-kick Rafael Rivera, we came up with a fully functional proof-of-concept in VBScript (would be just as easy in C++ EXE) to do that -- emulate a few keyboard inputs -- without prompting UAC," he wrote. "You can download and try it out for yourself here, but bear in mind it actually does disable UAC."
Zheng also posted what he said is a workaround for the problem on his blog.
Microsoft said on Friday through its public relations firm that it was looking into the problem and did not have an immediate comment.

Second Life Profitable Despite Interface Woes

In exclusive interviews with The Industry Standard, Linden Lab's two top executives have confirmed that the company is still profitable and Second Life is continuing to grow users and expand its enterprise services. However, Linden Lab founder and chair Philip Rosedale and CEO Mark Kingdon admitted that the in-world experience still takes too long for new users to master, an issue that will require significant amounts of technological work to rectify.
The two executives spoke to the Standard at the company's headquarters in San Francisco earlier this month (see Interview with Linden Lab CEO Mark Kingdon and Interview with Second Life creator Philip Rosedale for transcripts).
Kingdon acknowledged an "incredible hype phase" that had introduced lots of people to the potential of virtual worlds, but had also put the spotlight on many negative aspects. He said that the company was in a "comfortable place" in terms of growth in active users, usage hours, and Second Life uptime.
Rosedale said that Second Life had moved beyond an emerging application for technology-savvy users. "There is a lot more diversity in use, demographics and behavior in Second Life today than there was, say, at the end of 2003," he said.
Kingdon echoed this assessment. "I think the world has gotten its head around the fact that virtual worlds are here to stay," Kingdon said. "There is a very compelling set of activities that virtual worlds are incredibly powerful for. They erase geographies, they allow for a type of interaction that you can't get in the real world and they bring with them really interesting economic and business opportunities for users."
Kingdon pointed to several localization projects for countries in Europe, Asia, and South America, and cited in-world training and remote meetings as compelling activities for companies. Both he and Rosedale portrayed Second Life as a competitor to enterprise video conferencing, which they believe is unable to match Second Life's ability to make people feel comfortable interacting with other remote users.
As for competing virtual worlds, Kingdon said he and his team tried to keep abreast of trends, but declined to name any current competitors.
Discussing Google's closure of Lively last year, Kingdon said it was a "natural" outcome, considering Google's focus and the state of the economy. "I don't think that Lively's departure is an invalidation of the market. I think, it's just recognition that, yeah, there is promise [and] a lot of hard work," Kingdon explained. "Google made the right decision and said, 'We need to kind of stick to our knitting in this economic downturn, in this climate, and focus our resources on some of our core properties,' which is quite natural."
Rosedale said There.com had some "unique" aspects and had effectively targeted certain vertical markets, but called it "substantially less interesting" in terms of the content that users can create. "The demographic is tighter, narrower, less diverse," he stated.
Linden Lab's CEO said that despite the recession, the company remained profitable. "We have not felt the same in world economic turmoil that the real world has faced," Kingdon said, noting that Second Life was an affordable entertainment alternative to activities such as going to a movie. "Dollar for dollar, it's high-value entertainment for the casual user," he said.
On the enterprise side, he and Rosedale described uptime improvements and new products, including a hosted service and a behind-the-firewall service nicknamed "Nebraska" aimed at companies with stronger security needs.
However, the Linden executives said that a lot of work remained to be done in terms of making the service easier to use. Rosedale singled out search, the user interface and new user orientation as needing major improvements. "We need to collapse the orientation experience on learning the interface down to a 30-minute timeframe," he declared. "We're not there yet."
Rosedale went on to describe the current interface as "overwhelming."
He said, "the basic UI of the software also needs to change. It has too many pixels," referring to the buttons, numbers, and other data presented to users on the screen. "They're all kind of demanding your attention -- your [Linden] dollar balance, your inventory window, all the buttons on the bottom bar, chat and text that are visible in the window, that's asking something of you, blue pop-ups that are coming up."
Rosedale said that while the work required to make the interface less complex was significant, it would have a huge impact on the adoption rate of virtual worlds. Currently, only 15% of the people who tried out Second Life continued to use the virtual world. "I'd like to triple that number," he stated.
Nevertheless, progress has been made in terms of making the technology more appealing to new users. Kingdon described how the old Second Life registration process -- a seven-page form which he likened to a mortgage application -- had been streamlined. "We've very substantially shortened the registration flow," he said. "We shortened it in July to one page and saw a very substantial increase in registration completions."
Kingdon added that the company has also implemented better email management techniques to increase activations, and was also paying attention to SEO, in order to help users get to helpful Second Life resources via Web search engines.

The Web 2.0 'Conversation' Is Really a Shouting Match

Web 2.0 wonks like to gush about how the Internet these days is all about "joining the conversation." Lately, though, it's been more like a shouting match.
Today's example: The fall of Michael Arrington.
As Rodrigues & Urlocker (the Captain & Tennille of InfoWorld bloggers) have also noted, Michael Arrington is hanging up his keds at TechCrunch -- at least through the month of February, if not longer. The reason? He's sick of all the haters.
On his most recent blog post, Arrington said death threats he'd received last summer, coupled with recently being spat upon at a conference a few days ago, were key factors in his decision. Hey, nobody should have to endure stuff like that just for expressing their opinions, whether you agree with them or not.
But in true Arrington form, he couldn't just leave it at that. In an interview with the Wall Street Journal, he implicitly blamed popular blog sites Valleywag (now part of Gawker) and AllThingsD -- led by poppa bear Walt Mossberg of the Wall Street Journal and BoomTown's Kara Swisher -- for daring to question his ethics and (thus) inciting the haters. Quoth Mr. TechCrunch:
"Whoever is the top blog will get attacked by everyone else and that'll just be the way it is," Mr. Arrington said. "We really need to think about, the community of bloggers, if we're going to continue to slay our own for competitive reasons."
Apply your own cliche here -- glass houses, stones, heat, kitchen, pot, kettle, black, etc. Just about any of them work. And who exactly anointed TechCrunch "top blog"? I must have missed that press release.
Still, you have to give TechCrunch its due. In a few short years, it's grown from one guy spouting his opinions on startups to one of the most popular (and feared) news sites on the Net. It has broken some real stories -- like Google's acquisition of YouTube -- before the mainstream media had even heard of "viral video." Its reach is impressive. Even the Washington Post deigned to syndicate TechCrunch.
On the other hand, there's still too much of one guy spouting his opinions masquerading as real journalism for my taste. The site seems willing to publish any rumor, which means it's wrong a lot of the time. It's become a running joke that every week or so TechCrunch will post a story saying Google is going to acquire some company, and when it doesn't happen, post a second story saying they walked away from the deal. Rightly or wrongly, questions about Arrington's relationship to the companies he writes about continue to dog him.
(Note: This blog is sometimes guilty of much of the above. But then, I make no claims to be a news source. We're all snark all the time here in Cringeville.)
My real point is, Arrington's right: It has gotten nastier out there. Maybe it's always been this way, and the flame wars that used to be confined to alt.geek.whatever on Usenet have now exploded across the Net.
I see it here in the comments to this blog. All I need to do is pick the right topic -- anti- or pro-Microsoft, Apple, Linux; Scientology vs "Anonymous"; science vs faith; and anything that touches on politics -- and the anonymous posters come out with guns blazing. It's like pushing the flame button; it's automatic.
So far, the worst thing that's happened is people unsubscribing from the e-mail newsletter -- no death threats or spittle yet. But it seems like it's only a matter of time before that kind of thing starts happening to more of us.
Has the Net gotten nastier? Can "love keep us together"? E-mail me direct: cringe@infoworld.com.

AMD Set to Release DDR3-Capable Processors

Advanced Micro Devices will soon introduce processors that are capable of supporting DDR3 memory, earlier than the company had anticipated.
The company in the next few weeks will launch new processors targeted at desktops that will include DDR3-capable memory controllers, said John Taylor, an AMD spokesman.
Taylor declined comment on specific processors being launched, though a leaked road map suggests the launch of new Phenom II and triple-core processors.
The support for DDR3 memory comes earlier than anticipated. Late last year the company said it aimed to add DDR3-capable Phenom II processors by the middle of 2009, but could push that up depending on factors including pricing of the memory.
Compared to current DDR2-capable processors, the new DDR3-capable chips will allow information from the memory to be communicated to a CPU faster, which translates to better PC performance. To run DDR3-capable processors, the company will introduce the AM3 socket for motherboards.
"The people who want the latest and greatest will want to use DDR3 memory," Taylor said.
AMD's decision to switch to DDR3 memory is to make CPUs faster so it can effectively compete with Intel in the high-end PC and server markets, said Dean McCarron, president of Mercury Research, a market analysis firm.
"When we make changes in PC architecture, it is because it's either faster or cheaper," said McCarron. For AMD, the decision was technical rather than financial, but the enhanced competitiveness could yield a financial benefit to AMD in the long run, McCarron said.
Intel's Core i7 processor for gaming systems, launched in November, already supports DDR3 memory. Intel is also adding DDR3 support to chips for portable products like laptops.
However, given AMD's inherent price advantage compared to Intel's products, price-sensitive buyers may initially oppose the high prices of DDR3 memory modules, McCarron said. As of early January, a 1GB DDR3 memory module running at 1333MHz was priced at $35, versus $12 to $14 per unit for a 1GB DDR2 unit.
"This is completely normal for technology. As the volume ramps [DDR3 memory prices] will come down," McCarron said.
Motherboard companies like Asus have already announced AM3-compatible motherboards, setting the stage for AMD to launch its new DDR3-capable processors, which could include new Phenom II processors. The new CPUs will include a DDR2- and DDR3-capable memory controller, allowing it to work with older motherboards with DDR2 memory.
AMD earlier this year launched new quad-core Phenom II processors, which the company called its "highest-performing" CPUs to date. Aimed at high-end desktop PCs, the chips ran at speeds of up to 3GHz and included 8MB of cache.
However, the Phenom II chips are capable of even faster clock speeds under certain circumstances. For example, the processors have been overclocked to run at speeds of up to 6.5GHz on liquid-cooled systems and up to 4GHz on air-cooled systems.
AMD remains on track to transition to DDR3 memory support for servers with the Maranello platform in 2010, Taylor said. The Maranello platform includes the six-core Sao Paulo and 12-core Magny-Cours chips.

Friday, January 23, 2009

Windows 7 Security Features Get Tough

Two years after Windows Vista debuted, many companies have yet to upgrade. And in many instances their reluctance to migrate to Vista stemmed from concern about security.
Microsoft hass responded with its latest operating system, Windows 7, currently in public beta and expected to ship later this year. In Windows 7, new security features have been added, popular features expanded, and familiar features enhanced. Here's a look at a dozen or so security improvements that we expect will convince even the most recalcitrant corporate clients to upgrade.
Improved Migration Tools
Microsoft says that Windows 7 will be faster and easier to roll out across an enterprise than previous OS migrations were. Much of the credit for the anticipated improvement goes to new tools such as Dynamic Driver Provisioning, Multicast Multiple Stream Transfer, and Virtual Desktop Infrastructure.
With Dynamic Driver Provisioning, drivers are stored centrally, separate from images. IT professionals can arrange for installation by individual BIOS sets or by the Plug and Play IDs of a PC's hardware. Microsoft says that reducing the number of unnecessary drivers installed will help avoid potential conflicts and will accelerate installation. With Windows 7, as with Windows Vista, IT professionals can update system images offline, and even maintain a library of images that includes different drivers, packages, features, and software updates.
Rolling out any particular image across the entire network--or even installing individual images on desktops--is faster in Windows 7, thanks to the new Multicast Multiple Stream Transfer feature. Instead of individually connecting to each client, deployment servers "broadcast" the images across the network to multiple clients simultaneously.
Virtual Desktop Infrastructure (VDI), another desktop deployment model, allows users to access their desktops remotely, thereby centralizing data, applications, and operating systems. VDI supports Windows Aero, Windows Media Player 11 video, multiple-monitor configurations, and microphone support for voice over IP (VoIP) and speech recognition. New Easy Print technology permits VDI users to print to local printers. But use of VDI requires a special license from Microsoft, and doesn't offer the full functionality of an installed operating system.
Protecting Corporate Assets
Once the OS is installed, organizations may protect their assets with authentication for log-in. Windows Vista included drivers for fingerprint scanners, and Windows 7 makes such devices easier for IT professionals and end-users to set up, configure, and manage. Windows 7 extends the smart card support offered in Windows Vista by automatically installing the drivers required to support smart cards and smart card readers, without administrative permission.
IT professionals may further protect the contents of their Windows 7 volumes with BitLocker, Microsoft's whole-disk encryption system. Windows Vista users have to repartition their hard drive to create the required hidden boot partition, but Windows 7 creates that partition automatically when BitLocker is enabled. In Windows Vista, IT professionals must use a unique recovery key for each protected volume. But Windows 7 extends the Data Recovery Agent (DRA) to include all encrypted volumes; as a result, only one encryption key is needed on any BitLocker-encrypted Windows machine.
BitLocker To Go is a new feature that lets users share BitLocker-protected files with users running Windows Vista and Windows XP. The BitLocker To Go desktop reader provides simple, read-only access to the protected files on non-BitLocker-protected systems. To unlock the protected files, the user must provide the appropriate password (or smart-card credentials).
Application Control
Windows 7 also introduces AppLocker , an enhancement to Group Policy settings that lets organizations specify which versions of which applications users have permission to run. For example, a rule might allow users to install Adobe Acrobat Reader version 9.0 or later, but it might block them from installing legacy versions without specific authorization. AppLocker contains a rule-generation wizard to make the process of creating policies much easier, and it includes automatic rule making for building a custom white list.
System Restore, first introduced in Windows ME, gets a much needed update in Windows 7. First, System Restore displays a list of specific files that will be removed or added at each restore point. Second, restore points are now available in backups, giving IT professionals and others a greater list of options over a longer period of time.
The Action Center is a new, integrated Control Panel feature that gives Windows 7 users a central spot for locating tasks and common notifications under a single icon. The Action Center includes alerts and configuration settings for several existing features, including the Security Center; Problem, Reports, and Solutions; Windows Defender; Windows Update; Diagnostics; Network Access Protection; Backup and Restore; Recovery; and User Account Control. Popup alerts are gone in Windows 7, replaced by a new task tray icon (a flag with an X) that provides streamlined access to the problem directly or to the Action Center for more information.
Perhaps the most famous and most annoying form of Windows Vista notification comes from the User Account Control (UAC) feature, which flashes administrative warnings whenever you need to configure a system setting. In Vista the choices are stark: Endure the messages, or turn off UAC. In Windows 7, you have additional options. A slider bar configures the appropriate notification level for your computer, and by default UAC will notify you only when programs try to make changes to your PC.
Better Performance
Windows Defender, Microsoft's antispyware product, gains a much-needed performance enhancement in Windows 7. But Microsoft has removed the Software Explorer tool, asserting that the utility doesn't affect spyware detection or removal. That might be true, but Software Explorer would allow you to see what programs and processes are running, including ones that you may not know about or want. Perhaps Microsoft will reverse this decision by the final build.
Another new feature of Windows 7 is the Windows Filtering Platform (WFP), a group of APIs and system services that allow third party vendors to tap further into Windows' native firewall resources, thereby improving system performance. Microsoft stresses that WFP is a development platform and not a firewall in itself, but WFP does address a few of Windows Vista's firewall problems.
In Vista, Microsoft introduced the concept of profiles for different types of network connections--home, network, public and domain. This, however, bound corporate IT professionals whenever a remote user accessed their corporate VPN, because the firewall was already set as either "home" or "public," and corporate network settings could not be applied later. Windows 7 and WFP in particular permit multiple firewall policies, so IT professionals can maintain a single set of rules for remote clients and for clients that are physically connected to their networks. Windows 7 also supports Domain Name System Security Extensions (DNSSEC), newly established protocols that give organizations greater confidence that DNS records are not being spoofed.
Features for Mobile Users
Windows 7 has two enhancements designed for mobile users. With DirectAccess, mobile workers can connect to their corporate network any time they have Internet access--without needing a VPN. DirectAccess updates Group Policy settings and distributes software updates whenever the mobile computer has Internet connectivity, whether the user is logged on to a corporate network or not. This ensures that mobile users stay up-to-date with company policies. And with BranchCache, a copy of data accessed from an intranet Web site or from a file server is cached locally within the branch office. Remote users can use BranchCache to access shared data rather than using a connection back to headquarters.
Windows 7 also makes enhancements to event auditing. Regulatory and business requirements are easier to fulfill through management of audit configurations, monitoring of changes made by specific people or groups, and more-granular reporting. For example, Windows 7 reports why someone was granted or denied access to specific information.

Study: Spam Is Getting More Malicious

Spam, especially junk e-mails with malicious links or attachments, continues to be a huge IT headache. Spammers are also getting more creative in their attempts to find victims, utilizing popular sites such as Facebook and Twitter, according to a report from UK-based security firm Sophos this week.
The consultancy published its latest spam trend report and said new figures reveal that spam is still causing problems for computer users. In the fourth quarter of 2008, Sophos research found one in every 256 e-mails contained a dangerous attachment in October. In November, that figure improved to one in 384. December saw a huge decline: Just one in every 2000 e-mails contained a spam. Graham Cluley, senior technology consultant at Sophos, said it is possible the drop-off may be related to the shut down of the McColo Corp., a Web-hosting firm that security experts believe was responsible for three-quarters of the world's spam.
"It's hard to say exactly what can be causing this," said Cluley. "Certainly that is possible."
Numbers for January have not been assessed yet and Cluley said it is too early to determine if the drop off in spam levels has continued, or if spam is now back at levels seen in earlier months. What is clear, said Cluley, is that more spam is malicious in nature now and often designed to infect users' computers via sophisticated malware attachments or a link to malicious or infected websites, in order to steal sensitive information. Cluley also said social networking venues, such as Facebook and Twitter, are now the hot targets for spammers.
"Spammers really took to using sites like Facebook and Twitter as a vehicle for their spam antics during the last three months of 2008," he said. "Cybercriminals have cottoned onto the fact that social networking users can be more easily fooled into clicking on a link that appears to have come from a trusted Facebook friend, than if it arrived as an unsolicited email in their inbox. The notorious Nigerian 419 scammers have even evolved, masquerading as Facebook friends in order to trick unwary users into parting with valuable sensitive and financial information. Ultimately, while users are still falling for these scams, the fraudsters will continue. And while the authorities are making great progress, everyone must take steps to ensure they don't fall victim."
Death to Spam?
The report also referenced a 2004 prediction by Bill Gates that spam would be a thing of the past in 2 years.
"The rumors of spam's death have been greatly exaggerated over the years the threat remains alive and kicking despite increased legal action against spammers, the occasional takedown of Internet companies which assist the cybercriminals, and constantly improving anti-spam software," said Cluley. "Many IT professionals cast doubt on Bill Gates' assertion back in 2004, deeming the timeframe of his pledge to be unrealistic. Although the latest stats show that the proportion of spam relayed per country may have decreased year-on-year, spammers have turned to more creative, not to mention devious, methods to ensure their messages reach as many unsuspecting computer users as possible."
And the Spam King Crown Goes to...
Between October and December 2008, the United States was responsible for most of the world's spam, according to Sophos. China was in the second spot and Russia was third. Sophos officials pointed to Canada, Japan and France as countries that have made progress in spam prevention. All three, considered "serial offenders" five years ago, are no longer present in the list of spam reprobates.
"Although there's no denying that some countries have significantly reduced their contribution to the spam epidemic over the past five years, the United States still holds the crown," said Cluley. "Though its spam contribution has significantly decreased since Bill Gates' proclamation, falling from almost half of all spam relayed at the end of 2004, to 21.3 percent by the end of 2007, and now resting at 19.8 percent, this shows there's certainly no quick fix."

Mac BitTorrent Users Warned of Trojan

Mac users ill-advised enough to search for pirated copies of Apple's iWork 09 software could find themselves on the wrong end of an unpleasant and crafty new Trojan.
According to Mac security software company Intego, which has put out the alert, OSX.Trojan.iServices.A allows users to install a fully-working copy of iWork 09 as normal, but only at the price of letting malware bury into OS X using a rogue install add-on.
As with a lot of PC Trojans, the immediate purpose of the software is simply to compromise the OS X system comprehensively enough to allow for the downloading of further malware from a remote host, another way of saying that the user could be opening themselves up to more or less anything the writers fancy putting on the system.
According to Intego, this is no theoretical infection that will affect only a handful of people, having been downloaded at least 20,000 times using the BitTottent file sharing system in recent days.
Apple's iWork software, which would normally set the user back around £70 ($79), is an all-purpose program that includes document, spreadsheet and presentation features. The latest version that is being used as a lure on BitTorrent distributions sites, will be sought after by users of pirated software having been released only weeks ago.
The company stands to benefit from the alert of course - it is currently virtually the only Mac-only software security outfit. The big-name anti-virus products sold for Macs tend to be spin-offs from much more lucrative PC protection programs.
"Intego VirusBarrier X4 and X5 with virus definitions dated January 22, 2009 or later protect against this Trojan horse," the company said in a release.
The number of Trojans affecting Mac users is on a modest upward curve, helped the company said during a recent and separate Mac Trojan outbreak, by the tendency of some Apple users to see their computers as above Windows-like security woes. The number of companies that actively track malware targeting Apple users is also proportionately smaller than for Windows PCs.

RIAA Seeks to Block Streaming Video of Piracy Case

A federal appeals court in Boston has agreed to hear a motion filed by the Recording Industry Association of America (RIAA) that seeks to prevent courtroom proceedings in a music piracy case from being streamed live on the Internet .
In an order issued Wednesday, a three-judge panel at the First Circuit Court of Appeals noted the public interest in the piracy case and the "substantial and novel questions" raised by the prospect of live streaming. The judges called on both the RIAA and the defense to submit legal arguments on the streaming issue in an expedited fashion. The court will then decide if any oral arguments are needed to "facilitate the decisional process," the order noted.
The RIAA is asking the appeals court to overturn a Jan. 14 ruling by U.S. District Judge Nancy Gertner that authorized live video streaming of an upcoming hearing in the piracy case, which involves several music labels and a 25-year old Boston University doctoral student named Joel Tenenbaum .
Tenenbaum is accused by the RIAA of illegally downloading and distributing hundreds of songs over a peer-to-peer network, although the trade group's lawsuit only lists seven of the songs. The lawsuit was filed in August 2007, but it shot into the public limelight last fall, when Harvard University law professor Charles Nesson began representing Tenenbaum in the case.
In October, Nesson filed a counterclaim challenging both the constitutionality of the federal Digital Theft Deterrence and Copyright Damages Improvement Act and the RIAA's attempted use of that law against Tenenbaum. Nesson also challenged the appropriateness of the massive fines - ranging from $750 to $150,000 per willful infringement - that are available to the music labels under the statute.
Gertner's live-streaming ruling related to a hearing on the counterclaim that originally was scheduled for Thursday in U.S. District Court in Boston. However, the hearing was postponed until Feb. 24 after the RIAA said that it planned to appeal the judge's ruling, which authorized the Courtroom View Network to send a live video of the proceedings to Harvard University's Berkman Center for Internet & Society. The center, in turn, would make the video stream available to the public on its Web site.
Nesson asked Gertner to allow the hearing to be streamed, claiming it would allow a broader Internet audience "to see what's at stake and just how out of proportion the [RIAA's] response is to the supposed infraction."
But in its appeal of the streaming order, the RIAA called Gertner's decision "wrong on its face" and "troubling in its application." The trade group said the fact that the video stream would be distributed by the Berkman Center, which Nesson co-founded, was unfair and prejudicial to the music labels. Such an arrangement "undermines basic principles of fairness and is flatly inconsistent with the public interest," the RIAA said in its motion to the appeals court.
The trade group also took issue with the whole idea of streaming a court hearing on the Internet. "Unlike a trial transcript, the broadcast of a court proceeding through the Internet will take on a life of its own in that forum," the RIAA said. It added that although Gertner's ruling mandates gavel to gavel coverage of the hearing, the feed could easily be edited, "spliced together" and re-broadcast in such a way that comments and legal arguments would be "taken out of context."
Gertner also erred in authorizing the broadcast because it was in violation of the district court's own local rules, which forbid such broadcasts, the RIAA claimed. At a minimum, the trade group said, Gertner should have consulted with the full panel of judges at the district court before authorizing the video transmission.
In a statement, Nesson questioned the RIAA's appeal and said the Berkman Center was "working hard" to ensure that it didn't remain the exclusive distributer of the courtroom video feed. "We welcome the RIAA's help in finding additional Web sites through which the proceedings can be viewed," he said.

Friday, January 16, 2009

A World Without Steve

The news that Steve Jobs is stepping down as Apple CEO, however temporarily, gives an opportunity to think about what life would be like without Jobs.
No Steve in that Los Altos garage in 1976, no Apple computer. Wozniak alone couldn't have pulled it off.
If Jobs had not come back to rescue Apple in 1997, I sincerely believe there wouldn't be an Apple today. Its hardware operations would have been acquired and slowly subsumed by someone like Sun. The Mac OS would live on in cheap desktops, if it survived at all. More likely though is that it would have been squashed.
No iPods. We'd be forced to use whatever music player Creative or Microsoft came up with -- and it wouldn't be nearly as nimble as the Zune, which does its best to ape the iPod.
No iTunes. The record labels would eventually come up with some onerous, kludgy, unworkable system for buying music online (or they'd hire Microsoft to do it) that would drive even more people onto P2P networks. (Think Sony Connect.) And the TV and movie studios still wouldn't be close to distributing their wares over the wires.
No iPhone, certainly, and no iPhone wannabes flooding the market like there are now. It would be BlackBerrys, Windows Mobile, or bust.
I like to give Jobs as hard a time as the next guy -- OK, maybe the next 700 guys -- but I'm not ready for him to go. Despite the bluster, the arrogance, the Rushmore-sized ego, and the awful fawning that follows him from his army of spittle-licking lackeys, Jobs is a one-of-a-kind corporate leader who will be sorely missed. He gets what people want in a way no other tech CEOs do.
So I'm rooting for him to make yet another triumphant return in six months. Because with Gates gone, Ellison out of the public eye, and Jobs out of commission, who would be left for me to make fun of?
What do you think? Can Apple soldier on without Steve? E-mail me direct: cringe@infoworld.com.

Microsoft Sued Over Unified Communications Deal

Microsoft has been sued by a small Wisconsin business for allegedly misrepresenting the capabilities of its Live Communications Server product, selling the company more licenses than it needed and not providing a refund or other products to solve its original problem.
Imagineering International filed its lawsuit in December in the Fond de Lac County circuit court in Wisconsin, accusing Microsoft of breach of contract and breach of warranties, among other offenses.
Imagineering claims Microsoft failed to resolve problems the company had with deploying an enterprise version of Live Communications Server, then did not replace the product with a revamped version, Office Communications Server (OCS), as Microsoft had promised.
Microsoft also never provided Imagineering with a refund for the products and licenses it purchased, after requiring Imagineering to destroy its licenses and the software as a condition of receiving a credit toward OCS, said Jeff MacMillan, president and CEO of Imagineering.
Imagineering, a 23-person IT consulting firm and reseller, had been a Microsoft partner for about 10 years at the time it purchased the products and licenses, he said. The company has since terminated its partnership with Microsoft.
Rather than responding in the same court, Microsoft filed papers Wednesday with the U.S. District Court for the Eastern District of Wisconsin in Milwaukee to move the case from the county court to the federal court, citing Imagineering's request for damages that exceed US$50,000 as one reason.
Cases heard in federal courts also tend to take longer to be resolved, and plaintiffs can lose some of their claims in summary judgment, said Michael Kuborn, an attorney representing Imagineering from the Curtis Law Office in Oshkosh, Wisconsin.
A lawyer representing Microsoft did not respond to a phone call requesting comment. A Microsoft spokesperson said via e-mail Friday that Microsoft is reviewing the allegations and will make its response in court.
Imagineering alleges in its complaint that on Oct. 7, 2005, it purchased Microsoft's LCS software, 1,500 Client Access Licenses and 1,500 External Connector Licenses for a total of $70,776. At the time LCS was Microsoft's software for providing a unified communications system, which links a company's voicemail, telephone system, e-mail and other employee communications services on the same software infrastructure.
MacMillan said Friday that Microsoft representatives had informed him that LCS had the capabilities his company needed to create a unified communications platform out of its disparate systems for telephony, voicemail, fax and e-mail. Microsoft also said it would provide remote desktop capability, which was key to Imagineering's deployment, he said.
Imagineering purchased the product and licenses mainly for an in-house deployment, but if that proved successful, the company planned to sell a similar offering to customers, MacMillan said.
The number of licenses his company needed to purchase seemed high for a company with only 23 employees. However, Microsoft employees brokering the deal said Imagineering would need licenses not only for its own employees using the new system but also for any customers who wanted to access it.
After Imagineering secured the product from Microsoft, it had trouble deploying the product, and so in October of 2005 it contacted Microsoft technical support, MacMillan said. "They determined we were given bad presale information and that the product would not work the way we had been told it would," he said.
Microsoft also informed Imagineering that it did not need licenses for its customers and had indeed purchased too many, he said.
The companies worked together to come up with a solution, which MacMillan said was to give Imagineering a credit equal to what it paid Microsoft to purchase the follow-up version of LCS, OCS, from Microsoft distributor TechData once that product was available. TechData also would provide Imagineering with the licenses it would need for its deployment, he said.
Microsoft released OCS in late 2007. Around that time, MacMillan said he contacted TechData about acquiring the product and the licenses, per the company's agreement with Microsoft. TechData informed him that it had no record of such a deal, he said.
MacMillan said he contacted Microsoft and again worked with it to try to resolve the situation. In February 2008 Microsoft informed Imagineering that it would give it "no more than $27,000" in credit to purchase additional hardware it would need to deploy OCS -- a more complex product than LCS -- as well as the license to deploy it, according to court documents and MacMillan. Imagineering was given seven days to accept or reject the offer, according to court documents.
At that point, MacMillan said, he was frustrated and disappointed at how Microsoft handled the situation.
"They've welched on every deal they put into place ... and then said, 'You paid $70,000, we'll give you $20,000, that will have to be good enough, you can accept it or you can reject it,'" he said. "We had to reject that."
Imagineering still has not successfully implemented a unified communications platform, MacMillan said, and does not have the money to do so. "The $70,000 we spent on this was basically what we had for the project," he said. "It's actually an extraordinary amount of money to us."
Imagineering is seeking a refund from Microsoft for the original amount it paid the company, punitive damages and attorney fees.

Conficker Worm Attack Getting Worse: Here's How to Protect Yourself

Millions of Windows computers have been infected by a new computer worm dubbed "Conficker." The situation is "not getting better," but rather is "getting worse," according to security software vendor F-Secure.
Read how you can protect your PC here.
In a blog post, F-Secure security researchers report that the number of machines infected by the Downadup worm has skyrocketed from roughly 2.4 million to over 8.9 million in the last four days alone.
Downadup is a malicious worm that "uses computer or network resources to make complete copies of itself," according to F-Secure. And it may also include code or other malware that damages both a computer and network. The worm also goes by the names "Kido" and "Conflicker." Details on how it operates and how to remove it are here.
Once executed, Downadup disables a number of system services, including Windows Automatic Update, Windows Security Center, Windows Defender, and Windows Error Reporting. The worm then connects to a malicious server, where it downloads additional malware to install on the infected computer. Computerworld provides a more detailed report on Downadup's potential dangers.
Since Downadup uses random extension names to avoid detection, Windows users should make sure their security software is set to scan all files, rather than checking on specific extensions, F-Secure recommends.
The alarmingly high number of Downadup infections led Microsoft last Tuesday to enable its anti-malware utility, Microsoft Software Removal Tool (MSRT), to detect the worm. So it's important that Windows users, if they haven't already, download the latest Microsoft security patch that went out earlier this week.

Protecting Against the Rampant Conficker Worm

Businesses worldwide are under attack from a highly infectious computer worm that has infected almost 9 million PCs, according to antivirus company F-Secure.
That number has more than tripled over the last four days alone, says F-Secure, leaping from 2.4 million to 8.9 million infected PCs. Once a machine is infected, the worm can download and install additional malware from attacker-controlled Web sites, according to the company. Since that could mean anything from a password stealer to remote control software, a Conflicker-infected PC is essentially under the complete control of the attackers.
According to the Internet Storm Center, which tracks virus infections and Internet attacks, Conficker can spread in three ways.
First, it attacks a vulnerability in the Microsoft Server service. Computers without the October patch can be remotely attacked and taken over.
Second, Conficker can attempt to guess or 'brute force' Administrator passwords used by local networks and spread through network shares.
And third, the worm infects removable devices and network shares with an autorun file that executes as soon as a USB drive or other infected device is connected to a victim PC.
Conficker and other worms are typically of most concern to businesses that don't regularly update the desktops and servers in their networks. Once one computer in a network is infected, it often has ready access to other vulnerable computers in that network and can spread rapidly.
Home computers, on the other hand, are usually protected by a firewall and are less at risk. However, a home network can suffer as well. For example, a laptop might pick up the worm from a company network and launch attacks at home.
The most critical and obvious protection is to make sure the Microsoft patch is applied. Network administrators can also use a blocklist provided by F-Secure to try and stop the worm's attempts to connect to Web sites.
And finally, you can disable Autorun so that a PC won't suffer automatic attack from an infected USB drive or other removable media when it's connected. The Internet Storm Center links to one method for doing so at http://nick.brown.free.fr/blog/2007/10/memory-stick-worms.html, but the instructions involve changing the Windows registry and should only be attempted by adminstrators or tech experts. Comments under those instructions also list other potential methods for disabling autorun.

Steve Jobs and Apple: A Reality Check

Since Apple announced that Steve Jobs would be taking medical leave until June, I've seen rampant speculation on Steve Jobs' health, and I've seen some wonder if Apple would be able to survive without Jobs. It's all quite understandable; after all, Steve Jobs is by far the biggest figure in the company. But after reading a post by Brian Lam (warning: Lam uses lots of profanity) on Gizmodo where he becomes completely unglued when responding to criticism he and Gizmodo received regarding their coverage of Jobs' health, I really think it's time for us in tech circles to take a step back and get a reality check of the situation.
Apple is not a one-man company.
Ever since Jobs took the reigns of Apple in 1997 he has surrounded himself with a team of executives that reflect his vision for the company. Apple's chief designer Jonathan Ive is recognized as one of the top product designers around. Apple's Chief Operating Officer Tim Cook is a master of logistics. The point is, Apple has plenty of talent at its disposal to weather the storm and continue innovating.
There is only one Steve Jobs. There isn't a person on Earth who could actually replace him. That said, in Jobs' absence, Apple should be fine. There's enough of a brain trust there to be able to carry the company. When Jobs does decide to retire (for health reasons or otherwise), a "What would Steve do?" approach won't cut it of course, so Apple would be a different company without Jobs guiding it, but it wouldn't die.
In fact, since its inception, Apple has been a company that has fostered innovation. Apple innovated without Jobs, but what they lacked was direction. Jobs provided direction and focus to a company that had none in 1997. As long as Apple maintains its zeal for taking risks and pushing the envelope, and can stay focused, Apple should be fine with or without Jobs.
Speculating on Jobs's health is a waste of time and energy.
Let's all be honest: At this point, the only ones who likely know about the true nature of Jobs's condition are Jobs and his family, his doctors, and Apple's board and executives. Everything else at this point is essentially hearsay. Fellow journalists may have their insider sources, but with sources contradicting each other, I'd say there isn't a whole lot of stock we can put into them right now.
Jobs is a human being.
I think it's important to keep this in mind: Jobs is a person just like you and me. And while there should be a reasonable expectation for the CEO of a large corporation like Apple to be forthright about any matters—health or otherwise—that would prevent him or her from effectively running the company, beyond that, Jobs health should not be the matter of public discussion and speculation that it has become. All we know—and all we need to know as of right now—is this: Jobs has some health issues to deal with, he had to take a leave of absence, and he hopes to return to Apple in June. If anything changes, the public should know, but until then, let's give Steve Jobs our best wishes that he'll be back sooner rather than later, and let's let him be.

Advocates Disagree on Broadband Stimulus

The U.S. government should provide money for broadband deployment in an upcoming stimulus package, several groups said Friday, but the broadband advocates couldn't agree on how the money should be spent.
A couple of speakers at a broadband stimulus forum Friday called for the government to give grants to broadband providers to roll out service to unserved or underserved areas. Another speaker called for tax credits, saying a grant program would take months to set up.
Another speaker suggested that none of the broadband money in the US$825 billion stimulus package being pushed by President-elect Barack Obama should go to large incumbent telecom and cable companies that now provide a huge majority of the broadband connections in the U.S.
"I'm fundamentally opposed to taxpayer dollars going to absentee-owned networks, unless there's no hope of a local, community-based network emerging," said Wally Bowen, executive director of the Mountain Area Information Network, a nonprofit broadband provider in western North Carolina. "Local networks are going to create local jobs; they're not going to outsource their tech support to India."
Bowen and other advocates of government broadband spending spoke at a New America Foundation event the day after the U.S. House of Representatives Appropriations Committee recommended $6 billion in broadband deployment spending as part of the larger economic stimulus package. The House version of the stimulus package bill would include $2.8 billion for the Rural Utilities Service (RUS) of the U.S. Department of Agriculture to give out as grants and loans to broadband providers.
In addition, the House bill would give another $2.8 billion to the U.S. National Telecommunications and Information Administration (NTIA) for broadband deployment grants, with $1 billion of that money going to wireless broadband projects. An additional $350 million would go toward a national program to map areas that don't have broadband.
About 25 percent of the NTIA broadband grants would go to areas without broadband, and 75 percent to areas with limited broadband options, according to the bill. The money going to unserved areas would focus on providing basic broadband service of more than 5Mbps of downstream speed for wired broadband or basic wireless broadband.
In order to quality for the 75 percent of the money going to underserved areas, a wired broadband provider would have to deploy service offering 45Mbps downstream speed, and a wireless broadband provider would have to provide 3Mbps of downstream speed.
Both the NTIA and RUS money would require providers to abide by net neutrality rules, which would prohibit them from blocking or slowing any type of legal Web content.
But the speed and net neutrality rules could limit the number of broadband providers that apply for the grants and loans, said Rob Atkinson, president of the Information Technology and Innovation Foundation (ITIF), a Washington, D.C., think tank. Broadband providers such as AT&T and Qwest aren't currently set up to deliver 45Mbps, he said at the New America Foundation event.
"The more public-interest requirements you put on these networks, the less investment you will get," Atkinson added. "I would not be surprised, by the end of 2009, we could see very little investment that will come out of the stimulus bill as currently structured."
Atkinson also called for tax credits to be part of the package, in addition to grants. Grant programs could take months to set up, while tax credits could kick in immediately, he said. ITIF's broadband proposal would not disqualify large broadband providers from getting stimulus money.
But other panelists said tax credits are difficult to track, and it would be difficult for government auditors to guarantee that tax breaks go directly to broadband deployment in new areas. In addition, speed requirements will be necessary to ensure that the U.S. doesn't have to pay for a new broadband deployment in a few years, said Benjamin Lennett, a senior program associate at the Wireless Future Program at the New America Foundation.
Rural areas, which will use broadband for things such as telemedicine and distance learning, may need higher speeds than many urban and suburban users, he added.
"Do we want to give rural areas ... inferior networks? Do we want to give them second-best networks?" Lennett said. "You're going to need fast speeds and lots of capacity. There's no way we can give a band-aid here and a band-aid there to get them to 5 or 1 megabit. That's just not going to cut it, two, three, five, 10 years down the road."
But Derek Turner, research director at media reform group Free Press, said the net neutrality requirements are even more important than speed requirements. "We don't want to be giving federal dollars to fund networks that are closed and discriminatory," he said.
No large broadband providers were represented at the forum, and some panelists didn't hold back criticism. Like Bowen, Mark Cooper, research director of the Consumer Federation of America, called for community-based broadband projects, instead of money going to the large broadband providers. Broadband stimulus money should go toward wireless broadband projects based in the communities they serve, he said.
"The cozy duopoly of telecos and cable companies has failed to deliver," he said.
• Sponsored Resource:Back up, access, share, and store all your family's digital media. Windows Home Server.

Wednesday, January 14, 2009

Privacy Groups File Mobile Marketing Complaint With FTC

Two privacy groups on Tuesday asked the U.S. Federal Trade Commission to regulate how mobile marketers can use consumers' personal information, saying many people don't know when their information is being collected from cell phones and how it's being used.
The mobile industry responded that it already offers enough consumer protection through self-regulation, and one analyst said it is doing a good enough job that government intervention isn't necessary.
In their filing, the Center for Digital Democracy and the U.S. Public Interest Research Group asked the FTC to expand an existing inquiry into online interactive marketing to include mobile marketing.
They said the FTC should identify practices that compromise privacy and consumer welfare; examine opt-in procedures to make sure consumers are aware of what data they are giving up and how it will be used; investigate marketing tactics that target children and "multicultural communities," and create policies to halt abusive practices.
The filing acknowledges the industry's effort to police itself but says it does not go far enough. "Current self-regulatory privacy and marketing policies in the mobile arena are inadequate," the groups said. They also criticized the mobile advertising industry for creating its regulations without meaningful participation from consumers or consumer protection agencies like the FTC.
One of the major concerns is that mobile-phone customers don't know what they're agreeing to when they allow mobile operators to provide them targeted advertising, said Jeffrey Chester, executive director of the Center for Digital Democracy. Customers may not know that their personal data is being "retained, put in a profile and potentially shared" with other companies, Chester said.
The filing cites three examples of mobile marketers including Admob, Bango and Marchex that the privacy groups say collect information from mobile users without adequate notice.
While many companies ask customers for their approval to be part of marketing campaigns, those customers may not take the time to scroll through all the information on "tiny little screens," Chester said. "They don't know what they're getting into."
Mobile marketers and some other experts, however, argued that self-regulation is working. The industry has done a good job of protecting user privacy and probably doesn't need government regulations, said Greg Sterling, an analyst with Sterling Market Intelligence. Mobile operators have been cautious, "some argue overly cautious," about protecting consumer privacy, he said.
All of the operators in the U.S. are part of the Mobile Marketing Association, said Mike Wehrs, president and CEO of the group, which sets best practices for mobile marketers and investigates complaints from consumers. He defended the group's track record and said some of the allegations in the FTC filing are untrue. For example, consumers are welcome to offer input on how mobile marketing works, and anyone is invited to the MMA's open meetings, he said.
In addition, the MMA voluntarily briefs the FTC every six months, keeping it informed about changes in the industry and how the MMA is keeping pace, he said. If a government agency starts setting mobile advertising policy, it probably would struggle to make changes as quickly as an independent group like the MMA, he argued.
Wehrs and Sterling both contended that the advertising industry would not want to abuse the trust of consumers because their businesses depend on them. "This is a case where consumers' interests and the interests of the marketing company are relatively aligned, in the sense that if you get a lot of spam, people will be angry and the marketing company will be unsuccessful," Sterling said.
Google, whose mobile practices are referenced repeatedly in the FTC filing, said it is "keenly aware" of its responsibility to protect user privacy. "Whether it's for a desktop or for a mobile platform or device, we design products that give users meaningful choices about how they use our services and what information they provide to us, and let users know when products may collect personally identifiable information. ... We want to work with industry on developing best practices on privacy and we welcome all efforts to do that," the company said in a statement.
Still, given that the U.S. may be leaning toward a more regulatory environment, in the shadow of a financial crisis that many blame on a lack of regulation, the FTC will likely consider launching an investigation, Sterling said. He expects the FTC to at least hold hearings about mobile marketing, and potentially call for new guidelines. But he doesn't expect the FTC to create any sort of stringent regulatory system.

Bartz Wants 'breathing Room' for Yahoo

Yahoo has been pushed around by outsiders and needs to chart its own course, incoming CEO Carol Bartz said Tuesday, though she hasn't yet determined what that course should be.
"More than anything, let's give this company some friggin' breathing room," Bartz told reporters during a conference call. "It's been too crazy, everybody on the outside deciding what Yahoo should do, shouldn't do, what's best for them. That's gonna stop."
Bartz took over Yahoo on Tuesday after a year of turmoil in which the company spent much of its time immersed in management upheavals and negotiations with Microsoft and Google. The company's management, especially cofounder and outgoing CEO Jerry Yang, has come under fire for letting opportunities pass by before its stock price plunged along with the wider market in the second half of the year.
Bartz, a former CEO of Autodesk, declined to tell reporters what direction Yahoo should take, saying she needs to first talk with employees, customers and investors to find out more about the company. She wouldn't say how long that would take.
"Let's not put ourselves on some crazy timeline, let's let this process evolve," she said.
Bartz did say Yahoo should focus on being the best company in all its markets, and that it should create new markets, both geographic and vertical.
"Yahoo has unfortunately been battered in the last year, and (this) caused it to look internally and be protective, and that's nonsense for such a great company and such a great franchise," Bartz said. The company should now "get outward-looking and kick some butt," she said.
The Yahoo board approached her about the job in December, Bartz said, and she was excited to jump in.
"I wouldn't have taken the job if I didn't believe there's a huge opportunity here," she said. "I just see this company as a company with enormous assets that, frankly, could use a little management."
Roy Bostock, chairman of Yahoo's board, thanked Yang as well as outgoing President Sue Decker, saying Decker would help to make a smooth transition and Yang would remain "actively involved" with the company. Bartz said she expected to take advantage of Yang's insights. "No one knows more about Yahoo than Jerry," she said.
Bartz may have more to say about the company's direction when she speaks on Yahoo's conference call on its fourth-quarter financial results, set to be announced Jan. 27.

Yahoo Taps Bartz as CEO, Decker Walks

Yahoo said on Tuesday it has chosen former Autodesk CEO Carol Bartz as its next CEO to replace Jerry Yang, who announced his intention to step down in November.
The company also announced that President Sue Decker, who had been a candidate for the CEO position, has resigned and will leave the company after a transitional period. Decker worked at Yahoo for eight-and-a-half years and was a close supporter of Yang.
In a statement, Yahoo Chairman Roy Bostock said Bartz has the right mix of technology and business savvy to lead Yahoo, as well as a strong leadership style and a proven track record of driving growth, shareholder value and operational excellence.
"She is admired in the Valley as well as on Wall Street for her deep management expertise, strong customer orientation, excellent people skills, and firm understanding of the challenges facing our industry," Bostock said.
In the same statement, Bartz praised Yahoo for its assets, technology, staff and accomplishments. "There is no denying that Yahoo has faced enormous challenges over the last year, but I believe there is now an extraordinary opportunity to create value for our shareholders and new possibilities for our customers, partners and employees. We will seize that opportunity," she said.
Yang also praised Bartz, calling her "the ideal person" to drive Yahoo forward. "I believe Yahoo's best years are still ahead of it," he said in the statement.
Gartner analyst Allen Weiner called Bartz "a very solid pick" who should be an "easy sell" to investors, partners, employees and advertising customers. "She can bring the adult supervision the company has lacked," Weiner said.
He downplayed Bartz's lack of experience in the Web 2.0 world, saying plenty of people at Yahoo have that type of knowledge. Bartz brings a steady hand that will give Yahoo operational direction and strategic focus, just as Eric Schmidt has done at Google, Weiner said.
"She's the person who can come in and put the peanut butter back in the jar," he said, referring to a controversial memo a Yahoo executive penned in late 2006, in which he likened Yahoo's lack of focus to a spread of peanut butter: amorphous and shallow.
Industry analyst Greg Sterling from Sterling Market Intelligence said his initial reaction to Bartz is "cautious." Bartz seems like a solid, competitive manager and a safe choice for Yahoo, which could have gone for a flashier pick from the Web 2.0 ranks.
The question is whether she can lead Yahoo out of its yearslong crisis, given her lack of experience with the Internet and online advertising markets, Sterling said.
Neither analyst believes Bartz was brought in to broker a sale of Yahoo, nor do they expect her to implement major changes in technology strategy.
Bartz will take up her new job immediately. She also gets a seat on the Yahoo board.
Bartz was Autodesk's executive board chairman. She previously served as its chairman, president and CEO for 14 years, stepping down in April 2006.
While at the helm, Autodesk diversified its product line and saw its revenue rise from US$285 million to $1.5 billion, according to Autodesk's corporate Web site.
Before joining Autodesk, Bartz worked at Sun Microsystems, where she was vice president of worldwide field operations and an executive officer, and at Digital and 3M.
President George Bush appointed her to his Council of Advisors on Science and Technology, and she is on the boards of Intel, Cisco Systems, NetApp, and the Foundation for the National Medals of Science and Technology.
Her awards include being named as one of the 50 most powerful women in business by Fortune Magazine in 2005 and one of the world's 30 most respected CEOs by Barron's in 2005.
Bartz, who according to the Journal is 60 years old, will have her hands full as Yahoo CEO. The company has been in a technology and financial slump for several years. Multiple corporate shake-ups and reorganizations have failed to trigger a turnaround.
Yahoo has lagged behind large rivals like Google and small startups, unable to capitalize as much as it should have on many of the hottest Internet opportunities of recent years, like online video, search advertising, social networking and blogging.
There was much hope among industry observers when Yang, a Yahoo cofounder, took over as CEO from Terry Semel in mid-2007, but he failed to deliver on his goals to make Yahoo the preferred starting point for users, the preferred marketing vehicle for online advertisers and the preferred Web application platform for external developers.
His tenure included an unsolicited acquisition attempt by Microsoft, whose failure critics blamed on Yang and the Yahoo board. Later, a deal to let Yahoo run Google search ads collapsed after it became clear the U.S. government planned to challenge it due to antitrust concerns. The deal would have given Yahoo's revenue a significant boost.
Yang's tenure as CEO also featured two big rounds of layoffs, an embarrassing exodus of high-profile managers, disappointing financial results, a tanking stock price, free-falling employee morale and little or no advances in key areas, like search usage and search advertising.
He also oversaw the launch of several ambitious technology projects designed to improve Yahoo's services for end-users, developers, publishers and advertisers. Some have been delivered and others are in progress, like Yahoo Open Strategy (YOS), a project to rearchitect the company's sites and services to tap into the popularity of social networking.
With YOS, Yahoo pledges to open all its sites, online services and Web applications to outside developers, and give users a "social profile" dashboard to unify and manage their Yahoo services.
Yang plans to remain on the Yahoo board and retain his Chief Yahoo title.

Lenovo Puts Atom Chip in Fan-less Desktop

Lenovo launched its first nettop on Tuesday, reaching out to budget buyers with an inexpensive, fan-less desktop designed for surfing the Web and other basic computing tasks.
The Lenovo H200 uses Intel's low-power Atom CPU to help the system consume less power than traditional desktops. Like netbooks -- the laptop equivalent of nettops -- it's designed for basic tasks like Web surfing and word processing, not to deliver the full multimedia experience of typical PCs.
The fan-less design of the H200 makes it Lenovo's quietest desktop, the company said. It runs the single-core Atom 230 processor with 512KB of cache, can have up to 2GB of RAM and comes with Windows Vista Home Basic. It will also include up to 160GB of hard drive storage, a DVD-R/W drive and Intel integrated graphics controllers.
The H200 is priced at US$399 with a 19-inch monitor and is available now through retailers and on Lenovo's Web site in the U.S. and some parts of Asia, including China.
Some users may wonder why they need a nettop, when netbooks that provide greater mobility and support similar applications can be had for $300 to $500. Kristy Fair, a Lenovo spokeswoman, said some users value a larger screen over the mobility a netbook affords.
"It's an affordable option for people looking for a package that has everything they need -- the desktop and the monitor," she said.
The company has also introduced more powerful desktops. The IdeaCentre K220 and K230 will be able to run Intel's quad-core processors and come with 64-bit editions of Windows Vista for faster computing. The K220 and K230 will support up to 4GB and 8GB of memory, respectively, and optional Blu-ray drives will be available.
The K220 and K230 will be priced at $449 and $499, respectively, and will be available later this month.

Dude, You're Getting a Dell Smart Phone!

Okay, we're a bit premature here. But that might happen if one analyst's prognostications are true. According to an Electronista report, Shaw Wu of Kaufman Bros. is predicting that Dell is getting closer to launching its first smart phone, although no date has been set for the roll out. While Dell has played down the rumors, industry watchers have noted that the PC maker recently hired a former Motorola mobile phone exec to helm its global consumer products group.
Rumors of Dell's smart phone intentions have been swirling for many months, in fact. Reports from early last year had Dell working with Foxconn Group to develop handsets based on the Windows Mobile OS, and Dell vowed to roll out an iPod competitor. However, those projects may have hit the pause button, as Dell announced last fall that it wouldn't have an MP3 player ready for the holiday season.
Given the stunning success of the Apple's iPhone, the positive buzz generated by Palm's Pre and its clever webOS at last week's Consumer Electronics Show, and a growing number of Google Android devices, Dell knows it had better act-and fast.
But whose mobile OS should Dell use? Apple and RIM have designed their own in-house software-and done an excellent job of it-but building operating systems isn't exactly Dell's forte. Windows Mobile is the odds-on favorite, or Dell may have to go out and purchase a software maker. Whatever it does, Dell will have to launch one impressive handset to stand out in a very competitive market. Whether it can remains to be seen.

Diving Headfirst into Windows Beta 7

Well, I did it. I made the move. Got off the fence. Took the plunge. Or, as my English language-challenged friend at State Bank of Mauritius would put it, I "did the necessary."
After thoroughly backing up my Vista x64 production laptop -- including redundant, manual copies of my data files, plus a full system image via Complete PC Backup -- I paved over the disk and installed the official Windows 7 Beta (64-bit edition).
[ For more help sorting through the early Win 7 benchmarks, check out InfoWorld's special report. ]
I've actually been sitting on the beta bits since before the Christmas break. However, I didn't want to commit totally until I could snag a working Product ID code or two from the Microsoft servers. Now, with my fully configured and activated Windows 7 environment in place, it's time to survey the landscape...and start complaining.
For starters, when will Microsoft get its act together and deliver ISO image-mounting support in Windows? It seems like anytime the OS is updated (we early Vista users were in this same boat three years ago), all of the best third-party mounting tools get broken.
Daemon Tools? Won't even install.
Virtual Clone Drive? Buggy and unstable (check their forums).
Fortunately, I was able to use VCD just long enough to get my major work titles (SQL Server 2008, Visual Studio 2008, Office 2007, and their respective service packs) installed from my MSDN ISO images. However, after the umpteenth random lock-up, I finally pulled Slysoft's normally well-behaved mounting utility from the system. I may give Alcohol 52% a try, but at this point I'm not optimistic.
And, frankly, I shouldn't have to be. This is the kind of base-level functionality that's supposed to be baked into the OS. Microsoft did it with VHD images. It can do it with ISO images as well. Heaven knows these folks distribute enough software in this format. MSDN and TechNet users pay good money for this stuff. The least Microsoft can do is provide a simple mechanism for mounting the images at runtime.
Note to Microsoft: Put a couple of those supersmart engineers on this, pronto! And while you're at it, have them back-port the solution to Vista and XP. Think of it as a kind of penance for putting us all through this same ISO-mounting hell again.
Also, what's the deal with Skype? Version 3.8 is still completely hosed under Windows 7. It won't connect to the peer-to-peer network and crashes mere seconds after loading. I'm now forced to use the butt-ugly Skype 4 beta. Truly a painful experience!
Finally, who screwed up Internet Explorer's networking? If I have more than one NIC enabled on the system (for example, one of the VMware Workstation virtual adapters or even Microsoft's own Loopback driver), IE takes forever to load. And once it is loaded, opening new tabs is painfully slow. If I then disable the extra adapters (so only my one primary wireless connection remains), the problem disappears. So, it's clearly related to those multiple NIC drivers.
Other nits:
• Chrome (both 1.x and the newer 2.x pre-beta) didn't work at first under the x64-bit edition. I had to add the "—in-process-plugins" parameter to get it to load properly. Given my issues with IE, mentioned above, the inability to run Chrome properly was nearly a deal-breaker for me. Bullet dodged.
• The Intel 5300-series Wi-Fi connection on my Dell Precision M6400 test bed refused to wake up after a suspend/resume cycle. I had to disable power management for the device in order to regain my sanity. Of course, this means the system's battery life takes a hit, but it's better than having my laptop go "deaf" every time I put it to sleep.
• My tray icons keep disappearing! No matter how many times I tell Windows to leave them visible, it inevitably starts hiding one or more of them for no apparent reason. I've had to resort to disabling the notification icon hide function altogether and leaving them all visible, all the time. Very annoying.
• Those snazzy new Task Bar thumbnails don't always show the updated window contents. For example, when I check to see the status of a download in Free Download Manager, I often find that what the thumbnail shows and what's really in the status window (time remaining, bytes downloaded, etc.) are entirely different.
• What's with the hide-and-seek game for legacy features? The Add Hardware Wizard, which I use to install the aforementioned Loopback adapter, is now missing from Device Manager. You have to launch it from the Start Menu search box (filename is hdwwiz.exe). Also, the graphic equalizer in Windows Media Player is now buried in an unmarked, drop-down menu button thingy in the Now Playing view. Had to hit up Google to figure that one out.
On the plus side, Windows 7 definitely feels crisper than Vista. The UI is highly responsive, though I wonder how much of this is just smoke and mirrors (e.g., the Windows animation speed for minimizing/maximizing seems to be set higher under Windows 7) since benchmarks show that the system is actually performing about the same. Regardless, it's a fresh departure from my uneven Vista experience. And I'm really digging the new task bar, especially the various Aero "peek" functions. Definitely one of the more poorly understood features of the new Windows.
Bottom Line: Bugs or no, I think I'll stick around for a while -- if for no other reason than I really like that new task bar. Once you fully embrace it -- and this means using the default configuration with the big, grouped icons and no text labels -- you'll find that it's quite a bit more efficient to use than the old task bar.
I'll talk more about this next week. In the meantime, good luck downloading those beta bits!

Friday, January 9, 2009

Intel-backed Enterprise 2.0 Suite Is Discontinued

An Intel-backed suite of Enterprise 2.0 software announced with much fanfare a bit over two years ago is being put out to pasture.
SuiteTwo, announced in November 2006 at an O'Reilly Media Web 2.0 show, is no longer being sold, and its maintenance period for existing customers will close at the end of this year.
"We're going through the end-of-life process for SuiteTwo," said Dominic Sartorio, senior director of product management at SpikeSource, Intel's lead partner in the effort.
When it was announced, SuiteTwo was seen as concrete proof that CIOs, IT directors and business managers had begun seriously considering the use of Web 2.0 technology in their workplaces.
In a bundle integrated and maintained by SpikeSource, SuiteTwo included blog publishing software from Six Apart, RSS content syndication software from NewsGator, and SimpleFeed and wiki software from Socialtext.
Backing the project was Intel Capital, Intel's venture capital arm, while Intel's Software and Solutions Group would hawk it through its large OEM (original equipment manufacturer) and reseller channels. Tech Data was later brought in to help with fulfillment. SuiteTwo could be bought as packaged software, hosted software or pre-loaded into a hardware appliance.
SpikeSource has notified the about 80 companies that use SuiteTwo regarding the phaseout and will provide migration assistance to them, Sartorio said. "We're going to be there to help customers, advise them, to do what's right for their deployments and users," he said.
One customer that will seek migration assistance from SpikeSource is Clinical Trial Semantics Inc. CTSi uses SuiteTwo as a key part of its project to build a Web-based system to help cancer patients discuss with their doctors appropriate clinical trials they can consider participating in as part of their treatment.
"We probably over-invested in a platform that clearly didn't have the user base," said CTSi CEO Etienne Taylor.
So far, SpikeSource has been a very responsive vendor to CTSi during the year or so that the nonprofit has been using SuiteTwo.
"Their professional services department has helped us way above and beyond the economics of having us as a customer," Taylor said. "They've been wonderful."
However, one thing SpikeSource didn't do was alert CTSi about its plans to phase out SuiteTwo. "They forgot us. Nothing personal, I'm sure," Taylor said with a chuckle.
For now, SuiteTwo has served its purpose very well in the project's first stage of development, and CTSi, which is working with the American Cancer Society, can see viable migration options, Taylor said. "It's kind of OK. A lot will depend on what we do next with SpikeSource."
The concept behind SuiteTwo was right, said Forrester Research analyst Oliver Young. Companies are adopting blogs, wikis, enterprise RSS and other Web 2.0 technologies to improve collaboration and communication among their employees, partners and customers. "The market has moved in that direction pretty aggressively," he said.
"The problem with SuiteTwo wasn't the idea. The problem was the execution. They were trying to cobble together products from five or six independent companies, and it never looked like anything more than a bunch of applications that were duck-taped together," Young said.
Consequently, after its initial splash, SuiteTwo didn't get nearly as much attention from potential customers as its capabilities would have otherwise merited, and it became a sideshow for the partner vendors involved as well, Young said.
"SuiteTwo had a lot of great ideas [behind it] but there were shortcomings in the implementation and go-to-market strategies," said Brian Kellner, NewsGator's vice president of product management.
SimpleFeed's CEO Mark Carlson concurs. "All of our involvement pretty much stopped three or four months after the initial SuiteTwo announcement," he said.
While SuiteTwo failed to gain traction, vendor partners like NewsGator and Socialtext noticed that demand for a suite like that was real and expanded their own offerings beyond their niche areas to offer more comprehensive collaboration and communication functionality.
"We started seeing that the social side of our solutions had a lot of value to offer and we started going down that path," Kellner said.
For example, in expanding beyond their original niches, SocialText and NewsGator have replicated not only the SuiteTwo components but also newer ones, like workplace social networking, activity notification feeds and Twitter-like microblogging status updates.
"It was difficult for example in the SuiteTwo architecture to get a notification that someone had posted a new blog post, or that someone had updated a wiki page, or to pass information back and forth between the various solutions. It was a lot of work," Kellner said.
There are no hard feelings between the SuiteTwo partners and SpikeSource, whose migration plans for SuiteTwo include steering customers toward the partners. Since SuiteTwo is a superset of partners' software products, migration should be straightforward, he said.
Forrester's Young agrees. "The underlying products in SuiteTwo and their vendors are still here and innovating. It shouldn't be hard for a SuiteTwo customer to go to these vendors and put the thing back together," Young said.
Young's advice is for SuiteTwo customers to identify which component is delivering the most value for them and approach that vendor first.
SpikeSource pulled the plug on SuiteTwo in part because it wasn't in its best interest to focus on any particular software market segment, such as enterprise 2.0 products, CRM (customer relationship management) or content management, but rather to stick to its strengths: to assist ISVs with services like code testing, software maintenance and development.
As such, SpikeSource is focusing on its new Solutions Factory, launched in April 2008 and described as an automated platform for assembling, testing, packaging, certifying and updating software from ISVs. Along with the Solutions Factory launch, SpikeSource also announced that it had closed a new round of funding led by Intel Capital. Intel also uses SpikeSource's Solutions Factory services in its Intel Software Partner Program, Sartorio said.
Ironically, Intel still seems interested in Enterprise 2.0, judging by a demo of a workplace social-networking system that its CEO, Paul Otellini, gave in November at the Web 2.0 Summit in San Francisco, two years after SuiteTwo's introduction.
The demoed system included Web-based enterprise collaboration tools for social networking, blogging, wikis, online meetings and syndicated feeds. A company like Intel, with 86,000 employees worldwide, would put such a system to good use to let staffers better collaborate, obtain training and education, and find the data they need to do their jobs, he said.
Such a social-networking system for the workplace, which would require strong security and control features for IT departments, doesn't exist, he said. "I don't see any company really addressing this," Otellini said.
Maybe Otellini never paid close attention to SuiteTwo, which could have very well become such a system. Intel declined to comment about SuiteTwo.
Actually, if he moves quickly, Otellini might still be able to place an order for SuiteTwo: At press time, SpikeSource hadn't yet updated the SuiteTwo Web site to indicate that the product is being discontinued, and its ordering page remained online as well.
Asked for further information about the enterprise social-networking system Otellini had demoed, an Intel spokesman said in an e-mail that it had been "a mock-up done for the sole purpose of his keynote, with no plans to productize."
At his company's Web 2.0 Summit show floor booth in November, an amused Ross Mayfield, CEO of Socialtext, remarked: "We've had a lot of people rushing to our booth as a result of the Intel presentation."

Monday, January 5, 2009

Lenovo Brings Wii Functionality to PCs

Taking a page from Nintendo's Wii gaming console, Lenovo on Monday announced an all-in-one PC with a remote control that doubles as a motion-based gaming controller.
Like the iMac, the all-in-one IdeaCentre A600 combines a monitor and CPU in a thin system. It will be on display at the Consumer Electronics Show from January 8 to 11 in Las Vegas.
Its wireless remote control is similar to Nintendo Wii's Wii Remote, which allows users to interact with a video game by waving or pointing the game controller. Using motion-sensing technology, the Wii Remote becomes a racket when swinging during a tennis game, or a weapon when playing a fighting game.
Lenovo's gadget mimics the Wii's approach.
"We have an example of a bowling game [where] you can wave the remote and that actually controls your game," said Ninis Samuel, director of marketing strategy and programs.
The company is bundling some motion-based games with the PC to use with the remote-based gaming controller. Titles of the games weren't immediately available.
Lenovo is trying to capitalize on the trend of entertainment options merging into the PC. Few are able to play motion-based games, which could make this motion-based game controller a pioneer.
In addition to controlling TV functions and video recordings on the PC, the remote control can also be used as an air mouse that moves the mouse pointer when waved. It has some advantages over a conventional mouse -- it can function without being on a surface and be used at a distance -- when sitting on a couch, for example.
If the air mouse wasn't enough, the remote also works as a VOIP (voice over Internet protocol) handset. "If you have telephony software on your PC like Windows Live or Skype, you can use your remote to make those phone calls because it essentially can act as a phone," Samuel said.
The IdeaCentre A600 starts at a price of US$999. The desktop has a 21.5-inch screen that supports 1920 by 1280-pixel resolution for high-definition video playback. It runs on Intel Pentium Dual Core or Core 2 Duo mobile processors, supports up to 4GB of RAM and up to 1TB of storage. It includes Wi-Fi wireless networking and runs on the Windows Vista OS.
Options include the remote control, Blu-ray DVD player, a TV tuner and a Advanced Micro Devices' ATI graphics card. The desktop will be available worldwide by the beginning of March.
The desktop is part of a new portfolio of entertainment PCs that Lenovo plans to show at CES. The company is also rolling out a new laptop line, the IdeaPad Y series, which is targeted at mainstream users looking to create and view multimedia content. Lenovo has added features that can make watching movies an easier and enjoyable experience.
For example, the laptops have the "OneKey" feature, in which pressing one button "optimizes" the experience of watching movies by enhancing the sound and visuals, according to the company.
Another feature includes ambient light sensors that adjusts screen brightness based on the user's surroundings. "[It] uses a sensor on the actual lid of the laptop that senses whether or not you are in a darker or lighter room. Then it adjusts the brightness and the graphics to your environment," Samuel said. The feature is available only in the IdeaPad Y650 laptop, which has a 16-inch screen.
The IdeaPad Y series laptops come with screens ranging from 14 to 16 inches, run on Intel Core 2 Duo processors and include Windows Vista. The weight of the laptops ranges from 4.6 pounds (2.09 kilograms) to 6 pounds. The laptops will become available worldwide by the beginning of March, Lenovo said. Pricing was not immediately available.

How to Succeed in Tech in a Downturn

The economy is in trouble -- everywhere. Even outsourced providers are nervous. Already under stress, IT staffers see their jobs getting more and more difficult as they must do more with less, all while wondering if they'll keep their jobs at all.
That's why you need a plan for your tech career. The worst thing you can do is give up or panic. Although tech jobs are under increasing pressure, the reality is that the technology jobs market overall is still doing better than the market for other types of jobs. That doesn't mean you're immune from layoffs, stagnant salaries, or increasing workloads, but it does mean you have more options than many other workers -- if you're willing to be flexible.
[ InfoWorld has put together a special package of stories to help IT workers through the current tough times. Among the highlights: * Slideshow: Where IT jobs are headed * Special report: 2009 IT career survival guide * Special report: Where the tech jobs are overseas (and how to get one) * Special report: Tech workers under fire * Special report: IT and the financial crisis * Get sage advice on IT careers and management from Bob Lewis in InfoWorld's Advice Line blog and newsletter. ]
First, the bad news on tech jobs. There's plenty of data to support the fears that many tech workers have about their job security and ability to make ends meet. For example, more than 50,000 tech workers lost their jobs before the financial meltdown hit, and more jobs are in danger.
That trend translates to income pain for even the survivors. According to the 2008 salary survey by our sister publication Computerworld, bonuses for IT workers rose only 0.2 percent from 2007 levels. At a time when 3 to 4 percent salary raises are failing to keep up with inflation rates that are rising above 5 percent, those dwindling bonuses are making tough times even more challenging for IT professionals.
And stress levels are up. That same Computerworld survey shows that only 14 percent of respondents did not feel more stressed than a year earlier. Shrinking budgets are one reason. "Companies are in the mind-set of not spending in the next 3 months and increasing only 1 or 2 percent in the next 12 months. That's quite a change from last year when it was between 7 and 8 percent," notes Steve Minton, vice president of worldwide IT markets at IDC.
Having desirable tech skills is keyThe United States and Europe appear to be especially hard hit, though the downturn is being felt worldwide. Still, tech workers might consider moving to China, Canada, or other stronger markets where the demand for IT skills -- and the opportunities to develop new ones -- remains good. A move abroad may also give you more than technical skills: It can make you more appealing to companies that have global teams, an increasing reality everywhere.
To remain competitive, IT workers need a combination of the 30 essential basic skills -- including, according to one survey, strong ethics and morals -- and abilities in emerging recession-proof areas where demand remains high, such as security, VoIP, and wireless. And don't forget about not-so-hot areas that are critical to companies' abilities to keep running: Cobol skills can be great job insurance, for example. Also, look to certain skills that have been hot for a while and, thus, tend to be neglected, such as open source, .Net, and Java.
Certifications also can help, especially management ones. But beware: Certifications are not equally valuable. Some are simply expected -- and may be necessary to even be considered for a job -- while others are superfluous. That's especially true for technical certifications; outside of security and networking, they're not proving that valuable. Those that tend to give you an edge involve management and business-specific training -- skills that business managers more easily understand than technical ones.

Going Green to Save the Green

Could saving the Earth -- and your company's bottom line -- be as simple as using fresh air to cool your data center?
It's not quite that simple, but it can be one step toward those goals, because companies that use natural air to cool their facilities are seeing big benefits on both the environmental and financial fronts. In fact, IT leaders, analysts and environmental advocates say there are plenty of opportunities for tech organizations to create more Earth-friendly operations that cut energy needs and slash a company's carbon footprint while saving money, too.
But many organizations still aren't capitalizing on such initiatives -- even the ones that are relatively easy and inexpensive to implement.
IT executives who responded to Computerworld 's annual Forecast survey seem to echo that reluctance. Nearly half (42%) said their IT departments have no plans to launch projects in the next 12 months to reduce energy consumption or carbon emissions, and nearly three quarters reported no plans to create committees to oversee energy-saving initiatives.
Yet experts say organizations that ignore green computing now are going to have to catch up if they want to stay competitive. "The green issue is not going to go away. There's too much at stake," says Rakesh Kumar, an analyst at Gartner Inc.
That's not to say IT leaders don't have their reasons for staying away from green computing. Kumar says some of them think it's a fad. Christopher Mines, an analyst at Forrester Research Inc., says others believe global warming is a crock and that there's no need to act on the issue, or they see green as merely increasing expenses.
Many others are nervous about reworking established systems and processes. "The last thing these people want to do is take a screwdriver to IT processes that work and start re-engineering them to make them more efficient," Mines says.
Early Adopters
Increasingly, however, IT leaders and other executives are putting aside such concerns and pushing for green IT initiatives.
When IDC surveyed 300 CEOs for its September 2008 "U.S. Green IT Survey," 44% of the respondents said that IT will play a very important role in their organizations' efforts to reduce their environmental impact. Compare that to the 2007 survey, in which only 14% of CEOs said they felt that way.
The 2008 survey also showed that energy costs were the most pressing reason for the adoption of green IT.
"We don't see many or indeed any companies that are hesitant to explore green IT projects," IDC analyst Vernon Turner wrote in an e-mail on this topic. "In fact, the scary thing is where to start, and it may be that reason why there is somewhat a feeling of lost souls. There has been a lot of marketing by the IT vendor community around green, and I think that CEOs and CIOs are 'green-washed' by it."
To be sure, developing enterprisewide green policies is a major undertaking. On the other hand, IT departments can implement some green IT initiatives without reworking entire policies, processes and procedures -- and without spending a lot of cash.
Moreover, they can sell management on these projects based not just on the initiatives' environmental merits but on their financial rewards as well.
"A lot of stuff is going to give you a short-term payback," Kumar says. He says that given today's economy, CIOs should focus on green initiatives that will have paybacks well within 18 months. Projects with such quick ROI range from reducing energy demands by enabling more telecommuting and teleconferencing to consolidating data centers, he says.
"These, in our opinion, equal green IT," Kumar says.
With so many focused on reducing energy demand, IT organizations can easily sell initiatives that reduce power consumption -- a quick way to save money and become green, says Katharine Kaplan, product manager at Energy Star for Consumer Electronics and IT at the U.S. Environmental Protection Agency.
"Power management is probably one of the easiest, low-cost ways to get big, big savings," Kaplan says, pointing out that using power management features on desktop PCs can save $50 per computer per year. Enabling power management tools on monitors can save another $12 to $90 annually per monitor.
Becky Blalock, senior vice president and CIO at Southern Co., an Atlanta-based energy company, says her organization is implementing power management technology to ensure that its 26,000 desktops are asleep at night and during other times of inactivity. Although the numbers aren't in yet, Blalock says she expects high savings throughout the organization.
Managing desktops is just the start, says Henry Wong, senior staff technologist in the eco-technology program office at Intel Corp. He points out that better asset management is another simple step that can cut energy demand and costs. Just examine your operations to identify and turn off any device that isn't used or needed.
Mark O'Gara, vice president of infrastructure management at Highmark Inc., a health insurance company in Pittsburgh, says he's examining the need for any device that draws power -- any fax machine, printer or copier -- and figuring how to reduce its energy demands by either using power management tools or getting rid of the device. He says he's working with the company's facilities department to get baseline readings so he'll be able to measure progress.
"You can start to see what energy we use, find opportunities to reduce power costs and find ways to reduce it through capital improvements," O'Gara says.
Another quick way to introduce green benefits that have financial paybacks is through refresh initiatives and procurement policies, says Michelle Erickson, initiative director of the sustainable IT program in global operations and technology at Citigroup Inc. in New York. For example, Citi is looking at implementing thin clients, which, because they have lower power needs, save money and reduce the company's carbon footprint.
Erickson also recommends setting procurement policies that specify that new equipment must be Energy Star-complaint, thereby ensuring that the company is getting more energy-efficient computers. And with new Energy Star standards rolling out in 2009, the policy will apply to servers too.
Similar strategies can be employed in the data center, Wong says. Look at the machines you have, and consolidate where you can to maximize the use of each server -- but make sure that you can still meet the needs of your business units.
"We did this at Intel and had a $3 million cost avoidance," Wong says. The dollar savings came from not having to build a new physical structure and pay for that new building's ongoing maintenance. As for the green benefits, there's less demand for power and new equipment.
"You can see another building that doesn't have to exist anymore. And it's the HVAC system, the people, the maintenance area -- it's not just IT. That's a really big to-do," Wong adds.
But even organizations that aren't ready for those kinds of projects can simply start by controlling the temperature, Wong says. Although it will be necessary to monitor the humidity when doing so, most companies can raise the temperature at least a few degrees and start lowering their air conditioning demands. And don't forget about using that natural air for cooling.
It might not be the biggest step, but it's a start.