Tuesday, September 30, 2008

Clickjacking Vulnerability to Be Revealed Next Month

After shelving plans to detail a browser clickjacking vulnerability that is indirectly related to Adobe Systems' products at the company's request earlier this month, a security researcher plans to detail the flaw next month.
Jeremiah Grossman, chief technology at White Hat Security, will discuss the vulnerability at the Hack In The Box (HITB) conference in Kuala Lumpur, Malaysia. "We have no ETA on Adobe fixes, but we're hopeful that it'll be weeks and not months. Whether or not they 'patch,' it will not change the content of my keynote speech," he wrote in an e-mail.
Grossman was scheduled to detail the clickjacking flaw with Robert Hansen, CEO of SecTheory, at the Open Web Application Security Project conference in New York, but they pulled the presentation at Adobe's request. The hackers said no pressure was put on them, but Adobe wanted time to study and address the vulnerability before it was made public. "This is not an evil 'the man is trying to keep us hackers down' situation," Hansen wrote on his blog at the time.
Clickjacking is an attack where a user clicks on a button in a browser, thinking the button will perform a specific function, such submitting a news story to Digg, but instead an attacker hijacks the button to use it for another purpose. The vulnerability is "obviously scary enough for Adobe to call it a critical issue and ask for more time, even though they were only indirectly affected," Grossman wrote in an e-mail.
Over the weekend, Grossman and Hansen planned to inform Adobe of their intent to proceed with the presentation and make the proof-of-concept code they developed available.
"We gave Adobe time out of courtesy because they asked and we have a good working relationship with them. They are using the time productively, but we could not agree to another delay," Grossman wrote. "Our belief is clickjacking as an issue is not a problem in their software, but with browsers in general. It would not be fair to the others that it does impact to be without the information they need."
HITB will be held in Kuala Lumpur from Oct. 27-30.

Look Ma, No Keys!

Cell phones and cars are unusual driving partners but a new phone from Sharp may actually become a driver's best friend.
The phone is the first with a built-in electronic car key. It's compatible with Nissan's "intelligent key" function, which works with 950,000 Nissan vehicles across various model lines, and was developed by the auto maker with Sharp and Japanese cellular carrier NTT DoCoMo.
The phone is equipped with the same features of a normal Nissan intelligent key.
For example, the driver can easily lock or unlock the car from as far as one meter away with a press of a button. It can also be used to start or stop the car, providing that the cell phone is inside the vehicle. An electromagnetic sensor detects the presence of the phone, allowing the driver to start the engine by simply pushing the ignition button. In addition, one can open compartments such as the trunk by just carrying the phone as the same proximity sensors are at work.
The flip phone is based on Sharp's SH906i model, the only difference being the addition of two buttons in the phone's upper right corner that are used to lock/unlock the vehicle. NTT DoCoMo will provide the mobile service along with an additional security function that allows subscribers to disable the cell phone's "intelligent key" function in case they misplace the phone.
In Japan, cellphones go beyond the usual call, text and photo functions. Features intended to make daily life more convenient are commonplace and phones can be used as train passes or even as a wallet for shopping.
"A lot of people have been requesting integrating the car key into the cell phone," says Keiji Ohhira, a Nissan engineer who was demonstrating the system at the Ceatec electronics show that began Tuesday in Japan. "We took their concern into consideration and contacted Sharp."
At the moment, the companies only have a prototype of the phone but there are plans to commercialize the handset within 2009.

Wednesday, September 24, 2008

Microsoft: We're Not Afraid of the Cloud

Microsoft has been busy this year, rolling out Windows Server 2008 and SQL Server 2008 in a push to expand its presence in the corporate data center. To be successful, the company must overcome an economic environment that appears increasingly difficult as well as tough competition from rivals Oracle and VMware, among others
Rob Kelly, Microsoft's corporate vice president of infrastructure server marketing, sat down with IDG News Service to discuss the adoption rate of Windows Server 2008, Microsoft's plans for cloud-based services, and the recent declaration by Paul Maritz, an ex-Microsoft executive and current president and CEO of VMware, that the traditional server operating system "has all but disappeared."
What follows is an edited transcript of that interview:
IDGNS: We haven't heard much from Microsoft about user adoption of Windows Server 2008 and SQL Server 2008, can you give us an update on how things are going? For example, are you finding many customers switching from Oracle?
Rob Kelly: At the end of the day, our job is to build software that customers choose to run their businesses on. My job is not out there to beat Oracle, to take Oracle's business. I just want to do a better job of satisfying the customer and, ultimately, they'll choose me for that reason.
The products have been exceedingly well received. We've never had faster adoption of the OS as we've seen with Windows Server 2008, for lots of reasons: the quality of it, the workload focus that we have, some of the efficiencies that are built into it around things like power management, that sort of thing.
IDGNS: When you say this is the fastest adoption you've ever seen, can you quantify that for me?
Kelly: We're taking more share than we've ever taken of the x86 server world, whether you look at IDC or Gartner. We will probably make it to that magical million licenses per quarter run rate very quickly, and that's a phenomenal rate of adoption.
The other thing that's happening on Windows Server 2008 is our mix of premium SKUs (stock keeping unit), driven almost exclusively by virtualization and the rights around virtualization that we've put in those SKUs. Just think of it this way: We have three main SKUs for Windows Server 2008. If you want to run one virtual machine, you choose Standard. If you want to run four virtual machines or fewer, you run Enterprise Edition. If you want unlimited virtual machines you run Datacenter Edition.
Our premium SKU mix, on a global basis, has more than doubled over the last two years since we made the licensing change, and over the last six months it's been unbelievably fast.
IDGNS: Last week we saw Paul Maritz, the CEO of VMware and a guy who knows Microsoft pretty well, proclaim that the server operating system is obsolete and his company will build a virtual data center OS to manage applications. How should we interpret his comments? Is there substance to his claim?
Kelly: The death of Windows has been foretold many times. It's in their best interests to position it that way.
Secondly, as I told some internal folks the other day, I know Paul and he's a great guy. I don't envy him where he is right now, because he just entered into a hornet's nest. Not so much because of Microsoft; we treat virtualization as an enabler. It's a feature of the operating system, as it's always been in mainframes and as it is in Linux distributions. What he's just done is enter a space that's crowded by his partners, and I'll give you the one that's canonically challenging for him: Hewlett-Packard.
HP has spent, literally, billions of dollars buying companies that would allow them to be the governor of the data center. They bought Opsware, they bought EDS, they bought Neoware. They bought a whole bunch of interesting assets for themselves. VMware came out and said, "We want to displace them." That's a very tough spot to be in.
As they try to move out of what's quickly becoming a commoditized world, where all these very high margins go away, VMware has stomped into a space where all of their partners are scratching their heads. The buzz on the show floor last week was, "Oh, that's our stuff. That's what we do. Now they want to do this?" It's a fascinating thing, actually. I don't envy Paul.
IDGNS: The news coming out of Wall Street has been pretty grim lately. How does Microsoft view the economic situation? Do you see foresee a point where a worsening economy hurts user adoption of Windows Server 2008?
Kelly: At the core of what we're trying to do is help customers deliver a whole new set of experiences at the lowest cost possible. The first job is help them strip out cost, because that's what they live with every day. We also have to help them deliver new capabilities, because that's what drives their business. Everything is about helping customers move from a high cost, maintenance-focused IT environment to a much more dynamic, business-responsive IT architecture.
When times get tough, customers are constantly on the path to do more with less. Software really delivers the best bang for the buck. Customers tell us when their IT department is strapped, they start asking whether this next machine, this next application or maintenance check is the right way to spend their IT dollars. Or should they move to the economic platform, the one that's actually driven the innovation over the last 10 or 15 years and get on that wave now?
IDGNS: But is the investment required for companies to replace their existing systems with new ones going to be less than the marginal cost of maintaining those existing systems until the economic environment improves?
Kelly: It depends on the customer. A great advantage of Windows Server 2008 is our virtualization technology. It used to cost you $3,000 to buy the Enterprise Edition license and then you had to pay another $3,000 for each virtual server running on top of that, up to four. Now, because of the licensing rights change we allow you to run four virtual machines on top of that for no additional cost. Now, for $3,000, you can run five machines, the base plus four virtual machines.
If you're a Windows customer, for a very marginal outlay of price you can consolidate your physical environment into a virtual environment, which reduces your total footprint of servers. You get a huge power savings and your management costs go down.
IDGNS: Oracle and Amazon announced a deal that makes Oracle 11g and other products available as part of Amazon's Elastic Compute Cloud service. Is Microsoft headed down the same path?
Kelly: We're telling everybody to hold their horses, wait until the PDC. At the Professional Developers Conference in late October we will be more concrete about what our plans are, what our offers are in the cloud. But you could probably guess to a large degree. We are a platform company and we are going to offer platform elements in the cloud.
From a developer's standpoint, it will be coherent across the on-premises experience and the cloud-based experience, so you can leverage the learning you already have. If you're a customer, it's a consumption model. I want to consume an application, but do I want to consume it on the premises or in the cloud? We'll make it coherent.
Without giving away a whole bunch of interesting secrets and all that stuff we're holding for PDC, there is nothing that is frightening to us about the cloud. It's just a different delivery vehicle. There's nothing frightening to us about other vendors getting into the space. We believe in the platform and we believe that because we have a platform that customers have chosen with their feet, and their dollars, in the on-premises world, we will be able to deliver that same value proposition in a cloud-based world.

Rozwat Talks About Cloud, but Holds Back on Fusion

Chuck Rozwat, Oracle's head of product development, largely deflected questions about hotly anticipated technologies such as Fusion Applications during a Q&A session with reporters at the OpenWorld conference in San Francisco on Tuesday, but did reveal more details of the vendor's plans around cloud computing.
Intel and Oracle announced Tuesday that they are collaborating to "accelerate enterprise readiness of cloud computing." The partnership will focus on three areas: Software performance and power efficiency; improved virtual machine security; and the promotion of standards for porting virtual machine images and provisioning cloud services.
Oracle will likely seek a similar collaboration with Intel's rival, Advanced Micro Devices, Rozwat said.
"When we do announcements with one particular vendor, unless there's some big exclusive tag associated with it, it typically is something we might do with other vendors as well," Rozwat said.
That announcement followed Monday's news that Oracle will make its 11g database, Fusion Middleware and Enterprise Manager products available on Amazon's Elastic Compute Cloud (EC2)
The Amazon deal is also by no means exclusive, Rozwat indicated: "We will be making subsequent announcements for other cloud computing environments."
But Rozwat was less forthcoming regarding other topics, particularly Fusion Applications, which have been dogged by questions of delays.
Oracle has unveiled a handful of Fusion Applications so far, centering on CRM (customer relationship management). Rozwat also classified a recently released enterprise performance management product as a Fusion application.
On Sunday, another company executive said during a panel discussion that Oracle hopes to get the initial Fusion Applications product suite into the hands of early adopters starting in 2009.
But Rozwat refused to reveal any other concrete information about Fusion Applications, and characterized his reticence as standard company practice.
"Across all our products, we talk about general directions but we try to avoid giving very specific features or specific dates," he said. "If you look at all our presentations this week, you really won't see anything too far into the future that starts giving feature lists or product names. ... We're treating Fusion applications no different than any other product in that regard."
Neither would the executive pull the curtain back on Oracle 11g R2, for which Oracle is recruiting beta testers this week. Oracle did deliver a patch set for 11g Release 1 this week, but is not "quite ready to talk about 11g R2 and exactly when that will be available," Rozwat said.
When it does ship, 11g R2 will make it easier to employ and manage grid computing and Oracle's Real Application Clusters (RAC) database-clustering technology, and will include better diagnostic and monitoring tools, Rozwat said.

No Charges as Grand Jury Investigates Palin Hack

A federal grand jury investigation into the compromise of vice presidential candidate Sarah Palin's Yahoo account has apparently concluded its first day of meetings without an indictment.
Local press has reported that University of Tennessee student David Kernell is being investigated as a suspect in the crime, which occurred last week. Late Tuesday, U.S. Department of Justice spokeswoman Laura Sweeney said that no charges had been filed in the case.
The Chattanooga Times Free Press reported that federal agents and three students had reported Tuesday morning to the courthouse, where a federal grand jury was investigating the case. The grand jury stopped for the day around lunch time without handing down an indictment, the newspaper said.
Sweeney declined to comment on the grand jury or on whether Kernell was a suspect in the matter. Grand jury proceedings are secret until an indictment is delivered to the court.
Although Kernell has not been officially named as a suspect, he was fingered by bloggers who linked him to the online name "rubico," used by the hacker who claimed to have accessed Palin's Yahoo account last week. Information from that hack, including personal messages to and from Palin, were published on the Wikileaks Web site.
Rubico claimed to have accessed the account by using Yahoo's password reset feature and answering security questions with publicly available information. He guessed correctly that Palin had met her husband at Wasilla High.
Kernell, 20, is the son of Mike Kernell, a Democratic state representative from Memphis. David Kernell's Knoxville, Tennessee, apartment was searched over the weekend by U.S. Federal Bureau of Investigation agents, according to a local report.
Sweeney confirmed that there had been "investigatory activity late Saturday and early Sunday morning in Knoxville related to the government's inquiry."
Last week the McCain-Palin campaign called the hack "a shocking invasion of the Governor's privacy and a violation of law." Palin is the governor of Alaska.

EC Proposes Text Messaging Cap on Mobiles Abroad

Having slashed the price consumers pay for calling on their mobile phones while abroad, the European Commission showed on Tuesday how it plans to do the same thing to the price for sending texts and downloading material from the Internet while abroad.
EU citizens last year sent 2.5 billion text messages, generating €800 million (US$1.2 billion) in revenue for the mobile phone operators.
Consumers pay, on average, €0.29 for sending a text message from their mobile phones while outside their home country. The Commission said this is excessive and proposed a law that will cap the so-called roaming fee at €0.11 per message.
The proposed law, which must be approved by the European Parliament and national governments, also calls for mobile phones to alert users when a large amount of data is being downloaded onto them. This is to avoid what the Commission calls "bill shock." It cites an example: A person was charged €40,000 for downloading a TV program onto his mobile phone while abroad.
It also suggests a maximum fee that mobile-phone operators can charge each other (wholesale roaming cost) for carrying data across borders. The Commission proposed the round figure of €1 per gigabyte of data.
The mobile phone industry hit back Tuesday, claiming that prices overall are falling by 13 percent each year. Price setting by regulators "is not healthy," said David Pringle, spokesman for the GSM Association, a trade group representing mobile operators in Europe.
He added that it is too early to update the previous roaming law because operators still haven't assessed the impact of the changes imposed last year.
"We don't know what the knock-on effects of last year's price caps will be on, say, competition," he said. "As smaller mobile operators are hit relatively harder than the largest ones, the price caps could stunt competition," he said.
The proposed new law is an update of the 2007 roaming regulation. In addition to tackling excessive prices for text messages and data transfers, it will also force operators to reduce further the price of calling from abroad.
Last year voice roaming costs were capped at €0.46 for calling and €0.22 for receiving calls. The Commission said Tuesday it wants to reduce those caps to €0.24 and €0.10, respectively.
And it wants roaming costs charged per second rather than per minute. Consumers are paying an average 24 percent too much for calls abroad because call lengths are rounded up to the whole minute.
"Using your mobile phone abroad in the EU should not cost unjustifiably more than at home, whether for making calls, sending texts or surfing the Web," said Viviane Reding, the commissioner in charge of telecom.
"If Europe wants to deliver concrete results for its 500 million consumers, then practices whereby operators charge for a service which they do not deliver should not be acceptable," said consumer affairs commissioner Meglana Kuneva, referring to the practice of charging per minute instead of per second.
Fighting for lower mobile-phone costs has proved to be the Commission's most popular policy initiative in many years, winning plaudits from even the most ardent euroskeptical newspapers in the U.K.
Commission president Jose Manuel Barroso urged national governments and the European Parliament to work fast so that the new roaming regulation can come into force before people go away on holiday next summer.
"If we get this done quickly we will see tremendous growth in SMS and data services," he said.
The mobile phone industry accused the Commission of pursuing a "short-term political agenda" rather than looking at the long-term interests of the EU, said Pringle.

Tuesday, September 23, 2008

Adobe Unveils CS4 Suite

Adobe on Tuesday will unveil Creative Suite 4 (CS4), the latest update to the its suite of high-end applications for print, Web, and video professionals. Available in October, the suite consists of 13 products, 14 integrated technologies, and seven services.
One of the biggest changes users will notice right away is the new tabbed interface for applications in CS4. Adobe has given all of its applications the ability to use tabs, so you can have multiple tabs open at the same time, but still be in one window.
Files and objects can be shared across tabs by clicking and dragging them and holding them on a new tab. Of course, if you prefer the multiple window interface of the old Adobe products, you can still use that too.
Major standalone products released as part of CS4 include Photoshop, Photoshop Extended, InDesign, Illustrator, Flash Professional, Dreamweaver, After Effects, and Adobe Premiere.
If there is a consistent theme in Creative Suite 4, it's integration. Adobe integrated its Flash multimedia application across the CS4 applications. The company also focused on integrating common functions in other apps, too, as part of an effort to make pro users more efficient.
"One of the things we are really committed to is helping customers express their ideas," Chad Siegel, group product manager for Adobe Creative Solutions, said. "We aren't just about providing the tools, we are about making a platform to help them express themselves in engaging ways."
Trying to come up with solutions to challenges for one product the size of Photoshop is a daunting task for any company, but coming up with solutions for eight or 10 products can seem almost impossible.
Siegel said Adobe spent a lot of time with its customers, watching what they were doing on a daily basis. Isolating the workflow issues of a variety of pro users helped the company narrow the focus of where CS4 needed to go.
According to Siegel, Adobe focused on four key challenges across the suite of applications--effective collaboration between designers, developers, specialists within workflows, creative pros and clients; support for mobile devices and new media; the embrace of new media; and tracking, managing, automating and deriving value from assets.
To allow collaboration Adobe now offers a set of online services on Acrobat.com to allow designers and clients to create and share documents. Groups can also use Adobe ConnectNow to meet live over the Web and share a computer screen. This gives designers the opportunity to get immediate feedback and make any necessary changes to a design without sending documents to clients and waiting for them to be returned.
Focusing on the mobile platform, applications across the suite are now more tightly integrated with Adobe Flash, which allows designers to use the SWF format for mobile devices.
Being able to use new media was a major factor for many companies that Adobe spoke with since the last Creative Suite release. When building the new video tools, for example, Adobe had to consider that while broadcast revenue is an understandable business model, businesses are looking beyond that now. To help them, Adobe worked to help them understand their assets and how they could be used.
CS4 includes an expanded version of Dynamic Link in the Production Premium suite that enables users to move content between After Effects CS4, Adobe Premiere Pro CS4, Soundbooth, and Encore, so updates can be seen instantly without rendering.
Other new tools like Photoshop's Content-Aware Scaling automatically recomposes an image as it is resized, preserving vital areas as it adapts to new dimensions.
Adobe made some big changes with sharing and collaboration in CS4. Adobe ConnectNow, can be accessed from InDesign CS4, Illustrator CS4, Photoshop CS4 and Photoshop Extended CS4, Flash CS4 Professional, Dreamweaver CS4, Fireworks CS4, and Acrobat 9 Pro, allowing real-time collaboration with two colleagues or clients.
Adobe Kuler, a Web-hosted application for generating color themes, is now accessible from within InDesign CS4, Illustrator CS4, Photoshop CS4, Photoshop Extended CS4, Flash CS4 and Fireworks CS4, and can be shared by designers. Kuler is available for owners of the individual applications or the suites.
Adobe Creative Suite 4 will be available in October. Adobe Creative Suite 4 Design Premium will cost $1799 and includes InDesign, Photoshop Extended, Illustrator, Acrobat, Dreamweaver, Flash Professional and Fireworks; Adobe Creative Suite 4 Web Premium will be $1699 and includes Dreamweaver, Flash Professional, Fireworks, Photoshop Extended, Illustrator, Soundbooth, Acrobat and Contribute; and Adobe Creative Suite 4 Production Premium is $1699 and includes After Effects, Premiere Pro, Photoshop Extended, Illustrator, Flash Professional, Soundbooth, OnLocation and Encore.
Adobe Creative Suite 4 Master Collection costs $2499 and includes InDesign, Photoshop Extended, Illustrator, Acrobat, Dreamweaver, Flash Professional, Fireworks, Contribute, After Effects, Premiere Pro, Soundbooth, OnLocation and Encore.
Adobe will offer tiered upgrade pricing from previous versions depending on the applications you own. Details of the upgrades are available from Adobe's Web site.

End Office and Windows Annoyances

Deal with e-mail hassles, Office file-format woes, and lost passwords--and see what happens when a wolf wants to play.
What do Outlook, Office, and Windows have in common--other than they're from Microsoft? They can sure be annoying. This week I've got a handful of tips for dealing with some of the hassle--like weird attachments, messy e-mail quotes, confounding file formats, and lost Windows passwords.
Eliminate Annoying Winmail.dat Files
A friend asked me about a weird e-mail attachment he constantly receives from one his friends. "The file's unreadable," he said, "I can't find a program that can view, decode, or convert the thing." He said the file's always the same--winmail.dat.
My friend is using the ancient e-mail tool Eudora and is receiving e-mail from someone using Outlook.
Quick aside: Don't even consider switching to Eudora; it's no longer being sold or even upgraded. There's a new version in the works, but don't hold your breath.
The winmail.dat file is generated from Outlook (or Microsoft Exchange) and it's loaded with Rich Text Format code--italics, bold, and font info. So the file appears as an attachment, but it is useless to my buddy.
If you're in the same boat--or the Outlook user--it's easy enough to turn off the option to send Rich Text Formatted e-mail messages. In Outlook 2003, choose Tools, Options, then select the Mail Format tab, click Internet Format, and choose Convert to Plain Text Format from the menu. Click OK, then OK again to save the change.
It may also make sense to stop using Word as the e-mail editor. Do that by selecting Tools, Options, and clicking the Mail Format tab, then deselecting "Use Microsoft Office Word 2003 to edit e-mail messages."
Fix Outlook's Messy E-Mail Quotes
Did you ever notice how well Outlook can mess up e-mail quotes? You know, the way it takes and causes line breaks to make your messages look like hell?
For the cost of a download, you can make your Outlook e-mails look sharp and, more important, easy to read. The trick is Outlook-QuoteFix, a add-in that automatically takes out the tedium of fixing Outlook's quoting style.
Outlook-QuoteFix works perfectly in Windows XP and with Outlook 98 through 2003; there's a macro that makes it compatible with Outlook 2007.
Using Office 2003? Download These Converters
I know the hassle: Someone sends you an Office 2007 Word or Excel document. You can't open it because you're old-fashioned (or a cheapskate, like me) and decided to stick with Office 2003.
The fix is a quick download away. Microsoft has a compatibility pack that updates your old version of Office 2000, Office XP, or Office 2003 to let you open, edit, and save Office 2007 file formats (such as .docx).
If you have friends who are older than I am and still have ancient Wordstar, Lotus AmiPro (gosh, remember that program?), or even dBASE II files, there's an official converter pack for you, too.
Reset Your Windows Admin or User Password
You ever forget your login password? Probably not, because it's 1234. (LOL--just kidding. I know you use strong passwords.) Nearly two years ago, I told you how to find it; see "Find Old E-Mail Messages Quickly" and scroll to "I forgot the password"). Unfortunately, that fix would have cost you $70, about a fill-up for your car.
I have terrific news. PC Login Now is a freebie that gives you a fairly straightforward way to reset Windows Administrator or user password. The strategy is to download the ISO, burn it onto a CD, and boot from it. The wizard talks you through the steps. The tool works in Windows Server 2008, 2003, Vista and XP.
One more thing before you leave: If you need to find the product key for any office product, read how to do it in "Five Smart Fixes for Dumb PC Annoyances."
This Week's Roundup of Time Wasters
You ever get annoyed when people sling around acronyms and you haven't a clue? Get a clue with Acronym Finder, a handy site that looks up the pesky shorthands, such as WTF, SOL, HIPAA, and, well, you get the idea.
Want to see what you looked like back in 1970? Upload an image to Yearbook Yourself and let the site work its magic. Carefully, there's music and if you click on one of the pink, diagonal lines, you'll end up at a shopping site.
In this U.S. Geological Survey video taken in Glacier National Park, a solitary wolf tries playing with a momma bear and her two cubs. The bear, needless to say, ain't in the mood for this kind of nonsense.
There's a ton of water coming out of a drain along a highway and it appears as if a sewer cover popped onto the road. Some poor schnook stopped, got out of his car, and, well, watch the video. (I hope the guy kept his mouth closed...)

Preview: Adobe Creative Suite 4 Emphasizes Workflow, Web

Simplified workflow system, increased Flash integration, and community expansion are a few of the new features in Adobe's new software suite.
Adobe Systems today announced the next generation of Adobe Creative Suite products in its biggest software release to date. Among the highlights of Adobe Creative Suite 4 are a simplified workflow system, new tools for integrating Flash animations into projects, and expanded support for community features.
Creative Suite 4 remains a cornucopia of individual applications, most of which are sold independently, too. But with CS4, Adobe tightens the links among the applications--evidence that the company recognizes designers' need to switch seamlessly between programs without leaving a project. Adobe addresses this need with a newly simplified workflow, which will permit users to design across media more efficiently and to complete common tasks more easily.
For example, an expanded version of Dynamic Link in CS4 Production Premium will enable designers to move content between After Effects, Premiere Pro, Soundbooth, and Encore, and to see their updates instantly without rendering.
Illustrator CS4 now lets designers work from multiple art boards at once. Designers can pull up art boards from within InDesign, and can then drag and drop elements from those boards into page layouts. This tool has promise: It can be especially helpful for coordinating illustration and text for books or pamphlets, for example.
InDesign CS4 also includes a Live Preflight tool to help designers catch production errors, and a customizable Links panel for placing files.
The new Content-Aware Scaling tool in Photoshop automatically recomposes an image as it is resized, preserving vital areas as the image conforms to new dimensions. Photoshop CS4 also will have 3D capabilities, which will enable you to take a 2D image and wrap it around a 3D wireframe, such as a sphere.
With this latest iteration, Adobe promises a more intuitive process for creating what it calls "life-like" animations in Flash CS4. The new version has the ability to apply tweens to objects rather than to keyframes, giving designers greater control over animation attributes. And the Bones tool helps create more realistic animations between linked objects. These new Flash features allow designers to directly produce animations.
A big addition to Creative Suite 4 is its collaboration tools. Adobe ConnectNow, accessible from most applications in the suite, allows up to three people to collaborate in real time. This feature should be useful to designers who want hands-on input from their clients. In addition, designers can share color harmonies via the Adobe Kuler, have technical questions answered in Adobe Community Help, and access a media and tutorial library through Resource Central.
Adobe is offering prospective users a choice of six suites or full-version upgrades of 13 stand-alone applications, including Photoshop, Photoshop Extended, InDesign, Illustrator, Flash Professional, Dreamweaver, After Effects, and Premiere Pro.
The estimated street price of Adobe Creative Suite 4 Design Premium is $1799. Other suite versions that will be available are Adobe Creative Suite 4 Web Premium ($1699); Adobe Creative Suite 4 Production Premium ($1699); and Adobe Suite 4 Master Collection ($2499). Adobe will offer tiered upgrade pricing from previous versions.
Last May, customers got a preview of the next-generation versions of Dreamweaver, Fireworks, and Soundbooth, when Adobe released beta versions for download on its Adobe Labs site. CS4 will reach the market in mid-October 2008.

Oracle Mum on 11g Release 2, 11g Express Edition

Oracle has filled the schedule of its OpenWorld conference with sessions hyping the various features in its 11g database, which was launched in July 2007. But the company doesn't plan to deliver new details of the anticipated 11g Release 2 (R2), or a ship date for 11g Express Edition (XE), the free version popular among developers.
The vendor is doing some recruiting this week for the 11g R2 beta test, but otherwise has no plans to make any announcements, said Andrew Mendelsohn, senior vice president of server technologies. said in an interview following a keynote address Monday.
"We're going to do an 11g XE but it will be sometime after the 11g R2 time frame," added Mendelsohn, who oversees the development of Oracle's database. He declined to provide firm dates for either release.
Instead, Oracle seems ready to make news at OpenWorld around complementary technologies which add performance to the core database. Particular attention and speculation is on CEO Larry Ellison's keynote address scheduled for Wednesday, which is titled "Extreme. Performance."
Mendelsohn urged showgoers to watch the speech during his talk Monday.
As for Mendelsohn's remarks, they seemed geared more toward getting customers to adopt the initial release of 11g, focusing on topics such as its new features and the possible upgrade paths.
Oracle has not released hard adoption numbers for 11g -- which according to some estimates is seeing slow uptake -- but has said the software has been downloaded more than 450,000 times, and that adoption is "on pace" with 10g.
In addition, on Monday the company issued a press release stating that customers from "across all industries" have upgraded. Named customers in the release include Ely Lilly and Novartis.
Meanwhile, 11g R2 does appear to be moving toward completion.
Ian Abramson, a Toronto-based data warehousing consultant and president of the Independent Oracle Users Group, said his organization is scheduled to be briefed on the 11g R2 beta program under a nondisclosure agreement Tuesday.
"I think we'll understand their timelines a lot better [after the briefing]," Abramson said. "We already have a number of great volunteers who are ready to [join the beta program]."

Accused of Tolerating Scammers, an ISP Goes Dark

The lifeline linking notorious service provider Intercage to the rest of the Internet has been severed.
Intercage, which has also done business under the name Atrivo, was knocked offline late Saturday night when the last upstream provider connecting it to the Internet's backbone, Pacific Internet Exchange, terminated Intercage's service.
Intercage president Emil Kacperski said Pacific did not tell him why his company had been knocked offline, but he believes it was in response to pressure from Spamhaus, a volunteer-run antispam group, which has been highly critical of Intercage's business practices. A spokesman for Pacific could not immediately comment on why the company terminated Intercage's service.
Spamhaus placed Pacific on its Spamhaus Block List on Sept. 12, after it began peering with Intercage, said Spamhaus CIO Richard Cox.
The Spamhaus list of untrusted Internet addresses is used to filter unsolicited e-mail from about 1.5 billion e-mail boxes, so being added to the list would almost certainly have caught Pacific's attention. "Obviously they were feeling the displeasure of the rest of the Internet," Cox said.
According to security researchers, there was a lot to be unhappy about.
Last month, a team of cybercrime experts published a white paper on Intercage, slamming the San Francisco company as a "major hub of cyber crime." The researchers found that 78 percent of the domains and mail servers on Intercage's network were hostile.
Intercage's Kacperski had ignored complaints about illegal activity on its network for the past five years and only recently began to respond to problems, said Matt Jonkman, an independent researcher who contributed to the white paper. "His network was used for very clearly hostile criminal activity," he said. "I'm not aware of any legitimate customers."
In recent weeks other upstream providers terminated Intercage's service, but Pacific had stepped in at the last minute to keep the company online.
Kacperski said his company had been making efforts to remove bad operators from its network and be more responsive to complaints, but that it was not enough to keep Pacific from ultimately dropping Intercage.
Spamhaus reports more than 350 cybercrime hosting incidents on the Intercage network over the past three years.
After years of complaints, the Internet community did something that law enforcement had been unable to do: knock Intercage offline, according to Paul Ferguson, an advanced threats researcher with security vendor Trend Micro.
"This was just a situation ... that apparently went on for too long without being properly addressed," Ferguson said via e-mail. "The community seems to take upon itself the necessary actions to purge these sorts of issues when all other efforts fail."
Kacperski said Monday he was looking for a new service provider, but that he had no idea how long it will take him to get back online.
"I've got to basically start all over," he said.

Bills Take Aim at Online Sale of Stolen Goods

Three bills focusing on the sale of stolen goods online would impose new obligations on Web auction sites while not putting enough responsibility on brick-and-mortar stores to protect themselves against shoplifting, two e-commerce representatives said Monday.
The National Retail Federation (NRF), in pushing for the legislation, is trying to blame the Internet for shoplifting, when the problem existed before the Internet, said Steve DelBianco, executive director for NetChoice, an e-commerce trade group.
"That's like saying the back seats of cars cause teenage sex," DelBianco said at a hearing of the U.S. House of Representatives Judiciary Committee's Subcommittee on Crime, Terrorism, and Homeland Security.
But Joseph LaRocca, the NRF's vice president of loss prevention, told lawmakers that the Internet encourages shoplifting. Organized retail crime rings are able to steal products, such as prescription medicine and baby formula, from brick-and-mortar stores and anonymously sell them online, often without regard for the health and safety of consumers, he said.
Some estimates suggest that shoplifting costs U.S. retailers US$30 billion a year, and sales of stolen goods through online marketplaces is a growing problem, LaRocca said.
"People have quickly learned that the Internet presents a low-risk way to sell stolen goods," he added. "More disturbing, however, is that the Internet seems to be contributing to the creation of a brand-new retail thief -- people who have never stolen before but are lured by the convenience and anonymity of the Internet."
The subcommittee hearing Monday focused on three bills, the E-fencing Enforcement Act, the Organized Retail Crime Act and the Combating Organized Retail Crime Act.
DelBianco and eBay senior regulatory counsel Edward Torpoco focused much of their criticism on the E-fencing Enforcement Act, which would require e-commerce marketplace sites to keep records of high-volume sellers and take down listings for goods when given evidence by retailers that the goods are stolen.
They also raised concerns about the Organized Retail Crime Act, which focuses partly on creating penalties for organized retail crime by defining what it is. The bill would also require online marketplaces to "expeditiously investigate" reports of stolen goods and to maintain records of high-volume sellers. Both bills would allow retailers to file civil lawsuits against the operators of online marketplaces that offer stolen goods for sale.
The bills would force online sites to take down products when retailers demand it, without any involvement of law enforcement, DelBianco said. The bills would give retailers their own law enforcement arms, he said.
The three bills are unlikely to pass this year because Congress winds up its work for the year within weeks. However, the House hearing could help build momentum for similar legislation to be introduced in 2009.
EBay has about 2,000 employees who investigate reports of stolen goods, and it has offered to work more closely with retailers in shoplifting investigations, but several major retailers have declined that offer, Torpoco said.
Retailers are concerned about working with eBay on a shoplifting program designed by the online auction site, but would welcome cooperation on their terms, said Frank Muscato, a retail crime investigator with Walgreens. "By giving them the information, we're giving the case away," Muscato said. "We have no control in what they do with it. We're not going to reach out and give them that information without some kind of guarantee."
Torpoco said he's "stunned" that retailers would refuse to work with eBay. The online auction site will help retailers get reluctant law enforcement officials involved, he said.
"If you've got evidence, send it to eBay; we'll do the right thing," he said. "I'm certainly surprised to hear that a retailer would not join with eBay's efforts to prosecute an individual out of concern over losing control. This issue is serious enough that we ought to put aside such irrational fears."
One lawmaker questioned whether retailers were willing to spend money for theft controls on low-cost, often-shoplifted products such as razors and baby formula. There's no real way to track and identify the rightful owner of small items after they've been shoplifted, witnesses said.
"My question is, how do you say eBay ought to do more, when eBay turns around and says you guys ought to do more?" said Representative Daniel Lungren, a California Republican. "What I'm hearing is, 'It is an acceptable level of loss that we take because it would be too expensive for us to go further.'"

Google Extends Book Search to E-tail Partners

Seeking greater visibility for its Book Search engine, Google is making it easier for retailers, libraries and publishers to provide that service's capabilities on their own Web sites.
Via free code snippets and APIs (application programming interfaces), Google is allowing Web sites to display Book Search previews, most of which will allow potential buyers to browse up to 20 percent of a book's contents, Google announced on Monday.
Some retailers are already taking advantage of the program, including Books-A-Million, Blackwell Bookshop in the U.K., A1Books in India, Libreria Norma in Colombia, Van Stockum in the Netherlands and Livraria Cultura in Brazil.
Others planning to incorporate the Book Search functionality include Borders, Buy.com and Powell's Books. Several libraries, publishers and social book sites are also on board. Although aimed primarily at libraries, book retailers and publishers, the tools will be available to anyone who publishes a Web site.
Google has been scanning millions of books and making their contents available for searching through contractual agreements with publishers. It is these books that will provide most of the previews appearing on third-party sites.
Google also does wholesale scanning of some big library collections, including books that are in-copyright and in the public domain. Some of those in the public domain will also appear in the external site previews.
It's not surprising to see Google stay away from surfacing results obtained by scanning in-copyright books from library collections, since the practice has landed Google in court for alleged copyright infringement. When scanning library books, Google doesn't always seek the approval of copyright owners.
"What you'll see in the [Book Search] preview function are primarily books that are under contract with publishers because that's the nature of this program," said Tom Turvey, director of Google Book Search partnerships, in an interview.
Amazon.com isn't participating because the Book Search program provides functionality that Amazon.com already offers via its own Search Inside This Book capability, Turvey said.
"It's the retailers that don't want to invest millions of dollars to scan and host the books and so on that are the primary beneficiaries, as well as publishers," he said.
Amazon.com didn't immediately reply to a request for comment, but it's a safe bet that it's not happy with Google's move, which will allow Amazon.com competitors to rival its inside-the-book search.
Google is providing the Book Search tools and functionality free, and isn't generating any money from book sales commissions nor advertising, Turvey said. The main benefits for Google are broader exposure of Book Search and a strengthening of its book publisher partnerships, he said.
For book lovers, the benefit is having access at non-Google sites to the Book Search functionality, including being able to browse some of the book and search inside it.
Google provided more limited Book Search functionality via a previous API, but the new one offers external Web sites more and richer functionality and is simpler to implement, according to Turvey.
"Formerly we had publishers that put the [Book Search] preview on their site on a one-off basis. This goes back 12 to 18 months. Now, since we've automated the whole preview process with the APIs, that creates a real industrial strength vehicle," Turvey said. "We can add features and functionalities to the API and it can affect everyone simultaneously."
One feature in Book Search that won't carry over to external sites is the list of suggested retail sites where a particular title can be bought. The last thing a retailer wants to do is provide links to competing stores, he noted.

Monday, September 22, 2008

Some 'Cyberloafing' is OK, Study Says

Employees feel that 'cyberloafing' - the non-work related use of their workplace computer -- is acceptable and helps them work better.
This is according to a study by Associate Professor Vivien K.G. Lim and Don J.Q. Chen of the NUS Business School at the National University of Singapore. A total of 191 completed surveys were collected, yielding a response rate of 32 percent. Men made up 34 percent of the respondents.
The study 'Cyberloafing at the workplace: Gain or drain on work?' found that, on the average, employees in Singapore spend about 51 minutes per workday on cyberloafing. This compares to the 10 hours per employee a week, found by earlier studies, for example the US WebSense.com study.
Personal e-mailing, instant messaging, and visiting news websites were the commonly cited cyberloafing activities, noted the NUS researchers.
In general, respondents to the survey felt that some form of cyberloafing at work was acceptable. They also perceived cyberloafing to have a positive impact on work.
"Interestingly, findings suggested that browsing activities have a positive impact on employees' work engagement while emailing activities have a negative impact," the authors noted.
Gender Divide in Attitudes
The survey findings showed that men were more likely to cyberloaf than women.
"Men and women also differed significantly in the amount of time they spent on cyberloafing at the workplace," the authors said. "Men reported spending slightly more than an hour (61 minutes) a day on cyberloafing at work, while women reported that they spent about 46 minutes."
But there was more agreement between the members of the two genders in terms of the acceptability of cyberloafing. When asked to indicate whether they felt that it was appropriate for them to use their workplace Internet access for personal purposes during working hours, about 97 percent of men and 85 percent of women reported that it was acceptable for employees to cyberloaf at the workplace.
How much cyberloafing is OK?
One of the questions in the survey was how much cyberloafing at the workplace was acceptable. Respondents felt that cyberloafing at work was permissible insofar as it did not exceed 1 hour and 15 minutes per day.
According to the survey results, about 75 percent of respondents agreed with the statement that 'cyberloafing helps make work more interesting', and 57 percent reported that engaging in cyberloafing help them deal with practical issues and personal issues. In addition, 52 percent of respondents agreed with the statement that 'cyberloafing makes them a better and more interesting worker' and 49 percent indicated that cyberloafing helps them deal with problems they encounter at work.
Based on the findings of the study, the authors have this piece of advice for companies: "Browsing activities allow for some relief at work and may motivate employees to perform better. Thus, in designing workplace Internet policies, companies should allow employees to use the company's internet access for non-work related online activities that have a positive effect on work."

Data Security Gives IT Professionals Insomnia

Fraud is a fact of corporate life today, as the latest Kroll Global Fraud report notes, somewhat ominously, in its opening pages.
The average company's losses to fraud increased by 22 percent since last year, and the average business lost US$8.2 million to fraud during the past three years (last year's figure was $7.6 million).
Those sobering statistics are from a recent survey of 890 senior executives worldwide, commissioned by risk consultancy Kroll.
So what's keeping executives up at nights, besides the slumping economy and financial crises?
The survey found that information theft, loss or attack is the type of fraud that most worried the respondents, with 25 percent feeling highly vulnerable and 47 percent feeling moderately so. That data shows why: The fastest growing types of fraud are information theft (27 percent; up from 22 percent last year) and regulatory and compliance breaches (25 percent; up from 19 percent). (See "My Company Has Had a Data Breach. What Do I Do?" for tips on how to handle it quickly and effectively.)
What's interesting, however, is that while senior management may say they have deep concerns about fraud, they also may have some blinders on-and they wind up underestimating the exposure their businesses actually face today.
"The survey data suggests that those who know more about technology and how it is used day to day in a company have a greater concern," notes the report.
In fact, employees working below the C-suite who are closer to an organization's technology efforts and systems are over one and a half times more likely than those at the corporate level to see their companies as highly vulnerable (31 percent versus 19 percent), according to the report.
Further bolstering IT's view into possible threats, the survey found that chief technology officers have "opinions closer to those of less senior employees than to those of their C-suite colleagues," states the report. Twenty-five percent see their businesses as highly vulnerable, whereas only 18 percent of other corporate peers do.
"If senior executives are not worried about their vulnerability to information theft, they should check whether their sense of safety is based on a thorough understanding of the security deployed by the company, or ignorance of the full extent of threat," notes the survey report. "In this case, too little knowledge could be a dangerous thing."

Profits, Not Conscience, Drive Green Efforts

Business requirements, not corporate social responsibility goals, are driving Green IT developments in the U.K. according to new research.
Organizations with a green IT agenda state that reducing energy consumption (53%) and extending the life cycle of their IT assets (41%) are the most important desired effect of their green IT practices, according to a survey carried out by Forrester among more than 100 delegates at the Computerworld UK sponsored Green IT 08 Conference .
While strategic drivers for green IT lagged behind, the more pragmatic, cost-saving agenda, Forrester said "strategic motivations are likely to carry more clout into the future."
The reported noted "a healthy 34% of respondents are pursuing green IT to 'stay ahead of forthcoming regulations'," for example.
Other factors, such as meeting customer, supplier and employee expectations, and reaching corporate sustainability goals will also become more significant, the analyst group suggested.
Green IT initiatives are currently focussed on the data center and desktop infrastructure, the survey showed, with 50% of respondents already implementing a green data center strategy and another 40% exploring the practice.
"The data center is often a first target on the green IT hit list since it is a tightly controlled environment, managed solely by IT -- unlike the distributed desktop arena (which is) strongly influenced by non-IT end users," said Forrester.
Server virtualization, storage consolidation, and green procurement were the main initiatives among delegates to the Green IT 08, organized by Tech Touchstone, the research also highlighted the drive for energy audits with 72% carrying out energy audits and 66% pursuing centralized power management.
Forrester urged IT organizations to pursue data center efficiencies alongside moves to control desktop power consumption.
"The data center often receives much of the green IT spotlight and current project work, (but) significant environmental and financial value is being left on the table by overlooking the desktop environment. In fact, desktops and related peripherals may consume 40% to 50% of IT's total energy draw," it noted.
The analyst also highlighted growing interest in thin clients, which it said are at least 25% more energy-efficient than their "thicker" PC peers.
"Beyond energy savings, thin-client architectures offer a host of benefits, from improved management, security, and compliance, to enhanced disaster recovery and 'anytime, anywhere' access," the report said.
Forrester also urged It departments to drive green initiatives beyond the IT infrastructure altogether. "IT infrastructure investments -- such as video conferencing to reduce travel, or enabling double-sided printing to reduce paper waste -- can enable a culture of sustainable business behavior.
"While the benefits of these initiatives will not reduce IT's electricity consumption or extend the life cycle of IT assets, they can have significant environmental and business value across the organization as a whole," it noted.

Did Microsoft Learn its Lessons with Vista?

Why make the same mistakes all over again, when there are so many new mistakes to make?
That thought came to mind a few weeks ago when Microsoft spent some time talking about Windows 7 (the current name for whatever comes after Vista) at a technology conference. From what I saw, it would seem that the company learned little from the Longhorn/Vista launch and is setting out down the same road.
Some 20 months after Microsoft launched Vista, it's still struggling to win the hearts and minds of business and consumer users. The folks in Redmond are happy to tell you how many licenses have shipped, but go elsewhere and you can hear plenty of large enterprise customers -- including iconic Microsoft partner Intel -- publicly discussing how they plan on skipping Vista and sticking with Windows XP.
Given the lukewarm reception that Vista has received, you had to expect Microsoft to try to change the topic. But trotting out the next big operating system so early was not the wisest way to do that. The message this seems to send: "OK, we know you hate Vista, but just hold on because something better will be coming along." The takeaway for corporate IT around the world: There's no need to upgrade to Vista since we can just sit tight with XP and wait for what comes after it.
And Microsoft has put itself in a position where that's the best scenario. Given what was shown of Windows 7, quite a few observers might conclude that now's the time to cut the cord and move to an alternative like Mac OS or Linux. This couldn't have been Microsoft's intention, but you have to wonder what its executives are thinking.
Why, for example, show off features of Windows 7 so early and totally out of context with the rest of the experience that the operating system is going to deliver? Microsoft did the same thing with Longhorn, and I've already discussed how well that turned out. It was a mistake then, and it's a mistake now. This premature display is just a bad idea, since its representatives had to go to great lengths to tell everyone that these aren't final features and might not even make it into the final product. Don't they remember how they ended up cutting many Vista features they had publicly touted, such as the object-based file system promised since the Cairo release?
Of course, making a big deal over Windows 7's Multi-Touch capabilities is a whole new mistake. The first problem is that touch is now closely identified with another well-known operating system, so it makes it appear that Microsoft is once again chasing Apple. (It's not in this case, but that's not the point. I saw Microsoft's Multi-Touch efforts long before Apple showed its Touch technology in public.) Multi-Touch is a technology Microsoft can be proud of, but not many people are going to see that when it's introduced in this way. Save it for the Surface interface and phones, devices that are optimized for the Multi-Touch experience. On a laptop, as shown at this event, it's clunky. Apple, realizing this, has limited its Touch capability to the track pads on laptops. For now, for a platform still designed for a mouse and keyboard, that's a better approach.
If Microsoft keeps this up, the reception that Vista has received will end up as only the first chapter in a once-mighty company's fall.
Michael Gartenberg is vice president of Mobile Strategy at Jupitermedia. His weblog and RSS feed are at mobiledevicestoday.com . Contact him at mgartenberg@optonline.net.

Company Offers VMware Data Recovery Service

Kroll Ontrack, a provider of data recovery and legal technologies products and services, announced it is the first company able to perform data recovery services on VMware systems.
The company said that folders and files, on all types of virtual systems, could now be recovered with their newly-developed proprietary technology, providing businesses a viable resource when data loss occurs in these environments.
With the growing popularity of virtualization, Kroall said it was now critical for businesses to cover data loss in their disaster recovery plans to ensure quick recovery and effectively avoid a costly halt to business activity.
"The number of virtual environment jobs has increased ten fold over last year," said Adrian Briscoe, general manager, Asia Pacific, Kroll Ontrack. "When virtual infrastructure data loss occurs, identifying the cause and recovering the data is very complex. This requires deep expertise and extensive knowledge of VMware environments."
Virtualization: optimization
Virtualization is a software technology that is transforming the IT landscape and the computing experience. While most of today's computing hardware was designed to only run one operating system at a time, virtualization helps businesses overcome this limitation by allowing a single system to run several operating systems simultaneously, increasing its usage and flexibility.
With virtualization, users of VMware software can save time, money and resources while optimizing and rationalizing their IT infrastructure. These advantages and savings explain the growing rate of virtualization in business environments.
While there are space-saving, environmental and cost benefits to virtualization and server consolidation, centralizing all data in one place can leave businesses vulnerable to experiencing substantial data loss. The rise in virtualized environments paired with instances of physical or operating system failure during configuration has lead to a significant spike in the amount of virtual environment recovery requests.
Disaster recovery plans
Earlier this year, Kroll Ontrack opened its Hong Kong office equipped with a full-service recovery laboratory and cleanroom to provide local customers with expedient and cost-effective data recoveries.

Wednesday, September 17, 2008

IBM to Work With Taiwan Researcher on Racetrack Memory, Cell

IBM and a key Taiwanese research group agreed Wednesday to further develop racetrack memory chips and to find more product areas for Cell processors.
Racetrack memory chips are a new technology developed by IBM to replace today's hard disk drives (HDDs) and NAND flash memory to store songs, photos and other data in products ranging from iPods and iPhones to PCs.
A joint development team led by Stuart Parkin of IBM, who came up with the idea for racetrack memory, and the vice president of Taiwan's Industrial Technology Research Institute (ITRI), Ian Chan, will study new materials and structures that can be used to further develop the memory chips.
IBM's desire to work with the Taiwanese group shows how serious the company is about developing and selling products related to racetrack memory and Cell technology. Taiwanese companies are responsible for the design and manufacture of much of the world's IT hardware, including for companies such as Hewlett-Packard and Dell.
Working with ITRI will help improve the chances racetrack memory chips and Cell processors will be used in new products designed in Taiwan.
IBM has long been lauded for its research efforts but some analysts have criticized the company for not always being able to turn costly research into profitable products. Chip development is especially tricky. Companies often develop powerful chips that ultimately end up in the IT graveyard because nobody can figure out a cost-effective way to mass-produce them.
"Finding cost efficient ways to manufacture IT products is one of Taiwan's strengths," said Lee Chih-kung, executive vice president of ITRI.
ITRI is a publicly funded Taiwanese research center that has a long history of working with Taiwan's IT industry, including developing new industries.
The founder and chairman of Taiwan Semiconductor Manufacturing (TSMC), Morris Chang, formerly ran ITRI and even started his company on ITRI's grounds.
As part of the development effort around racetrack, researchers from IBM will travel to Taiwan a few times each year, officials said. At least one ITRI researcher will spend three years at IBM in the U.S. working on related projects.
IBM has said it may take up to four years to develop working racetrack memory chips that can be commercially manufactured.
The company says racetrack memory runs faster than current storage memory, costs less, uses less power, gives off less heat, can store 100 times more data, and won't wear out no matter how many times data is erased and rewritten. One weakness of NAND flash, for example, is that some varieties can only be written on 10,000 times.
IBM's development work with Sony and Toshiba on the Cell Broadband Engine, a processing chip with nine independent cores on board, is an example of collaborative success for the U.S. giant. The chip and related technology are now found in a range of Japanese products, including Sony's PlayStation 3 game consoles.
ITRI plans to open a Cell product development center at its campus in Hsinchu, the Silicon Valley of Taiwan, so companies from around the island can find ways to use the technology in new products.

No Virtual Bridge From Xeon to AMD, Intel Says

VMware customers are getting a bit more freedom in the way they can transfer virtual machines from one Intel-based server to another, but they shouldn't hold their breath waiting for a bridge between Intel and AMD-based systems, an Intel executive said Tuesday.
With its line of Xeon 7400 processors released this week, Intel is enabling customers using VMware's vMotion technology to move virtual machines between two servers even when they are based on different families of Intel chips.
VMotion is VMware's technology for moving running virtual machines onto a different physical server. It's used by some customers for load balancing or for building fault tolerance into applications.
Before the 7400 series, also known as Dunnington, the two servers had to use the same family of Intel chips for vMotion to work, said Doug Fisher, vice president with Intel's Software Solutions group, at the VMworld conference in Las Vegas. With the 7400 and future chip families, that restriction is lifted.
VMware CEO Paul Maritz mentioned the development in his speech at the start of VMworld Tuesday. "Now you'll be able to buy hardware essentially independent of your vMotion strategy," he said.
The compatibility goes back only to the previous processor family, the 7300 "Tigerton" series, and will extend to the next generation, known as Nehalem. "We'll always give at least three generations of compatibility," Fisher said.
Intel made a big deal about the news, but AMD said its Opteron processors have had a similar capability for years. AMD doesn't change the microarchitecture of its processors as frequently as Intel, so compatibility between different Opteron lines is not an issue, said Margaret Lewis, AMD director of commercial solutions.
Customers looking to move virtual workloads between AMD- and Intel-based servers are out of luck, however, at least for the foreseeable future, according to Fisher.
"It's not going to happen," he said on the sidelines after his speech. The companies' chip architectures, while both x86, are too different and change too frequently to be made compatible. "We'd have to slow the pace of innovation to make it happen," he said.
Lewis suggested it was only Intel, not AMD, that changes its architecture frequently. "We'd need to sit down with Intel and VMware and discuss how to make it happen, and we would welcome that discussion," she said.
AMD would stand to gain the most from such compatibility, since it would give companies one less reason to buy Intel-based servers.
Dunnington is a six-core processor with a larger, 16M byte Level 3 cache to boost performance. VMware CTO Steve Herrod said VMware will keep its per-socket pricing the same for Dunnington, "so customers can get more virtual machines per processor" without paying more in licenses.
It was one of several ways Fisher said Intel is working with silicon to usher in a "second wave" of virtualization. The first wave was using the technology for server consolidation and building virtual environments for software testing, and the second is to use it for load balancing, high availability and disaster recovery.
Citing IDC figures, he said that in 2007 about 12 percent of all servers in production were using virtualization, up from 8 percent in 2006 and 4 percent the year before. Virtualized servers run at 52 percent capacity on average, he said, compared to 10 percent to 15 percent for non-virtualized systems.
VMworld continues through Thursday.

Regulators Would Reject Samsung Deal for SanDisk

Samsung's US$5.85 billion offer for flash memory chip developer SanDisk will probably be rejected by government regulators fearful such a tie up would harm competition, analysts said Tuesday.
With such an acquisition, Samsung would likely gain control of the majority of the global supply of NAND flash memory chips and could squelch potent rivals, said Jim Handy, memory chip analyst at researcher Objective Analysis.
Apple and other major buyers of NAND flash memory would likely find their price negotiating power "severely constrained" if Samsung and SanDisk combine, he added.
Cheng Ming-kai, chip analyst at CLSA Asia-Pacific Markets, said a Samsung/SanDisk alliance would likely be viewed as uncompetitive by the U.S. Justice Department, based on measures the regulator uses to determine market competitiveness.
iPod and iPhone lovers could feel the brunt of any increase in NAND flash memory prices caused by the acquisition because the chips are at the heart of those devices as well as other digital music players and digital cameras, storing songs and other data.
For example, the 16G byte iPod Nano, at US$199, costs $50 more than the 8G byte Nano ($149), according to Apple's Web site, and the only difference between the two devices is the amount of NAND flash memory inside. Similarly, the 16G byte version of Apple's iPhone 3G costs $299 while the 8G byte version is $199, and the main difference is the amount of NAND flash memory.
Last year, Samsung and SanDisk together supplied nearly 50 percent of the world's NAND flash memory chips, Handy said, measured in either dollars or gigabytes.
"Objective Analysis is very doubtful that the government would allow such an acquisition to proceed, even in today's dire market," Handy said.
Samsung, already the world's largest producer of NAND flash memory chips, could also increase its production at the expense of SanDisk's current manufacturing partner, Toshiba.
Toshiba and SanDisk have co-invested in NAND flash production lines in Japan and share chip output for their products. It's unclear what would happen to Toshiba in a Samsung deal for SanDisk, but Handy speculates that the Japanese company may be pushed aside as Samsung produces all the chips needed for a combined Samsung/SanDisk on its own.
Toshiba representatives declined to immediately comment on Samsung's offer for SanDisk.
CLSA's Cheng said Toshiba will likely see how Samsung's offer for SanDisk unfolds before making any comments.
SanDisk has already rejected the offer as too low.
Samsung first approached SanDisk about a deal in May, and indicated it might be willing to pay a "significant premium to the SanDisk $28.75 per share closing price on May 22, 2008," SanDisk said in a statement.
The $26 per share offer Samsung made on Tuesday is lower than the May indication and 55 percent below SanDisk's 52-week stock market high. SanDisk shares ended regular trading Tuesday at US$15.04 on the NASDAQ, up 4.4 percent on talk of an offer from Samsung and a possible rival bid from Toshiba.
SanDisk shares soared after Samsung made its bid public, rising $7.89, or 52.5 percent, in after market trading to $22.93 per share.

Friday, September 12, 2008

Get Ready for Mobile Social Networks

Mobile social networking is a small part of the way people use their cell phones, but industry officials expect that use will grow, and not just for teenagers who want to text their friends or send short video clips.
Workers will also adopt mobile social networking, analysts and network providers say, following the way social network sites, such as Facebook, have begun to grow within work groups that rely on desktop computers. These experts also expect affinity groups, such as doctors, engineers, lawyers or even baseball fans, who are linked with wireless devices.
Mobile social networking makes sense because mobile devices are personal and they are taken everywhere, offering the potential for transmission of quick ideas or images. Mobile social networks will (and some already do) put video, GPS, text, voice and collaboration into the palm of a user's hand.
For example, a business traveler at a conference in an unfamiliar city could be walking past an appealing restaurant. Using mapping and location technologies, the traveler could almost instantly send a quick note to 10 friends in her work group to "meet here in 15 minutes for a meal." Or the hungry traveler could record video of herself standing in front of the restaurant and send the video clip along with the message so the work group friends would know what kind of restaurant to expect.
The future of mobile social networks became a major topic of discussion in seminars and forums at the CTIA trade show this week. Device manufacturers, network operators and social network providers debated how the services will be paid for and by whom, and what steps must be taken to protect user privacy and safety.
Mobile social networks have not been widely adopted in the U.S., where between 5% and 10% of mobile users are participating, said Karsten Weide , an IDC analyst who spoke on a panel about the trend. But Weide said the number of users could easily double in a year, given the amount of interest in the concept by so many industry players. Adding to the reason for optimism, prominent vendors, including Verizon Wireless and Nokia Corp., announced a variety of tools at CTIA to help users aggregate social networks into a single interface.
Still, there are limitations, Weide said, including the difficulty of using a cell phone or smartphone interface to find friends in a social network, to attach information and to send messages. "Even the iPhone interface, as good as it is, isn't ideal for so much navigating," Weide said in an interview.
Perhaps the biggest concern is how social networking sites, such as Facebook, Twitter or MySpace, will raise revenue by making their applications work on all kinds of cell phones across a variety of networks.
Weide said he had talked to executives at two major social networking companies, which he would not identify, who expressed concerns over how they would raise revenue and how much revenue wireless carriers would want to share. In addition to sharing revenues, social network providers have to figure out how much of a customer's personal information to share with carriers, and vice versa.
Another panel discussion included five industry officials who cast doubt on whether mobile social networks can successfully be supported with revenues from advertising seen by end users. If advertising doesn't support the concept, then carriers and social network providers will probably have to rely on subscription fees, the panel members said. Questions still remain over how much a user would be willing to pay for a subscription, since that fee might be on top of the cost of a user's unlimited monthly data plan.
Hitting the Right Note with Consumers
"Advertising is challenged" as a way to support mobile social networking, said Paul Rehrig, vice president of business development for Warner Music Group. However, he said that many companies, including those selling music, are likely to have more success selling certain content to mobile device users than has been sold to desktop computer users.
With music, there is great potential for sharing tunes and reviews through mobile social networks, said Mari Joller, director of new services for Virgin Mobile. She said there is great potential for mobile social networking, given the way young wireless users are already flocking to text messaging and relying on social networks from their desktops.
For that group, "we've found that people are terrified of being out of touch," which means a mobile social connection will be ideal, Joller said.
Still, Joller and others said carriers and social networking sites will have to be vigilant to protect privacy and safety, especially of minors who could be the subject of stalkers or who don't want to communicate their location to a former friend. Controls can be built in and should be, she said.
"We have to have controls that say 'Are you sure you want to share your location,' " Joller said. "We have real life examples of somebody stalking" through social networks. "It's going to take education with growth in the market ... With the youth demographic, there could be some scary situations on our hands."
Weide said mobile social networks "will be a non-starter" if all the parties don't put in plenty of privacy and security controls upfront, especially for young users.
Mobile social networking could lead to many uses that are only now being imagined, analysts said.
Benjamin Mosse, director of mobile products at the Associated Press, said groups of friends could share news stories and offer comments within their group, instead of posting comments directly to a story. And down the road, he said the AP itself could use GPS technology to track what stories are being read in certain cities, zip codes or buildings.
"I would like to know what the editors inside of AP are reading," he said. The AP last week announced it is making content from its 700 news organization members available on Research in Motion Ltd. BlackBerry devices, and announced this summer it is offering the same service via iPhone.
Hitting the Right Note with Consumers
"Advertising is challenged" as a way to support mobile social networking, said Paul Rehrig, vice president of business development for Warner Music Group. However, he said that many companies, including those selling music, are likely to have more success selling certain content to mobile device users than has been sold to desktop computer users.
With music, there is great potential for sharing tunes and reviews through mobile social networks, said Mari Joller, director of new services for Virgin Mobile. She said there is great potential for mobile social networking, given the way young wireless users are already flocking to text messaging and relying on social networks from their desktops.
For that group, "we've found that people are terrified of being out of touch," which means a mobile social connection will be ideal, Joller said.
Still, Joller and others said carriers and social networking sites will have to be vigilant to protect privacy and safety, especially of minors who could be the subject of stalkers or who don't want to communicate their location to a former friend. Controls can be built in and should be, she said.
"We have to have controls that say 'Are you sure you want to share your location,' " Joller said. "We have real life examples of somebody stalking" through social networks. "It's going to take education with growth in the market ... With the youth demographic, there could be some scary situations on our hands."
Weide said mobile social networks "will be a non-starter" if all the parties don't put in plenty of privacy and security controls upfront, especially for young users.
Mobile social networking could lead to many uses that are only now being imagined, analysts said.
Benjamin Mosse, director of mobile products at the Associated Press, said groups of friends could share news stories and offer comments within their group, instead of posting comments directly to a story. And down the road, he said the AP itself could use GPS technology to track what stories are being read in certain cities, zip codes or buildings.
"I would like to know what the editors inside of AP are reading," he said. The AP last week announced it is making content from its 700 news organization members available on Research in Motion Ltd. BlackBerry devices, and announced this summer it is offering the same service via iPhone.

DataSentinel -- a Backup Service With Issues

To be truly effective, a backup application must let you easily choose what you back up, simplify recovery and not slow down your work. So I was looking forward to evaluating DataSentinel, a combination of hardware, software and storage service.
The hardware -- a 512MB thumb drive -- comes preloaded with the backup software. The software loads automatically (and prompts you for a password) when you insert the thumb drive in any USB slot. The files you select are stored on DataSentinel's servers, for which you pay a fee, beginning at $5 a month for the first 5GB.
The idea behind DataSentinel is a good one. On the plus side, installation is simple. You insert the thumb drive in a USB port and the program asks you to verify your user ID and password. During installation, your password generates a 136-character "Personal Encryption Code" that you must keep safe. Should you lose your thumb drive, the code is required to recover your data, though the hard copy's black characters against a green background didn't produce a clear printout on my HP LaserJet monochrome printer.
After you've installed the drive, the next step is to select the folders and/or files you want stored on DataSentinel's servers by checking boxes in a tree view (see Figure 1). You can also check boxes to exclude files of a particular type (for example, audio files such as MP3 and WAV files), but you have no control over the file extensions for each type. You can also exclude files larger than a user-specified size.
Once you've made your file/folder selections (which you can change at any time), the program goes to work, backing up your files in the background. This is where it's clear how DataSentinel differs from other backup options -- when it works, that is. To prevent your system from grinding to a halt (a problem with most backup software), DataSentinel backs up at approximately 1MB to 5MB per minute, a speed that's slow enough not to interfere with foreground operations.
When I ran DataSentinel, backups ran at 3MB to 4 MB per minute -- a speed that varied depending on the ISP used. A progress screen (see Figure 2) shows the status of the backup, but I found that if I'd selected a large number of files, it often closed before it could calculate the percent complete. At other times, it reported the backup was 157% complete.
There's a downside to the slow speed. Depending on how many files and folders you initially select, your first backup may take many, many hours to complete. Furthermore, if you make minor changes to large files (a graphics-rich PowerPoint presentation, for example) and want to shut down your system right away, you won't have time to protect your files. Fortunately, any file not backed up before you turn off your computer will be synched to the server when you next boot up and reinsert the DataSentinel device (From insertion of the thumb drive until the software opens the file view is about 90 seconds.)
Once the initial files I selected were backed up, DataSentinel's performance varied. For example, as I worked on this review, I saved the 75K Word file frequently to a folder on my hard drive I'd marked for automatic backup. Each time I saved the file, DataSentinel recognized the update within 3 seconds and backed it up in less than 2 seconds, so I was never more than 5 seconds away from a backup copy on the DataSentinel server. That's peace of mind. DataSentinel took roughly 15 seconds to spring into action when I saved a new file to the same folder for the first time. Unfortunately, a 300MB video file never was successfully backed up. The file remained on my hard drive, of course; it was never completely backed up to the service. The 10MB that did make it through the Ethernet and to their servers remained on the service, and the file list shows a file size of 10MB for the file -- but unless you know this is a 300MB file, I wouldn't have known that the file was incomplete on their server. In other cases, when the individual file hadn't been backed up at all , the "size" column remained at 0, which is logical.
If you need to save a file immediately, DataSentinel provides a "Private" folder. Drag a file to this folder and the file is copied as quickly as possible -- sometimes. This option worked fine when saving several 6MB audio files, but a 350MB video file never made it to the Private folder, and the status bar inexplicably disappeared after just 3% of the job was complete. I had to stop and restart the backup service so DataSentinel could "wake up" and do a bit more of the backup each time. I gave up after repeating that process 10 times, with the file transfer still incomplete.
Awkward Data Recovery
If you delete a file from a folder on your hard drive, the file is kept on DataSentinel's server. To permanently delete an archived file from the server, you must click on the Pencil icon in the file list; DataSentinel displays these deleted files with a light-yellow background (see Figure 3). However, I found that right-clicking and choosing the delete command didn't work consistently. For example, I tried to delete an empty folder but was never successful, and using the command often froze the software. Because the service is priced on the number of gigabytes you store, successful deletion is important -- you may want to regularly review what's on the server so you don???t pay for storing obsolete files.
To restore a file to your hard drive, you're supposed to double-click the file to open it in the associated application, then save the file to your hard drive. That's a simple procedure, but it's limited to data files that are associated with a program. For other files (configuration files or drivers, for example), there's no documented procedure, though I found that selecting the files, copying them and pasting them to my hard drive worked. Unfortunately, there's no drag-and-drop support to easily restore files -- a major drawback.
One feature I particularly like is the ability to plug the DataSentinel thumb drive into any PC and back up files for that system. With this feature you can backup a file from one computer and restore it to another.
Quirks and Caveats
While day-to-day backup performance of files in selected folders worked well, it's not a program with an interface I quickly understood. Clicking on an icon on an elephant to select files wasn't immediately intuitive, for example, and the "Synchronization" screen that tells you how far along DataSentinel has progressed often closed before it had calculated the percent complete, and there was no way to "pause" the application so the window didn't close. Furthermore, even after de-selecting all files and subfolders in a folder (so nothing from the folder would be copied to the DataSentinel server), the program insisted on marking the folder as one containing files to be backed up -- which was simply wrong. It took several hours until I felt comfortable with how the program worked, but then I kept running into problems. The main window was sometimes slow to respond to mouse clicks or file selections. The program froze more than once and on more than one test machine, adding to my frustration.
When you open a file on the server directly (by double-clicking on it) and save it from an application, DataSentinel automatically assigns a version number to the file and keeps 10 versions of the file; you can't turn versioning off nor set the number of versions you want to keep, so directly editing large files can quickly grow your storage allotment. No versioning is provided for files automatically backed up from your hard drive.
The program had some other, albeit minor, quirks. For example, to end the program completely you must pull out the thumb drive -- there's no "File/Exit" command.
Perhaps future users will have an easier time once the company's promised documentation is complete, its performance improves and backup of large files can be relied on. Until then, I cannot recommend this hardware/software/service combo.

SAP Certs Boosting Pay, but Others Still Falling

IT professionals looking to cash in on hard-earned certifications may be surprised to learn that many IT skills continue to lose value as others experience increases in pay.
Foote Partners, a research firm focusing on pay and compensation for hundreds of certified and noncertified IT skills, this week reported that while pay for SAP skills saw increases of 25% to 30% over the past 12 months, other certified skills pay declined during the same time. Foote Partners quarterly tracks some 331 IT skills and pay premiums earned by 22,000 IT professionals in the United States and Canada.
David Foote, co-founder, CEO and chief research officer, states in the report that "by now everybody has heard that demand for certifications by IT managers has softened considerably. In fact, our IT skills and Certifications Pay Index has found eight straight quarters of consistently decreasing pay for the 165 certifications we survey."
For instance, IT certifications among those with the largest market value declines in the past 12 months included Microsoft Certified Professional+Internet (MCP+I) with a 40% decrease. Pay for IBM Certified Advanced Application Developer -- Lotus Notes/Domino, Novell/Certified Internet Professional (CIP) and Novell/Certified Novell Engineer (CNE) each shrunk by 25%. LAN Server Engineer (LSE) certifications dropped more than 33% in terms of pay, while Oracle Forms Developer Certified Professional (OCP) certifications earned more than 18% less pay. And IT professionals with Cisco Certified Design Associate (CCDA) certifications saw a 14% decrease in pay.
Among the noncertified skills Foote Partners tracks, Microsoft Exchange skills pay saw a 25% decrease over the last 12 months. IT professionals with WebSphere skills experienced a more than 16% drop in pay for those skills, while Linux professionals earned about 14% less in the past 12 months. Pay for expertise in HP-UX dropped 12.5%, and RFID talent received about 10% less over the previous 12 months.
Aside from the SAP skills in that have seen increases by as much as 57% in some cases over the past 12 months, Foote noted several skills that have yet to experience pay declines. "The exception has been a selection of security, networking, systems and database certifications, plus a few in the architecture and project management areas that are showing solid pay growth numbers," he stated in the report.
For instance, noncertified wireless network management skills pay increased by more than 33% over the past 12 months and noncertified network security management pay grew by more than 36%. Pay for certified GIAC Security Experts (GSE) grew more than 36% and Certified Information Security Manager (CISM) pay increased by 20%. Cisco Certified Network Professionals (CCNP) and Microsoft Certified Systems Administrator: Messaging (MCSA:Messaging) pay each increased by more than 14%.
Separately, Goldman Sachs this week released its quarterly IT Spending Survey that found IT budgets being decreased more and staffing as one of the areas to be cut. Some 15% of the 100 managers surveyed identified internal staffing as one area their companies considered as having the greatest potential for cost reduction within the IT organization. The August survey number was up 9% in June, 12% in April and 9% in February.
"We see a greater focus on potential internal staffing cuts as an incremental negative for the environment," the research firm wrote in its report.

Get Started With Virtual Machines

It's great to have multiple computers. On the first of them, you can install a database or crunch spreadsheets. On another, you can simply browse the Web, listen to music, and check your e-mail. Yet another can have a supercharged configuration for playing games. Sure, you could have all of your programs on the same, single computer, but some applications--such as games--can't run concurrently with other programs.
Many businesses have a different problem: They need to use applications that will run only on a specific operating system, be it Windows Vista, XP, or 2000, or maybe Mac OS X or Linux. So they maintain computers running different OSs around the office.

Using multiple PCs has several drawbacks, of course, such as the amount of money they cost and the power and space they consume. You can always install a second operating system to dual-boot with your existing OS, but dual-booting requires repartitioning the hard disk, as well as shutting down one operating system and its applications in order to use the other.
Thanks to today's faster processors and capacious memory and disk space, however, you can get many of the benefits of using multiple, separate computers by adding virtualization software to a single PC. Such software lets you install multiple OSs into virtual hard disks that are really just files on your main hard drive. You can then launch and run multiple guest operating systems simultaneously. The software redirects access to key hardware devices on the host system, including network adapters and optical drives. Although the virtualized operating systems themselves are not always free, several excellent virtualization utilities don't cost a cent.
VMWare's VMWare Server and Sun Microsystems' VirtualBox come in multiple versions that run under either Linux or Windows, allowing you to host either Linux or Windows. Microsoft's Virtual PC 2007, not surprisingly, runs only under Windows--but it does permit you to run Linux as a guest operating system, surprisingly.
Other virtualization tools abound (notably, the Mac-based VMware Fusion and the multiplatform Parallels Desktop), but for this story I'm going to focus on three excellent, free virtualization apps as I guide you through the process of installing and configuring multiple OSs on your Windows PC. You can run just about any OS (except for Apple's, which is restricted to Mac hardware) in your virtual machines, and you can run the virtualization software on many different host operating systems. Virtualization is all about expanding your options.
Though you'll see the best performance with fast processors, multiple gigabytes of memory, and large virtual hard disks, some virtualization tasks (including all of those described here) can run on systems with 512MB of memory or less. If virtualization doesn't meet your needs for whatever reason, or if you just aren't satisfied with a particular virtualization app, simply uninstall the software and delete the virtual hard-disk file to return your PC to its previous state.
Scenario 1: Host Another Version of Windows, Under Windows Using Virtual PC 2007
If Windows XP floats your boat but you've sailed onward to Vista, an occasion might arise when you need to reverse course and return to the earlier Windows version. The ability to do so is especially handy if you're having trouble getting a favorite XP application to behave in Vista. It's also useful if you want to run even older versions of the OS, such as Windows 2000.
Microsoft's Virtual PC 2007 is the perfect fit for this virtualization scenario. According to Microsoft, the main host operating system versions supported are Windows Vista, Server, and XP Professional (32-bit and 64-bit versions), though I installed the app successfully in Windows XP Home Edition with just a brief warning that it was not supported.
When you launch Virtual PC 2007 for the first time, it displays a console listing your Virtual Machine components (which will be empty at first) and a New Virtual Machine Wizard. Click Next to start the Wizard, and click Next again, thereby choosing the default option to create a virtual machine. The wizard then asks you to confirm the amount of memory and disk space to dedicate to the virtual machine. I accepted the default memory size, but adjusted the amount of space dedicated to the virtual machine's file-based virtual hard disk downward to avoid eating up all of the free space on my host hard disk.
How much memory you devote to your virtual machine will determine how quickly it performs, but remember that any RAM you give to the virtual PC will come at the expense of its host system. If you have 2GB or more memory on your host PC, consider giving a virtual XP machine 512MB, which will ensure reasonably fast performance.
Click the final Next, then Finish, and your virtual machine will appear in the Virtual PC Console. Insert an operating system installation CD and double-click the virtual-machine icon in the Virtual PC console to start the boot process. You may need to select the CD/DVD boot drive in the Virtual PC's CD menu, choose Action, and then press Ctrl-Alt-Del to make the virtual machine boot from its installation CD or DVD. After that, the installation process should proceed just as it would on a non-virtual PC. You can install as many different virtual OSs as your hard-disk space will allow.
Virtual PC 2007 is pretty simple to use, because it has only a few options. To boot up an installed virtualized OS, select it in the Virtual PC Console and click Start. To save the current state of the OS (in order to exit Virtual PC or shut down the computer), click on Close, choose Save state from the list of options, and click OK. Clicking the mouse within the virtualized OS window once allows it to "capture," or recognize, the mouse pointer. To release the pointer for use in the host OS, press the right-hand Alt key and drag the mouse out of the Virtual PC window. Choose Action, Full Screen Mode to view the OS in a full screen, and press the right-hand Alt plus Enter to escape to the host OS.
Once your virtual machine is up and running, click Action, Install or Update Virtual Machine Additions in the Virtual PC menu. This will install a variety of tools that allow you to copy and paste text between the virtual machine and the host PC, as well as to send documents back and forth via a shared folder on the host PC.
Scenario 2: Host Linux Under Windows, Using VirtualBox
Sun Microsystems should rename its free, open-source virtualization utility VersatileBox. Not only does it run on Windows, Linux, Mac, and Sun's OpenSolaris, but it also supports an even wider array of guest OSs, including just about any version of Windows.
Let's say you want to start learning to use some Linux tools and applications without dual-booting or repartitioning. Download the Windows version of VirtualBox from Sun's site, and install it. Launch it and click on New to create a new virtual machine.
Click on Next to start the installation process, enter a name for your virtual machine in the dialog box that follows, select an operating system type from the OS Type list, and click Next. Click Next again to accept the default base memory size for your virtual machine (or adjust the slider up or down), and then click Next. Because Linux runs well with minimal system resources, you really needn't allot more than 128MB of RAM to the virtual machine in most cases.
Click New to create and size a virtual hard-disk file; I recommend choosing the dynamically sized option, which will allow the virtual hard-drive size to expand as you fill it with data. Click Finish to create the virtual disk, and then select Next and Finish to complete the virtual-machine creation process.
Now, select your new virtual machine in the Sun xVM VirtualBox console, insert the Linux boot media, and click on Start to begin the boot and installation process. Once everything is installed, launch the virtual OS by selecting it in the Sun xVM VirtualBox window and clicking Start.
Like Virtual PC 2007, VirtualBox runs your guest OS in a window, and grabs the mouse pointer automatically once you click inside it. To release the pointer to the host OS, press the right-hand Ctrl key. To enter full-screen mode, choose Machine, Fullscreen Mode, or press the right-hand Ctrl key and the F key at the same time. To escape full-screen mode, press that key combo again.
Scenario 3: Using Ready-to-Run VMware Appliances With WMware Player
So far we've made the process of installing and configuring a guest operating system using a virtualization utility look easy. But getting a more complicated desktop or server OS working in a virtual machine can be more frustrating, particularly if you're not yet familiar with all of the OS's configuration options. What if you could just plug one in, boom, and see what it looks like?
VMware makes one of the leading free virtualization programs, VMware Server 2.0. For a brief tutorial on how to install VMware Server under Windows, see "Run a 'Guest' OS on Your PC" in the article "12 Great Do-It-Yourself PC Projects."
You don't have to go through all those steps to take an operating system or server application out for a spin, however. VMware's Player allows you to run preinstalled and preconfigured OSs and other software "appliances" as if they were movies or PowerPoint presentations. Player doesn't allow you to make changes to the virtual machine's configuration, but using it is a great way to assess a particular application's features quickly.
VMware hosts hundreds of these applications on its Virtual Appliance Marketplace, ranging in size from a few megabytes (for some of the smaller Linux distributions) to several gigabytes. Though Linux distributions abound--a VMware appliance is a great way to sample a prerelease version of your favorite distro, for example, without giving it free rein over your PC--the majority of appliances are open-source server-based applications, including network backup utilities, content management systems, network security and traffic analyzers, mail servers and spam filters, firewalls, PBX and VOIP servers, and SAN and NAS servers.
To run an appliance, first download and install VMware Player, and then download the virtual appliance you'd like to run (I chose a recent version of the OpenBSD Unix OS). VMware Player's interface offers options for both downloading and launching appliances. Exiting the player saves the appliance's state by default, but there are very few other options. Press Ctrl-G to allow the appliance to capture keyboard and mouse input, and Ctrl-Alt to release input to the host OS.