Why do some people prefer Windows XP and Mac OS X over Windows Vista ? After all, Vista is pretty and sleek and much more advanced than XP, and, in many areas, Mac OS X. Why is there so much love for Xbox, but none for Windows Mobile ?
Why do BlackBerry users love their BlackBerrys, but the public is lukewarm about Palm devices?
Why is the Amazon Kindle, which is an unsophisticated, clunky, poorly designed gadget so popular with owners?
Why do people love plain, ugly Gmail?
The answer to these questions is a mystery to most of the companies that make PCs, gadgets, consumer electronics devices and to software makers. The industry spends billions on usability testing and user interface design. Unfortunately, that money is mostly wasted.
The problem is that there are too many technologists in technology. The technology is only half the equation. The other half is the human, that irrational, impulsive, impatient, power-hungry gratification machine.
When you ask someone what they really want, they won't tell you the truth because they're not aware of the truth.
Both users and product designers alike talk about user interface (UI) consistency, usability and simplicity, and system attributes like performance and stability. What's missing is that these attributes are means to an end. The real issue is always the user's physiological feeling of being in control. And control comes in many ways:
Consistency: Designers focus on UI "consistency," but why? Consistency gives predictability, which gives users a feeling that they know what will happen when they do something -- even for the first time. It's a feeling of mastery, of control.
Usability: One of the errors software and hardware designers make is to base their UI decisions on the assumption that the user is an idiot who needs to be protected from himself. Give this moron too much rope and he'll hang himself, the reasoning goes. But instead of taking the Microsoft route -- burying and hiding controls and features, which protects newbies from their own mistakes but frustrates the hell out of experienced users -- it's better to offer a bullet-proof "undo." Give the user control, let them make their own mistakes, then undo the damage if they mess something up.
Simplicity: Simplicity is complex. And there are many ways to achieve it. One way is to insist on top-to-bottom, inside-and-outside simplicity. Extreme examples include the original Palm Pilot organizer, Gmail and RSS feeds. And then there's the illusion of simplicity, which is the Microsoft route. In trying to be the operating system vendor for all people and all tasks, Microsoft Windows and Windows Mobile are extraordinarily complex pieces of software engineering. To "simplify," the company hides features, buries controls and groups features into categories to create the appearance of fewer options, without actually reducing options. (From all accounts, it appears that Windows 7 will offer more of the same.) Both extremes result in something you could call "simplicity." But one version thrills users by putting them in control. The other frustrates them by taking away control.
Performance: Everyone hates slow PCs. It's not the waiting. It's the fact that the PC has wrenched control from the user during the time that the hourglass is displayed. That three seconds of staring at the hourglass is three seconds when you feel utterly powerless. Fast computers are good because they keep the user in control.
Stability: Designers focus on system "stability," but it's not because they worry about time wasted, though that's how users tend to talk a lack of stability. Like the performance issue, instability is about the theft of system control from the user. People waste all kinds of time on all kinds of things, and usually don't mind doing it. What enrages people is when somebody else forces wasted time on you. Blue Screens of Death are more akin to running into unexpected traffic jams or having somebody take away the TV remote control. You're forced into putting your objectives on hold, and left feeling powerless.
One reason for the industry-wide pandemic of frustrating products is that the whole culture of usability testing doesn't emphasize user feelings of control. Microsoft does usability tests, for example, but its tests are flawed. Typically, it sits random people in front of a PC in a usability lab. Victims are directed to do various tasks, and asked what they're doing and thinking as they try to complete those tasks. All of this is monitored, and everything is recorded.Microsoft usability testing tends to focus on enabling users to "accomplish goals." Microsoft categorizes these goals according to their educated preconceptions about what people are trying to do based on their jobs or user categorization are you a student, middle manager, designer, for instance. So Microsoft focuses on results.
My view is that how the user feels during the process is more important than anything else.Here's the problem. In these scenarios, users are using somebody else's PC. They expect and assume that the software is in control. There is no psychological feeling of "ownership" over the equipment or the software or the work or anything. So the most important element -- the sense of control people feel when doing their own work on their own PCs in their own homes -- is missing entirely from the tests.
During usability tests, users are asked constantly about the software. And that's the wrong question. When real people are doing real work, they're focused on their own desires and objectives and are frustrated or not frustrated based on the degree to which they're given what they want.
My advice to Microsoft is to add an additional test: a "Who's In Control?" test. After performing a task, ask the user to rank their experience on a scale with "me in control" on one side, and "software in control" on the other. Try all test methods for completing various tasks, and choose the one ranked with the maximum "me in control" score. And they need the home version for ongoing testing in the "real world."
We've all experienced the full range of emotions while using gadgets, PCs, phones and software. At one end of the spectrum is a kind of thrilling joy, where something "just works." At the other end, there is a consuming rage. The amount of time your emotional state spends at one end of the spectrum rather than the other is the one and only thing that determines how much you "love" the product.
All the factors involved in using a PC -- consistency, usability, simplicity, stability, performance and even the successful completion of tasks -- all come down to control.Give me control, and I will love your product. It's as simple as that.
Monday, January 5, 2009
Tech Jobs May Increase Despite Economic Trends
The outlook for IT jobs in 2009 may not be as bad as some observers suggest. While some indicators and surveys are showing some declines in tech jobs, none predict a precipitous drop. In fact, a federal economic stimulus package may even add IT positions.
"IT jobs are relatively safe in the aftermath of the economic meltdown compared to jobs in general," said David Foote of Vero Beach Fla.-based Foote Partners LLC , which analyzes IT wages and hiring data.
While 853,000 U.S. jobs in all industries were lost in October and November, 9,000 were gained in the U.S. Bureau of Labor Statistics categories of "Computer Systems Design and Related Services" and "Management and Technical Consulting Services," said Foote.
The IT job market is stable, said Foote, "because a lot has happened to show businesses that IT is really our edge."
Robert J. McGovern, CEO of JobFox Inc., a career site in McLean, Va., is bullish in the belief that hundreds of thousands of tech jobs will be created from the federal stimulus of hundreds of billions of dollars that's expected early next year from President-elect Barack Obama 's administration and Congress.
Major chunks of that federal money may be used to build infrastructure such as roads and bridges, and for expansions of broadband, especially in rural areas, he said. IT pros should focus on showing employers how their skills can be adapted to projects in these areas.
For example, construction companies and engineering firms will likely seek multiple IT skills, including computer-aided design and telecommunications. Companies focused on alternative energy and health care modernization will likely need IT pros who specialize in bioinformatics, information security and software development. "Target your job search in those directions," McGovern suggested.
Regulatory compliance may also be a source of new jobs, he said. The Obama administration is expected to quickly expand regulatory controls, especially in the financial services industry. The industry's response could be similar to its actions after Congress passed the Sarbanes-Oxley Act of 2002. That law, enacted after a series of securities scandals such as the spectacular failures of WorldCom and Enron, drove demand for integration and Web skills.
Some of the hottest areas for jobs over the next two years, according to Foote, will be business analysis, financial and human resources applications, program management and application development.
McGovern believes a stimulus measure could have a fairly rapid impact on new hiring. "What employers need more than anything is confidence," said McGovern. "They have the open positions, but they are reluctant to fill them."
"IT jobs are relatively safe in the aftermath of the economic meltdown compared to jobs in general," said David Foote of Vero Beach Fla.-based Foote Partners LLC , which analyzes IT wages and hiring data.
While 853,000 U.S. jobs in all industries were lost in October and November, 9,000 were gained in the U.S. Bureau of Labor Statistics categories of "Computer Systems Design and Related Services" and "Management and Technical Consulting Services," said Foote.
The IT job market is stable, said Foote, "because a lot has happened to show businesses that IT is really our edge."
Robert J. McGovern, CEO of JobFox Inc., a career site in McLean, Va., is bullish in the belief that hundreds of thousands of tech jobs will be created from the federal stimulus of hundreds of billions of dollars that's expected early next year from President-elect Barack Obama 's administration and Congress.
Major chunks of that federal money may be used to build infrastructure such as roads and bridges, and for expansions of broadband, especially in rural areas, he said. IT pros should focus on showing employers how their skills can be adapted to projects in these areas.
For example, construction companies and engineering firms will likely seek multiple IT skills, including computer-aided design and telecommunications. Companies focused on alternative energy and health care modernization will likely need IT pros who specialize in bioinformatics, information security and software development. "Target your job search in those directions," McGovern suggested.
Regulatory compliance may also be a source of new jobs, he said. The Obama administration is expected to quickly expand regulatory controls, especially in the financial services industry. The industry's response could be similar to its actions after Congress passed the Sarbanes-Oxley Act of 2002. That law, enacted after a series of securities scandals such as the spectacular failures of WorldCom and Enron, drove demand for integration and Web skills.
Some of the hottest areas for jobs over the next two years, according to Foote, will be business analysis, financial and human resources applications, program management and application development.
McGovern believes a stimulus measure could have a fairly rapid impact on new hiring. "What employers need more than anything is confidence," said McGovern. "They have the open positions, but they are reluctant to fill them."
Monday, December 22, 2008
No-Name Power Supplies Can Prove Painful
Power is the talk of the town in the PC industry, and for good reason. Nobody wants their desktop computer to double their electricity bill. But as much as we throw various power tweaks, myths, and all other kinds of electronic hocus-pocus back-and-forth, it's important to sit back for a moment and think about the core of your system: the power supply. It's the most critical part of your system, and it certainly won't last forever. But this is not a place where you're going to want to scrimp when it comes time to build a new machine or replace the aging power supply in your current rig.
Tech Report ran a pretty comprehensive batch of reviews for seven power supplies recently. While that normally doesn't sound like the kind of article that would draw many eyeballs, given the specificity of the topic, it's worth your while to check out. Spoiler alert: Be careful what you purchase with PSUs. As tempting as it might be to save your pennies on your power supply so you can afford that next tier of processor in your low-budget CPU, you're only going to hurt yourself in the long run.
Unlike grocery store food, Tech Report found that generic power supplies tend to lack proper cabling for all the accessory devices you'll want to plug into the PSU. Worse, their warranties can be shorter than name-brand PSUs--just like their cabling. Tech Report puts it best:
"Generic PSUs may not always be time bombs waiting to take your system down with them, but based on what we've seen, they're not worth the trouble and are poor values, anyway."
So what's a name-brand power supply? Well, if you go by the article, companies like Antec, Corsair, Enermax, and OCZ offer reliable, fault-proof PSUs--at least, more so than generic brands like Coolmax or SolyTech. But this isn't the kind of decision you should be making based on your brand familiarity at the store. If you must, consider the "too good to be true" adage--the cheaper and jankier the power supply away from the competitive average, the greater the likelihood that you're being hoodwinked. That's not to say that the best power supplies are super-expensive, but these generic PSUs can appear the most tempting because of their absurdly low costs.
Your best bet is to treat your power supply purchase like you would any other computer part: Do your research! While power supply reviews can be harder to come by than, say, a new processor or video card, they exist. Don't go to Newegg--find sites that run PSUs through a rigorous testing environment. And if all else fails, at least pick up something that has the longest warranty you can find. Then, when your New Power Supply Of Choice blows up, at least you'll have a wide blanket of coverage.
Tech Report ran a pretty comprehensive batch of reviews for seven power supplies recently. While that normally doesn't sound like the kind of article that would draw many eyeballs, given the specificity of the topic, it's worth your while to check out. Spoiler alert: Be careful what you purchase with PSUs. As tempting as it might be to save your pennies on your power supply so you can afford that next tier of processor in your low-budget CPU, you're only going to hurt yourself in the long run.
Unlike grocery store food, Tech Report found that generic power supplies tend to lack proper cabling for all the accessory devices you'll want to plug into the PSU. Worse, their warranties can be shorter than name-brand PSUs--just like their cabling. Tech Report puts it best:
"Generic PSUs may not always be time bombs waiting to take your system down with them, but based on what we've seen, they're not worth the trouble and are poor values, anyway."
So what's a name-brand power supply? Well, if you go by the article, companies like Antec, Corsair, Enermax, and OCZ offer reliable, fault-proof PSUs--at least, more so than generic brands like Coolmax or SolyTech. But this isn't the kind of decision you should be making based on your brand familiarity at the store. If you must, consider the "too good to be true" adage--the cheaper and jankier the power supply away from the competitive average, the greater the likelihood that you're being hoodwinked. That's not to say that the best power supplies are super-expensive, but these generic PSUs can appear the most tempting because of their absurdly low costs.
Your best bet is to treat your power supply purchase like you would any other computer part: Do your research! While power supply reviews can be harder to come by than, say, a new processor or video card, they exist. Don't go to Newegg--find sites that run PSUs through a rigorous testing environment. And if all else fails, at least pick up something that has the longest warranty you can find. Then, when your New Power Supply Of Choice blows up, at least you'll have a wide blanket of coverage.
Tech Centers Go Green Despite Cuts
The number of firms accelerating their green IT initiatives is double that of those scaling back such projects, according to a survey by Forrester Research.
The report, called 'A slowing economy won't slow down corporate green IT initiatives', found that 10 percent of firms are increasing their green IT expenditure, and 38 percent of firms are maintaining the level of expenditure. But five percent are cutting green spending, and 47 percent expressed uncertainty over the future.
Some 67 percent said the main motivation for pursuing a green agenda was reducing energy bills, up from 55 percent in the same survey one year ago. Regulation prompted 16 percent of businesses to implement greener systems, and nearly a third of companies said they wanted to align IT with business-wide green plans.
Some 52 percent of businesses had a green IT action plan, up from 40 percent last year. One thousand firms were interviewed in October for the survey, of which a third were in Europe, the Middle East and Africa.
Technology buyers were also more aware of manufacturers' green credentials, the survey said. Six in 10 businesses considered factors such as recycling and greener manufacturing when buying technology.
Christopher Mines, the author of the report, said: "Green IT is not a fad or a bubble. ... The slow-but-steady increase in awareness and activity bodes well in our view for continued growth in demand for greener IT products and services."
The report, called 'A slowing economy won't slow down corporate green IT initiatives', found that 10 percent of firms are increasing their green IT expenditure, and 38 percent of firms are maintaining the level of expenditure. But five percent are cutting green spending, and 47 percent expressed uncertainty over the future.
Some 67 percent said the main motivation for pursuing a green agenda was reducing energy bills, up from 55 percent in the same survey one year ago. Regulation prompted 16 percent of businesses to implement greener systems, and nearly a third of companies said they wanted to align IT with business-wide green plans.
Some 52 percent of businesses had a green IT action plan, up from 40 percent last year. One thousand firms were interviewed in October for the survey, of which a third were in Europe, the Middle East and Africa.
Technology buyers were also more aware of manufacturers' green credentials, the survey said. Six in 10 businesses considered factors such as recycling and greener manufacturing when buying technology.
Christopher Mines, the author of the report, said: "Green IT is not a fad or a bubble. ... The slow-but-steady increase in awareness and activity bodes well in our view for continued growth in demand for greener IT products and services."
Friday, December 19, 2008
Three Deals Symbolized Storage Trends in 2008
The storage story of 2008 was growth: An accelerating explosion of information, much of it in the form of video, led IT administrators to try to make better use of their capacity and staff.
Overall demand for storage capacity is growing by about 60 percent per year, according to IDC. Another research company, Enterprise Strategy Group, pegs the annual growth rate of data between 30 percent and 60 percent.
"Organizations are having a hard time getting their arms around all that data," said ESG analyst Lauren Whitehouse. Economic woes are making it even harder, with frozen or scaled-back budgets, while the downturn isn't expected to significantly slow data growth next year.
Stuck in that bind, organizations don't want to have to roll out a gigabyte of capacity in their own data centers for every new gigabyte that's created, analysts said.
"What we'll see more of in companies is a focus on efficiency," IDC analyst Rick Villars said. They're seeking to increase the utilization of their storage capacity as well as other IT resources.
A big part of that effort is virtualization of storage, which often goes hand in hand with server virtualization and became a mainstream technology in 2008, according to analyst John Webster of Illuminata. Storage vendors are offering more virtualization products and seeing more demand for them, he said. A virtualization capability such as thin provisioning, which lets administrators assign storage capacity to a new application without having to figure out how much it ultimately will need, helps make better use of resources, Webster said.
But in addition to the trend toward disconnecting logical from physical resources, there were a handful of acquisitions this year that signaled other trends in storage world.
1. Brocade-Foundry
On Dec. 19, Brocade Communications and Foundry Networks completed a deal they had announced in July before navigating the roughest waters the financial and credit markets have seen in a generation. The merger, now valued at $2.6 billion, is intended to address a coming merger of SAN (storage area network) and LAN technology.
SAN builders have long relied on Fibre Channel, a specialized networking technology designed not to drop packets. But in most cases, the rest of the enterprise network is based on Ethernet, which is cheaper than Fibre Channel and now available at higher speeds. Maintaining both requires more adapters on storage equipment and adds to an IT department's workload. The two types of networks are headed toward gradual consolidation under the FCOE (Fiber Channel Over Ethernet) standard, which is intended to make Ethernet reliable enough for storage networks. Then, Ethernet can be the network of choice across data centers and keep getting faster.
Brocade wasn't the only company thinking this way. Cisco, which will be the main competitive target of the merged company, bought out Nuova Systems in April and simultaneously announced a line of routing switches designed to connect the whole data center. The flagship Nexus 7000, which Cisco has positioned as one of its most important products ever, is built to scale to 15T bps (bits per second) and has a virtualized version of IOS (Internetwork Operating System) called NX OS. Like the combination of Brocade and Foundry, the Nexus line is likely to help enterprises virtualize their storage and computing resources and eventually streamline networking and management.
EMC and NetApp also introduced FCOE products this year. But the protocol is not expected to be in widespread use until 2010.
2. IBM-Diligent
In April, IBM acquired Diligent Technologies, which specializes in data de-duplication for large enterprise storage systems. The company didn't reveal how much the acquisition cost, but it was a key move in a market that could grow to US$1 billion in annual revenue by 2009, according to research company The 451 Group.
De-duplication systems find identical bits of data in a storage system, treat them as redundant, and eliminate them. So if there are several nearly identical copies of a document, all will be deleted except one copy and the differences that are unique to the other copies.
The Diligent deal was an early move in a year full of de-duplication activity. In June, Hewlett-Packard introduced a suite of de-duplication systems for small and medium-sized businesses and added some features to its HP StorageWorks backup line. And in November, EMC, Quantum and Dell said they would use a common software architecture for data de-duplication products. Dell will enter the de-duplication business next year. It is already a major reseller of EMC gear, under a partnership that in December was extended until 2013.
Data de-duplication can reduce the amount of storage capacity an enterprise requires by as much as two thirds, said ESG's Whitehouse. It has been available before, but this year companies started to integrate it with storage arrays or sell it in appliances, bringing the technology closer to a turnkey solution, she said. They also established data de-duplication as a technology customers could trust, at least for archived material.
"If you eliminate a block of data that somehow negates the value of that data when you recover it ... that's a really scary prospect for some companies," Whitehouse said.
So far, most enterprises are only using it for secondary storage, or the archived information that's backed up for safekeeping, she said. The next step will be to embrace de-duplication for primary storage, the data that applications are using in real time. Users will start to trust the technology enough for that next year, she said. In July, NetApp enhanced its V-Series storage virtualization products so they can perform de-duplication on primary storage systems from third parties such as EMC, Hitachi and HP.
3. EMC-Pi
In late February, enterprise storage giant EMC bought Pi, a provider of software and online services for consumers to keep track of personal information stored locally or online. The deal, which followed the company's 2007 buyout of online backup provider Mozy, was one sign of growing interest in cloud storage.
Handing off personal or corporate data to a third party's hard drives and accessing it via the Internet can be a less expensive alternative to provisioning all that capacity in your data center or home network. It may be used in conjunction with cloud-based applications, but also just for archiving or disaster recovery, Illuminata's Webster said. In many cases, the cloud-storage service can be set up as a target when data is being backed up. The information can be sent to the cloud only or to the cloud and a dedicated tape backup system simultaneously, he said.
With the economy weakening, cloud storage will be big next year, Webster believes. Paying for additional capacity on a monthly basis moves that expense out of the IT department's capital budget and into its operational budget, which tends to be easier to fund when times are tough, he said. It's also relatively quick because nothing needs to be purchased or installed, he added.
A related option, managed services, may also take off in the coming year, Webster said. While keeping their own storage systems in-house, enterprises can pay a vendor such as Brocade or IBM to manage it for them remotely. The vendor can monitor alerts through an appliance at the customer's site and respond if needed. If IT staff needs to be cut back, this may be one way to maintain service levels to the rest of the company, Webster said.
Overall demand for storage capacity is growing by about 60 percent per year, according to IDC. Another research company, Enterprise Strategy Group, pegs the annual growth rate of data between 30 percent and 60 percent.
"Organizations are having a hard time getting their arms around all that data," said ESG analyst Lauren Whitehouse. Economic woes are making it even harder, with frozen or scaled-back budgets, while the downturn isn't expected to significantly slow data growth next year.
Stuck in that bind, organizations don't want to have to roll out a gigabyte of capacity in their own data centers for every new gigabyte that's created, analysts said.
"What we'll see more of in companies is a focus on efficiency," IDC analyst Rick Villars said. They're seeking to increase the utilization of their storage capacity as well as other IT resources.
A big part of that effort is virtualization of storage, which often goes hand in hand with server virtualization and became a mainstream technology in 2008, according to analyst John Webster of Illuminata. Storage vendors are offering more virtualization products and seeing more demand for them, he said. A virtualization capability such as thin provisioning, which lets administrators assign storage capacity to a new application without having to figure out how much it ultimately will need, helps make better use of resources, Webster said.
But in addition to the trend toward disconnecting logical from physical resources, there were a handful of acquisitions this year that signaled other trends in storage world.
1. Brocade-Foundry
On Dec. 19, Brocade Communications and Foundry Networks completed a deal they had announced in July before navigating the roughest waters the financial and credit markets have seen in a generation. The merger, now valued at $2.6 billion, is intended to address a coming merger of SAN (storage area network) and LAN technology.
SAN builders have long relied on Fibre Channel, a specialized networking technology designed not to drop packets. But in most cases, the rest of the enterprise network is based on Ethernet, which is cheaper than Fibre Channel and now available at higher speeds. Maintaining both requires more adapters on storage equipment and adds to an IT department's workload. The two types of networks are headed toward gradual consolidation under the FCOE (Fiber Channel Over Ethernet) standard, which is intended to make Ethernet reliable enough for storage networks. Then, Ethernet can be the network of choice across data centers and keep getting faster.
Brocade wasn't the only company thinking this way. Cisco, which will be the main competitive target of the merged company, bought out Nuova Systems in April and simultaneously announced a line of routing switches designed to connect the whole data center. The flagship Nexus 7000, which Cisco has positioned as one of its most important products ever, is built to scale to 15T bps (bits per second) and has a virtualized version of IOS (Internetwork Operating System) called NX OS. Like the combination of Brocade and Foundry, the Nexus line is likely to help enterprises virtualize their storage and computing resources and eventually streamline networking and management.
EMC and NetApp also introduced FCOE products this year. But the protocol is not expected to be in widespread use until 2010.
2. IBM-Diligent
In April, IBM acquired Diligent Technologies, which specializes in data de-duplication for large enterprise storage systems. The company didn't reveal how much the acquisition cost, but it was a key move in a market that could grow to US$1 billion in annual revenue by 2009, according to research company The 451 Group.
De-duplication systems find identical bits of data in a storage system, treat them as redundant, and eliminate them. So if there are several nearly identical copies of a document, all will be deleted except one copy and the differences that are unique to the other copies.
The Diligent deal was an early move in a year full of de-duplication activity. In June, Hewlett-Packard introduced a suite of de-duplication systems for small and medium-sized businesses and added some features to its HP StorageWorks backup line. And in November, EMC, Quantum and Dell said they would use a common software architecture for data de-duplication products. Dell will enter the de-duplication business next year. It is already a major reseller of EMC gear, under a partnership that in December was extended until 2013.
Data de-duplication can reduce the amount of storage capacity an enterprise requires by as much as two thirds, said ESG's Whitehouse. It has been available before, but this year companies started to integrate it with storage arrays or sell it in appliances, bringing the technology closer to a turnkey solution, she said. They also established data de-duplication as a technology customers could trust, at least for archived material.
"If you eliminate a block of data that somehow negates the value of that data when you recover it ... that's a really scary prospect for some companies," Whitehouse said.
So far, most enterprises are only using it for secondary storage, or the archived information that's backed up for safekeeping, she said. The next step will be to embrace de-duplication for primary storage, the data that applications are using in real time. Users will start to trust the technology enough for that next year, she said. In July, NetApp enhanced its V-Series storage virtualization products so they can perform de-duplication on primary storage systems from third parties such as EMC, Hitachi and HP.
3. EMC-Pi
In late February, enterprise storage giant EMC bought Pi, a provider of software and online services for consumers to keep track of personal information stored locally or online. The deal, which followed the company's 2007 buyout of online backup provider Mozy, was one sign of growing interest in cloud storage.
Handing off personal or corporate data to a third party's hard drives and accessing it via the Internet can be a less expensive alternative to provisioning all that capacity in your data center or home network. It may be used in conjunction with cloud-based applications, but also just for archiving or disaster recovery, Illuminata's Webster said. In many cases, the cloud-storage service can be set up as a target when data is being backed up. The information can be sent to the cloud only or to the cloud and a dedicated tape backup system simultaneously, he said.
With the economy weakening, cloud storage will be big next year, Webster believes. Paying for additional capacity on a monthly basis moves that expense out of the IT department's capital budget and into its operational budget, which tends to be easier to fund when times are tough, he said. It's also relatively quick because nothing needs to be purchased or installed, he added.
A related option, managed services, may also take off in the coming year, Webster said. While keeping their own storage systems in-house, enterprises can pay a vendor such as Brocade or IBM to manage it for them remotely. The vendor can monitor alerts through an appliance at the customer's site and respond if needed. If IT staff needs to be cut back, this may be one way to maintain service levels to the rest of the company, Webster said.
NSA Patents a Way to Spot Network Snoops
The U.S. National Security Agency has patented a technique for figuring out whether someone is tampering with network communication.
The NSA's software does this by measuring the amount of time the network takes to send different types of data from one computer to another and raising a red flag if something takes too long, according to the patent filing.
Other researchers have looked into this problem in the past and proposed a technique called distance bounding, but the NSA patent takes a different tack, comparing different types of data travelling across the network. "The neat thing about this particular patent is that they look at the differences between the network layers," said Tadayoshi Kohno, an assistant professor of computer science at the University of Washington.
The technique could be used for purposes such as detecting a fake phishing Web site that was intercepting data between users and their legitimate banking sites, he said. "This whole problem space has a lot of potential, [although] I don't know if this is going to be the final solution that people end up using."
IOActive security researcher Dan Kaminsky was less impressed. "Think of it as -- 'if your network gets a little slower, maybe a bad guy has physically inserted a device that is intercepting and retransmitting packets,' " he said via e-mail. "Sure, that's possible. Or perhaps you're routing through a slower path for one of a billion reasons."
Some might think of the secretive NSA, which collects and analyzes foreign communications, as an unlikely source for such research, but the agency also helps the federal government protect its own communications.
The NSA did not answer questions concerning the patent, except to say, via e-mail, that it does make some of its technology available through its Domestic Technology Transfer Program.
The patent, granted Tuesday, was filed with the U.S. Patent and Trademark Office in 2005. It was first reported Thursday on the Cryptome Web site.
The NSA's software does this by measuring the amount of time the network takes to send different types of data from one computer to another and raising a red flag if something takes too long, according to the patent filing.
Other researchers have looked into this problem in the past and proposed a technique called distance bounding, but the NSA patent takes a different tack, comparing different types of data travelling across the network. "The neat thing about this particular patent is that they look at the differences between the network layers," said Tadayoshi Kohno, an assistant professor of computer science at the University of Washington.
The technique could be used for purposes such as detecting a fake phishing Web site that was intercepting data between users and their legitimate banking sites, he said. "This whole problem space has a lot of potential, [although] I don't know if this is going to be the final solution that people end up using."
IOActive security researcher Dan Kaminsky was less impressed. "Think of it as -- 'if your network gets a little slower, maybe a bad guy has physically inserted a device that is intercepting and retransmitting packets,' " he said via e-mail. "Sure, that's possible. Or perhaps you're routing through a slower path for one of a billion reasons."
Some might think of the secretive NSA, which collects and analyzes foreign communications, as an unlikely source for such research, but the agency also helps the federal government protect its own communications.
The NSA did not answer questions concerning the patent, except to say, via e-mail, that it does make some of its technology available through its Domestic Technology Transfer Program.
The patent, granted Tuesday, was filed with the U.S. Patent and Trademark Office in 2005. It was first reported Thursday on the Cryptome Web site.
Losing the Data Mangement Race
Storage is bigger and faster than ever, with 1.5TB drives shipping and 8Gbps Fibre Channel, 10Gbps iSCSI, Infiniband becoming affordable. The data to fill those disks and pipes is growing faster than ever, with archiving for e-discovery and legislative requirements growing all the time, audio and video data for surveillance, teleconference archives, video blog posts, Webcasts, and simply more business processes being digitized. By contrast, a unified approach for protecting and managing that data is not really much further along than it was ten years ago, when 10TB was a large amount of data for even big enterprises.
Now that petabytes are becoming commonplace, the problem is much more urgent. If indexing software to build metadata about all the files stored across an enterprise requires a cluster of servers to run, and it still takes days to complete an index, the utility of that metadata is limited. We keep getting hints of potential solutions to this sort of problem, such as Microsoft's promise of a new file system (Windows Future Storage) based on a relational database -- originally promised as part of Windows Server 2008 but now pushed out indefinitely.
Don't blame Microsoft for failing to pull the rabbit out of the hat; it's a difficult problem to solve. To automatically classify data and index it requires a high degree of artificial intelligence. Indexing engines that can run across a LAN and index data on multiple disparate systems are extremely processor and bandwidth intensive.
While some of today's data management applications do a good job, they tend to be isolated silos, tied to a specific vendor's storage or to an application running on a specific platform. An enterprise-wide, multi-platform data management system that can handle all aspects of data management, including indexing, metadata creation, virtualization, migration, data tiering, replication, and so forth does not yet exist.
For such a data management system to become a reality, three key pieces must come together: widely adopted standards for data management, which should come from SNIA, the Storage Networking Industry Alliance; methods for automatically classifying and finding data, which should come from the file system; and cooperation between storage and OS vendors to facilitate single-console management of data across multiple data storage platforms, operating systems, and networks.
Will these pieces fall into place before we're swimming in exabytes? It depends mostly on you. Ask your vendors for these features, and keep asking. Nearly all storage and operating system vendors are members of SNIA. The infrastructure is there to create the standards necessary, but it has taken much longer to make any progress than one might hope.
For more IT analysis and commentary on emerging technologies, visit InfoWorld.com. Story copyright © 2007 InfoWorld Media Group. All rights reserved.
Now that petabytes are becoming commonplace, the problem is much more urgent. If indexing software to build metadata about all the files stored across an enterprise requires a cluster of servers to run, and it still takes days to complete an index, the utility of that metadata is limited. We keep getting hints of potential solutions to this sort of problem, such as Microsoft's promise of a new file system (Windows Future Storage) based on a relational database -- originally promised as part of Windows Server 2008 but now pushed out indefinitely.
Don't blame Microsoft for failing to pull the rabbit out of the hat; it's a difficult problem to solve. To automatically classify data and index it requires a high degree of artificial intelligence. Indexing engines that can run across a LAN and index data on multiple disparate systems are extremely processor and bandwidth intensive.
While some of today's data management applications do a good job, they tend to be isolated silos, tied to a specific vendor's storage or to an application running on a specific platform. An enterprise-wide, multi-platform data management system that can handle all aspects of data management, including indexing, metadata creation, virtualization, migration, data tiering, replication, and so forth does not yet exist.
For such a data management system to become a reality, three key pieces must come together: widely adopted standards for data management, which should come from SNIA, the Storage Networking Industry Alliance; methods for automatically classifying and finding data, which should come from the file system; and cooperation between storage and OS vendors to facilitate single-console management of data across multiple data storage platforms, operating systems, and networks.
Will these pieces fall into place before we're swimming in exabytes? It depends mostly on you. Ask your vendors for these features, and keep asking. Nearly all storage and operating system vendors are members of SNIA. The infrastructure is there to create the standards necessary, but it has taken much longer to make any progress than one might hope.
For more IT analysis and commentary on emerging technologies, visit InfoWorld.com. Story copyright © 2007 InfoWorld Media Group. All rights reserved.
Subscribe to:
Posts (Atom)