Monday, December 22, 2008

No-Name Power Supplies Can Prove Painful

Power is the talk of the town in the PC industry, and for good reason. Nobody wants their desktop computer to double their electricity bill. But as much as we throw various power tweaks, myths, and all other kinds of electronic hocus-pocus back-and-forth, it's important to sit back for a moment and think about the core of your system: the power supply. It's the most critical part of your system, and it certainly won't last forever. But this is not a place where you're going to want to scrimp when it comes time to build a new machine or replace the aging power supply in your current rig.
Tech Report ran a pretty comprehensive batch of reviews for seven power supplies recently. While that normally doesn't sound like the kind of article that would draw many eyeballs, given the specificity of the topic, it's worth your while to check out. Spoiler alert: Be careful what you purchase with PSUs. As tempting as it might be to save your pennies on your power supply so you can afford that next tier of processor in your low-budget CPU, you're only going to hurt yourself in the long run.
Unlike grocery store food, Tech Report found that generic power supplies tend to lack proper cabling for all the accessory devices you'll want to plug into the PSU. Worse, their warranties can be shorter than name-brand PSUs--just like their cabling. Tech Report puts it best:
"Generic PSUs may not always be time bombs waiting to take your system down with them, but based on what we've seen, they're not worth the trouble and are poor values, anyway."
So what's a name-brand power supply? Well, if you go by the article, companies like Antec, Corsair, Enermax, and OCZ offer reliable, fault-proof PSUs--at least, more so than generic brands like Coolmax or SolyTech. But this isn't the kind of decision you should be making based on your brand familiarity at the store. If you must, consider the "too good to be true" adage--the cheaper and jankier the power supply away from the competitive average, the greater the likelihood that you're being hoodwinked. That's not to say that the best power supplies are super-expensive, but these generic PSUs can appear the most tempting because of their absurdly low costs.
Your best bet is to treat your power supply purchase like you would any other computer part: Do your research! While power supply reviews can be harder to come by than, say, a new processor or video card, they exist. Don't go to Newegg--find sites that run PSUs through a rigorous testing environment. And if all else fails, at least pick up something that has the longest warranty you can find. Then, when your New Power Supply Of Choice blows up, at least you'll have a wide blanket of coverage.

Tech Centers Go Green Despite Cuts

The number of firms accelerating their green IT initiatives is double that of those scaling back such projects, according to a survey by Forrester Research.
The report, called 'A slowing economy won't slow down corporate green IT initiatives', found that 10 percent of firms are increasing their green IT expenditure, and 38 percent of firms are maintaining the level of expenditure. But five percent are cutting green spending, and 47 percent expressed uncertainty over the future.
Some 67 percent said the main motivation for pursuing a green agenda was reducing energy bills, up from 55 percent in the same survey one year ago. Regulation prompted 16 percent of businesses to implement greener systems, and nearly a third of companies said they wanted to align IT with business-wide green plans.
Some 52 percent of businesses had a green IT action plan, up from 40 percent last year. One thousand firms were interviewed in October for the survey, of which a third were in Europe, the Middle East and Africa.
Technology buyers were also more aware of manufacturers' green credentials, the survey said. Six in 10 businesses considered factors such as recycling and greener manufacturing when buying technology.
Christopher Mines, the author of the report, said: "Green IT is not a fad or a bubble. ... The slow-but-steady increase in awareness and activity bodes well in our view for continued growth in demand for greener IT products and services."

Friday, December 19, 2008

Three Deals Symbolized Storage Trends in 2008

The storage story of 2008 was growth: An accelerating explosion of information, much of it in the form of video, led IT administrators to try to make better use of their capacity and staff.
Overall demand for storage capacity is growing by about 60 percent per year, according to IDC. Another research company, Enterprise Strategy Group, pegs the annual growth rate of data between 30 percent and 60 percent.
"Organizations are having a hard time getting their arms around all that data," said ESG analyst Lauren Whitehouse. Economic woes are making it even harder, with frozen or scaled-back budgets, while the downturn isn't expected to significantly slow data growth next year.
Stuck in that bind, organizations don't want to have to roll out a gigabyte of capacity in their own data centers for every new gigabyte that's created, analysts said.
"What we'll see more of in companies is a focus on efficiency," IDC analyst Rick Villars said. They're seeking to increase the utilization of their storage capacity as well as other IT resources.
A big part of that effort is virtualization of storage, which often goes hand in hand with server virtualization and became a mainstream technology in 2008, according to analyst John Webster of Illuminata. Storage vendors are offering more virtualization products and seeing more demand for them, he said. A virtualization capability such as thin provisioning, which lets administrators assign storage capacity to a new application without having to figure out how much it ultimately will need, helps make better use of resources, Webster said.
But in addition to the trend toward disconnecting logical from physical resources, there were a handful of acquisitions this year that signaled other trends in storage world.
1. Brocade-Foundry
On Dec. 19, Brocade Communications and Foundry Networks completed a deal they had announced in July before navigating the roughest waters the financial and credit markets have seen in a generation. The merger, now valued at $2.6 billion, is intended to address a coming merger of SAN (storage area network) and LAN technology.
SAN builders have long relied on Fibre Channel, a specialized networking technology designed not to drop packets. But in most cases, the rest of the enterprise network is based on Ethernet, which is cheaper than Fibre Channel and now available at higher speeds. Maintaining both requires more adapters on storage equipment and adds to an IT department's workload. The two types of networks are headed toward gradual consolidation under the FCOE (Fiber Channel Over Ethernet) standard, which is intended to make Ethernet reliable enough for storage networks. Then, Ethernet can be the network of choice across data centers and keep getting faster.
Brocade wasn't the only company thinking this way. Cisco, which will be the main competitive target of the merged company, bought out Nuova Systems in April and simultaneously announced a line of routing switches designed to connect the whole data center. The flagship Nexus 7000, which Cisco has positioned as one of its most important products ever, is built to scale to 15T bps (bits per second) and has a virtualized version of IOS (Internetwork Operating System) called NX OS. Like the combination of Brocade and Foundry, the Nexus line is likely to help enterprises virtualize their storage and computing resources and eventually streamline networking and management.
EMC and NetApp also introduced FCOE products this year. But the protocol is not expected to be in widespread use until 2010.
2. IBM-Diligent
In April, IBM acquired Diligent Technologies, which specializes in data de-duplication for large enterprise storage systems. The company didn't reveal how much the acquisition cost, but it was a key move in a market that could grow to US$1 billion in annual revenue by 2009, according to research company The 451 Group.
De-duplication systems find identical bits of data in a storage system, treat them as redundant, and eliminate them. So if there are several nearly identical copies of a document, all will be deleted except one copy and the differences that are unique to the other copies.
The Diligent deal was an early move in a year full of de-duplication activity. In June, Hewlett-Packard introduced a suite of de-duplication systems for small and medium-sized businesses and added some features to its HP StorageWorks backup line. And in November, EMC, Quantum and Dell said they would use a common software architecture for data de-duplication products. Dell will enter the de-duplication business next year. It is already a major reseller of EMC gear, under a partnership that in December was extended until 2013.
Data de-duplication can reduce the amount of storage capacity an enterprise requires by as much as two thirds, said ESG's Whitehouse. It has been available before, but this year companies started to integrate it with storage arrays or sell it in appliances, bringing the technology closer to a turnkey solution, she said. They also established data de-duplication as a technology customers could trust, at least for archived material.
"If you eliminate a block of data that somehow negates the value of that data when you recover it ... that's a really scary prospect for some companies," Whitehouse said.
So far, most enterprises are only using it for secondary storage, or the archived information that's backed up for safekeeping, she said. The next step will be to embrace de-duplication for primary storage, the data that applications are using in real time. Users will start to trust the technology enough for that next year, she said. In July, NetApp enhanced its V-Series storage virtualization products so they can perform de-duplication on primary storage systems from third parties such as EMC, Hitachi and HP.
3. EMC-Pi
In late February, enterprise storage giant EMC bought Pi, a provider of software and online services for consumers to keep track of personal information stored locally or online. The deal, which followed the company's 2007 buyout of online backup provider Mozy, was one sign of growing interest in cloud storage.
Handing off personal or corporate data to a third party's hard drives and accessing it via the Internet can be a less expensive alternative to provisioning all that capacity in your data center or home network. It may be used in conjunction with cloud-based applications, but also just for archiving or disaster recovery, Illuminata's Webster said. In many cases, the cloud-storage service can be set up as a target when data is being backed up. The information can be sent to the cloud only or to the cloud and a dedicated tape backup system simultaneously, he said.
With the economy weakening, cloud storage will be big next year, Webster believes. Paying for additional capacity on a monthly basis moves that expense out of the IT department's capital budget and into its operational budget, which tends to be easier to fund when times are tough, he said. It's also relatively quick because nothing needs to be purchased or installed, he added.
A related option, managed services, may also take off in the coming year, Webster said. While keeping their own storage systems in-house, enterprises can pay a vendor such as Brocade or IBM to manage it for them remotely. The vendor can monitor alerts through an appliance at the customer's site and respond if needed. If IT staff needs to be cut back, this may be one way to maintain service levels to the rest of the company, Webster said.

NSA Patents a Way to Spot Network Snoops

The U.S. National Security Agency has patented a technique for figuring out whether someone is tampering with network communication.
The NSA's software does this by measuring the amount of time the network takes to send different types of data from one computer to another and raising a red flag if something takes too long, according to the patent filing.
Other researchers have looked into this problem in the past and proposed a technique called distance bounding, but the NSA patent takes a different tack, comparing different types of data travelling across the network. "The neat thing about this particular patent is that they look at the differences between the network layers," said Tadayoshi Kohno, an assistant professor of computer science at the University of Washington.
The technique could be used for purposes such as detecting a fake phishing Web site that was intercepting data between users and their legitimate banking sites, he said. "This whole problem space has a lot of potential, [although] I don't know if this is going to be the final solution that people end up using."
IOActive security researcher Dan Kaminsky was less impressed. "Think of it as -- 'if your network gets a little slower, maybe a bad guy has physically inserted a device that is intercepting and retransmitting packets,' " he said via e-mail. "Sure, that's possible. Or perhaps you're routing through a slower path for one of a billion reasons."
Some might think of the secretive NSA, which collects and analyzes foreign communications, as an unlikely source for such research, but the agency also helps the federal government protect its own communications.
The NSA did not answer questions concerning the patent, except to say, via e-mail, that it does make some of its technology available through its Domestic Technology Transfer Program.
The patent, granted Tuesday, was filed with the U.S. Patent and Trademark Office in 2005. It was first reported Thursday on the Cryptome Web site.

Losing the Data Mangement Race

Storage is bigger and faster than ever, with 1.5TB drives shipping and 8Gbps Fibre Channel, 10Gbps iSCSI, Infiniband becoming affordable. The data to fill those disks and pipes is growing faster than ever, with archiving for e-discovery and legislative requirements growing all the time, audio and video data for surveillance, teleconference archives, video blog posts, Webcasts, and simply more business processes being digitized. By contrast, a unified approach for protecting and managing that data is not really much further along than it was ten years ago, when 10TB was a large amount of data for even big enterprises.
Now that petabytes are becoming commonplace, the problem is much more urgent. If indexing software to build metadata about all the files stored across an enterprise requires a cluster of servers to run, and it still takes days to complete an index, the utility of that metadata is limited. We keep getting hints of potential solutions to this sort of problem, such as Microsoft's promise of a new file system (Windows Future Storage) based on a relational database -- originally promised as part of Windows Server 2008 but now pushed out indefinitely.
Don't blame Microsoft for failing to pull the rabbit out of the hat; it's a difficult problem to solve. To automatically classify data and index it requires a high degree of artificial intelligence. Indexing engines that can run across a LAN and index data on multiple disparate systems are extremely processor and bandwidth intensive.
While some of today's data management applications do a good job, they tend to be isolated silos, tied to a specific vendor's storage or to an application running on a specific platform. An enterprise-wide, multi-platform data management system that can handle all aspects of data management, including indexing, metadata creation, virtualization, migration, data tiering, replication, and so forth does not yet exist.
For such a data management system to become a reality, three key pieces must come together: widely adopted standards for data management, which should come from SNIA, the Storage Networking Industry Alliance; methods for automatically classifying and finding data, which should come from the file system; and cooperation between storage and OS vendors to facilitate single-console management of data across multiple data storage platforms, operating systems, and networks.
Will these pieces fall into place before we're swimming in exabytes? It depends mostly on you. Ask your vendors for these features, and keep asking. Nearly all storage and operating system vendors are members of SNIA. The infrastructure is there to create the standards necessary, but it has taken much longer to make any progress than one might hope.
For more IT analysis and commentary on emerging technologies, visit InfoWorld.com. Story copyright © 2007 InfoWorld Media Group. All rights reserved.

Outsourcing Won't Be a Cure-All in 2009

Enterprise IT executives looking to cut expenses in 2009 will consider outsourcing, but industry watchers argue moving from fixed to variable costs could also result in unreliable services, unpredictable outcomes and financial woes.
"Throughout the history of outsourcing, we've seen during the tough times that a lot of these decisions are extremely tactical, short-term oriented and present consequences downstream," says Ben Pring, research vice president at Gartner. "An uncertain economy is not a slam dunk for outsourcers, but some very bad deals can be put in place. It's the ugly truth of outsourcing."
Outsourcing in a tight economy can represent a classic case of "You get what you pay for" to enterprise IT executives, analysts say. For instance, companies looking to squeeze costs out of a contract with a service provider can suffer degraded service levels without too much concern from the outsourcer, Pring explains. Passing responsibility for infrastructure, applications or staffing to a third-party -- regardless of the economy -- leaves the customer vulnerable to the whims of the provider and can make outsourcing a risky proposition in tough financial times.
In the short term the deal may help a company's bottom line, but long term, enterprise companies need high-quality services to better compete.
"Oftentimes, you are going to be disappointed with the level of cost reduction you can achieve in an outsourcing deal, and if that is all you are focused on, inevitably, it will produce a bad deal," Pring says. "Historically customers get lousy quality of service when trying to squeeze an outsourcer. . . . Really the service provider has no incentive to invest in better services or staff for a smaller contract."
The trend toward shorter-term contracts and smaller deals will continue in 2009, according to global sourcing advisory firm TPI, and despite the rush to reduce expenses, contract negotiation cycles have lengthened -- which could benefit both customer and provider.
"One of the big concerns for TPI is that in the angst and rush to judgment, decisions could be made without fully evaluating the risk," says Mike Slavin, partner and managing director for CIO Services North America at TPI. "Once a firm has made the decision to outsource, it is a pretty hard decision to come back from in the short term."
Because economic uncertainty for 2009 has reached a fever pitch, enterprise IT executives need to be smarter than ever about signing services contracts. By no means should companies abandon outsourcing as an attractive option, but when looking to strike a deal with a service provider, enterprise IT executives need to take a step back from the need to reduce expenses immediately and think about IT needs a year or more from now, analysts say.
"It's so tumultuous that there is no clear line of sight in terms of when the end of the upset will come. There is still too much we don't know," says Christine Ferrusi Ross, vice president and research director at Forrester Research. "A lot of clients coming up on renewals are being extremely cautious and taking their time to weigh risks and get the right deal for them."
Analysts advise enterprise IT decision makers to research service providers' financials, product road maps, deal flow and turnover. Outsourcers are not safe from the current economic conditions and may not survive the storm any better than others. "There has to be a big focus on vendor risk and vendor viability. The deal may sound great now, but if the vendor goes south in six months, where does that leave you?" Ross says.
Another key factor to investigate is service levels. Talk to providers' current clients and determine if they have begun to slip on service levels or started to reveal gaps in service, analysts say.
"A lot of these outsourcers are not immune to the meltdown that is occurring; many of the bigger ones have been working with financial services institutions, and inevitably that will hurt even the biggest service providers," Gartner's Pring says.

Watchdog Group Asks Google to Create Personal Data "opt-out"

The nonprofit group Consumer Watchdog asked on Friday that Google give users of its search engine the ability to "opt out" of leaving personal data, such as IP addresses, on Google's servers.
"Many people don't understand that the kind of unnoticed conversations that are going on between them and [Google's servers]," said John Simpson, policy advocate at Consumer Watchdog. "Some of that can provide a useful, helpful service to the user, but people need to know what they're providing and made informed judgments about whether they want to or not."
Search vendors have of late moved in varying degrees to assuage user privacy concerns, albeit not to the degree groups like Consumer Watchdog would prefer.
Yahoo recently said it will anonymize most personal data it collects after three months; in September, Google said it would keep such information for nine months, halving its previous policy of 18 months. Microsoft, which now retains the data for 18 months, said recently it would drop that to six months if other search vendors agree.
Consumer Watchdog is targeting Google because the company is such a dominant player in Internet search, Simpson said. "It's really the opportunity for them to become the gold standard for privacy on the Internet. If they can be made to see the light, then others will fall into line too."
Simpson said he recently became aware that Ask.com provides a service called "AskEraser," which allows site users to scrub their personal information from the company's systems.
"That's what prompted us to say [to Google], if these guys can do it, why can't you?" Simpson said.
Consumer Watchdog is also asking Google chairman and CEO Eric Schmidt for an in-person meeting to discuss their request.
"I'm very optimistic that he's sincere about listening to people's concerns," Simpson said. "I fully expect we'll have a meeting at some point."
Google did not immediately respond to a request for comment Friday.

Monday, December 15, 2008

Boost Your Corporate Blog

Forrester Research analyst Jeremiah Owyang has devised a "health check" to establish whether your corporate blog has the right stuff -- the only problem being that you don't have a corporate blog. Probably. No, almost certainly.
Owyang's checklist is worth reading if you do, however, and it vicariously points out what's missing from most corporate blogs -- that is, trust. But the elephant on the table is the absence of industry leaders prepared to discuss in open forum what's really going on inside their heads.
The PR folks at Sun Microsystems did a tremendous job in persuading CEO Jonathan Schwartz to start a blog that helped make the firm appear less dot-com relic and more relevant computing company, at least for a while. Schwartz lived up to the promise that Robert Scoble and Shel Israel outlined in their book Naked Conversations -- that blogging can bring firms closer to customers, removing the fangs and even (gasp!) humanizing the business by providing direct links to the people behind the decisions.
For a while, corporate blogging -- or 'clogging', as was demanded by the relentless search for neologisms on the web -- was du jour and all the rage but it seems to me that its fling was short-lived. The business blogging community now lacks big names and depends on the egotistical and the occasional. The groundswell was just that, and the big names almost never joined in apart from a select, and perhaps self-serving, few from within the tech sector.
I learned this as part of researching a commission for a forthcoming piece about CIO bloggers (and, by the way, there are too few of those also, although enough to make a feature).
It's not like Twitter or other micro-blah services have replaced the cloggers. It's just that firms seem to have decided that blogging is not worth the candle when it comes to describing what's going on in boardroom and executive decision-making salon.
Call me old-media dinosaur but I'm struggling to find useful naked conversations out there when it comes to blue-chip entities. I love blogging for its access to specialist areas but its effect on the visibility of top people at name-brand companies is negligible. I call that a shame but if I'm being unfair and you know blogs that are business-orientated, written by the bylined author and useful, do let me know and I'll take it all back.

Can Open Source Help the Economy?

In the last major economic downturn, Linux established itself as a widely-accepted enterprise operating system, benefiting a lively ecosystem of vendors such as Red Hat and Novell. The return of tough economic times puts the open source alternative again front and center, this time with focus on databases and higher-level software applications.
I believe we've entered another era for open-source companies of all stripes. IT decision makers need to fight the financial crisis and they need a more efficient solution for critical enterprise system and IT needs.
As IT costs grow and the economic crisis puts pressure on global IT budgets, open source becomes irresistibly attractive to developers and IT decision makers who are being asked to do more with a whole lot less. Meanwhile, proprietary vendors react by increasing license fees by 15 percent to 45 percent, they continue to lock-in their customers, and they take away independence regarding choice and flexibility across the enterprise technology infrastructure.
That's why open-source solutions are more attractive than ever.
During the last economic downturn in 2001-2002, open-source usage and adoption was on an upward curve. Red Hat, for example, began winning large customer accounts that are now the backbone of their customer base. CIOs and CTOs were on the lookout for innovative ways to save costs both from a technology and people perspective, and open source was a great solution. Just like it is today.
Red Hat began to see the fruits of their labor in late 2002; the company grew revenue 14 percent for the year and that growth improved to 38 percent and 58 percent in 2003 and 2004 respectively. Given the timing of subscription revenues and long sales cycles, it is not hard to conclude that during the 2001-2002 economic downturn, large corporations made the decision to switch to open-source technologies. It also explains why Novell paid $200 million for Suse Linux in late 2003, which at the time, was roughly 20 times its revenues.
Just as in the last downturn, every IT decision maker today is faced with increased license and spiraling support costs for complex proprietary solutions. But business demands critical new capabilities at lower costs. Now is the time for IT leaders to make a loud and clear choice: accept the extraordinary expense and "lock-in" of proprietary vendors, or take advantage of open-source's cost-effectiveness and freedom.
With freedom also comes faster innovation.
My experience at the New York Stock Exchange (NYSE) was that we could innovate more rapidly with open source through rapid technical collaboration and by eliminating the long legal and contractual delays of the proprietary software model.
An open-source user who has worked with both Linux and Ingres agrees. Alan Nidiffer, VP and CIO at C&K Market, a West Coast grocery chain, recently explained why open source offers more innovation and faster development times. According to Nidiffer, the set release schedules of traditional software companies slow down innovation, whereas open-source improvements come at any time. He notes that innovative features can even come from software developers outside of his company; they have fresh ideas on how to continuously improve the applications.
Open source now provides a complete stack of enterprise-grade software backed by excellent support, and rivals the technological strength of traditional proprietary vendor offerings. Successful implementations for mission-critical workloads continue to dismiss concerns about support, security and reliability. As more customers share their open source success stories, first-time open source users now have powerful innovators and role models to follow.
For example, I met recently with a customer who processes more than $100 billion a day in fund transfers for 160 banks using an open source stack that includes the Ingres database and Red Hat's JBoss application server. Meanwhile, customers from diverse industries spanning airlines (Lufthansa) and manufacturing (PPG Industries) to grocery store chains (C&K Market) and government agencies (National Center for Missing & Exploited Children) are embracing open-source solutions to manage millions of pieces of information critical to run and maintain their businesses, and even save lives. These companies trust open source to do the jobs that must get done, and appreciate the accompanying cost savings, innovation and freedom.
These pioneering companies are choosing to forgo proprietary solutions and adopt open source stacks from companies such as Alfresco, JasperSoft and Ingres, from core database systems to emerging open-source stacks that support enterprise content management, document management, transaction processing and business intelligence.
Savings matters.
According to a recent report from Forrester Research, companies can save as much as 25 percent of their database costs by switching to open-source databases and have the potential of an additional 25 percent savings in hardware costs by using open source on commodity servers. C&K Markets' Nidiffer says his company has saved nearly 20 percent using the Ingres open-source database.
In the coming months and years, I predict that many companies will experience open source benefits, including greater innovation and much lower costs, while delivering the enterprise performance that companies require.
More than 10,000 of our worldwide customers, in sectors ranging from financial services through manufacturing and distribution to the public sector, prove this equation each day.
It is a new time for open source to show its true value as the alternative choice to companies that will be hit hard by the financial crisis.

Web 2.0 Tactics for Successful Job-Hunting

With unemployment at a 14-year high and 240,000 workers laid off in October alone, many Americans are scrambling to update their resumes and turning to job boards and networking sites. Some are panicking as they try to devise new ways to get in front of employers. But even in trying times like these, prospective employees shouldn't completely reinvent their job-seeking styles.
Indeed, much of the tried-and-true career advice we've all heard is relevant in your next job search. To outshine your competitors and win the gig in today's economy , here's a secret to success: Don't abandon the steadfast career tips passed down from generations, but rather, refine them -- with a keen eye for the value in Web 2.0 tools like social networking.
Whether you're one of many IT professionals out of work or among the few making career leaps despite rocky economic times, consider these five ways to express your candidacy with flair.
Self-assess to stand out. Assess your core strengths, as well as qualities that will set you apart from the competition. Then strategize ways to emphasize these qualities in your resume, cover letter and the interviewing process.
For example, tailor the experience, skills and education sections of your resume to the position you're applying for. Use keywords from the job posting, employer's Web site and any related articles on the company.
Also, be specific with numbers. List how many employees you've managed, systems you've administered or applications you've developed.
Letters of reference can also set you apart, if they are particularly compelling and give an accurate description and concrete examples of your talent, work ethic or past successes. Try reaching out to references who have expertise that relates to the field or company you're applying for. Your prospective employer might take confidence in knowing you were trained or mentored by others with similar goals, interests and objectives.
Keep your skills a step ahead. To stand out and stay on the cutting edge, demonstrate fluency in state-of-the-art technical and functional skills as well as the standard competencies a specific role demands.
If you're a Visual Basic programmer, for example, don't settle for expertise in ASP.Net, VB.Net and SQL. Enhance that with experience in data warehouses, OLAP analysis tools and Business Objects reporting. For a Java application developer, Fatwire CMS experience is a plus.
Stay current in your field by reading trade publications, news articles, blogs and market research reports from firms such as Gartner. And attend a conference in your niche to build your knowledge and to network. You might find someone who can connect you to a valuable contact, or even meet your next employer.
It's also important to have the necessary management skills, such as leadership capability, project management knowledge and mentoring experience. According to a March 2007 report from Forrester Research, 55% of the 280 IT decision-makers polled cited project management expertise as a missing skill among techies.
Remember, It's Who You Know
Network out of the box. As tried-and-true career advice suggests, networking is key to landing a gig. Monster.com and HotJobs.com are great starting points, but many positions are never publicly posted.
So don't be embarrassed to spread the word to your friends, former colleagues and contacts. Inform them that you're currently in the market for a new job, and don't be shy about asking them to put you in touch with any of their relevant contacts. It's likely you'll be able to return the favor some day.
But don't stop in the physical world. Social networking sites such as Facebook and LinkedIn , as well as microblogging tools like Twitter , make it easy to initiate relationships with new contacts. Remember, however, that this is just a launching pad. It's up to you to develop and grow these relationships.
When you invite someone to join your social network , include a personalized note. Mention mutual friends or contacts, if any, or call attention to any specific interests that you share.
Think of activities in your past that make building bridges easy, and start with those organizations. Colleges, high schools, former employers and home towns often provide the links that allow an initial conversation to build into a potential referral.
Team up with a staffing firm .Staffing professionals closely follow the job market every day and have extensive knowledge about trends in your field and the most updated inventory of available positions. They know what skills and experience top employers are looking for and can guide you through the entire recruitment process, from searching for a job to negotiating salary and other benefits once the position is yours.
With a trusted and diverse network of hiring managers at their fingertips, staffing firms have the resources to help find suitable placements for job seekers looking for short- or long-term assignments. They have extensive information about employers' backgrounds, enabling them to match a position and company to a job seeker's skills and priorities.
Such firms are increasingly rolling out online resources for job seekers to tap into. At Yoh, for example, our online career database provides over 400 listings of contingent and direct hire jobs, sorted by position and location. In addition, our online career resources provide tips on subjects ranging from preparing for a telephone interview to writing a proper letter of resignation.
Upgrade your online image. Networking online begins with tools such as LinkedIn and Twitter , but there's much more to consider -- such as your digital footprint. Today, it's easy to run a Google search and instantly be connected to your various online posts, personal blog and Flickr page of photos that trace back to your college days.
In fact, according to a 2008 CareerBuilder survey, 22% of employers check candidates' Facebook profiles before hiring them, up from 11% two years ago. What's more, one-third of hiring managers rejected candidates based on what they found, CareerBuilder says.
To ensure you're not one of the candidates that gets rejected prior to an interview, be mindful of what you post online, and continually consider what others might assume or perceive from what you share. You can also protect your online persona by implementing privacy restrictions, such as displaying a limited public profile on Facebook or using invite-only photo-sharing on Flickr.
Also make an effort to connect with others who share your interests. For example, join online networks that directly relate to your personal interests or career field. For IT professionals, this might include SNetBase.com , 9Rules.com or Fark.com .
To take it a step further, visit the blogs of leading experts in your field, and post your comments or reactions. These comments are likely to turn up in a Google search if a prospective employer looks for your name. Your opinions in the comments reflect your knowledge and insights and can increase your credibility and marketability. Moreover, you'll initiate relationships with these bloggers and open yourself up to learn more and grow.
Better yet, publish your own blog, where you can provide expert insight and analysis of your field. This will surely give you a voice among your peers, and you'll be on your way to developing a community of like-minded individuals in the blogosphere.
Negotiate wisely. " What's your salary range? " That can be a loaded question, and there are do's and don'ts when answering it. A number too low can cost you thousands of dollars, but a figure too high might take you out of the running.
Do have a clear range in mind. This involves doing your homework. Since salaries are usually tied to the competitive market, it's important to know the average pay of others in a similar field, position, experience and geographic location. Fortunately, there are a number of reliable sites to keep your salary expectations in line with market offerings.
At Glassdoor.com, visitors can view salaries of more than 14,000 companies for free. The site features candid company reviews, including detailed pros and cons of everything from work hours to culture, as well as opinions about senior management and tips on how to rise through the ranks. All posts are anonymous, but here's the catch: It's a "give-to-get" model, so be willing to share insights about your company in order to receive those about others.
PayScale.com also provides the inside scoop for job seekers, employees and employers. After visitors complete a personal profile and multipage survey, the site delivers a free PayScale Summary Report tailored to the visitor's requested company. Like at Glassdoor, profiles are anonymous and forthright.
So continue to follow that tried-and-true career advice you've relied on for years.But to leverage it for success in today's market, use social networking sites and the wealth of information on the Internet to your advantage.
You can expand your opportunities, differentiate yourself in the talent pool and tap into new resources to more effectively network, apply for positions, interview and, of course, close the deal.

Monday, December 8, 2008

SugarCRM Adds Hooks to Cloud Data Services

Commercial open-source CRM (customer relationship management) vendor SugarCRM said Monday it will give customers the ability to plug in feeds from third-party data sources like the business social-networking site LinkedIn.
The new "Cloud Connectors" feature is part of the vendor's new SugarCRM 5.2 release, which will be available worldwide this month.
While users could obviously tap such third-party services separately, SugarCRM created the new integration capability because it keeps users in a CRM context, as well as makes the process more convenient and efficient, said Martin Schneider, director of product marketing.
If you're logging into third-party sites "while you're on the phone with someone, you're going to be hemming and hawing and you're not going to have it at your fingertips," he said. "The idea is to drive adoption and keep people in one space, but also give them unfettered access to bringing content into the CRM system."
Windows, which SugarCRM is calling "Cloud Views," will pop up with relevant information, such as which of a user's LinkedIn connections work at a certain company. Users can also import this information into SugarCRM.
The Cloud Connectors are made possible by a new data services framework that Schneider characterized as "a toolkit for developers or really astute users to bring in any type of data source."
Other new features in SugarCRM 5.2 include:
-- "Social Feeds," which provide the types of alerts and status details common to social-networking sites, applied to a CRM milieu. Salespeople could set up alerts to tell team members about newly closed deals, for example. "It's kind of like using an internal Twitter inside Sugar," Schneider said.
-- More granular administration capabilities that can give various department leaders authority to manage specific parts of the system, without the need for IT involvement. For example, a company's director of marketing might be granted admin rights over the campaigns module, Schneider said.
-- "Portal Dashlets," which can display information from third-party Web sites inside SugarCRM.
One industry observer said the Cloud Connectors feature is similar to past efforts by CRM vendors like Salesforce, but is nonetheless "a great idea."
"It seems to me that offering the ability to connect CRM and social apps like LinkedIn makes sense," said Denis Pombriant, managing principal of Beagle Research Group in Stoughton, Massachusetts. "It's essential to the next iteration of CRM that we have good embedded social-networking tools."
451 Group analyst China Martens said via e-mail that SugarCRM has been talking for some time about adding more collaborative features to its software, but "had a couple of other major calls to make before turning its attention to social CRM, notably, the revamping of its on-demand architecture with Sugar 5.0 and then the improvement of its mobile support in Sugar 5.1."
Martens speculated on whether SugarCRM will expand on the Cloud Connectors idea, using information from sources like LinkedIn and Hoover's "to prepopulate its CRM apps from the get-go, so a customer in a given industry can sign up for their own version of Sugar and download it already complete with predefined prospect contact details."
Pricing for on-demand versions of SugarCRM starts at US$40 per user per month, while on-premises edition costs begin at $275 per user per year.

Salesforce Links Force.com to Google App Engine

Salesforce.com is set to announce Monday that it is connecting its Force.com development platform with Google's App Engine.
The news, which Salesforce CEO Marc Benioff is expected to discuss during a company event in New York, follows Salesforce's recent announcement of a similar arrangement with Amazon Web Services' Elastic Compute Cloud (EC2) and Simple Storage Service (S3).
Google's App Engine, which is still in preview mode, is aimed at developers who want to quickly and easily build scalable Web applications, while AWS is positioned as a more generalized, flexible infrastructure platform for serving all types of programs.
Meanwhile, Force.com provides a database, Java-like programming language, integration and workflow capabilities, and user-interface design tools for creating business applications that run on Salesforce's cloud infrastructure.
The results from this hook up remain to be seen, said Adam Gross, vice president of developer marketing at Salesforce.
"Obviously, it's up to the imagination of developers what they create," Gross said.
In a statement, Google said the integration "will foster the creation of new Web applications and further demonstrate the power of the Web as a platform."
One industry observer is expecting big things to happen over time as cloud platforms merge in this manner.
"We're really talking about the invention of apps that don't exist right now, that will exist at the intersection of CRM [customer relationship management] or more broadly, business applications and front-office applications," said Denis Pombriant, managing principal of Beagle Research in Stoughton, Massachusetts. "Or front-office applications and social-networking applications. This opens up a door, or maybe a couple of doors, to really new innovation."
Monday's announcement is the latest stage in Salesforce and Google's relationship -- which has also resulted in an integration between Salesforce and Google Apps -- and could prompt another round of speculation that the search engine giant will eventually buy Salesforce.
But Pombriant hopes such a deal doesn't materialize.
"I think it's highly important that the two companies remain separate, and that they continue pursuing their own unique paths toward platform integration," he said. "I don't think you can have a cloud computing era happen if all the clouds are owned by the same company."

HP Updates Desktop Virtualization Software

Hewlett-Packard hopes to widen the use of its desktop virtualization products with new software that will improve video playback and allow the use of USB peripherals such as webcams, the company announced Monday.
HP is also rebranding its desktop virtualization suite as the HP Virtual Client Essentials, and adding Linux support for its broker software, called Session Allocation Manager, which runs only on Windows today, HP said.
Most of the updates concern HP's Virtual Desktop Infrastructure suite, which allows a company to run multiple images of a desktop OS in virtual containers on a server, instead of having to manage a separate OS on each employee's PC.
Virtualized desktops are catching on at some businesses but companies need to provide workers with an experience similar to what they'd expect from a standard desktop PC, and that hasn't always been the case with multimedia content, said industry analyst Roger Kay, president of Endpoint Technologies Associates.
HP said it has solved that problem by developing an enhanced version of Microsoft's Remote Desktop Protocol, which transfers presentation data between thin clients and Windows applications running on a virtualized server.
The existing RDP works fine for relaying basic on-screen data, such as keyboard strokes and mouse movements, but it's not good at carrying rich content such as a training video or webcast, said Manoj Malhotra, product marketing manager for HP's Client Virtualization group.
"The server gets overloaded when it tries to decode a video stream for a large number of users, and some employees end up having a poor experience," he said.
HP's enhanced RDP shifts the burden of decoding video away from the server and onto the thin clients, he said. That will allow companies to stream video to a large number of employees without a deterioration in performance, he said. The new protocol also lets them plug in a wide range of USB peripherals, which don't work well with the existing RDP, according to HP.
The enhanced protocol will be preinstalled starting in January on HP's thin clients running Windows XP Embedded, and on its Linux thin clients later in the first quarter, Malhotra said.
HP said the enhanced RDP is aimed at basic productivity workers. The company also has its own RGS (Remote Graphics Software) protocol, which it positions for applications that use higher end graphics, such as CAD programs, or that multiple users access at once.
While the enhanced RDP will be free, HP charges for its RGS protocol. But on Monday it said it has cut the price of RGS to US$35 per seat, from "between $99 and a few hundred dollars" per seat, Malhotra said. It's also allowing customers to use RGS on non-HP servers, which previously was not permitted, he said.
He predicted that the RGS protocol will become more widely used, but HP still expects the enhanced RDP to be used for about 75 percent of virtual desktop deployments.

Businesses Ban Surfing, Study shows

Nearly two thirds of UK businesses ban access to inappropriate websites and monitor employees' Internet activity, according to the Chartered Management Institute (CMI).
Research by the CMI in association with Ordnance Survey, also revealed that 18 percent of businesses impose a curfew as to when employees can access the web. However, 72 percent of employees questioned said they relied on the web for "professional development" while 59 percent also claimed it was a "useful research tool."
The CMI also highlighted that many UK organizations are slow to adopt Web 2.0 technology, with only 39 of UK businesses revealing they are happy for employees to use web-based applications such as Google Docs.
Jan Hutchinson, director of HR and corporate services at Ordnance Survey, said: "The low level adoption of new technology is in tandem with employers' belief that internet usage is a 'time waster.' It's something that must be looked at because the longer this situation is allowed to remain unchallenged, the greater the likelihood UK employers will fall behind their international competitors."

Vista Customer Satisfaction Climbs, Microsoft Claims

Microsoft Asia Pacific has responded to researchers' claims that enterprise chief information officers "have not warmed to the Vista operating system over the past year."
The northern hemisphere researchers said that the CIOs they recently surveyed voted 11 to one against plans to implement Vista. The researchers concluded that "CIOs and other heads of IT still believe there is little business value in migrating from XP to Vista, especially in the current economic climate."
Richard Francis, General Manager, Windows Client Group, Microsoft Asia Pacific, said that, overall, sales of Windows Vista licenses have passed 180 million since launch, and at least 100 million Windows Vista users have actively hit Windows Update.
He said that, in the first half of the calendar year 2008, Windows Vista had 23 per cent fewer vulnerabilities than Windows XP, and 21 per cent fewer high-severity vulnerabilities.
More than 77,000 components and devices are now supported by Windows Vista SP1, triple the number Microsoft supported at launch, says Francis. From the Windows Vista Compatibility Center, Microsoft now has more than 2,000 printers, 220 scanners, 135 webcams, 485 digital cameras, and 180 media players compatible with Windows Vista.
Positive Asia Pacific feedback
"We have received positive feedback from our enterprise and SMB customers in APAC who have experienced the business benefits of Windows Vista," said Francis. "The history of the operating system lifecycle clearly suggests an increase in deployment 18 months into the launch or close to release of SP1. This is the right time and we are seeing customers starting to deploy, especially with Windows Vista SP1.
"Windows Vista satisfaction is actually increasing over time -- research tells us customer satisfaction levels increase among those who bought Windows Vista during just the last six months. This data also shows that favourability increases as people become more familiar with it. We are also on track to a faster rate of deployment in the enterprise compared to past releases including Windows XP and Windows 2000 during the same timeframe.
"As a trusted partner to our customers, we are delivering the technologies they need to optimise the performance of their IT investments during these challenging economic times," he said. "The combination of the Microsoft Desktop Optimization Pack (MDOP) and Windows Vista gives customers the tools to support more flexible work environments for their users, while making it easier and more efficient to manage and maintain their networks.
Francis said that Borneo Motors in Singapore, for example, now saved half an hour per machine using Windows Vista's secure remote deployment, and had reduced desktop support time by 25 per cent with improved Remote Assistance and the Event Viewer in Windows Vista.
Francis said that other Windows Vista customers in the region included Gleneagles and Pernec in Malaysia, Satyam and ONGC in India, Kiwi Bank in New Zealand, and the Australian Customs Service, to name a few.

Friday, December 5, 2008

Networking Glitch Knocks Yahoo Offline for Some

A networking problem made Yahoo's Web site unreachable for many users on Wednesday.
The problem, first observed at around 11:40 a.m. Pacific Time, appears to have primarily affected users in the eastern United States and Canada who were trying to reach the www.yahoo.com domain.
Network engineers on the NANOG (North American Network Operators Group) discussion list reported that when they tried to reach www.yahoo.com, they were sent on the Internet's version of a wild goose chase. DNS (Domain Name System) servers redirected traffic to another Yahoo domain, www.wa1.b.yahoo.com, which was not associated with an IP address. In other words, computers trying to find Yahoo's Web site were sent nowhere.
The problem appeared to have been resolved by about 1 p.m. Pacific Time.
Yahoo did not have much to say about the outage. The company confirmed in a statement that it had "a disruption in service earlier today that affected users in some geographic areas."
One NANOG poster, apparently a Yahoo employee named Matthew Petach, reported that the problem was triggered by a Juniper T1600 router that "went kablooie."
"This is primarily affecting traffic coming through Ashburn, Virginia," Petach wrote at 1:16 p.m. Pacific Time. "We're aware of the issue and have put workarounds in place; you should be back up and functional for the moment, though not in an optimal state."
The kind of router failure described by Petach could easily have accounted for the problems reported on NANOG Wednesday, if Yahoo's DNS nameservers were behind that faulty router, said Cricket Liu, vice president of architecture with DNS appliance vendor Infoblox.
Yahoo appeared to have fixed the problem by moving some of its DNS services to nameservers run by Akamai, a company that helps sites deliver Web content, Liu said.
"It sounds like maybe they changed ... to Akamai to save their network bacon," he said.

Consolidation May Hit Worldwide PC Market, IDC Says

Tumbling demand could affect PC makers next year, leading to industry consolidation, IDC said on Wednesday.
Competition among PC makers could intensify as consumers and enterprises tighten budgets during the economic downturn, creating a stagnant market for PC makers, said Richard Shim, personal computing research manager at IDC. That could lead to fewer opportunities for PC makers to sell their products.
The PC market is already pretty mature globally, so the lower-than-expected shipments and falling prices could create consolidation in the PC industry, either through acquisitions or by forcing competitors out, Shim said.
In mature markets like the U.S. and Europe, smaller PC makers may be forced out by larger competitors, Shim said. However, in emerging markets the smaller PC makers are ripe for acquisition, as larger PC makers are always trying to expand their customer base.
PC shipments worldwide are expected to grow by only 3.8 percent in 2009, a dramatic drop from the 13.7 percent growth the firm predicted earlier this year. Growth of PC shipments for 2010 has been lowered to 10.9 percent.
While Dell reported slow growth in PCs shipped for the quarter ended Oct. 31, companies like Hewlett-Packard and Apple have defied the economic downturn, reporting consistent growth in shipments. HP reported a 19 percent rise in unit shipments year-over-year for its most recent financial quarter, while Apple saw 21 percent growth in Mac shipments for the quarter ending Sept. 27.
Apple has a good chance of recording solid growth through the economic downturn compared to other PC makers, Shim said. Historically Apple has outpaced the industry, as it has a loyal customer base willing to pay higher prices for PCs, Shim said.
Consumers will show more preference for laptops over the next few years, with shipments outpacing those of desktops, IDC said. Laptop shipments are expected to grow from 168 million next year to 285.7 million in 2012, compared to desktops, which will grow from 145.8 million in 2009 to 156.6 million in 2012. IDC has not included handhelds like PDAs in the numbers.
The growth in laptop shipments will be driven partly by larger shipments of netbooks, or mini-laptops, which are small, inexpensive laptops with screens of up to 12 inches, Shim said. Netbooks are shipping in larger volumes in emerging markets because of their lower prices. However, consumers in the U.S. haven't figured out how to effectively use netbooks, as 20 percent of buyers return them, Shim said.
After years of double-digit growth, developed countries will see slower PC shipment growth because of the economic downturn. Shipments in the U.S. will decline by 3 percent in 2009 and continue to grow slowly in the coming years, while countries like Japan and Canada will see low single-digit growth. Growth in Western European countries is expected to continue at 6 percent in 2009, driven by increased laptop shipments, but it will be a giant drop from the 20 percent growth it is expected to record in 2008.
PC shipments in emerging Asia-Pacific countries will outpace mature markets, increasing by around 7 percent in 2009 and jumping up by around 18 percent to 20 percent in the following years, IDC said.
After being the fastest-growing markets over the past few years, emerging markets in Latin America and Central Europe will see PC shipment drop through the third quarter of 2009 due to falling prices and currency fluctuations. PC shipments are expected to increase at a slower pace in certain Middle East and African countries compared to recent years.

Facebook, Google Launch Data Portability Programs to All

Google and Facebook separately announced the general availability of their respective data portability programs on Thursday.
Google Friend Connect and Facebook Connect are generally designed to extend social-networking capabilities broadly across the Web.
In the real world, this means making it possible for people to use their previously created Google and Facebook accounts to sign in to other Web sites that accept them. That way, people don't have to create an account for every Web site that requires one, reducing the number of log-in details they need to remember.
MySpace's Data Availability Initiative has a similar mission.
These programs also aim to let people port elsewhere content they have entered into Google, Facebook and MySpace, like profile information, photos, notes, list of contacts, comments, status updates and the like.
In its announcement on Thursday, Google said Friend Connect is now available to any Web site publisher and that the social features available can be added by copying and pasting snippets of code, so advanced technical knowledge isn't necessary.
To access Friend Connect features on a Web site, people can log in using not only their account information from Google but also from Yahoo, AOL and the industry standard OpenID, Google said.
Meanwhile, Facebook urged its users to contact their favorite Web sites and encourage them to implement Facebook Connect, which is already running on places like Citysearch, CNN's The Forum and CBS' The Insider.
"Obviously our launch partners don't cover all the websites you use on a daily basis, so if you want to see this list grow, get in touch with your favorite websites, developers, and services, and tell them you want to connect. With your help, we can all share more information across the web," wrote Facebook founder and CEO Mark Zuckerberg in a blog posting.
Still, the grand vision of widespread and seamless data portability is far from complete, as these and other initiatives are fairly recent, and important technology and privacy issues remain unsolved.
For example, days after the initial announcements of their data portability programs in May, Google and Facebook promptly locked horns and have been unable to work out their differences. Facebook blocked Google's Friend Connect service from accessing Facebook members' data, saying the Google program violates its terms of services because it redistributes Facebook user information to developers without users' knowledge.

Monday, December 1, 2008

LinkedIn Revamps its Search Tool

The LinkedIn business social network last week rolled out an overhauled search platform that it says will let users more easily find who they are looking for on the site.
The new search engine uses what Esteban Kozak, a senior product manager at LinkedIn , called a personalized relevance algorithm to pick out the most relevant users in the 31 million-member LinkedIn community.
In a blog post , Kozak noted that the searcher's network is a key factor in ranking search results. Therefore, every matching search result is evaluated based on who is executing the search. "The end result is a personalized relevance algorithm that places the professionals that are most likely to be of interest at the top of the first search results page. We synthesized over a thousand pieces of feedback and analyzed data from over a billion search queries" in creating the engine, he said
New features include an "In Common" field, which lets users find what connections and groups he or she shares with the users listed in search results. In addition, users can customize their views of results. Thus, users can add or remove fields based on their own needs.
"We also saw in the data that many of you use search to get to your connections quickly," Kozak noted. "In order to make it more efficient, we developed a type-ahead widget that recommends connections as you type from any people search box."
Jason Kincaid, a blogger at TechCrunch , noted that the company is looking for the revamped search engine to streamline the most often used features on the network by the LinkedIn community of business users.
"The engine also streamlines advanced search by presenting options in a more accessible menu (some of the features were previously available, but buried so that most users never found them)," he added. " Most of the new features revolve around people-search ... and while there isn't anything particularly exciting from the user's perspective the changes make the engine significantly more convenient (and will hopefully help the recently unemployed get back on their feet that much faster)."
One of the search engine's most powerful new features, he added, is persistent search, which allows users to set up alerts notifying them when a company hires an executive, or when a potential job candidate may be available for hire. "Before now there have been a few ways to create similar notifications but this is the first time that LinkedIn has integrated this functionality," Kincaid noted.

Does E-Mail Really Make You Happy?

While a survey of Canadian chief information officers showed the preferred mode of communication among IT staff is e-mail, this may not be the best way to gauge employee satisfaction, according to an executive with Robert Half Technology International.
The study by the Toronto-based IT staffing firm, based on interviews with more than 270 CIOs from across Canada, found that 49 per cent of respondents preferred using e-mail to communicate with each other at work.
"I find it sad that it's 49 per cent," said Sandra Lavoy, Ottawa-based regional vice-president with Robert Half Technology. The issue with using e-mail, said Lavoy, is it doesn't allow the reader to assess the sender's tone as easily as it would be on the phone or face-to-face.
E-mail archiving lets businesses manage correspondence.E-mail is, however, a good mode of communication for quick exchanges and maintaining written records of certain decisions, for instance, when deciding the date of a project deadline or arranging a meeting time, said Lavoy. E-mail also works well for following up on previous conversations, she added.
The study also found that the preference for in-person conversations was 34 per cent and phone calls was six per cent.
Lavoy thinks that CIOs probably rank e-mail higher than face-to-face and phone conversations simply because they often rely on mobile devices like Research in Motion Ltd.'s BlackBerry to stay in touch with the office when they're on the road. Besides, she said, it's common now for employees to travel frequently and have to communicate across different time zones, or have flexible work arrangements that allow them to work from home or work an irregular work day.
"But the problem in this volatile market that we're in is it's really important for us as employers to make sure we have these face-to-face discussions," said Lavoy. "The No. 1 reason why employees leave is lack of recognition; that could be something we're not addressing."
Given that the use of mobile devices like BlackBerrys is common across all organizations, the study did not find any variation in attitudes between CIOs from different-sized companies.
Respondents also reported their preference for instant messaging to be on par with phone calls.
Lavoy said that, given the state of the economy, businesses would benefit from face-to-face collaboration channels like videoconferencing and conference calls that allow "if you're going to have a discussion with multiple people and you want to get some resolution quickly."
However, Lavoy acknowledged that even videoconferences and conference calls lack that human touch because often there is a large number of attendees on the call with just one moderator to direct the proceedings.
Despite these findings, however, respondents still found that the prevalence of new technologies like handheld devices, instant messaging and text messaging in the workplace enabled better connectivity overall. Thirty-one per cent of interviewees said they felt much more connected and 26 per cent reported being somewhat more connected.

Malware is Getting Smarter, CA Warns

Online attacks will be dominated by smarter malware and bots targeting Web users ranging from gamers and social network users to the elderly and unsuspecting parents.
This is according to IT management software company CA, maker of the CA Internet Security Suite, which was recently updated to the Plus 2009 version.
"Families should feel safe and secure when they are online," said Brian Grayek, vice president of product management for CA. "However, there are more online threats than ever before. While it's important for parents to practise general PC safety practices like not placing a PC in a child's room and monitoring social networking profiles, parental control software provides an added layer of protection and additional peace of mind."
CA Internet Security Suite Plus 2009
"Each element of the suite, which includes a personal firewall, anti-virus, anti-spyware, anti-spam and anti-phishing software has been enhanced to provide even stronger protection against a wide range of emerging online threats," said the software company.
The software is in a single console and is easy to use and install, allowing users to monitor the security status of all of the licensed PCs on their home networks, said CA. Other features of the suite include integrated parental controls that help protect children from inappropriate Web content and enable parents to monitor Internet activity. Also, the suite helps users back up and restore their important data and PC settings or transfer them to a new PC.
Grayek continued: "Historically, high-performance users, like gamers, turn off security features because it either slows down their PCs or they get pop-ups that interfere with their experience. This is a huge risk to their PC security. We've focused on developing a product that runs quietly in the background for uninterrupted gaming and movie watching, while keeping the PC secure."
CA Internet Security Suite Plus 2009 is currently available from CA for US$79.99 (MSRP), with a one-year subscription for up to five PCs in the household. More information is available at http://shop.ca.com/.

Adobe Rules in Web Video

Eighty-one per cent of worldwide online videos are viewed using Adobe Flash technology, making it the number one format for video on the Web. This is according to the independent research firm comScore.
Adobe Flash Player software is already installed on 98 per cent of Internet-connected desktops and a growing number of mobile devices. The company has just released two key components of the Adobe Flash Platform.
New offerings
These are Media Interactive Server 3.5 software and Adobe Flash Media Streaming Server 3.5 software. Adobe Flash Platform delivers interactive content, applications, and video on the Web.
The new servers include new media delivery options, such as: dynamic streaming, enhanced H.264 video and High Efficiency AAC (HE-AAC) audio support, plus the ability to pause and seek within a live stream.
These innovations improve the quality of video delivered over the Web and offer richer interactive experiences for users, the company said.
The new versions advance the company's leadership in rich media and open significant new opportunities for content owners delivering interactive and social media applications, Adobe claims.

Friday, November 28, 2008

Botnets Can Trample Most Anti-Virus Programs

A new analysis of botnets has come up with a possible reason for their prodigious ability to infect PCs -- many anti-virus programs are near to useless in blocking the binaries used to spread them.
According to FireEye chief scientist Stuart Staniford, detection rates are so poor that, on average, only around 40 percent of security software can detect binaries during the period of greatest infectivity and danger, namely the first few days after a particular variant starts being used by botnet builders.
In a detailed blog, he describes how he uploaded a sample of 217 binaries culled from FireEye appliances in customer premises between September and November to the independent VirusTotal test website. This runs 36 anti-virus programs -- a representative sample of the security programs used by businesses and individuals -- giving researchers access to data on get statistics on how many malware binaries have already been uploaded to the site by other researchers, when they were uploaded and how many were detected by each program.
Roughly half of the binaries picked up by FireEye were unknown to VirusTotal, a result indicative of the core problem of detecting botnet malware -- speed.
Because malware often uses 'polymorphism' -- programs are constantly changed very slightly to evade binary pattern detection -- the problem of detecting and blocking malware quickly is huge. According to Staniford, this makes it important that anti-virus programs can spot malware in the first week of its use.
"The sample is likely to get discarded by the bad guys pretty soon after that," he notes.
During the first three days after initial detection by FireEye, only four in ten anti-virus programs could spot the offending code, which suggests that many bots would evade security software during attacks on real PCs in they happened during this same period.
"The conclusion is that AV works better and better on old stuff -- by the time something has been out for a couple of months, and is still in use, it's likely that 70-80 percent of products will detect it," says Staniford.
FireEye's appliances can be seen as an 'early warning' system because of the way they use behavioural analysis to spot malware in real time, in some cases days or weeks before a program has been formally identified and documented by security companies. By the time it has been spotted and a signature rolled out to anti-virus databases, however, it might already be too late.
Equally, many prominent security vendors will use similar techniques to spot malware as quickly as possible, making it surprising that so many anti-virus programs failed to spot FireEye's sample binaries. The reason might simply be the vast number of samples that appear in any given period.
What nobody doubts is the importance of botnets to the spread of malware and spam, as evidenced by the recent takedown of a US hosting company McColo, which had been accused of hosting botnet controllers. In the hours after the hoster's demise, spam levels were reported to have plummeted dramatically.

Analyst: Mobile Data Will Continue Boom in '09

There will be more than 36 million laptops connected to mobile data networks in Western Europe in 2009, compared to the 26 million estimated for the end of this year, according to market research company CCS Insight.
"It's a little bit more growth than what we have seen this year. Overall, next year you will see a push by all the carriers, but not just mobile carriers, but also from alternative providers as well," said Paolo Pescatore, analyst at CCS Insight.
For example, in Sweden, cable operator Com Hem is collaborating with 3 to offer subscribers mobile as well as cable broadband. The goal is for them to become one-stop shops for broadband, according to Pescatore.
Another big trend during 2009 will be packaging of mobile broadband in various new ways.
"Many mobile operators also have fixed-line assets, so they are very much in a position to package multiple access technologies and compete quite aggressively," said Pescatore.
The mobile operators will also start to package mobile phones and laptop connectivity.
"We are already seeing that today: whereby 3 here in the U.K are saying that if you take out a contract with us we'll also throw in mobile broadband at a 50 percent discount," said Pescatore.
Mobile broadband will weather the current economic storm, with growth and data-integration plans continuing, according to Pescatore.
For the mobile data operators both browsing on mobile devices and laptop connectivity will become an even more important source of revenue. "We have already seen this year how much of an impact that it's making on total revenue, and this will continue next year given the fact that voice connections are very much saturated, and there aren't many users to connect," said Pescatore.
Laptops will generate a majority of the traffic on a per user basis, but mobile phones will also be an important traffic generator. There are many more phones than laptops in circulation and data browsing from phones will continue to increase for a number of reasons. The industry has only skimmed the surface of what can be done with social networking on the phone, according to Pescatore.
"Social networking will continue to whet consumer appetite for data moving forward and we'll see a lot more collaboration between carriers and social networking sites, as well as social networking sites and device manufacturers," said Pescatore.
Mobile data growth is still a bit of a tricky proposition for operators because more customers mean more revenue but also a larger strain on networks.
So mobile broadband providers hope users will sign up for a mobile broadband deal in the same way they take out membership to a gym: a subscription makes them feel good, even if they take advantage of it less than they intended, according to Pescatore.

Hot Jobs: Software Implementation Analyst

Job description: The software implementation analyst ensures that deployments of new applications or upgrades are planned and carried out correctly. They act as a bridge between the software developer and the IT infrastructure team that handles installation and maintenance, says Carlo Carbetta, vice president of operations development at CIO Partners, an executive search firm. They determine whether the applications interoperate with existing systems and plan for customization or integration work. This person may be involved in testing, creating documentation and dealing with end users.
Why you need one: When a company buys packaged software, it has to adapt it to its operations and processes. "If you don't have your own staffer involved in that implementation to make sure that the vendors understand your needs, you run the risk of the implementation moving off your business plan," says Eugene Farago, an account executive with the IT and metals division of Hudson, a recruitment and talent management firm. The software implementation analyst also acts as an agent of change for the company, steering it through an often risky but necessary process, he says. This means addressing user concerns while keeping the implementation on track. Finally, with IT environments becoming more heterogeneous, the need for someone with detailed knowledge of a company's business and technology increases, experts say.
Desired skills: Candidates should have computer, technical, engineering or science degrees,and certifications in areas such as project management and software development lifecycle. Experience with and knowledge of a company's business and technology operations are key. "This is a midcareer-plus position," Carbetta says.
How to find them: Software implementation analysts move around a lot and many do contract work, so they network a lot. Try business-oriented social networking sites like LinkedIn.
What to look for: A potential hire should be meticulous, process oriented, methodic and cool under pressure. They should be able to build relationships across the business and IT.
Elimination round: Ask candidates which software platforms they are most familiar with and their experience deploying them, including the environment size. Good candidates will discuss their interaction with the infrastructure team regarding things such as hardware provisioning and bandwidth requirements.
Salary range: $65,000 to $125,000
Growing your own: Groom internal candidates by rotating them through the business to gain expertise in a variety of areas. "They need to understand the bigger picture. A typical issue with implementation consultants is that they were previously a developer only and they get stuck in the details," Farago says. Putting a prospect on a process-improvement team is also a good idea. This allows you to see how quickly someone is at identifying problems and coming up with solutions, and to determine whether or not they are good facilitators.

VoIP: Worth the Effort?

If product placement for film and TV is as effective as companies hope, Cisco must be rolling in it. If you've seen any scene in the last few years involving some form of office -- no matter whether it's in a high-rise commercial building, a hospital or a morgue -- there's a significant chance you'll see a Cisco-branded, IP-based phone sitting on a desk. If fiction mirrored reality, Voice over IP (VoIP) is ubiquitous already.
Unfortunately for Cisco, that isn't exactly the case. In fact, I think I'm yet to see an entire corporation that has made the switch to VoIP. Sure, there are the odd cases here and there -- small businesses trying to cut costs wherever possible, for example -- but unless you happen to work for a VoIP company, chances are you are on the same traditional PSTN line as everyone else.
So is VoIP really worth the hassle? Sure, there are the cost benefits if it is set up and organised properly, but the idea of adding layer of technology on top of your standard office network is mind-boggling. Load the network bandwidth of VoIP on top of your standard, snail-slow office Internet connection and there's a good chance your employees will find yet another way to remain unproductive, as IT staff struggle to rebuild the IP network in time for that all-important conference call.
For homes, the situation isn't too different. Sure, there are naked DSL plans, Analog Telephone Adaptors and dual-mode VoIP phones, but add VoIP onto your average broadband plan -- already cluttered with BitTorrent and Facebook -- and don't be surprised if it's nightmares ahoy. Not to mention that most naked DSL plans include uploads in the bandwidth quota, making the trip to dial-up town all that much shorter. If you happen to be the resident techie in your household and you feel confident to take the cons with the pros, go for it. For Joe the Plumber, though, VoIP is still a pipe dream.

Monday, October 27, 2008

Sony, IT Stocks Hit Again as Tokyo Market Slides

Shares in Japan's major electronics companies slid again on Monday, as the benchmark Nikkei 225 index hit its lowest level in 26 years and the yen strengthened against the dollar.
An emergency joint statement issued by the Group of Seven major industrialized nations expressing strong concern over the recent sharp rise in the yen's value did little to halt the currency's climb. It was trading at ¥92.85 to the U.S. dollar at 4 p.m. on Monday afternoon, up ¥2.29 yen since Friday.
The strong yen makes Japanese goods more expensive overseas and also reduces the value of profits made overseas when they are brought back to Japan, hitting companies that are major exporters like Sony. The currency's climb in recent weeks was partly behind the revision Sony made to its financial outlook last week. Sony had assumed an average exchange rate of ¥105 to the U.S. dollar when it originally issued its financial outlook but the revised outlook assumed ¥100 to the U.S. dollar, which is already far away from the currency's current value.
Shares in Sony, which lost 12 percent of their value on Friday, were again hit hard and closed down 8 percent at ¥1,821 (US$19.59).
On Monday Canon revised its financial outlook downwards because of the strong yen. The company said the exchange rate and a slowing global economy hit demand for its products and as a result net profit in the July to September period was down 21 percent against the same period last year. It now expects to record a full year net profit of ¥375 billion, down from its previous target of ¥500 billion.
The results and earnings forecast downgrade were issued after the Tokyo market closed and won't be reflected in the company's stock price until Tuesday at the earliest. In Monday trading Canon shares were down 11 percent.
Other electronics companies saw steeper drops.
Shares in computer memory chip maker Elpida slid 16 percent to an all-time low after investment banks cut their outlook for the shares. Pioneer saw its shares decline 15 percent also after a cut in the share outlook, Mitsubishi Electric dropped 14 percent and Sanyo Electric was off 12 percent.
Shares in computer game maker Nintendo, which also relies on overseas markets for a lot of its sales, were down 11 percent, NEC shares fell 10 percent and single digit declines were recorded by Fujitsu, Panasonic and Toshiba.
This week sees quarterly earnings for the July to September period due from many electronics companies. Panasonic and Ricoh are due to report after the market close on Tuesday. Toshiba, Fujitsu and Sony will report on Wednesday. Thursday will see Sharp, Nintendo, NEC, Hitachi and Kyocera. Cell phone carrier NTT DoCoMo is due to report results on Friday.

IBM Set to Discuss 'Information Agenda

Attendees of IBM's Information on Demand conference this week in Las Vegas will be bombarded by a rash of product and services announcements and a lot of discussion about how to create an "Information Agenda."
IBM launched the IOD strategy, which pulls together a wide range of data management, storage and analysis technologies, a few years ago. Since then, IBM has made a string of acquisitions to support IOD, including the large BI (business intelligence) vendor Cognos.
Announcements at this year's conference are expected to include:
-- "Foundation Services," which consist of a one-day workshop followed by 12 weeks of follow-up consulting, that are meant to help customers create an "Information Agenda." IBM hatched the phrase in September when it announced a set of tools, services and industry-specific data models for helping companies use information "as a strategic asset across their businesses."
-- The C3000 and C4000 editions of the InfoSphere Balanced Warehouse, which are data-warehousing appliances aimed at small and medium-size businesses, now include Cognos 8 BI.
-- Seven new performance management and financial offerings based on Cognos technology. Among them are Clinical Resource Planning, for pharmaceuticals to perform modeling and forecasting, and Earned Value Management, which federal agencies can use to monitor capital spending.
IBM is also expected to discuss news around MDM (master data management), ECM (enterprise content management) and a range of releases due before the end of the year from its Optim product line, which it acquired through the purchase of Princeton Softech in 2007. Optim products focus on data archiving, classification, data privacy and test data management.
About 7,000 attendees are expected at this year's conference, compared to roughly 6,000 last year, according to IBM.
IBM's IOD strategy is broadly relevant simply because so many companies "have bet the business on a large swath of IBM solutions," said Forrester Research analyst James Kobielus. In a weak economy, customers may consider consolidating their data management technology "down to fewer, but more strategic and comprehensive, vendors, such as IBM," he added.
As far as the BI portion of its arsenal, IBM could be in a better position to innovate in coming years than its rivals Oracle and SAP, according to Forrester analyst Boris Evelson.
Oracle still has a good deal of work to integrate products from its Siebel and Hyperion acquisitions, while SAP, which recently bought BI vendor Business Objects, has "some tough decisions to make on how to help their customers migrate from Netweaver BI to the new product line," Evelson said.
Meanwhile, the IBM-Cognos merger saw few product overlaps and Cognos "already took the time a few years ago to streamline and upgrade the platform," he said.

Monday, October 20, 2008

Intel's Moorestown Platform to Get 3.5G Support

Intel's upcoming Moorestown chip platform will include optional support for high-speed cellular data services when it hits the market in 2009 or 2010, Intel said Monday.
Moorestown will be based on Lincroft, a system-on-chip that includes an Atom processor core and a memory controller hub, and a chipset called Langwell. Designed for small, handheld computers that Intel calls Mobile Internet Devices, Moorestown will offer optional support for both WiMax and HSPA (High Speed Packet Access) cellular networks.
Intel is heavily pushing WiMax, which it sees as the best option for future wireless broadband services. But WiMax availability is very limited and it will take time for networks to enter commercial operation and expand their coverage areas. The addition of HSPA support to Moorestown hints that Intel recognizes that WiMax may not be extensively deployed as quickly as it would like, and users will want an alternative way of connecting wirelessly outside of Wi-Fi hotspots.
This isn't the first time Intel has flirted with offering 3G (third generation telephony) support to computers. In 2007, the company shelved an agreement with Nokia to provide 3G modules for Centrino laptops, saying customer interest in the technology was lukewarm.
That appears to be changing. At the Intel Developer Forum in San Francisco during August, Belgium's Option showed off HSPA modules it developed for MIDs based on Intel's Atom. On Monday, Intel announced that Option and telecom equipment maker Ericsson will make low-power HSPA modules that will be offered as an option with Moorestown.
Intel is making its own WiMax module for Moorestown. The module, code named Evans Peak, made an appearance at the Ceatec show in Japan during late September.

Tech Economic Woes Don't Rival Dot-Com Bust

Current economic uncertainty will impact IT budgets in 2009, according to Gartner, but the industry won't experience the extreme cuts it suffered in 2001 as a result of the dot-com bust.
Gartner analysts presenting at Symposium/ITxpo 2008 Monday in Orlando said the research firm is reducing its original forecast of 5.8% global IT spending growth down to 2.3% for 2009. In the United States, the research firm expects existing 2008 budget plans to not change significantly and forecasted spending in 2009 to remain flat.
"In the worst case scenario, our research indicates an IT spending increase of 2.3% in 2009, down from our earlier projection of 5.8%," said Peter Sondergaard, senior vice president at Gartner and global head of research, in a press release. "Developed economies, especially the United States and Western Europe, will be the worst affected, but emerging regions will not be immune. Europe will experience negative growth in 2009; the United States and Japan will be flat."
While the financial events of the past few weeks will impact 2009, Gartner said it doesn't expect the fallout to be as significant as the recession of 2001. Due in part to the "dramatic reductions" made in response to the dot-com bust, Sondergaard said the IT industry is better prepared to respond to today's economic woes. According to Gartner, IT budgets were "slashed from mid-double-digit growth to low single-digit growth" during and after the 2001 recession.
Also IT has been able to shift its position from a back-office cost center, Gartner suggested, to an active partner in the business. For instance, IT is now "embedded in running all aspects of the business" and often employs "multi-year IT programs aligned with the business," which are more difficult to cut in the short term. Gartner also pointed out that IT spending decreases "lag the economy by at least two quarters."
"What [CEOs] want now most of all is agile leadership. Leadership that can guide us through simultaneous cost control and expansion at the same time," Sondergaard said.

Mini Laptops Bolster PC Sales, Gartner Says

With the economy in turmoil, a lot of people who are looking to buy PCs are increasingly turning to cheap, low-power mini laptops.
And that single move is bolstering what otherwise would be a soft PC industry, according to industry analysts at Gartner Inc. With a strong push from the new slew of mini notebooks hitting the market, worldwide PC shipments reached 80.6 million units in the third quarter this year, marking a 15% increase from the third quarter of 2007.
"The mini-notebook segment experienced strong growth in the global PC [market], led by robust growth in the Europe, Middle East and Africa region," said Mika Kitagawa, a principal analyst with Gartner, in a statement. "In the North America market, the economic crunch created more interest in the sub [US]$500 segment ... At the same time, the global PC market finally felt the impact from the global economic downturn. The U.S. professional market experienced the biggest hit from the economic crunch. The U.S. home market saw definite softness in PC sales after a few quarters of strong growth."
A lot of PC makers are diving into the mini or ultra-portable laptop market.
In August, Lenovo took a run at the fledgling netbook market with a new ultra portable laptop. Scheduled to be available this month, the IdeaPad S10 has a starting price of $399.
Mini laptops, increasingly known as netbooks, are relatively inexpensive, small form-factor notebooks designed for basic applications, such as Web surfing, e-mailing and word processing. They're designed to use less power than traditional PCs and laptops and aren't robust enough for serious power users or gamers.
Intel Corp. announced earlier this year that it was betting heavily on the new market. The chip maker began shipping Atom processors for mobile Internet devices, which are small almost pocket-size machines, in April. Intel spokesman Chris Tulley said at the time that the company expects sales of netbook and "net-top" devices to outpace growth of traditional laptops and desktops.
Early in June, Acer Inc. dove into the mini-laptop market with the Aspire One netbook, which is designed to use Intel's Atom N270 chip. Acer's netbook runs either the Linpus Linux Lite operating system or Windows XP Home.
That move into the netbook market worked out well for Acer, according to Gartner's report. The analyst firm reported that both Acer and ASUS "had a strong focus and acted quickly in the mini-notebook segment." Because of it, both PC makers saw strong third quarter growth.
Gartner reported that Acer, which has scrambled into third place in the worldwide PC shipment market, saw 47.3% year-over-year growth in the third quarter. That's compared to 8.1% for fourth-place Lenovo, 15.1% for market leader Hewlett-Packard Co. and 11.6% for second-place Dell Inc.
Gartner reports that HP was hurt by its slow entry into the netbook market. Dell, which maintained its top position in the U.S. market, was hit by the general weakness in both the enterprise and home markets.
According to the analyst firm, for the U.S.-only market, HP comes in behind Dell, while Apple takes the third spot. Acer is in fourth place and Toshiba rounds out the top five.
PC shipments in the U.S. market grew 4.6% in the third quarter of 2008 compared to the same time last year. Gartner also reported that mini-notebook shipments accounted for about 5% of U.S. mobile PC shipments and added one to two percentage points of year-over-year growth.

Altor Ships Firewall for Virtual Systems

Altor Networks is announcing the availability of its firewall designed for virtual environments that overcomes some shortcomings of traditional firewalls that have been adapted to run on virtual machines.
Altor VF addresses blindspots that exist with other firewalls deployed in virtual environments. Products outside the physical server on which virtual machines are running have no visibility of traffic among the virtual machines and can take no action on that traffic.
In addition, as virtual machines recreate themselves to meet demand -- known as live migration -- they can wind up on physical machines with other applications they were never intended to be exposed to. Live migration can help propagate infections by expanding the presence of corrupted machines.
Altor VF migrates a virtual firewall and the rules that pertain to a particular virtual machine when it undergoes live migration. Other firewall vendors such as Check Point and Stonesoft offer virtual versions of their firewalls, but they don't address firewall rules for virtual machines that migrate.
Altor includes a tool to define where to place firewalls among virtual machines, automating a multi-step process. It also controls virtual machine sprawl by enabling default settings that can, for example, lock down virtual machines for which no one claims ownership.
The firewalls can also impose security policies on traffic.
Altor VF integrates with Juniper intrusion-detection system (IDS) gear, sharing its logs so the IDS can assess traffic among virtual machines. Altor says it has a similar relationship with ArcSight's security event management platform and Mazu's network behavior analysis products.
Altor VF costs US$2,000 per physical server with discounts for volume purchases.

Wednesday, October 15, 2008

Exchanging E-mails With a Pirate

The Pirate Bay (TPB), one of the world's biggest torrent tracker sites, found itself embroiled in controversy last month, when a link to a torrent containing photographs of a grisly child murder in Sweden appeared on the site.
A torrent is a small file that contains information about another file, such as a movie, distributed using the BitTorrent peer-to-peer protocol. The torrent itself doesn't contain the movie, but acts as a marker of sorts, pointing computers to the actual file.
The torrent of the photographs, which were released by a Swedish court presiding over the case, was not posted online by TPB or its founders, but the site nevertheless found itself at the center of a discussion on the limits of free speech on the Internet, and to what extent Web sites should be held responsible for content posted by users.
The controversy was different from that normally faced by TPB, which has made enemies of the music and movie industries, as well as the U.S. government, over allegations its activities violate copyright law -- charges the site denies, citing differences between U.S. and Swedish law.
TPB's view on the pictures was that anger over their release should be directed towards the court that made them public, rather than TPB. The site refused calls to take down the torrent, citing its general commitment to not censor or remove any files posted to the site, regardless of the circumstances.
The controversy came to a head when Peter "brokep" Sunde, one of the founders of TPB, was invited to appear on a Swedish television show for an interview, under an agreement that the father of the murdered children would not be present. According to Sunde, the television station broke the agreement and surprised him by inviting the father to participate in the show with him.
That experience led TPB to declare an end to all contact with the press. "All future interviews are to be considered impossible. We have no longer any interest in participating in traditional media since it's apparent that they are not trustworthy," TPB announced on its blog on Sept. 12.
Sunde and Fredrik "TiAMO" Neij, another TPB founder, will speak at the upcoming Hack In The Box (HITB) security conference in Kuala Lumpur, Malaysia, later this month. Their keynote presentation is called "How to dismantle a billion dollar industry -- as a hobby."
Despite the announced ban on press contact, Sunde agreed to an e-mail interview ahead of the presentation. What follows is an edited version of that exchange.
IDGNS: What can we expect to see in your presentation at HITB? Why did you decide to present at the conference this year? How did that happen?
Peter Sunde: The presentation will probably be a mixture of a tech presentation, some pirate humor and a story about the power of Internet. We usually hold seminars for politicians, so it's going to be very much more interesting doing it in front of people that understand the technology. We will talk about how and why we do what we do! We got in contact with some guys from Hack In The Box who are really good at what they do and they invited us to come over. Going to Asia is never a boring thing so we went for it!
IDGNS: The recent situation in Sweden involving the pictures from the police case and the interview on Swedish television was obviously an emotional experience. The Pirate Bay has said it believes in free speech without restrictions. At a personal level, did this experience cause you to reconsider your stance on this issue?
Sunde: No, we are very sure of what we do. One of the most impressive things for me about TPB when looking back is our consistency towards our goals and ideals. We've always been true to them, even when the winds have been blowing against us rather than with us. And in the end, that's what makes us what we are -- we're honest and have a good ideology behind us. Compare us to our opponents and see what you get -- hint, it's not honesty and ideals.
IDGNS: In an ideal world, do you think copyright should exist? If it should, what do you think is the ideal way to structure copyright laws? What restrictions should be put on consumers, and what rights should copyright owners have?
Sunde: The copyright issue is quite complex -- more complex than just writing an e-mail. But I do see things that can work in a copyright, but for commercial aspects. It's very important to not infringe on personal life due to copyright. Creative Commons and other licenses are a better way than today's copyright laws. However, I do feel that Creative Commons is not reaching far enough.
IDGNS: What do you think is the current state of copyright law and Internet censorship, globally? Are we moving forwards? Backwards? What forces are driving these changes?
Sunde: I think that the people are definitely moving forward. The media industry is fighting, lobbying and bribing their way through the system, which is a really bad thing, both for us and them. In the end, it will show that they are only in this for money and nothing else. What a surprise! It's not good for business.
IDGNS: Can you tell me something about the recent move in Italy to block access to TPB's site? What really happened leading up to when the judge overturned the decision by a lower court to block the site, and what was the ultimate impact from your perspective?
Sunde: IFPI [International Federation of the Phonographic Industry] in Italy -- called FIMA, I think -- decided to sue us personally in a country where we do not live or have any connection. That in itself is not a valid thing in Europe, but the judge however decided to let them do it and to let them win. It was quite crazy. We found some really good lawyers afterwards that helped us with the case and we won it quite easily in the higher level of court.
FIMA had a major setback by that, when even the European Union had rules saying that an EU country is not allowed to block access to a system in another country like that. For some stupid reason they refuse to listen to the judge and the laws (the typical IFPI approach) and have now decided to appeal to the supreme court. It's no chance for them to win but they are losing face if they don't appeal. The interesting part is that we have never done anything illegal, not according to Swedish nor European Union laws. Our opponents have broken hundreds of laws in order to get to us.
IDGNS: What is the latest on The Pirate Bay's other projects, like BayWords and the streaming-video service?
Sunde: Oh yes, we have some projects coming out. A problem is that we're only two to three people in the gang and some are more active than others, so the projects tend to take some time to finish. But we have two very exciting projects that we're working on and we hope to maybe talk more about them at Hack In The Box.