Monday, October 27, 2008

Sony, IT Stocks Hit Again as Tokyo Market Slides

Shares in Japan's major electronics companies slid again on Monday, as the benchmark Nikkei 225 index hit its lowest level in 26 years and the yen strengthened against the dollar.
An emergency joint statement issued by the Group of Seven major industrialized nations expressing strong concern over the recent sharp rise in the yen's value did little to halt the currency's climb. It was trading at ¥92.85 to the U.S. dollar at 4 p.m. on Monday afternoon, up ¥2.29 yen since Friday.
The strong yen makes Japanese goods more expensive overseas and also reduces the value of profits made overseas when they are brought back to Japan, hitting companies that are major exporters like Sony. The currency's climb in recent weeks was partly behind the revision Sony made to its financial outlook last week. Sony had assumed an average exchange rate of ¥105 to the U.S. dollar when it originally issued its financial outlook but the revised outlook assumed ¥100 to the U.S. dollar, which is already far away from the currency's current value.
Shares in Sony, which lost 12 percent of their value on Friday, were again hit hard and closed down 8 percent at ¥1,821 (US$19.59).
On Monday Canon revised its financial outlook downwards because of the strong yen. The company said the exchange rate and a slowing global economy hit demand for its products and as a result net profit in the July to September period was down 21 percent against the same period last year. It now expects to record a full year net profit of ¥375 billion, down from its previous target of ¥500 billion.
The results and earnings forecast downgrade were issued after the Tokyo market closed and won't be reflected in the company's stock price until Tuesday at the earliest. In Monday trading Canon shares were down 11 percent.
Other electronics companies saw steeper drops.
Shares in computer memory chip maker Elpida slid 16 percent to an all-time low after investment banks cut their outlook for the shares. Pioneer saw its shares decline 15 percent also after a cut in the share outlook, Mitsubishi Electric dropped 14 percent and Sanyo Electric was off 12 percent.
Shares in computer game maker Nintendo, which also relies on overseas markets for a lot of its sales, were down 11 percent, NEC shares fell 10 percent and single digit declines were recorded by Fujitsu, Panasonic and Toshiba.
This week sees quarterly earnings for the July to September period due from many electronics companies. Panasonic and Ricoh are due to report after the market close on Tuesday. Toshiba, Fujitsu and Sony will report on Wednesday. Thursday will see Sharp, Nintendo, NEC, Hitachi and Kyocera. Cell phone carrier NTT DoCoMo is due to report results on Friday.

IBM Set to Discuss 'Information Agenda

Attendees of IBM's Information on Demand conference this week in Las Vegas will be bombarded by a rash of product and services announcements and a lot of discussion about how to create an "Information Agenda."
IBM launched the IOD strategy, which pulls together a wide range of data management, storage and analysis technologies, a few years ago. Since then, IBM has made a string of acquisitions to support IOD, including the large BI (business intelligence) vendor Cognos.
Announcements at this year's conference are expected to include:
-- "Foundation Services," which consist of a one-day workshop followed by 12 weeks of follow-up consulting, that are meant to help customers create an "Information Agenda." IBM hatched the phrase in September when it announced a set of tools, services and industry-specific data models for helping companies use information "as a strategic asset across their businesses."
-- The C3000 and C4000 editions of the InfoSphere Balanced Warehouse, which are data-warehousing appliances aimed at small and medium-size businesses, now include Cognos 8 BI.
-- Seven new performance management and financial offerings based on Cognos technology. Among them are Clinical Resource Planning, for pharmaceuticals to perform modeling and forecasting, and Earned Value Management, which federal agencies can use to monitor capital spending.
IBM is also expected to discuss news around MDM (master data management), ECM (enterprise content management) and a range of releases due before the end of the year from its Optim product line, which it acquired through the purchase of Princeton Softech in 2007. Optim products focus on data archiving, classification, data privacy and test data management.
About 7,000 attendees are expected at this year's conference, compared to roughly 6,000 last year, according to IBM.
IBM's IOD strategy is broadly relevant simply because so many companies "have bet the business on a large swath of IBM solutions," said Forrester Research analyst James Kobielus. In a weak economy, customers may consider consolidating their data management technology "down to fewer, but more strategic and comprehensive, vendors, such as IBM," he added.
As far as the BI portion of its arsenal, IBM could be in a better position to innovate in coming years than its rivals Oracle and SAP, according to Forrester analyst Boris Evelson.
Oracle still has a good deal of work to integrate products from its Siebel and Hyperion acquisitions, while SAP, which recently bought BI vendor Business Objects, has "some tough decisions to make on how to help their customers migrate from Netweaver BI to the new product line," Evelson said.
Meanwhile, the IBM-Cognos merger saw few product overlaps and Cognos "already took the time a few years ago to streamline and upgrade the platform," he said.

Monday, October 20, 2008

Intel's Moorestown Platform to Get 3.5G Support

Intel's upcoming Moorestown chip platform will include optional support for high-speed cellular data services when it hits the market in 2009 or 2010, Intel said Monday.
Moorestown will be based on Lincroft, a system-on-chip that includes an Atom processor core and a memory controller hub, and a chipset called Langwell. Designed for small, handheld computers that Intel calls Mobile Internet Devices, Moorestown will offer optional support for both WiMax and HSPA (High Speed Packet Access) cellular networks.
Intel is heavily pushing WiMax, which it sees as the best option for future wireless broadband services. But WiMax availability is very limited and it will take time for networks to enter commercial operation and expand their coverage areas. The addition of HSPA support to Moorestown hints that Intel recognizes that WiMax may not be extensively deployed as quickly as it would like, and users will want an alternative way of connecting wirelessly outside of Wi-Fi hotspots.
This isn't the first time Intel has flirted with offering 3G (third generation telephony) support to computers. In 2007, the company shelved an agreement with Nokia to provide 3G modules for Centrino laptops, saying customer interest in the technology was lukewarm.
That appears to be changing. At the Intel Developer Forum in San Francisco during August, Belgium's Option showed off HSPA modules it developed for MIDs based on Intel's Atom. On Monday, Intel announced that Option and telecom equipment maker Ericsson will make low-power HSPA modules that will be offered as an option with Moorestown.
Intel is making its own WiMax module for Moorestown. The module, code named Evans Peak, made an appearance at the Ceatec show in Japan during late September.

Tech Economic Woes Don't Rival Dot-Com Bust

Current economic uncertainty will impact IT budgets in 2009, according to Gartner, but the industry won't experience the extreme cuts it suffered in 2001 as a result of the dot-com bust.
Gartner analysts presenting at Symposium/ITxpo 2008 Monday in Orlando said the research firm is reducing its original forecast of 5.8% global IT spending growth down to 2.3% for 2009. In the United States, the research firm expects existing 2008 budget plans to not change significantly and forecasted spending in 2009 to remain flat.
"In the worst case scenario, our research indicates an IT spending increase of 2.3% in 2009, down from our earlier projection of 5.8%," said Peter Sondergaard, senior vice president at Gartner and global head of research, in a press release. "Developed economies, especially the United States and Western Europe, will be the worst affected, but emerging regions will not be immune. Europe will experience negative growth in 2009; the United States and Japan will be flat."
While the financial events of the past few weeks will impact 2009, Gartner said it doesn't expect the fallout to be as significant as the recession of 2001. Due in part to the "dramatic reductions" made in response to the dot-com bust, Sondergaard said the IT industry is better prepared to respond to today's economic woes. According to Gartner, IT budgets were "slashed from mid-double-digit growth to low single-digit growth" during and after the 2001 recession.
Also IT has been able to shift its position from a back-office cost center, Gartner suggested, to an active partner in the business. For instance, IT is now "embedded in running all aspects of the business" and often employs "multi-year IT programs aligned with the business," which are more difficult to cut in the short term. Gartner also pointed out that IT spending decreases "lag the economy by at least two quarters."
"What [CEOs] want now most of all is agile leadership. Leadership that can guide us through simultaneous cost control and expansion at the same time," Sondergaard said.

Mini Laptops Bolster PC Sales, Gartner Says

With the economy in turmoil, a lot of people who are looking to buy PCs are increasingly turning to cheap, low-power mini laptops.
And that single move is bolstering what otherwise would be a soft PC industry, according to industry analysts at Gartner Inc. With a strong push from the new slew of mini notebooks hitting the market, worldwide PC shipments reached 80.6 million units in the third quarter this year, marking a 15% increase from the third quarter of 2007.
"The mini-notebook segment experienced strong growth in the global PC [market], led by robust growth in the Europe, Middle East and Africa region," said Mika Kitagawa, a principal analyst with Gartner, in a statement. "In the North America market, the economic crunch created more interest in the sub [US]$500 segment ... At the same time, the global PC market finally felt the impact from the global economic downturn. The U.S. professional market experienced the biggest hit from the economic crunch. The U.S. home market saw definite softness in PC sales after a few quarters of strong growth."
A lot of PC makers are diving into the mini or ultra-portable laptop market.
In August, Lenovo took a run at the fledgling netbook market with a new ultra portable laptop. Scheduled to be available this month, the IdeaPad S10 has a starting price of $399.
Mini laptops, increasingly known as netbooks, are relatively inexpensive, small form-factor notebooks designed for basic applications, such as Web surfing, e-mailing and word processing. They're designed to use less power than traditional PCs and laptops and aren't robust enough for serious power users or gamers.
Intel Corp. announced earlier this year that it was betting heavily on the new market. The chip maker began shipping Atom processors for mobile Internet devices, which are small almost pocket-size machines, in April. Intel spokesman Chris Tulley said at the time that the company expects sales of netbook and "net-top" devices to outpace growth of traditional laptops and desktops.
Early in June, Acer Inc. dove into the mini-laptop market with the Aspire One netbook, which is designed to use Intel's Atom N270 chip. Acer's netbook runs either the Linpus Linux Lite operating system or Windows XP Home.
That move into the netbook market worked out well for Acer, according to Gartner's report. The analyst firm reported that both Acer and ASUS "had a strong focus and acted quickly in the mini-notebook segment." Because of it, both PC makers saw strong third quarter growth.
Gartner reported that Acer, which has scrambled into third place in the worldwide PC shipment market, saw 47.3% year-over-year growth in the third quarter. That's compared to 8.1% for fourth-place Lenovo, 15.1% for market leader Hewlett-Packard Co. and 11.6% for second-place Dell Inc.
Gartner reports that HP was hurt by its slow entry into the netbook market. Dell, which maintained its top position in the U.S. market, was hit by the general weakness in both the enterprise and home markets.
According to the analyst firm, for the U.S.-only market, HP comes in behind Dell, while Apple takes the third spot. Acer is in fourth place and Toshiba rounds out the top five.
PC shipments in the U.S. market grew 4.6% in the third quarter of 2008 compared to the same time last year. Gartner also reported that mini-notebook shipments accounted for about 5% of U.S. mobile PC shipments and added one to two percentage points of year-over-year growth.

Altor Ships Firewall for Virtual Systems

Altor Networks is announcing the availability of its firewall designed for virtual environments that overcomes some shortcomings of traditional firewalls that have been adapted to run on virtual machines.
Altor VF addresses blindspots that exist with other firewalls deployed in virtual environments. Products outside the physical server on which virtual machines are running have no visibility of traffic among the virtual machines and can take no action on that traffic.
In addition, as virtual machines recreate themselves to meet demand -- known as live migration -- they can wind up on physical machines with other applications they were never intended to be exposed to. Live migration can help propagate infections by expanding the presence of corrupted machines.
Altor VF migrates a virtual firewall and the rules that pertain to a particular virtual machine when it undergoes live migration. Other firewall vendors such as Check Point and Stonesoft offer virtual versions of their firewalls, but they don't address firewall rules for virtual machines that migrate.
Altor includes a tool to define where to place firewalls among virtual machines, automating a multi-step process. It also controls virtual machine sprawl by enabling default settings that can, for example, lock down virtual machines for which no one claims ownership.
The firewalls can also impose security policies on traffic.
Altor VF integrates with Juniper intrusion-detection system (IDS) gear, sharing its logs so the IDS can assess traffic among virtual machines. Altor says it has a similar relationship with ArcSight's security event management platform and Mazu's network behavior analysis products.
Altor VF costs US$2,000 per physical server with discounts for volume purchases.

Wednesday, October 15, 2008

Exchanging E-mails With a Pirate

The Pirate Bay (TPB), one of the world's biggest torrent tracker sites, found itself embroiled in controversy last month, when a link to a torrent containing photographs of a grisly child murder in Sweden appeared on the site.
A torrent is a small file that contains information about another file, such as a movie, distributed using the BitTorrent peer-to-peer protocol. The torrent itself doesn't contain the movie, but acts as a marker of sorts, pointing computers to the actual file.
The torrent of the photographs, which were released by a Swedish court presiding over the case, was not posted online by TPB or its founders, but the site nevertheless found itself at the center of a discussion on the limits of free speech on the Internet, and to what extent Web sites should be held responsible for content posted by users.
The controversy was different from that normally faced by TPB, which has made enemies of the music and movie industries, as well as the U.S. government, over allegations its activities violate copyright law -- charges the site denies, citing differences between U.S. and Swedish law.
TPB's view on the pictures was that anger over their release should be directed towards the court that made them public, rather than TPB. The site refused calls to take down the torrent, citing its general commitment to not censor or remove any files posted to the site, regardless of the circumstances.
The controversy came to a head when Peter "brokep" Sunde, one of the founders of TPB, was invited to appear on a Swedish television show for an interview, under an agreement that the father of the murdered children would not be present. According to Sunde, the television station broke the agreement and surprised him by inviting the father to participate in the show with him.
That experience led TPB to declare an end to all contact with the press. "All future interviews are to be considered impossible. We have no longer any interest in participating in traditional media since it's apparent that they are not trustworthy," TPB announced on its blog on Sept. 12.
Sunde and Fredrik "TiAMO" Neij, another TPB founder, will speak at the upcoming Hack In The Box (HITB) security conference in Kuala Lumpur, Malaysia, later this month. Their keynote presentation is called "How to dismantle a billion dollar industry -- as a hobby."
Despite the announced ban on press contact, Sunde agreed to an e-mail interview ahead of the presentation. What follows is an edited version of that exchange.
IDGNS: What can we expect to see in your presentation at HITB? Why did you decide to present at the conference this year? How did that happen?
Peter Sunde: The presentation will probably be a mixture of a tech presentation, some pirate humor and a story about the power of Internet. We usually hold seminars for politicians, so it's going to be very much more interesting doing it in front of people that understand the technology. We will talk about how and why we do what we do! We got in contact with some guys from Hack In The Box who are really good at what they do and they invited us to come over. Going to Asia is never a boring thing so we went for it!
IDGNS: The recent situation in Sweden involving the pictures from the police case and the interview on Swedish television was obviously an emotional experience. The Pirate Bay has said it believes in free speech without restrictions. At a personal level, did this experience cause you to reconsider your stance on this issue?
Sunde: No, we are very sure of what we do. One of the most impressive things for me about TPB when looking back is our consistency towards our goals and ideals. We've always been true to them, even when the winds have been blowing against us rather than with us. And in the end, that's what makes us what we are -- we're honest and have a good ideology behind us. Compare us to our opponents and see what you get -- hint, it's not honesty and ideals.
IDGNS: In an ideal world, do you think copyright should exist? If it should, what do you think is the ideal way to structure copyright laws? What restrictions should be put on consumers, and what rights should copyright owners have?
Sunde: The copyright issue is quite complex -- more complex than just writing an e-mail. But I do see things that can work in a copyright, but for commercial aspects. It's very important to not infringe on personal life due to copyright. Creative Commons and other licenses are a better way than today's copyright laws. However, I do feel that Creative Commons is not reaching far enough.
IDGNS: What do you think is the current state of copyright law and Internet censorship, globally? Are we moving forwards? Backwards? What forces are driving these changes?
Sunde: I think that the people are definitely moving forward. The media industry is fighting, lobbying and bribing their way through the system, which is a really bad thing, both for us and them. In the end, it will show that they are only in this for money and nothing else. What a surprise! It's not good for business.
IDGNS: Can you tell me something about the recent move in Italy to block access to TPB's site? What really happened leading up to when the judge overturned the decision by a lower court to block the site, and what was the ultimate impact from your perspective?
Sunde: IFPI [International Federation of the Phonographic Industry] in Italy -- called FIMA, I think -- decided to sue us personally in a country where we do not live or have any connection. That in itself is not a valid thing in Europe, but the judge however decided to let them do it and to let them win. It was quite crazy. We found some really good lawyers afterwards that helped us with the case and we won it quite easily in the higher level of court.
FIMA had a major setback by that, when even the European Union had rules saying that an EU country is not allowed to block access to a system in another country like that. For some stupid reason they refuse to listen to the judge and the laws (the typical IFPI approach) and have now decided to appeal to the supreme court. It's no chance for them to win but they are losing face if they don't appeal. The interesting part is that we have never done anything illegal, not according to Swedish nor European Union laws. Our opponents have broken hundreds of laws in order to get to us.
IDGNS: What is the latest on The Pirate Bay's other projects, like BayWords and the streaming-video service?
Sunde: Oh yes, we have some projects coming out. A problem is that we're only two to three people in the gang and some are more active than others, so the projects tend to take some time to finish. But we have two very exciting projects that we're working on and we hope to maybe talk more about them at Hack In The Box.

NetSuite Debuts 'SRP' Product Push

On-demand ERP (enterprise resource planning) vendor NetSuite announced a new product push Tuesday, stemming from its June acquisition of OpenAir, maker of software for project-oriented organizations.
NetSuite has been working to integrate its product and OpenAir's, and is now marketing the combination under the moniker of NetSuite SRP (services resource planning).
The products will work in tandem, with functionality like sales automation and core financials coming from the NetSuite side, and matters such as project task management and timesheets handled by OpenAir, which is also on-demand software.
NetSuite demonstrated the integration work at a company event in Boston Tuesday. An official showed how an employee could create a customer record in NetSuite and push it into OpenAir. Then the representative showed how to assign a consultant to the project and subsequently push information on the hours the person worked back into NetSuite for accounting.
"Sure, you could program all this stuff yourself for products that you use, but then you have to maintain that," said NetSuite CEO Zach Nelson, emphasizing one oft-cited, purported advantage of SaaS (software as a service) products like NetSuite, which sees vendors handle upgrades and integrations.
NetSuite didn't provide pricing information.
Michael Fauscette, an IDC analyst who spoke at NetSuite's event, said the project-based ERP market will grow to $1.5 billion by 2010.
But since many project-based companies actually bought ERP systems geared more for product manufacturing and then customized them, the real size of the project-based market could be much larger, he said.
In other news Tuesday, NetSuite announced a revamped OpenAir product lineup that includes offerings for small, medium and large businesses.
The entry-level Team Edition includes integrations with QuickBooks and NetSuite and, for a fee, Salesforce. The mid-level Professional Edition adds more powerful features, such as a configurable billing rules engine. The high-end Enterprise Edition provides the Professional Edition's features plus additional functionality, like an API (application programming interface) for connecting with other applications, as well as 24-7 customer support.
NetSuite's announcements closely follow Oracle's recent news that it plans to buy Primavera, a vendor Forrester Research analyst Ray Wang termed "the granddaddy of project management." Oracle CEO Larry Ellison is a major investor in NetSuite.

Washington Uses Google Apps to Power New Intranet

When it came time for Washington, D.C., to create a new intranet for city employees, spending US$4 million on a site based on proprietary portal software just didn't seem like a good idea to CTO Vivek Kundra. But using Google Apps did, he said in an interview Tuesday.
With its Web-based Google Apps suite, Google is currently trying to fashion itself into a worthy competitor to Microsoft Office in enterprise accounts. Washington, D.C., is an example of where the company is making some inroads, thanks to the thinking of 34-year-old Kundra, who believes technology that is open source or based on open standards -- or both -- is the future for the enterprise.
Google Apps is not replacing Microsoft Office entirely for the 38,000 municipal employees in Washington D.C., as some published reports have implied. However, Kundra said he is seeing more and more government employees "migrating to using Google Docs instead of Microsoft Office," and Google's online applications have advantages as far as ease of use and the ability to build new sites for the city's intranet quickly and easily.
The use of Google Apps to power the city's new intranet has its roots in a decision Kundra made soon after he took his job in March 2007. He looked at current IT projects and decided to eliminate one to build an intranet for millions of dollars based on proprietary portal software from Plumtree.
Washington, D.C., has been piloting the intranet with employees since June 2007. The application, which uses Gmail as its e-mail service and Google Apps for documents and spreadsheets, went live earlier this year and is currently in regular use, Kundra said.
Kundra decided to go with Google Apps as the basis for the new intranet not only because it was less expensive -- the city is paying Google about $475,000 a year in licensing fees -- but because new applications and interfaces can be assembled quickly on Google's platform because of its open nature.
"When we looked at integration and deployment costs, what we decided to do [was use Google because] it was at a lower cost and was a faster way of achieving the same goal," he said.
Take for example a new site the city created for its Fall 2008 Job Fair. On it, Kundra has his managers outlining in YouTube videos the positions for which they're hiring, an example of how easy it is on Google Apps to allow for "voice, video and data integration," he said.
Though Google Apps is not entirely replacing Office, it is eliminating the use of Microsoft's suite for certain jobs, Kundra said.
The city's budget planning and procurement process, the way it conducts internal surveys and, as shown by the Job Fair example, the way it goes about posting jobs and hiring employees are now done via Google Apps, he said. Previously, these tasks were largely based on a paper trail of Microsoft Word documents, Kundra said.
Google isn't the only company gunning for Microsoft, which still has the lion's share of the business market for productivity applications. IBM also has a free office productivity suite, Symphony, that it has built into its Lotus collaboration suite. There also is OpenOffice.org, the freely available, open-source productivity suite, the third version of which was released to the Web Monday.

Intel Earnings Up, Future Uncertain

Despite the ongoing financial crisis that is weighing down some tech companies, Intel earnings were up in the third quarter, narrowly beating analyst expectations.
Net income for the quarter was US$2 billion, or $0.35 per share, up from $1.8 billion, or $0.30 per share, in the same period last year. Analysts polled by Thomson Financial had expected $0.34 per share.
Intel's net revenue was $10.2 billion. Sales in both its microprocessor and chipset units drove revenue for the quarter, which ended Sept. 27.
The third quarter was the first full period in which Intel sold its Atom microprocessors and chipsets for low-cost PCs, and sales of the chips brought down the average selling price of microprocessors for Intel. Overall, the average selling price was lower sequentially, but excluding Atom shipments, the average stayed flat, Intel said.
The results included a number of charges, such as an impairment charge for the Numonyx investment and a restructuring charge.
Intel warned that it is difficult to know how the current economic environment may affect its business in the coming months. The chip giant plans to release a mid-quarter business update on Dec. 4. But for now, it is expecting revenue for the fourth quarter to be between $10.1 billion and $10.9 billion. The fourth quarter will include a charge related to the recent decision by Intel and Micron to close their joint production of NAND flash memory from an Idaho facility.
Intel's results come just a week after competitor AMD said it plans to split into two companies, one to design chips and one to make them, in an effort to compete more effectively with Intel. AMD, which plans to release its third-quarter earnings report on Thursday, has reported losses in the past seven quarters.

US Court Lifts Ban on Qualcomm Phone Imports

A U.S. appeals court overturned a ruling that prevented Qualcomm's handset-making customers from importing products to the U.S. because they allegedly contain patented technology developed by Broadcom.
The appeals court said the U.S. International Trade Commission had overstepped its bounds when it issued the ban against handset makers who were not named in Broadcom's complaint to the ITC. The appeals court also said the ITC misapplied the legal standard for "induced infringement."
At the same time, the U.S. Court of Appeals for the Federal Circuit agreed with the ITC's finding that the Broadcom patent in question is valid, something Qualcomm had disputed. The appeals court posted the 31-page ruling to its Web site (PDF).
Qualcomm said the court had "disapproved Broadcom's tactic of attacking the wireless industry, including handset manufacturers and wireless operators, without providing them with the opportunity to defend themselves in the action."
But Broadcom also applauded part of the ruling and appeared set to appeal.
"We are pleased that the Court affirmed our patent's validity, the infringement by Qualcomm's customers and the validity of the ITC's claim construction. In light of that, we believe that Qualcomm's continued use of our patented technology would certainly meet the new standard of intent and be found to infringe. We look forward to addressing this issue upon remand to the ITC," Broadcom said.
The ITC's ruling, issued in June last year, applied to new phones developed subsequent to the ruling, and followed an earlier ITC finding that Qualcomm had infringed on Broadcom's patent. The companies that joined Qualcomm in the appeal included Kyocera Wireless, Motorola, LG Electronics and Palm, as well as wireless carriers AT&T Mobility and Sprint Nextel.
The patent in question, U.S. patent number 6,714,983, describes a technology that helps save battery life when a mobile phone can't find a wireless signal. It can be viewed by searching at the U.S. Patent and Trademark Office Web site.

Saturday, October 11, 2008

Google in Curious Alliance With Click-fraud Detection Firm

In a development that would have seemed impossible two years ago, Google is cooperating publicly with Click Forensics, a click-fraud detection company with which it has had a rocky relationship.
Click Forensics said Thursday that Google has agreed to accept the electronically generated click-quality reports generated by the Click Forensics FACTr service. That means the process of documenting click-fraud instances and submitting reports to Google will be significantly automated and simplified for advertisers that use the FACTr service.
Google and Click Forensics make for strange bedfellows. The companies have sparred over the issue of click fraud, and the rhetoric has often approached ugly territory.
Google has accused Click Forensics of being inept in its methodology and misleading in its results in order to make the problem seem bigger than it is. Meanwhile, Click Forensics has charged that Google has purposefully trivialized click fraud and mischaracterized it as a minor problem.
Starring in the skirmishes have been Click Forensics President and Founder Tom Cuthbert and Google's expert on click fraud, Shuman Ghosemajumder.
Click fraud happens when someone clicks on an ad with malicious intent. For example, a competitor may click on a rival's pay-per-click ads in order to drive up their ad spending. Or a publisher may click on pay-per-click ads on its site to trigger more commissions.
Google generates almost all of its revenue from the type of online advertising that is most vulnerable to click fraud -- pay-per-click ads that appear along with relevant search results or in Web pages of relevant content.
Google declined to comment for this article, but Click Forensics CEO Paul Pellman said his company welcomes Google's cooperation in the FACTr (Fully Automated Click Tracking Reconciliation) service.
"From our standpoint, this is the first opportunity in which we've been able to implement something specific with Google, which is great," Pellman said.
Joseph Cowan, senior search strategist at Outrider, a search-engine marketing agency, said it would have been unheard of not long ago for Google to let itself be identified as a Click Forensics collaborator.
"Two years ago, it was a very adversarial relationship," said Cowan, whose company helps advertisers manage campaigns on Google and other search ad networks.
Outrider, which has been in business for 13 years, started offering Click Forensics click-quality services to its clients about two years ago. It realized at the time that click fraud went beyond scammers clicking on pay-per-click ads for malicious purposes, such as inflating their commissions or hurting competitors, he said.
Outrider views click fraud as a broader problem that includes what it calls "unwanted clicks" that aren't maliciously generated but that nonetheless offer advertisers little or no value. For example, a company that only sells in the U.S. gets no benefit from clicks on its ads by people who live abroad, he said.
Gaining this insight, backed up by hard data from Click Forensics, lets Outrider further optimize its clients' campaigns, Cowan said. And the more confident a company is about the effectiveness of its search ad campaigns, the more it will invest in them, which is good for all parties involved, Google included.
"By buying into this, Google is simply accepting the fact that it's good to have a third party review [its ad campaigns] so that someone not connected to their company is also saying 'yes, it's working,'" Cowan said.
Click Forensics' Pellman said his company doesn't need cooperation from Google or any other search ad providers in order to collect their data and track clicks on clients' campaigns.
However, the FACTr service, which focuses on generating automated reports based on the collected data right from the Click Forensics interface, does benefit from cooperation from the search ad providers.
"Customers can now [electronically] submit a detailed evidence report in the format in which Google wants it, which is different from the format in which Yahoo wants it. And Google will accept that report and respond back," Pellman said.
In addition to Google, Click Forensics also announced on Thursday that Miva and LookSmart are now also supporting FACTr.
Click Forensics, which reports on click-fraud incidence every quarter, recently said that the overall industry average for click fraud was 16.2 percent in the second quarter of this year. "We continue to see click fraud as a big challenge for advertisers," Pellman said.
Fraudsters are getting more sophisticated and trying to make their scams harder to detect and track, lately resorting to using botnets to perpetrate click fraud, he said. "Click fraud is a big, consistent problem and it's not going away," Pellman said.

PCI App Security: Who's Guarding the Data Bank?

While Willy Sutton never really said it, the truth is that people rob banks because that is where the money is. Today's criminals don't walk into banks with loaded guns and get-away drivers. Rather they connect from a remote location using a browser and are armed with hacking tools and spyware.
Where criminals of old targeted the teller behind the counter, today's attackers target banking and e-commerce applications. So, although the targeted infrastructure has changed, not much else has really changed from a threat perspective since Willy Sutton robbed banks. Ask a hacker "where is the money?" They will tell you: behind and within the poorly written and poorly protected banking and e-commerce software applications.
The list of threats and their calamitous consequences targeting banking and payment applications is seemingly endless. Identity theft, data leakage, phishing, SQL injection, worms, application Denial of Service (DoS) attacks, and botnets just scratch the surface, but these are the threats critical applications have to be secured against today. The big problem is that the number of threats as well as the number of applications that need to be secured are increasing on a regular basis.
PCI and Application Security
To date, the industry has given short-shrift to the needs of application security, and we all have paid for it with continuing data breaches. Consider this, Microsoft finally got serious about application security in 2002 with its Trustworthy Computing (TWC) initiative. TWC was an outcome of devastating attacks against Microsoft operating systems with worms such as Code Red and Nimda.
TWC was announced in an all-employee email from Microsoft head Bill Gates. He redirected all software development activities at Microsoft to include a full security review. Even with that directive it still took years to get to the point where Microsoft's code could start to be secure. How many merchants in the PCI space have their founders tell everyone to code securely and that they will stop all development until it is done? The point was, and is, that application security needs to be taken seriously and this means, investing the time, effort, and resources to do it right.
Application security is at the heart of the Payment Card Industry (PCI) security standards and requirements. In the last few years, data breaches have resulted in hundreds of millions of data records being compromised. In most of these cases, the firewalls worked, the encryption worked, the logging worked, but the application contained security holes which obviated much of the security. It's like barring the front doors to the bank and leaving a back window open.
So why has PCI started focusing on web and payment applications? For the very reason that these applications are the most obvious entry point for attackers to gain access to back-end databases containing huge amounts of credit card data.
Within the PCI Data Security Standard (DSS), requirement 6.6 (which became mandatory on June 30, 2008) requires the validated security of web-based applications. PCI DSS requirement 6.6 requires organizations that process credit card transactions to address the security of web applications, either via manual or automated source code reviews or vulnerability scans, or via the installation of a web application firewall between a client and application.
PCI DSS Requirement 6.6
While the applications security requirements in PCI DSS section 6.6 comprise a mere 44 words, don't think that application security compliance is either unimportant or a piece of cake. The specifics of requirement 6.6 are:
Ensure that all web-facing applications are protected against known attacks by applying either of the following methods:
-- Having all custom application code reviewed for common vulnerabilities by an organization that specializes in application security
-- Installing an application layer firewall in front of web-facing applications
First off, just what is this thing called an application layer firewall? Also termed a web application firewall, it is a network device that is placed in front of a web application to protect against application attacks. An application layer firewall can view and digest all application traffic, but has the enhanced capability to specifically filter session, presentation, and application layer network traffic (OSI model) in real time. This gives it the advantage of protecting the applications and all associated sensitive data from illegitimate access and unauthorized usage.
The security threats mitigated by an application layer firewall are very real. To give you a feel for things and to truly address business risk, note that the range of software security risks is significant. They can be divided into two distinct types; coding vulnerabilities and design flaws/policy violations. According to a leading software application security firm , they view the hierarchy as:
Coding vulnerabilities:
-- Buffer overflows
-- Format string vulnerabilities
-- Race conditions
-- Resource leaks
-- Input/output validation and encoding errors
o SQL injection
o Cross-site scripting
o Operating system S injection
Design flaws and policy violations
-- Cryptography
-- Network communication vulnerabilities
-- Application configuration vulnerabilities
-- Access control
-- Database and file system use
-- Dynamic code
-- Access control and authentication errors
-- Error handling and logging vulnerabilities
Insecure error handling
Insecure or inadequate logging
Native code loading
Data storage vulnerability
-- Insecure components
Malicious code
Unsafe native methods
Unsupported methods
Custom cookies/ hidden fields
While this one of a number of possible attempts at threat codification, the message should be clear that software security is a multifaceted effort that takes a directed and formalized approach.
Insecure Banking Applications
Resting 50 feet below sea level, on the solid bedrock of Wall Street, the Federal Reserve gold vault [links to .pdf] contains hundreds of billions of dollars worth of gold. Besides the extravagant layers of security implemented around, and within, this facility the reality is that gold being quite heavy, bulky, and is difficult to move. Even if an attacker got in, it would be hard to get out with a significant amount of gold.
While getting gold out is difficult, data is light and very fluid. Transferring a gigabyte of data today is almost trivial. The data contained in today's banking applications have the value of gold, yet are light enough to access and move with ease. These applications will contain from tens of thousands to hundreds of millions of records. The application will connect to a database that serves as the repository for sensitive personal data. The hacker's booty will be contained within these applications, and the currency can take many forms; but usually is comprised of customer account information and other personal identifiers that the modern day Willie Sutton can use for ill-gotten gains. And getting the currency out is merely transferring strings of ones and zeros from one computer to another, hardly like attempting a haul of heavy gold bars.
In the case of PCI Data Security Standards (DSS) the 'money' or currency, aka sensitive information that most needs to be protected, includes:
-- primary account numbers (PAN)
-- cardholder name
-- various service codes
-- expiration date
-- other items that are allowed to be digitally stored if suitably protected
-- magnetic card stripe data including security codes and PIN's
In fact, based on merchant compromises, Visa has found that the storage of prohibited data (full track, CVV2, PIN blocks, etc.) was the prime cause for most cardholder information data breaches. Often, the data stored in these databases should have never been stored there in the first place once the requested credit card transaction was authorized. But time and time again there are instances of proprietary and commercial off-the-shelf (COTS) applications being compromised because they are not developed to be in compliance with PCI DSS and secure coding requirements. In some instances the reported code flaws violated just plain common sense application development standards, such as OWASP.
As security professionals and PCI Qualified Security Assessors (QSA) [links to .pdf], based on our own experience and insight into the market, the authors are surprised that in 2008, there are still banking applications that are deployed without a formal security-based SDLC and security code review. Compounding this, far too few organizations have effectively trained their developers in security development practices. It is almost criminal that in late 2008, developers creating such payment processing applications don't even know what the PCI DSS requirements for the proper processing and storage of sensitive information are.
An additional layer of protection should be provided by quality assurance personnel who should be testing these applications. They should do so with respect to all aspects of how sensitive information is handled throughout the information life-cycle and the historical recording of resultant transactions. Yet many application test teams continue to focus their efforts on testing only the functionality, scalability, and loading requirements of application capabilities.
To their credit, some application testers may actually scan the various components with a generic vulnerability scanner, but lack the skill set to properly interpret the scanner output results. They may also be relying on how the application is deployed or on other external controls to provide the necessary security. Those tasks might have been tolerable long ago on non-Internet connected systems, but are vastly inadequate when the application is running on an open, publicly accessible networks.
Fortunately for those who rely on such commerce, which is everyone with a credit card, the PCI Security Standards Council (PCI SSC), the major card brands, card issuers, and Qualified Security Assessors across the industry are working in concert to permanently change things for the better. The posse, if you will, has been formed, and if the necessary fundamental changes can be made, the modern day Willie Sutton and his boys' days are numbered. Payment application security may ultimately become ubiquitous and considered just as important as any other system function.
PCI Application Security Requirements
Currently, there are requirements for custom in-house or out-sourced applications and software that are not commercially re-sold to comply with all relevant requirements of PCI DSS, All custom applications must be developed, deployed, supported and refreshed according to these requirements.
They include the following:
1. Applications must be developed as a part of a well-defined Software Development Life Cycle (SDLC) with security principles incorporated into the development process.
2. Applications should reside on hardened operating systems and with discrete and well defined limitations on unnecessary functionality.
3. Applications should never store sensitive authentication data (card magnetic stripe, security codes, PINs, etc.)
4. Applications should not interfere with cyber-security controls such as antivirus, firewalls, cryptographic protections, secure authentication schemes, IDS/IPS, etc.
5. Web based applications must be developed in accordance with the Open Web Application Security Project (OWASP) guidelines for secure coding.
6. Web-facing applications (i.e.-Internet facing) must be protected either with a source code review by an authorized entity or be protected by application firewalls.
7. Applications should be tested for security vulnerabilities in addition to functionality testing by someone other than the authors of the actual code.
Application Security Action Items
Some business and IT leaders may be just starting to consider the security implications of their banking or commerce applications. There may be lingering uncertainty on what to do first. The following 5 steps are a great first start:
1. Update POS Applications. Visa maintains a list of Payment Application Best Practices compliant POS applications. Ensure that you are running a compliant version of POS.
2. Identify Poorly Coded Web Apps. Perform a code review for known coding flaws. Then follow-up with a vulnerability scan and an application-layer penetration test to ensure application code is PCI complaint and secure.
3. Perform Quarterly Vulnerability Scans. As detailed in DSS section 11.2, run internal and external network vulnerability scans at least quarterly and after any significant change in the network (such as new system component installations, changes in network topology, firewall rule modifications, product upgrades).
4. Perform Annual Penetration Testing. Both internal and external (public facing) applications that process "sensitive" data should be penetration tested at least annually and whenever they undergo significant revision.
5. Create Formal SDLC Processes. Microsoft understood this via Trustworthy Computing. Make sure you formalize a Software Development Life Cycle that incorporates security analysis throughout that life cycle.
Note that these five steps will keep your development teams busy for a while. And make sure you have a good project manager to keep all of the tasks and teams in sync.
Visa PABP Replaced With PCI PA-DSS
COTS payment processing applications that are sold or leased to the public have more stringent requirements for application security compliance. These requirements were originally developed, implemented and enforced by Visa and were known as the Payment Application Best Practices (PABP) standard.
Over the years these requirements served the industry well and have helped to protect Visa credit card commerce wherever compliant applications have been implemented. Unfortunately, however, the PABP was focused primarily upon applications processing Visa payments, and the enhanced security benefits could not be shared across all payment card brands. It became obvious that a broader, more encompassing application security standard was in order; this is where PCI Payment Application Data Security Standard (PA-DSS) came into play.
In November 2007, the PCI Security Standards Council (SSC) announced that PABP will be transcended by the PCI Payment Application Digital Security Standard (PA-DSS). In doing so the PCI SSC became the sole entity to maintain these new card brand independent requirements and oversee compliance with this new security standard. Payment applications that have been previously certified as compliant with the most current versions of the PABP specification will have their certification grandfathered for a limited time, and be given a grace period before they must be recertified under the new PA -DSS.
Newly developed commerce applications, which are sold to the public, will have to be tested and found compliant with PA-DSS requirements starting in October 2008. The two standards are similar and indeed a majority of PA-DSS content is based upon the previously well-defined PABP requirements. There are some distinct differences between two, however, including a very stringent requirement for the PA-DSS QSA to validate the environment which is used for all application security testing.
In addition, the PA-DSS Implementation Guide (similar to PABP's Best Practices Implementation Guide) has detailed references on how to securely implement the payment application and related systems in a specific supported, compliant configuration. It also clearly states that any deviations from specific supported configurations may indeed jeopardize PCI DSS compliance for merchants and businesses who implement the chosen COTS payment application.
Additional Visa Mandates
Beginning in January 2008, Visa raised the bar on application security when they announced a series of new mandates. Ultimately, these mandates are designed to eliminate the use of what are deemed to be vulnerable payment applications from their Visa's payment processing networks. To quote from their announcement, "These mandates require acquirers to ensure their merchants and agents to not use payment applications known to retain prohibited data."
The initial Visa mandates will be focused primarily on new payment applications to be connected to the Visa payment processing system this year. As the other additional mandates are phased in over time, however, their overall objective is to force the eventual de-commissioning of all known vulnerable payment processing systems from Visa networks by July 2010.
In addition, Visa will be publishing a list of current known vulnerable applications and providing that information to acquirers. By doing so, Visa can ensure that acquirers will hold their merchants and agents accountable for using only non-vulnerable payment processing systems.
Conclusion
Web applications have become the backbone of banking and e-commerce. POS and payment processing applications leveraging web and web-like technologies are being deployed as the next generation alternative to similar legacy systems. They connect end-users, customers, merchants, agents, and partners and process sensitive data including personal and financial information which is of the highest value. They do so anywhere, everywhere, anytime, and in real time. The need for significantly enhanced application security becomes paramount, and as a result the importance of PCI DSS and PA-DSS application security requirements become even more focused.
While application security presents some of the most challenging, and possibly the most costly, barriers to compliance with PCI DSS, requires 6.6 is far too important to ignore, no matter how difficult it is, nor how high the cost. Your organization's future depends on securing web applications and the costs of an unauthorized breach will eclipse the costs of doing the right thing by protecting the applications and sensitive data in the first place.
Ben Rothke CISSP, QSA (ben.rothke@bt.com) is a Security Consultant with BT Professional Services and the author of Computer Security: 20 Things Every Employee Should Know (McGraw-Hill Professional Education). David Mundhenk CISSP, PCI-DSS & PA-DSS QSA, QPASP (stratamund@sbcglobal.net) is a Security Consultant with a major professional services firm.

When in Doubt, Consider the Customer

I have an interesting question for you: How many people are there between your customer and your CEO? Hint: As in golf, lower scores win. And as many of us find ourselves facing bleak numbers in the short term, it's worth thinking about what makes happy customers.
The problem with many businesses is that they have too many customer service firewalls in place. There is no accountability. There are no common complaint-routing protocols, and escalation procedures for getting things resolved are spotty at best. Many businesses think nothing of spending hundreds or even thousands of dollars on customer acquisition, but when it comes to spending on customer retention, there isn't any calculation on what it costs to reduce churn.
[ Check out The Gripe Line, InfoWorld's ongoing effort to expose and resolve reader problems with customer service. ]
Why is this the case? Aren't happy customers the best references for a business? And these days, unhappy customers have powerful tools at their disposal to carpet-bomb the Internet to tell their tales of woe.
How many of you have seen the Dell-laptop-on-fire video or heard the recording of the caller who tried in vain to cancel his AOL account for the better part of 15 minutes? And while there are some ways to defend against a rabid blogger, there is no better mechanism than to provide solid customer service to begin with.
I was reminded of the sad state of affairs with customer service when I tried to order new service from AT&T and DirecTV for a new home. First I tried to order via their Web sites. No go. Then I tried the phone number -- which on AT&T's Web site was outdated. DirecTV makes you hunt down the number. When I finally found the right numbers, I got the wrong department because I was calling from an IP phone that had a 310 area code, yet I live in St. Louis (I guess it is my fault for not having the "right" phone number). The service rep just terminated the call rather than bothering to help me with my DSL service. When dealing with the phone company, data is still a four-letter word.
Is something wrong with this picture? Contrast this with a local retailer who sent my wife a check for $2 because he overcharged her shipping expenses. Or the guy at my local garage, who told me I owed nothing because he could easily fix my car's problem and didn't feel right about charging me anything. When was the last time your mechanic did that? Both of these folks earned my undying loyalty and respect and got us as forever customers. What was the cost of that acquisition? Not a heck of a lot.
Right now many of us are being forced to revise our budgets or make contingency plans based on bleak forecasts. As part of that painful process, consider how many layers you've put in place and whether that structure actually makes your customers happy. A little thought could yield some pretty impressive improvements to your bottom line.

Whoop de Doop for De-Dupe

De-duplication started out as a way to do backups without having to store mostly the same stuff over and over again. Companies like Data Domain, Diligent Technologies, and NetApp provided de-dupe of virtual tape libraries and direct-to-disk backup targets, providing full backups that stored only the changes since the previous backup. The result: You could reap the same space savings you get with incremental backups but without the necessity for multiple restores to re-create an entire volume.
Now these same companies are advertising de-duplication of near-line storage, and even online storage in NetApp’s case, while other vendors are using de-duplication to reduce WAN traffic, shrink the size of databases, or compress e-mail archives. Yes, de-duping is going gangbusters. Heck, we might even dream of the day when you might never need more than one copy of any file throughout the entire enterprise. Assuming it’s possible, is that something you’d want?
Currently, all storage de-duplication requires a gateway between the server and the storage. Methods of de-duplication vary widely. Some solutions function at the file level, some at the block level, and some work with units of storage even smaller than blocks, variously referred to as segments or chunklets. Processing for de-duplication can occur either "in-line" (i.e., before the data is written to storage) or "post process" (meaning after the data is initially written).
There are applications where de-duplication is extremely effective, and ones where it isn’t. If data is largely the same, such as multiple backups of the same volume or boot images for virtual servers, de-duplication can provide enormous reductions in the storage space required. However, dynamic data, such as transactional databases or swap files, will show very little reduction in size and may also be sensitive to the latency introduced by de-duplication processing. In the case of databases, though, de-duplication can in fact improve I/O performance and speed up some queries (see "Oracle Database 11g Advanced Compression testbed, methodology, and results").
But the biggest issue with de-duplication is that it creates a choke point: All data to be compressed must be saved and retrieved through the de-duplication gateway. This isn't much of an issue with backups or even near-line archives. But for applications where access to the data becomes critical, or usage is heavy, the gateway becomes a hot spot, requiring redundant gateways, dual-path SAN infrastructure, and redundant storage. Given the investment necessary to support live data, where even short interruptions to access would cause major problems, it is typically cheaper to live with multiple copies.
There’s no question that de-duplication can provide great benefits in specialized applications, including backups, e-mail archives, and other cases where data is largely repetitive, such as VMware boot images. However, a fully de-duplicated enterprise, even if feasible, would require a massive and expensive infrastructure. Given that disk capacity continues to grow in leaps and bounds, scaling out de-duplication will be difficult to justify. It’s cheaper to keep buying more local storage than to put all the eggs in one basket.

Forrester: Discontent Persists Over SAP Maintenance Hike

A new Forrester Research study that polled more than 200 SAP customers found widespread discontent over the vendor's recent decision to shift customers to a pricier Enterprise Support offering, and also provides tips on how customers can mitigate the increased cost.
SAP announced in July that Enterprise Support would replace its basic and premium support options. Enterprise Support costs 22 percent of a customer's license fees, compared to 17 percent for basic support. The additional costs will be phased in over the next few years, and new charges won't begin until Jan. 1.
But Forrester clients voiced a number of common gripes.
Eighty-five percent of the clients interviewed described minimal utilization of the Basic Support offering. "The average customer claims to connect with SAP fewer than six times a year -- the equivalent of buying a comprehensive but expensive insurance policy and rarely utilizing it," the report states.
Customers also complained to Forrester about the time it takes SAP to meet requests for new features.
"Customers believe that the maintenance dollars paid to SAP should go to filling in key functionality gaps in the software. However, there are a plethora of examples where key functionality requested two to four years ago by multiple customers in the same or different industries were not delivered in SAP R/3 4.7, let alone available in SAP ERP 6.0," the report states.
Clients "want to know how much of their support dollars really go back into reinvestment versus profit margins," it adds.
SAP has cited a number of reasons for its decision, such as greater complexity in customer environments, and argues that Enterprise Support provides a higher level of benefits for customers -- points the Forrester report does not dispute.
To mitigate the increased cost of maintenance, customers should seek steeper discounts on licensing deals, according to Forrester.
Another tactic would be to create a long-term "SAP containment strategy," which could include taking a look at other vendors. "Many SAP clients with whom we spoke have begun the process of evaluating Oracle Siebel, Salesforce.com and others for customer relationship management as well as Siperian, Initiate Systems and IBM for master data management."
Customers also should consider third-party options for support. While one company, Rimini Street, has announced plans to provide such support, it has not yet begun doing so, and even when it does will focus on only SAP's R/3 products.
The Forrester report's results stand in contrast to SAP's past contention that while customers may not like to pay more money, they understand the value of the new service.
An SAP spokesman was given a copy of the report on Friday, but did not immediately provide comment on its findings.

Thursday, October 9, 2008

Microsoft Joins Study to Gauge Impact of Genetic Testing

Microsoft is co-sponsoring a study to see if people who undergo genetic testing to identify their risk for developing certain diseases actually change their behavior to mitigate that risk.
San Diego-based research lab Scripps Translational Science Institute (STSI), the study's main sponsor, will offer genetic scans to up to 10,000 employees, family members and friends of Scripps Health that provide a detailed analysis of their risk for more than 20 health conditions. Scripps Health is a US$2 billion nonprofit community health system also based in San Diego.
The conditions -- including diabetes, obesity, heart attack and some forms of cancer -- are ones that can be changed or prevented by people's lifestyle choices. Scripps will then track changes in the participants' behaviors over 20 years to see if people who learn they are at risk for certain diseases or conditions will actually take preventative measures to avoid them, Microsoft said.
Microsoft is contributing its HealthVault service to the study. HealthVault is Microsoft's online repository for storing patient information and allowing it to be shared, at the discretion of patients, with health care providers and other parties they trust with it, such as their insurance companies. Using HealthVault, study participants can store, track and manage health and lifestyle information over the course of the study, Microsoft said.
Genetic testing service provider Navigenics of Redwood Shores, California, and Affymetrix of Santa Clara, California, which provides hardware and software to do genetic testing, also are co-sponsoring the study and contributing technology to it.
Affymetrix will scan the genomes of participants, while Navigenics will interpret the scan results and offer guidance to study participants to help them prevent future health conditions, or at least lessen the negative impact of them.
The ultimate goal is to help researchers better understand ways to prevent, diagnose and treat diseases, Microsoft said.
HealthVault aims to bridge the gap between enterprise companies, such as health-insurance providers, and patients through an online system that allows them to share information securely over the Web. Competitor Google also is developing a similar offering called Google Health.

Nokia, Nuance Aim Voice Features at Developers

A partnership between Nokia and speech-recognition software vendor Nuance will make the software company's capabilities available to third-party developers as well as to the phone maker itself.
Nokia is already a major customer of Nuance, which provides both speech and predictive text capabilities on a number of Nokia's handsets. The deal announced Wednesday could bring more Nuance technology to Nokia phones but will also produce open protocols that developers can use to build such features into applications for the handsets.
Nuance is a major supplier of speech-recognition software for both PCs and mobile devices, and last year bought the maker of the widely used T9 software for completing words that cell-phone users are trying to type in e-mail or text messages. It has gone beyond basic voice-enabled functions such as dialing and now offers additional capabilities such as text or e-mail dictation and searching for content on a phone or products in an online mobile store. Nokia is the world's largest mobile-phone maker.
Under the new relationship, Nuance will provide Nokia with some of those more advanced features, said Michael Thompson, vice president and general manager of Nuance Mobile Speech. But the implications for third-party applications could be even more significant.
Full details haven't been worked out, but the idea will be to let developers using Nokia's Series 60 and Series 40 software platforms take advantage of Nuance's speech-recognition and other functions in their applications, Thompson said. Rather than having to approach Nuance separately, the developers would get access to those capabilities through the Nokia development platforms. The companies said they would provide developers with open programming interfaces, language models and development tools.
The applications they build could appear on a wide range of phones from low-end devices to smartphones, and Nuance's technology spans functions that take place both on devices and on servers, Thompson said. Server-based speech recognition, accessed over a high-speed mobile network, can leverage much greater processing power and consume less memory and battery life than what is done on the phone itself.
Nokia supports a large community of application developers, especially for its Series 60 smartphone platform, and is in the process of buying out the Symbian OS that forms the basis of Series 60 and making it available as open source through an entity called the Symbian Foundation. That move is part of the wider trend in the mobile industry, accelerated by Apple's iPhone App Store success, of creating and tapping into large developer communities to make devices compelling.
The deal is not exclusive, and the companies didn't reveal any financial details. The primary royalty arrangement would be between Nokia and Nuance, Thompson said.
Making the voice tools available to third-party developers might lead to a flowering of new voice-enabled mobile applications, including games, said analyst Jack Gold of J. Gold Associates. But there's a danger of those third parties implementing the technology poorly, he added.
"Command and control is one thing," Gold said. "Understanding random speech patterns is very hard to do."

Microsoft, Others Seek to Get Paid for Sales to WaMu

Microsoft got in line with several other organizations that are taking steps to get paid for products and services provided to Washington Mutual, the largest bank to fail in U.S. history.
On Tuesday, Microsoft filed a document with the Delaware bankruptcy court handling the WaMu case asking to be sent copies of all proceedings in the case.
"Microsoft filed a notice of appearance because we have existing contracts for software licenses and consulting services with Washington Mutual and we want to make sure those contracts are properly administered through the bankruptcy process," said David Bowermaster, a Microsoft spokesman, in an e-mailed statement.
Microsoft would not describe the size or duration of its contracts with WaMu. The bank is a beta tester and early user of its products, and last year a WaMu executive joined Microsoft Chairman Bill Gates on stage at a launch event for Windows Vista, Office 2007 and Exchange 2007.
Microsoft isn't alone in its efforts. On Tuesday Siemens filed a motion asking that JP Morgan, which is taking over WaMu, either reject or assume an IT services contract it has with the bank, worth US$5 million to $6 million each month. Siemens said it had supplied about $10 million worth of IT services to WaMu that have not been paid for. Siemens has about 400 employees and contractors providing services to the bank, it said.
And Tata Consultancy Services, the Indian outsourcing and IT services company, filed a motion similar to Microsoft's asking to be kept apprised of proceedings.
Government regulators seized WaMu in late September and let other companies bid to take over the bank. JP Morgan had the winning bid. The economic crisis in the U.S., driven in part by a housing slump, hit WaMu particularly hard because it was one of the country's biggest providers of home mortgages, including risky loans.
Amid the troubles in the financial sector, many analysts say that the surviving financial organizations will still need to rely heavily on technology, so IT vendors may not be hit too hard by banking failures. While the wider economic meltdown is sure to slow IT spending overall, many analysts still say that they expect most of the big vendors to be able to weather the storm.

Wednesday, October 8, 2008

Yahoo Revamps Calendar Service

Yahoo has developed a new online calendar that the company said offers significant improvement over the current product because it makes it easier to share items and has a more interactive interface.
Yahoo will begin to offer the new version of Yahoo Calendar on Wednesday in beta in the U.S., Brazil, India, Taiwan and the U.K. General availability is expected in the coming months.
Although Yahoo Calendar and Yahoo Mail are tightly integrated, only about 8.1 million use the former and about 278 million the latter, said John Kremer, Yahoo Mail vice president.
The situation is similar among other major providers of Webmail and online calendar services, said Matt Cain, a Gartner analyst. Only about 3 to 4 percent of consumers also use their preferred Webmail service's companion online calendar, he said.
There are various reasons for this, including continued use of paper-based calendars at home and of desktop calendar software like Outlook and Lotus Notes in the workplace, Cain said.
However, Gartner sees favorable conditions for significantly boosting the use of online calendars among consumers, from around 4 percent in 2008 to 25 percent in 2012, Cain said.
The factors that will drive up usage include increasing industry adoption of open standards that make online calendar services interoperable, he said. This is coupled with an increasing realization among consumers of the benefits of subscribing to public calendars and sharing their online calendars with friends and family. In addition, younger users are all growing up using online calendars, he said.
"There's no question that online calendars will emerge as a very important part of portal collaboration software offerings for Yahoo, Microsoft, Google and AOL," Cain said.
"For anyone with a big e-mail population, there's no doubt that the next big battleground will be at the calendar level," Cain said, adding that online calendars will become prime real estate for advertising.
Yahoo is hoping that the new Yahoo Calendar will prompt more Yahoo Mail subscribers to use it. Yahoo Calendar hasn't gotten a facelift of this magnitude in about 10 years, Kremer said.
Among the features Yahoo is highlighting in the new calendar service is its compatibility with competing products from providers like Mozilla, Apple, Microsoft, AOL and Google.
This compatibility, which will allow Yahoo users to share calendar data with users of those other services, is possible because Yahoo Calendar is built on open standards like iCalendar (iCal) and CalDAV, Kremer said.
Yahoo hopes that open calendar standards are adopted broadly not only among online service providers but also among makers of business desktop and mobile calendar software, so that Yahoo Calendar will become interoperable with those products, Kremer said.
The new Yahoo Calendar allows items to be dragged-and-dropped into the calendar, as well as to be color-coded, and users will be able to call up calendar views as detailed as a single event and as broad as a month. The new Yahoo Calendar also has a "to do" feature for listing pending tasks and the capability to set up reminder alerts that can be delivered via e-mail, instant messaging or SMS.
Future versions of the new Yahoo Calendar will also sync up bi-directionally with Microsoft Outlook's calendar and let users access Yahoo Calendar when they're not connected to the Internet via integration with Zimbra Desktop. The new Yahoo Calendar is based on technology from Zimbra, which Yahoo acquired last year.
In the future, Yahoo also plans to add further integration with other of its services, such as Yahoo Maps, so that users can call up a map from within the calendar, Kremer said.
Users will also be able to subscribe to receive events from public calendars and event services like Yahoo's own Upcoming.org.
The new Yahoo Calendar will also give third-party developers the chance to build applications and extensions for it via APIs and its standards-based architecture, Kremer said.
Yahoo has set up a Web site for those interested in getting more information about the new Yahoo Calendar.

Wikia Search Debuts App Platform to Sharpen Query Answers

Wikia Search will roll out on Wednesday a platform that developers can use to create applications for this open-source search engine.
Called Wikia Intelligent Search Extensions (WISE), the platform is aimed at letting individuals and organizations create applications that sharpen the search engine's ability to answer queries.
"As we look at what's going on in search, we've realized there's a whole bunch of rules-based mechanisms you can use to map a search query to the exact correct result," said Wikia Inc. founder Jimmy Wales.
Already organizations including Thomson Reuters, The Washington Post, Digg and Twitter have created extensions to allow Wikia Search to provide more relevant results for specific queries.
Wikia Search already allows anyone to participate in building its index by manually adding, deleting and rating Web pages, as well as editing a search result URL by modifying its headline and description. Contributions are reflected immediately and don't go through an approval process.
With WISE, Wikia Search wants to automate this participation. "We'll get a whole new level of user-built search. Instead of having to individually edit one result at a time, which is useful in some contexts, here users can create applications for whole categories or whole rules of searches," Wales said.
The Washington Post's application, for example, delivers articles from the newspaper to Wikia Search results.
Also taking advantage of the WISE platform to create direct links between their site content and Wikia Search are weather information provider AccuWeather.com, job listings site Indeed.com and travel site Kayak.com.
Wikia Search will provide documentation and a sandbox for building and testing the applications, which will be reviewed by the company before going live on the site.
With WISE, Wikia Search wants to increase the frequency with which it is able to answer a query right from the results page, saving users from having to scan Web site links and click around to find the desired information

MySQL Cofounder David Axmark Leaving Sun

David Axmark, a cofounder and former lead engineer for MySQL, has resigned from Sun Microsystems a few weeks after another cofounder said he may also leave the company.
"I have thought about my role at Sun and decided that I am better off in smaller organisations," Axmark wrote in his resignation letter, according to a blog post Tuesday from Kaj Arno, head of MySQL community relations.
"I HATE all the rules that I need to follow, and I also HATE breaking them. It would be far better for me to 'retire' from employment and work with MySQL and Sun on a less formal basis," Axmark wrote. His last day with Sun will be Nov. 10, Arno said via instant message from Germany.
Axmark filled several important roles at MySQL over the years, including head of engineering, head of internal IT and head of community relations. It's because of Axmark that MySQL is open-source software, according to Arno. Another cofounder, Michael "Monty" Widenius, had planned it to be closed source, he said.
His departure will be a setback for Sun, which acquired MySQL for US$1 billion in January and hopes to attract new developers to the database.
Early last month Arno confirmed that Widenius was also thinking of resigning from Sun. He is still with the company, but his future there is still "hard to predict," Arno said Tuesday.
Axmark is an "important figurehead" at MySQL and someone who recruited many of its top engineers, Arno said. But his role lately has been primarily speaking with the press and liasing with the open-source community, according to Arno. Axmark will continue to do consulting and speaking engagements for Sun, he said.
"Emotionally, it's a sad moment to see a cofounder leave -- but the day-to-day impact is low," Arno said via instant message.
Marten Mickos, MySQL's former CEO, continues to lead the database group within Sun, and Jeffrey Pugh is still head of engineering, Arno said.
Axmark couldn't be reached for comment Tuesday. Arno said the things he disliked about being part of a large company were "mundane things" like having to turn in expense reports, order travel and change his e-mail to "@sun.com."

Facebook Starts Offering Live Search

Facebook has begun allowing users to search the Web from within Facebook, using Microsoft's Live search service.
The arrangement, first announced in July, offers a revenue opportunity for both companies. But they'll need to convince people to use the search feature, and it's uncertain they will in its current form.
"I'm not sure the experience they offer is optimal," said Greg Sterling, an analyst at Sterling Market Intelligence.
Still, the deal is a coup for Microsoft, which has been struggling to boost its search business. It might be the biggest search agreement for Microsoft in terms of potential users, Sterling said. Microsoft owns a stake in Facebook and has an existing exclusive agreement with the site for banner advertisements.
Starting Tuesday, Facebook users began to notice that when they start typing in the search bar on Facebook, a drop-down menu appears and they can choose to search Facebook or search the Web. If they choose the Web, results appear on a new Facebook page, with advertisements on the right of the screen.
At the top of the results list is a link to do an "Advanced search on Live.com," which launches a new window with results displayed at Live.com. Those results are sometimes slightly different from the ones on the Facebook page and include the ability to expand a search to include images, news, maps and videos.
The results are different because Facebook uses certain filters and doesn't display sponsored results, said Matt Hicks, a Facebook spokesman.
In addition to requiring users to do an advanced search to find image or map results, the results format looks different from other leading search providers, another potential drawback, Sterling said. Each item appears in a box that lists the Web site, a description of it and a link. "The presentation of results is a little strange, it's a little unfamiliar for search results," he said.
However, the companies may view this as the first try, with plans to add more capabilities.
"As we evaluate user feedback and results we'll explore additional ways to integrate Live Search more deeply into the Facebook experience," wrote Angus Norton, senior director of Live Search product management at Microsoft, in a blog post. He called today's service a first step.
In the future, Facebook should look for ways that search results and other content on the site can be more relevant to people in the context of their visit to Facebook, said Jeremiah Owyang, an analyst with Forrester Research. "This is a long-term play," he said.
Facebook was one of the only major social-networking sites that didn't offer search, Sterling said. "That was kind of a major issue and they've rectified that," he said.

Brocade Gets Loan to Fund Foundry Buyout

Brocade Communications Systems on Tuesday secured a US$1.1 billion loan to fund its acquisition of Foundry Networks despite a tightening of credit markets amid the Wall Street meltdown.
The loan came through even as the Dow Jones Industrial Average fell more than 500 points (5.11 percent), and many technology stocks dropped even more steeply. Among those hardest hit were Sun Microsystems, Nortel Networks, Amazon.com and Qwest Communications International, all down more than 10 percent. Apple shares went down 9.25 percent while Google suffered less damage, falling 6.79 percent. As they did in the bloodbath last Monday, most major tech stocks, including Microsoft and Cisco Systems, outpaced the Dow in their journey down on Tuesday.
Brocade's planned acquisition of Foundry, announced in July, will expand the storage-area-networking pioneer into the Ethernet LAN business for an end-to-end set of offerings. Brocade said in July it would exchange a combination of stock and cash for each Foundry share, partially funding the deal with about $1.5 billion of debt financing from Bank of America and Morgan Stanley.
Such arrangements are common, but by the time Brocade held an analyst meeting in mid-September to lay out its plans for Foundry, Wall Street investment banks were already teetering and stocks were on their way down. Financial analysts at the event repeatedly asked Brocade CEO Michael Klayko about the financing of the deal, and he said each time that he was confident it would come through.
Brocade said Tuesday it got a $1.1 billion term loan facility and a $125 million revolving credit facility. Bank of America N.A. led the funding, joined by Banc of America Securities, Morgan Stanley Senior Funding and other institutions. Brocade also said it expects to raise as much as $400 million in additional financing. Brocade stuck by its forecast that the acquisition would be completed by year's end.
News of the loan, released after the U.S. trading day, perked up both companies' shares. After falling $0.39 to $4.43 at the closing bell, Brocade shares (Nasdaq: BRCD) were up $0.36 in after-hours trading late Tuesday. Foundry (Nasdaq: FDRY), which dropped $0.91 to $16.26 during the day, was up $1.28 after hours.
Also on Tuesday, VoIP (voice over Internet protocol) service provider Vonage Holdings announced new terms for its proposed debt financing with Silver Point Finance. It announced $215 million in private debt financing with Silver Point in July, but has restructured that financing, which now totals $220.3 million. The company expects that deal to close next month. Its stock (VG, on the New York Stock Exchange) had fallen $0.10 to $0.85 in the day's trading but was up almost $0.03 after hours. Vonage has settled a series of patent lawsuits but is still struggling as bigger players, such as cable operators, offer competing VoIP plans packaged with other services.

Wednesday, October 1, 2008

Is That Keyboard Toxic?

Warning: Your keyboard could be a danger to you and the environment.
Sound preposterous? Then consider this: Some keyboards contain nanosilver, which, because of its antimicrobial properties, is increasingly being incorporated into everyday items even though studies have questioned its health and environmental safety.
Studies are raising concerns about the proliferation of nanotechnology, which can be found in numerous products, from IT components to cosmetics.
"The biggest issue around nanotechnology is that we don't know [all of its risks]. We're putting things on the market that haven't been fully tested," says Sheila Davis, executive director of the Silicon Valley Toxics Coalition (SVTC), a San Jose-based advocacy group.
Nanotechnology refers to work done on the nanoscale; 1 nanometer equals a billionth of a meter, or about 1/100,000 the thickness of a sheet of paper.
Use of this technology can save resources and energy. Moreover, nanomaterials offer potential benefits that could revolutionize our world. For example, they could be used to track tumors or clean up contaminated water and soil.
But scientific studies have also found potential health and environmental problems with nanomaterials.
"The nanotech boom is generating an unprecedented number of new processes and materials that pose unknown potential environmental and health hazards," the SVTC stated in its April 2008 report on nanotechnology and its risks.
And research published in the May issue of Nature Nanotechnology suggests that carbon nanotubes, which researchers are using to build next-generation circuits, could be as harmful as asbestos.
"We have to consider with new physical properties that there's likely to be a new toxicology profile and do more testing before people are exposed," says Jennifer Sass, a senior scientist at the Natural Resources Defense Council in New York who specializes in toxicology.
Sass' comments bring us back to that keyboard. Does yours contain nanosilver, which studies suggest may damage human cells as well as disrupt the nitrogen balance in freshwater ecosystems? Most likely, you don't know. And you probably can't easily find out because manufacturers aren't required to note that products contain nanomaterials.
The good news, however, is that a number of factors limit the potential dangers posed by nanotechnology. One of the most significant is that humans evolved in the presence of nanoparticles, says R. Stanley Williams, director of information and the quantum systems laboratory at HP Labs in Palo Alto, Calif.
"There is certainly reason to be careful," he says. "But our environment is filled with nanoparticles. We just didn't know it until we had tools that could see them."
Even so, industry is taking steps to minimize exposure. Leading manufacturers follow protocols to contain manufactured nanoparticles, says Mihail Roco, senior adviser for nanotechnology at the National Science Foundation.
For example, workers at Intel Corp. wear protective gear, use HEPP (high-efficiency pleated polypropylene) filters and work under hoods, where air pressure pulls wayward particles away from them and into filters, says Todd Brady, Intel's corporate environmental manager.
Technology users also have a measure of protection against exposure thanks to the very nature of nanotechnology. Nanoparticles are bound with other materials to make final products, and studies show that nanomaterials stay bound and therefore won't harm humans or the environment.
"These are so tightly locked down that there's no way the nanoparticles can get out," Williams says. "You can beat on them with a hammer, and they still won't get out."

Broadcom, Skyhook Join for Better Location Data

Broadcom will enhance its mobile device chipsets and location-based services (LBS) infrastructure through a partnership with Skyhook Wireless, the company that powers the iPhone's location feature using Wi-Fi access points.
In addition to being a major vendor of communications chips, Broadcom also operates an international infrastructure for location-based services that uses information from GPS (Global Positioning System) to determine where a mobile device is. Skyhook's technology can use signals from Wi-Fi hot spots, as well as cellular base stations and GPS, to do the same thing.
Broadcom will integrate Skyhook's capabilities into its chipsets and its LBS infrastructure, the companies said Tuesday. This will give makers of mobile phones, personal navigation devices and other products a single, integrated hybrid positioning system that can take advantage of GPS, Wi-Fi and cellular base stations, they said.
GPS, which relies on satellites, has a wide coverage area but doesn't work well indoors or in urban settings where there are buildings blocking signals from satellites. Skyhook's Wi-Fi Positioning System can analyze signals from nearby private and public Wi-Fi access points and compare them against a database of geographical points to determine location.
LBS is becoming one of the brightest spots in mobile data as consumers take advantage of the ability to find businesses and even friends nearby, according to industry executives. Skyhook Founder Ted Morgan told the Mobilize conference in mid-September that the company's service had received "billions" of location requests from users and that this volume had spiked up in the previous three months. The GPS-equipped iPhone 3G went on sale in July. The location-enabled version of Google Maps for Mobile, which has been available on the iPhone since earlier this year, gets twice as much use as the previous version, according to Steve Lee, a Google project manager who spoke at the same conference.
Broadcom's LBS system formed the basis of the first nationwide E911 emergency location system in the U.S. in 2002 and today serves more than 20 million active customers in more than 170 countries, according to the company.

WindowShade X Revisited

Over the many years we've been writing about great low-cost software, one of the most popular products--with both readers and Gems writers--has been Unsanity's WindowShade X. This "haxie," as Unsanity calls its system-enhancement utilities, brings back one of the favorite features of Mac OS 9: windowshade-style window minimizing. With WindowShade X installed, double-clicking the title bar of a window no longer minimizes the window to the Dock; instead, the entire window "rolls up"--complete with audio effect--into the title bar, which remains in place.
This is a great way to keep windows visible and accessible without blocking your view of other onscreen items. It's also a handy way to quickly view something behind a window: double-click for a better view, and then double-click again to restore the window. Although Exposé, introduced in Mac OS X 10.3 (Panther), reduced the utility of this windowshade feature somewhat, it still has its advantages.
Unfortunately, Mac OS X 10.5 (Leopard) broke WindowShade X, forcing fans to muddle through without it. Granted, most people got by just fine. But as someone who started "windowshading" when the feature first debuted in 1997 as part of Mac OS 8, and used WindowShade in OS X for years, by the time Leopard was released I'd been using this functionality for over a decade! That's some serious muscle memory to overcome, and, in fact, as recently as a couple weeks ago I still found myself wanting to "roll up" windows.
Why only until a couple weeks ago? Because that's when Unsanity finally released WindowShade X 4.2, the first official release that works with Leopard. Like previous versions, version 4.2 offers a standard windowshade mode, as well as three other "minimize" features: transparency, which makes a window translucent so you can see what's behind it; minimize-in-place, which shrinks a window down to the size of a large icon (that you can move around); and hiding, which hides the application to which the window belongs.
But you don't have to choose just one of these modes; via the WindowShade X preferences pane, you can assign a different action to each mode, as well as change the action required to get the standard minimize-to-Dock behavior. (You can choose from among several different actions.) For example, on my Macs, double-clicking a window's title bar windowshades it, while control-double-clicking makes the window translucent. You're also supposed to be able to create your own actions and keyboard shortcuts; unfortunately, I haven't gotten this feature to work properly.
You can also customize many of these effects. For example, you can choose the translucency of "transparent" windows; choose the size and behavior of minimized-in-place windows; and set up application-specific preferences so, for example, double-clicking the menu bar of a window does one thing in the Finder, another in your favorite Web browser, and another in Photoshop.
In addition to adding Leopard compatibility, WindowShade X 4.2 works better with iTunes and fixes a number of bugs. It also removes a feature of older versions of WindowShade, custom shadow settings, that never worked reliably.
Besides the problem I noted above about custom actions, WindowShade X's minimize-in-place option doesn't currently work with Mac OS X's Spaces feature. I'd also like to be able to adjust the volume of the "swoosh" sound you hear when minimizing a window. But perhaps the biggest issue for some OS X users is how WindowShade X works its magic: via Unsanity's Application Enhancer, now at a Leopard-compatible version 2.5. What is Application Enhancer? As Unsanity explains it:
It is a combination of a Framework and a system daemon. Application Enhancer performs its task by loading plugins (Application Enhancer modules) containing executable code into the running applications. Once loaded, the APE module performs the needed modifications (such as redefining the minimize window action, or customizing the standard Apple menu) on the launched application memory space, never touching any files on disk, utilizing set of functions defined in the Application Enhancer framework. To help the APE modules to be loaded into newly launched applications, the Application Enhancer daemon (aped) is used.
In other words, it's a system hack that affects all running applications (although you can manually exclude certain applications from being modified by Application Enhancer and, thus, Application Enhancer-based system utilities). Some Mac users refuse to use such hacks on their systems because of concerns about instability and other potential issues; if you're one of these people, WindowShade X isn't for you. That said, I ran WindowShade X for years on my pre-Leopard Macs without problems, and I've been using version 4.2 for the past couple weeks without incident.
(One related note: Unsanity's instructions say you need only log out and then back in after installing WindowShade X; however, I had to actually restart my Mac for WindowShade X to take effect.)
Finally, a note about WindowShade X's price. Unsanity provides version 4.2 free to anyone who's ever purchased an older version. However, the company requests that those who've been using it for many years consider paying a US$7 "voluntary upgrade fee." As the name implies, you aren't required to pay it, but if you've been using WindowShade X across multiple versions of OS X, it's a way of saying "Thanks for all the free upgrades; here's some cash to help with future development."
WindowShade X 4.2 requires Mac OS X 10.4 or later

YouTube Improves Video Usage Analytics

Delivering on a pledge made earlier this year, Google has again improved usage metrics for YouTube videos, making it possible for account holders to measure the popularity of different parts of a clip.
This new feature, available to all people who have uploaded videos to YouTube, is called Hot Spots and displays a graph that indicates continuous fluctuations in viewership of a clip.
The feature could come in handy for videographers willing to re-edit their clips based on their popularity highs and lows.
YouTube, primarily used by millions of individuals to upload personal or copied clips, is also employed by organizations to market their products and services.
More recently, Google added a video upload and sharing service based on YouTube to its Google Apps Premier hosted suite of collaboration and communication applications, which is designed for workplace use.
In early March, a YouTube official told attendees at the eRetailer Summit in Miami that YouTube would soon roll out tools to measure in more detail videos' viewership data.
As promised, a few weeks later Google unveiled YouTube Insight, a tool for YouTube account holders interested in monitoring usage statistics for their clips, such as the geographic locations of viewers.
Hot Spots is part of the YouTube Insight service. "We determine 'hot' and 'cold' spots by comparing your video's abandonment rate at that moment to other videos on YouTube of the same length, and incorporating data about rewinds and fast-forwards," wrote YouTube product managers Tracy Chan and Nick Jakobi on Tuesday in an official blog posting.