Friday 3 February 2012

ICT esperanto and competition among standards

It has been a little while since the IP Finance weblog has hosted a piece from regular guest contributor and ICT patents and standards expert Keith Mallinson (WiseHarbor) -- but we are pleased to welcome him back for his first post for 2012:
ICT Esperanto and Competition among Standards

Open and competitive ICT markets produce many standards, but not all will flourish or even survive. With free choice, customers and their users often overwhelmingly plump for one standard over others in pursuit of highest performance (e.g., from HSPA in 3G cellular) or widest interoperability (e.g., with SMS for mobile messaging).

Significantly different levels of compliance and interoperability among vendors, all notionally supplying to the same standard, may also cause customers to favour one vendor over all others. If product performance and interoperability is inferior to that of established standards from leading vendors, the introduction of new and open standards or alternative vendors will fail with customers. Increasing choice of standards or suppliers is useless if implementations do not work properly. In many cases that requires full interoperability with large bases of existing users.

However, coexistence of competing standards, including proprietary extensions to these, is also vital to facilitate innovation, satisfy diverse requirements and enable the emergence of new leading standards. Strong user preferences for certain standards justifiably rewards those who have had the foresight, taken the risks and made the investments to develop and promote them, and build a user base for their fully-featured and standards-compliant products.

Evolution beats creation with rich and widespread use

The need for high levels of performance and interoperability in ICT can be illustrated with an analogy in human language. According to Wikipedia, the goal with Esperanto’s creation was “an easy-to-learn and politically neutral language that transcends nationality and would foster peace and international understanding between people with different regional and/or national languages”.  However, using Esperanto has never become more than a niche activity. It is spoken by less than 0.1% of the world’s population versus more than 10% with English as a first or second language. Evolved languages have richer vocabularies, much more extensive literature and large numbers of existing mother-tongue and second-language users. English, Spanish, French, German and other languages have remained preeminent across regions, nations and within particular international domains in business, art, ICT and engineering.  Customer preference and supplier leadership for books and courses by indigenous organizations, for example, Collins for English as a foreign language and the Goethe Institute for German, are a natural consequence in development of market supply for language education.

Improving interoperability among different suppliers’ kit to eliminate, so called, vendor lock-in is a common requirement of centralized procurement authorities, but it is not, the only, let alone the most important need in selecting or mandating standards. Standards get displaced when alternatives provide distinctly better functionality or end-user interoperability. For example, the introduction of the GSM standard in mobile communications from 1992 was a major technological step forward from the various different analogue technologies deployed in European nations and no other digital standard was allowed to contend. GSM increased network efficiency with use of scarce spectrum allocations, introduced a succession of new capabilities including text messaging and data communications and enabled much wider geographic interoperability for users with international roaming. It also created a fresh start for European manufacturers who had been impeded by disparate national analogue standards with correspondingly fragmented handset and network equipment markets. The openness of the GSM standard also provided greater choice and created an expectation of less customer dependency on particular vendors. The needs of consumers, operators and equipment vendors were so much better satisfied with a new standard that it was quite possible to ignore the complications of backward compatibility and start afresh in European cellular with GSM.

Backward compatibility and dual-mode working

In order to preserve interoperability while adding new functionality and improved efficiency in ICT, it was sometimes necessary to create standards with backward compatibility to older standards or provide dual-mode working for many years. In the UK, two of only three TV channels were broadcast in monochrome with 405 line VHF transmissions until 1970, when colour was introduced with the addition of 625 line UHF transmissions.  Although the colour transmissions were backward compatible with 625-line UHF monochrome receivers, a simulcast was required to serve old, single standard, TV sets until the 405-line VHF network was eventually closed down between 1982 and 1985. In the US, cellular systems including incompatible and competing digital CDMA and TDMA technologies —introduced from the mid 1990s—were designed to incorporate existing analogue capabilities to ensure national roaming. The requirement for cellular licensees to maintain capabilities for analogue roaming throughout their networks was not relinquished until 2008.  Whereas 3G and 4G technologies (including WCDMA, HSPA, LTE and LTE Advanced) surpass the performance of 2G technologies (including GSM, GPRS and EDGE) with respect to spectral efficiency, network speeds and latency—2G technologies will most likely remain embedded in our phones for at least another decade so we can continue to roam widely worldwide. Similarly, HSPA will be included in multimode devices to maximise access to mobile broadband coverage where LTE is not present or uses incompatible frequencies. Most Blu-Ray players sold for decades to come will retain the ability to play our old libraries of regular DVDs.

 In fact, with low incremental costs of retaining old standards, due to software-defined capabilities running on cheap processing and memory, manufacturers find it increasingly attractive to retain old standards and incorporate multiple standards in their products. For example, as I draft this article, Microsoft Word offers me the option to “Save As” from among 18 different formats including .odt, (ODF OpenDocument format text), html, pdf and docx, as well as .doc. Apple’s iPhone 4S incorporates three distinctly different cellular standards: GSM, CDMA EV-DO and HSPA (plus several other closely-related and compatible standards in each case) across five frequency bands, plus WiFi and Bluetooth. Customer and end-user choice is maximized without excessive incremental costs. Crowds can now choose for themselves which standards they prefer to actually use.

Forbidden fruit is most succulent

Prohibiting use of popular standards and products in favour of “open” alternatives can significantly harm end users because, unsurprisingly, the former generally work best. Functionality, quality and interoperability among users must take precedence over ability to switch or mix suppliers. Document formatting standards provide a good example. It is a domain where standards selection has become a most prominent issue. While it is widely and correctly recognized there are problems with interoperability across different formats, e.g., going from ODF to OOXML, it is commonly and incorrectly assumed that all different vendor implementations of a particular document format will fully interoperate and faithfully reproduce identical documents after editing and saving.

Research by Rajiv Shah and Jay P. Kesan shows that what are supposedly the most open document standards do not in themselves ensure the highest or even satisfactory levels of interoperability in many cases when documents are transferred, edited and saved among different world-processing programs. On the contrary, compatibility among different vendors’ implementations of the same open document formats can be quite poor.  In contrast, the leading proprietary standard has the greatest functionality and this was best preserved when documents are exchanged, edited and saved only among different users with the same word-processing program or different programs from the same vendor.  This research included the three most popular word processor document formats: ODF that is generally regarded as the most open format, OOXML and .doc that is seen as most proprietary or closed.  Given that open standards do not ensure interoperability among different vendors; there is no guarantee of vendor choice and the resulting price competition that authorities such as governments expect from procurement policies that insist on what are commonly regarded as being the most open standards.

Interoperability case study

Will anything less that 100% standards compliance and interoperability ever be good enough?  Whereas that goal is unachievable – particularly given that most standards must regularly be updated with various changes and interoperate among other standards serving different purposes, such as, presentations and spreadsheets as well as word-processing – it can be highly desirable to reach as close as possible to that ideal. Personal experience illustrates how demanding conditions can be with risk of embarrassment or something worse with the seemingly slight incompatibilities or data loss. From time to time I am retained as an expert witness in litigation on temporary case teams with contributions from up to dozens of different firms (e.g., with many case co-defendants) including lawyers, economists and industry clients. Drafting and editing expert reports and other documents involves the “master” being passed around with changes to text, graphics, footnotes and redline “track changes” implemented by many different people before being finalised and hurriedly submitted in advance of a fixed deadline. According to Shah and Kesan, as referenced above, these are precisely the types of document features that tend not to be preserved when documents are modified and re-saved in different vendors’ applications –let alone when transferring from one standard format to another. Several years ago, I was satisfied with my checks to a certain finalised word-processing document, but was subsequently horrified to discover that in a chart – created in a presentation program and faithfully reproduced into the word-processing document, the background shading had moved in front of a graph line when the document was converted into .pdf format immediately prior to submission.  This obscured the key turning point that was the entire purpose of my chart. At the other extreme, many users may seek only basic functionality. They might, quite reasonably, prefer to trade-off functionality and interoperability in order to pay the lowest price possible or obtain the document program for free.

Winner may take all – but not for ever

Standards can rise from a variety or origins and for various reasons. Communications standards including fax for document images, SMS for mobile messaging, SMTP for email, TCP/IP and HTML on the Internet took hold rapidly and most extensively in their respective domains because the world lacked, while users desperately needed, standards with widespread adoption for interoperability. These characteristics were lacking, for example, in the closed environments with proprietary email systems used internally by corporations. The .doc and other office suite standards remain entrenched because they already provide the highest levels of functionality and interoperability with 95% of users. Usage also includes significant legacies with user-customized templates–including un-standardised macro programming–for particular business purposes such as order entry and monthly financial reporting.

The major fall in fax usage since the advent of interoperable and widely-adopted Internet-based email a decade ago (though most of us retain fax capability and still list fax numbers on our business cards), and a decline in SMS in recent years, show that even the most popular and seemingly enduring standards can eventually take a tumble with new technologies and alternative standards.

Mobile communications has flourished due to, not despite of, extensive competition among standards. Multivendor mobile technology supply has not significantly constrained functionality and interoperability because new mobile standards were developed from inception to achieve these. The U.S. has thrived and now leads the world in network deployment of HSPA and LTE technologies with the most rapid adoption of the most advanced smartphones. There has been competition among four different 2G technologies, and several 3G technologies including CDMA (including EV-DO), WCDMA (including HSPA) and most recently between WiMAX and LTE technologies. The latter two are standardized by rivals  IEEE and the 3rd Generation Partnership Project respectively. CDMA is standardized by another group called 3GPP2This in turn has spurred operator competition in the U.S. and also accelerated technology developments worldwide.  The 3G successor to GSM’s radio layer, which is called UMTS or WCDMA, has far more in common with CDMA than it does with GSM. Pioneering work in CDMA was of great benefit to WCDMA. There was a call to arms with LTE for cellular operators against WiMAX by Vodafone’s former CEO, Arun Sarin at the GSM Association’s 2007 Mobile World Congress.  Later that year, Vodafone and its CDMA technology-based partner Verizon Wireless announced they would both pursue LTE as their common next generation technology.  A keynote presentation by Verizon Wireless CTO, Dick Lynch, at the 2009 Barcelona show announced the LTE vendor line up and most ambitious launch dates. The acceleration and strength of commitment to LTE, precipitated by the WiMAX challenge, has ensured the latter will be kept to a minor position versus LTE.

Current consolidation of cellular standards development in 3GPP has not eliminated competition and it does not preclude significant challenges from standards groups such as IEEE in the future.  3GPP cellular standards have become increasingly strong versus 3GPP2 and IEEE wireless and mobile standards, but there is internal competition within 3GPP and rival standards bodies will continue to present competitive challenges with innovations in rapidly growing and changing markets. Notwithstanding the rise of LTE, based on OFDMA protocols, CDMA-based technologies such as HSPA are also continuously being improved to closely rival the capabilities of LTE.  Some 3GPP contributors have distinct technical and commercial preferences for one standard over the other.

IP financing and just rewards

When selecting standards, customers and end-users in particular want highest performance, most exhaustive compliance and widest user interoperability. It is no surprise customers and end-users tend to make the same selections. Apple’s popular iPhone, with its App Store, iOS and deep integration of software with silicon is a very closed and proprietary system; but this provides superlative performance end-to-end. Microsoft is a leading beneficiary in word-processing applications– with its leading .doc standard and with its contribution to OOXML because its implementations provide the richest functionality and most compliant interoperability among the widest base of users. Whereas there is no consensus on what qualifies, and what does not qualify as an open standard, the likes of GSM, HSPA, EV-DO, LTE and WiFi are as open as anything on offer in their respective domains. Major contributors, for example, Ericsson and Intel respectively, have been principle beneficiaries. Their gains are in upstream licensing income, downstream product markets or from both. Whichever way, returns are for taking significant risks and making investments in developing standards, products and markets full of standards-compliant users. 

No comments: