Testing for World-Readiness

来源:互联网 发布:juniper vpn for mac 编辑:程序博客网 时间:2024/05/22 17:46

What is world-readiness testing, and why should you read and we write about it? The answer to the read/write question is because we can now answer the first question.

Many documents have been written on how to test software for the use in the international market place and how the Software Quality Assurance (QA) process should be organized for software that will work in different countries (other that the United States). Most of them mention that there is no ideal solution for international software development. All of them although provide a solution for their author’s specific perceived "real-world” situation, but none of them give a solution for the ultimate “ideal” case.

So, what is this “ideal” situation? It is a software product that is a single “World-Ready” binary that runs on every localized version of a specific OS and presents its data and User Interface according to local standards without any custom modification need to be preformed to its source code before the software is introduced into new foreign markets.

Before now, creating “World-Ready”software has been hardly achievable because no platform provided enough universal support to create a convenient functional international development platform. And since “World-Readiness” was not part of the basic design, creating guidelines to test these functions where next too impossible. In the past, such QA guidelines for the “ideal”situation were basically only good for academic discussions.

All of this has changed with the introduction of Windows® 2000 and the new international functionality it provides to software developers to create “World-Ready” products. But one must remember that even if your software is developed for Windows® 2000, it may still not be completely world-ready. Software design and development can easily break world-readiness because:

1.

An old non-world-ready approach is used

2.

Some must interact with applications built in the pre-Unicode era.

That is why the QA process, especially testing, must be prepared to check for this new functionality.

The rest of this article describes setting up the QA process for "World-Ready”software developed for the Windows® 2000 operating system.

World Readiness and QA

If you deal with software quality assurance for a living or just plan the internationalization testing of your software, a single world ready binary brings much needed relief to your over taxed resources by making the QA process simpler. And although a single code base design, unified data-processing algorithms, and no localization caused functionality problems are definitely attractive, a world-readiness development goal brings new challenges to testing.

Ensuring world-readiness is different from testing localized program versions,and it is broader than just functionality testing. It also includes realizing the implications of globalization, plus verifying that those implicit requirements are met through out all the design and development steps.

Ensuring world-readiness is different from testing localized program versions, and it is broader than just functionality testing. It also includes realizing the implications of globalization, plus verifying that those implicit requirements are met through out all the design and development steps.

What should the testing process verify in order to be able to sign-off on the world-readiness of a product? The definitions of world-readiness have existed for a long time (mostly as development guidelines). Testing should be verifying that development has followed these guidelines. For the sake of completeness (as far as the author can remember), these guidelines are listed below:

Globalization of the code:

Code should make no assumption about what is the default language of the system (know as System Locale). To achieve this, use Unicode for text encoding and call appropriate conversion routines when the application interfaces with non-Unicode systems.

Code should make no assumption about the local conventions of the locale of the system (known as User Locale). To accomplish this, call NLS API functions when locale-sensitive operations like sorting or date/number/time formatting are required.

Localizability of the application:

The application’s code and the language of its user interface are not related. Whatever is read/written in I/O operations or displayed on screen should be designed such that it can be changed without modification of the code or even re-compilation of the program. The Multi-lingual User Interface (MUI) implemented in Windows® 2000 adds a new twist to this definition. It can now be extended with the ability to display multiple language versions of the UI at the same time in the same application!

With the use of the above World-Ready approach comes the disappearance of the software development concept called “enabling”. Enabling referred to applications that had an English UI combined with the ability to work with some particular language and locale setting. When done correctly, all world-ready applications are “enabled” to handle all the languages that are supported by the underlying platform.

Even though global functionality is available in a single world-ready binary, localized products are still in demand in local markets, and thus localized products still need to be built and tested. The advantages that software developers get by developing a single world-ready binary is that localization becomes much easier and requires less resources. This occurs because:

1.

Localization teams are less likely to find localizability problems in multiple language versions that may require multiple fixes and branches of source code.

2.

The localization process becomes mostly a UI translation. Thus the localized product testing becomes less technical and more of a grammar and spell-checking job.

With any new software development concept, moving to a world-ready binary will require necessary changes in the QA process. One of these changes is that the QA process must ensure that world-readiness is reflected as a core goal in all levels of software design documents. Another change needed in the QA process is the modification of the test procedure. The following section describes this modification.

Testing for World-Readiness

The goal of globalization testing is to detect potential problems in software globalization.It makes sure that the code can handle all international support without breaking functionality that would cause either data loss or display problems. Globalization testing checks proper functionality of the product with any of the locale settings using every type of international input possible.

Proper functionality of the product assumes both a stable component that works according to design specification regardless of international environment settings or locales, and the correct representation of locale-sensitive data such as the proper handling of National Language Support (NLS) information and NLS-aware user interface.

The following steps must be part of your globalization-testing plan:

Step 1: Decide the priority of each component

To make globalization testing more effective, all tested must be assigned a testing priority. Components that should receive top priority:

Are designed to be used on both Windows ® 2000, and Windows 95/98/ME or interface with written for Windows 9x or earlier platforms.

Support text data in the ANSI (American National Standards Institute) format or call ANSI APIs. Network or with console output are assumed to be in this area.

Extensively handle strings (e.g.: with many edit controls).

Use files for data storage or data exchange (Windows metafiles, Jet Database files, Group Policy Engine, security configuration tools, Web-based tools, etc.).

Ones that have had many globalization problems in the past.

Step2: Select a test platform

So which OS should you use for your international testing platform?  The first choice is the US build of Windows® 2000 with a language group installed (e.g.: US Windows® 2000 + East Asian language group).  This combination gives you complete international support for the language selected via the System locale without imposing requirements on the testers’ language skills.

Even if you target a broader range of operating systems, Windows® 2000 should be your primary test platform. No other operating system gives you the same flexibility with local settings and native support for the broadest range of languages and locales. Windows 9x for example, does not even give you the ability to change the System locale.

You may also use other platforms that differ from pure US Windows® 2000:

MUI (Multilanguage User Interface) Windows® 2000 – especially useful if your code implements multilingual UI and it has to adjust to the UI settings of the OS. This approach is an easier implemented alternative to installing multiple localized-versions of the OS.

Localized build of the target OS – German or Japanese are good choices. Remember it might be harder to work with them if you do not know the operating system’s UI language. This approach does not have any significant advantages over the solutions above.

Most globalization problems can be found by testing in an environment where East-Asian-languages support is active or the OEM code page differs from the ANSI code page for a given locale. For example, the following System locales can be selected in US Windows® 2000 to test for potential globalization problems:

Japanese

German

A combination of both (one selected for the System locale and another for the User locale) whenever possible to cover multilingual support

You can achieve the most complete coverage if you install all the language groups, rotate the System and User locales and run "globalized" tests as described later in this article.

To perform globalization testing, you need to ensure that multiple language groups are installed and that the System locale is not set to English-US. As mentioned, most globalization issues can be covered by executing test cases in both Japanese and German environments and a combination of both.

Essentially then the steps to create a world-ready test environment are:

1.

On a US build of Windows® 2000, install Japanese (or any other East-Asian region) language support. The German language support is installed by default on the US build of Windows® 2000.

2.

Set the System and User locales on the test machine different from English-US (Japanese or German)

3.

Set up a distributed network with a mixed environment of US Windows® 2000 systems with some set to the Japanese system locale and others set to the German system locale.

Testing with Japanese as the system default locale verifies double-byte character set (DBCS) handling in ANSI (non-Unicode) . Testing with German as the system default locale ensures that ANSI and OEM code pages are handled properly when text conversion is required. Having a distributed mixed network environment verifies that data can be successfully passed between different locales.

Step 4: Execute tests

After the environment has been set for globalization testing, your regular test cases must be run with special attention paid to potential globalization problems:

Put greater importance on test cases that deal with the input/output of strings, directly or indirectly.

Test data needs to contain mixed characters from East Asian languages, German, Complex Script characters (Arabic, Hebrew, Thai), and optionally English. In very few cases, such as NetBIOS names, there are some limitations, such as the acceptance of characters that only match the system locale.

It might be hard to enter manually all of these test inputs if you do not know the languages in which you are preparing your test data. A simple Unicode text generator may be very helpful at this step.

Step 5: Recognize the problems

The most serious globalization problem is functionality loss, either immediately (when a System locale is changed) or later when accessing input data (non-US character input).

Some functionality problems can be detected as display problems:

Question marks (????) appearing instead of displayed text indicate problems in Unicode-to-ANSI conversion.

Random High ANSI characters "¼, «, †, ‰, ‡, ¶, etc.) appearing instead of readable text indicate problems in ANSI code using wrong code page.

The appearance of boxes, vertical bars, or tildes (default glyphs)[□, |, ~ ] indicates that the selected font cannot display some of the characters.

It might be hard to find problems in display or print results that require shaping, layout, or script knowledge. This test is language-specific and often cannot be executed without language expertise. On the other hand, your test may be limited to code inspection. If the text output is formed and displayed using standard text-handling mechanisms (like Uniscribe for complex scripts), you may consider this area safe.

Another area of potential problems is code failing to follow local conventions as defined by the current User locale. Make sure that locale-sensitive data (numbers; dates; time; currency; calendars) in your application are displayed according to the current settings in the Regional Options of your computer.

The Regional options panel does not cover all locale-specific functionality. For example, you cannot see the current sort order there. Thus it is important to have a test plan covering all aspects of functionality related to locale before you start your test. You may want to use National Language Support documentation as a starting point for this plan. In it, you should find exactly what locale information can (and should) be retrieved dynamically, and then apply those requirements to your project.

Localizability test

Localizability testing verifies that the user interface of the program being tested can be easily translated to any target language without re-engineering or making code modifications. This implies that localization of the program is required for this test to be complete. However, this test can be done without true localization of the product if the following steps are performed:

Step 1: Run a pseudo-localized version of a program

Pseudo-localization may be the most effective way of finding localizability bugs. Localizability bugs are exposed by translation of the program’s UI. Pseudo-localization gives you a translation without the cost of an actual localization. The most effective way to perform pseudo-localization is to modify the program’s resources automatically, doing essentially what localizers do when they translate program’s UI:

Replace English text with text containing non-English characters (It’s advisable to keep the text readable. For example, make your translation algorithm replace English characters with non-English symbols of a similar shape: à -> å, c -> ç, n -> ñ, etc.).

Add extra characters to your resource strings. In many cases translated text is longer then the English original (This string: +++ This string: ---).

Stretch your dialogs. This is usually done by localizers when the string length grows due to localization

Mark the beginning and the end of each resource string. Having those markers, you’ll easily see when text is built at run-time – potential source of localizability bugs.

Make your substitution multi-lingual Unicode (as resource strings are stored as Unicode anyway). This will make finding places where the program utilizes ANSI functions to process or display text.

Once you pseudo-localize your program, test its functionality. Pseudo–localized applications should function no differently than its original US version.

An area often forgotten in localizability testing is mirroring test. If you want to distribute your software to the markets where the text and user interface of the programs display from right-to-left, you will want to know how your application looks like when it is mirrored. This test may be implemented as a part of the pseudo-localization of your product. This way your text stays in English and is displayed as usual – only the windows and text alignment get mirrored.

Note: One way to implement mirroring testing is by adding the WS_EX_LAYOUTRTL style to dialog resources, and by adding a call to SetProcessDefaultLayout(LAYOUT_RTL) before the main window of the application is created.

Step 2: Perform code review

Be sure the code meets the following requirements:

All resources are written in standard Windows resources format and no hard-coded strings are present in the source code.

Pointer arithmetic is not used for string-length calculations, access to string elements, or string manipulations.

Strings are not built at run time by stripping or concatenation.  String substitution is done via language-aware APIs such as FormatMessage.

Resources do not make assumptions about string buffer length.

Icons and bitmaps do not contain text.

No assumptions exist on drive and folder names or registry keys.

Step 3: Perform UI and documentation review

Make sure the terminology used in the UI and support documentation is clear, consistent and unambiguous.  Localizers find it hard to do their jobs when the UI and the documentation refer to the same features, but use different words, or when the text is overloaded with technical slang.

Localization testing

As mentioned before, localization testing is not strictly part of world-ready software development. Localization translates the product UI and sometimes changes some initial settings to make it suited better for some local market. This definitely reduces the “world-readiness” of the application. On the other hand, localization is part of internalization – so it will be covered here.

Localization testing checks the quality of translation of the build into a particular target language. This test is based on the results of globalization testing where the functional support for that particular locale is verified. Localization testing can be executed only on the localized version of a product – no pseudo-localized products are tested for localization quality!

The test effort during localization testing focuses on:

Areas affected by localization such as UI and content.

Culture-specific, language-specific, and country-specific areas.

In addition, localization testing should include:

Basic functionality tests.

Setup and upgrade tests that are run in the localized environment.

Application and hardware compatibility tests that are planned according to the product’s target market.

Any language version of Windows® 2000 can be selected as a platform for the test. Just make sure that the target language support is installed.

The localization testing of the user interface and linguistics should cover items such as:

Validation of all application resources.

Verification of linguistic accuracy and resource attributes.

Typographical errors.

Consistency checking of printed documentation, online help, messages, interface resources, command-key sequences, and so on.

Confirmation of adherence to system, input and display environment standards.

User interface usability.

Assessment of cultural appropriateness.

Checking for politically sensitive content.

When shipping a localized product, make sure the documentation (manuals, on-line help, context help, etc.) you provide is also localized accordingly. Items to check for:

Verify the quality of the translation.

Verify the completeness of the translation.

Check that the terminology is used consistently in all documents and application UI. If you have shipped localized versions of your product before, make sure that the translation is consistent with the earlier released versions.

Test tools and world-ready software

Test tools have great importance in today’s testing. They cannot substitute manual testing completely, but many test areas gain tremendous benefits from the introduction of test automation. It is only natural to believe that it is good practice to use automatic test tools on localized products or to test the degree of a product’s globalization.

This task, however, may appear much more complex then just running old trusty test tools on your new product under a globalized environment. Test tools that check functionality of an application’s UI may be broken by the translation of the tested application. Even when not affected by translation, test results may be incorrect if the tool verifies, say, dates and assumes that the date format is fixed – while it is not in a globalized application. The inability to use international test data can also make a test tool unusable.

All these facts mean that the development of the test tools must follow the same rules, as does the development of globalized software. Test tools must be globalized, adjust dynamically to locale settings and be able to process international text data, and be localizable (allow for easy translation of UI elements). Rules of writing global test tools do not strictly belong to the scope of this article – and will be omitted. These rules are common for any software – test, office or development tools – look for them in other articles on http://www.microsoft.com/globaldev

Conclusion

This white paper has explained that world-readiness testing is now feasible because Windows® 2000 has eased the work in creating world-ready programs. It defined world-ready testing as the development of a test plan that checks an application’s:

Globalization

Localizability

Localization for each targeted local.

It also gave time tested guidelines for each of these areas, explained how the modify your testing tools and showed how Windows® 2000 is an excellent test environment for world-readiness testing.

In closing, as you develop your world-readiness testing plans, please keep these concepts in mind. Also, if you have any questions, submit them to Dr. International.

About the Author

Rostislav Shabalin is a Software Engineer in the Windows International Division in charge of globalization-and-localizability testing evangelism.

Note: This article comes from: www.microsoft.com

原创粉丝点击