Silvio Scozzari, SDL

LOC TALK: Why Linguistic Testing is Critical for Global Success

In this third installment of our Loc Talk blog series, SDL Localization Manager Silvio Scozzari talks with linguistic expert and long-time SDL employee Jennifer West about localization testing. Jennifer leads the SDL Test Lab, which tests the functionality and linguistic standards of customer products before they are launched to global audiences.
Silvio: Thank you for taking the time to talk to me today Jennifer about SDL’s testing services. Could you start by telling our readers about your team and SDL’s test lab facilities?

Jennifer: Yes, absolutely, it’s a pleasure to speak with you again, Silvio. I’m based at SDL’s Colorado Test Lab, which is the original test lab formed 20 years ago. Since then it has expanded in both people, scope and products that we test. Our lab is one of four worldwide. We have one in Bangkok, one in India for functional testing and Indic languages, and one that will be based in Madrid. Our labs work very closely together and may even share projects. 

All of our Colorado test leads are bilingual, which is helpful because there’s an immediate understanding of linguistic or potential linguistic issues. Within my team we have native speakers in Italian, Nepalese, Spanish, German and I speak French, so we have quite a variety. All our testers must be native speakers, not just speak another language. They’ve lived and were educated in their home country and moved to the US. Testers go through an interview process and are tested in the knowledge of their language. They also have to be able to type in their language, not just speak it, because they have to report defects. We check that testers keep up-to-date with their language. As you know, languages evolve and terminology evolves, so it’s important for them to maintain their languages.


Silvio: On a high level, what kinds of projects come through your door? What could a client typically ask the SDL Test Lab to test for them?

Jennifer: It’s quite vast. We cover anything from apps to web pages to consumer products, such as small devices that you may have around the house or carry with you. We have tested games, software, financial applications and insurance websites - places where people enter data. Aside from a large American multinational technology company that specializes in internet-related services, almost half of our testing is done for the medical field and medical devices.

As long as they’re not unwieldy, our clients send us their devices so we can perform physical tests on the device. The advantage is that in addition to linguistic testing, we can make sure it functions correctly once it’s been translated. For example, we have come across situations where the print button doesn’t launch a print job once it was translated. 

We also have automotive testing - we actually had a car in our parking lot recently where we performed voice control tests. We test user experience as well as the accuracy, context and the consistency of the language and the functionality.


Silvio: So, the client actually brought a car to your facility. Were you testing the infotainment system?

Jennifer: It's infotainment but it’s also if the car can open the boot (or Trunk, if you’re a US reader. There’s my localization!), and that it will do it in the 14 different languages. So we will have 14 different testers with the proper accents making sure the car does exactly what it's supposed to do.


Silvio: That’s really interesting! As you and I know, localization can be, at times, an afterthought. With linguistic testing typically coming at the end of the localization process, once the UI has been translated, some customers may think, “Why bother testing at all? It’s just going to increase the cost of my localization project or potentially delay my launch if problems or defects are found, then they need to be fixed and then they’re regressed.” So, is linguistic testing really necessary? What exactly are you checking for when you’re testing?

Jennifer: Is it necessary? Well I’d like to paraphrase, “if you think education is expensive, try ignorance.” And if you think testing is expensive, try launching a product that has defects and consider the impact that’ll have on your customers and your brand credibility, not to mention the confusion and potential safety hazard that could be caused as a result. 

Imagine on a medical device, a word is translated one way on one screen, but the same word is translated differently on the next screen. That could be very confusing for the nurse, the doctor or the person using the device. We look for consistency and context. Is it the right word in that context? A simple example: We had a device that used the word “waste” a lot, but it’s sometimes a noun, sometimes a verb, and the translator would not necessarily know at what point it had to be a noun or a verb. These are the kinds of things we check during testing.

We also check for internationalization issues. Imagine phone numbers and address formats in different countries, as well as decimals, dates and paper sizes for a report. The paper size may be different in your country or your clients’ countries. So when they print out the report, is everything going to fit properly? Are you going to see the header and the footer? We check for truncations, meaning whether something is cut off. Some user products have very small screens and we often get text expansion with localization. So, in English you have 30 characters but once you translate it, you’re up to 45 or 50 characters and you may have issues. Sometimes we find code bleeding, so in the UI we see some of the software code and not the actual translation. 

For some symbols, we get hard coded text - perhaps the developer didn’t realize that this was going to be translated. Often in pop-up or error messages the title may be hard coded, but needs to be translated. We would find that. We also see concatenations, where the developer will split two strings into sentence fragments across multiple lines of code. The sequence is fine in English, but the minute you translate it and the word order changes, it doesn’t work.


Silvio: It sounds like testing is an important last line of defense before releasing a product into the global marketplace. It sometimes astounds me how much investment organizations put into their English products and then seem to forget about the localization, but that’s where the problems arise like the issues you’ve mentioned. When we hear the word testing, what does functional, internationalization and linguistic testing look for? How would you summarize each for our readers?

Jennifer: Functional testing ensures that the software works the way it should and there are no unexpected issues or blocking points. When the client runs the software, imagine they click on a link and it goes nowhere — those sorts of things. We can check this without native speakers, so the cost can be lower because we don’t need specialists in each language. 

We like to do internationalization testing before translation because we can proactively point out potential issues, such as phone number, date and address format. Unfortunately, people don’t plan that far ahead sometimes, or maybe it’s a company that was not originally planning on localization, but sees an opportunity to expand. 

Then, of course, there’s the linguistic testing with the sort of things that I mentioned earlier. We make sure that the product responds correctly or we may also test voice over, where we check that the subtitles and the audio correspond.


Silvio: Going back to internationalization, it’s interesting that the test lab offers that sort of consultancy service, which some organizations, particularly smaller ones that are not as mature to localization, would really find useful. What percentage of the work you do is linguistic testing? Do you find that most clients tend to carry out the functional testing themselves and ask SDL for support with only their linguistic testing?

Jennifer: Correct, they often do the functional testing in the source language themselves, but as we’re running through the software and using it like an end user, that’s where we see things that they may have not seen. We had an excellent example on a financial product that we were testing. The end user would enter data and the database had an American format date. So you can imagine sitting in France, or anywhere in Europe, and you enter a date when you want something to be done. You enter the date in European format and the database records it in American format. That can cause all sorts of issues and very unhappy customers.


Silvio: Indeed! I’m sure it can be surprising for a first-time client when they see the defects that you find in their software. It is probably quite alarming for them, and goes back to my earlier point that your service really does provide that vital final line of defense before a product is released.

Jennifer: Absolutely. We had a case recently where we told the client right at the start that testing was going to take a lot more time. We were pointing out all the issues in the software that were going to create a lot of defects, but they did ask us to continue and highlight them so they could take corrective action.


SDL's CEO visits the SDL Test Lab

Silvio: I can imagine that you test lots of different applications and platforms. Obviously, you can’t be an expert on every piece of technology that is provided to you for testing services. During the initial test environment set-up and the actual testing itself, how do you ensure that the linguistic testers navigate and cover all of the software, web pages or device that you need to test? How do you support the testers through this process?

Jennifer: There is a certain amount of familiarization that has to happen with the test lead, who will be running the project. In some cases, the test lead will visit the client and sit with the developers to learn about their product. When we started on a large Life Sciences project that ran in different iterations for about four years, I was the French tester and I spent three weeks with the client to learn about the product. 

Next we create a test plan: What are we going to test? How do we go about it? Where do we need to go within the product to test? Then we discuss this with the client to ensure that we have covered everything. The test plan is the scope of the testing and the different activities required. It’s high level, but we go through what we need to test, what the client wants us to do and where we need to go in each part of the software. Testing can be a small part of the product or the entire product. 

We create the test plan or the client may send it. For example, the client may have used it when they were doing their functional testing and we might be able to adapt it for linguistic testing. Once everybody agrees on the plan, we create detailed test cases – go to this page, click on this, you will see this, go there. We make sure that everything is detailed so the tester knows exactly where to go, what to do and what to expect because sometimes things happen that were unexpected.


Silvio: I’m sure that when you’re testing and these linguistic testers are in the test lab, the testers are talking to each other. For example, “Have you seen this?” or “I’ve come across this”, which is a significant advantage of having testers in one location on any given project, right?

Jennifer: Absolutely. Occasionally, if only 15 minutes are needed for the testing, we may test using screenshots. We’re not going to ask them to drive an hour for 15 minutes of work. On a project that lasts for a while, we ask the testers to come into the test lab since there’s a big, big advantage of having them all together on-site. When they are sitting next to each other, they do communicate and even help each other out. You know, “I’m not getting the… What am I missing?” sort of things, and of course, the test lead is right there to support them as and when required.


Silvio: You’ve mentioned the role of the Test Lead. What actually is their role during a testing project?

Jennifer: The Test Lead makes sure that everything runs smoothly. They answer questions and can demonstrate how to do something, so they’re very active in the actual testing. Then, there’s all the post-testing reports that we send to the client. Project management is also highly involved, so they are aware of everything. 

We send reports on a daily basis – what happened today, how far we got. We track the hours that we’re using. How far are we in the testing with the hours that we’ve used? If we see we’ve already used 80% of the time and we’re only 20% through the test cases, we have a problem that we need to understand very quickly. We wouldn’t wait until we were 80% through to raise a red flag. Maybe we’re having serious issues with connectivity, because sometimes we connect to the client’s location and their software.


Silvio: That’s an interesting point because there’s the testing part itself, but then there’s the post-testing work. I would imagine the days of logging defects in a spreadsheet are long gone and you’re using more sophisticated software to manage testing defects. You mentioned that you generate reports. Can you explain how you manage and track defect logging, defect fixing and regression testing?

Jennifer: We use a software application called “Jira”, which a lot of our clients also use so they’re familiar with it. We can use SDL’s Jira session for any client who would like to use it. We give them access to that project and they only see their project; they don’t see what else we may be working on. In Jira we describe the issue. If it’s a typical linguistic defect, we put in the English source, the incorrect text that we saw and the recommended change with a screenshot to highlight where that issue is. 

That will typically go to our translators if we did the translation, but sometimes we test things that were translated by another company. The translator will review defects at the end of our day since they’re usually in Europe or South America, depending on the language. They work on issues overnight and we get their comments back the next day. They can agree with the recommendation and make the modification in the language files, they may recommend a different change or they may disagree and keep it the way it is because of a specific reason, but we like to know why. 


Silvio: So there is communication between the tester, who is not involved in the initial translation, and the translator who originally translated the software interface?

Jennifer: Yes, and all of this happens in Jira. There is a history of all the communication, all the actions taken and we can audit this because we have a history. If we test for the same client again later, we can see if we’ve already logged this issue and somehow it didn’t get corrected — a functional issue, for example - so we have all this information per project per client.


Silvio: That’s fantastic. So you’re able to generate dashboards with how many defects you’ve found, how many defects have been fixed and provide the client a snapshot of their project’s status?

Jennifer: Absolutely. We have a dashboard by language and error type with pie charts and tables. Any information that the client feels is useful, we can provide directly in Jira. They can log in and see live data as we are working - that there’s a new bug and things being updated. Then, if they want us to, we can export all this information and send it to them in Excel, Word or whatever format they need. 

Since a lot of clients have their own Jira, we can synchronize the data from our Jira to their Jira, or vice versa. This is a really nice feature that we’ve been using recently. Granting us access to their Jira can be very time-consuming since they have to go through their security and IT. We can simplify that by using our Jira and managing the access for the translators, the testers and the test leads. Then, at the end of the day, we can synchronize our defects into their Jira. The client makes comments on any of those defects and only the ones where they made changes are synchronized back to us. 


Silvio: I wanted to touch on security. No doubt some customers are very security-conscious. Particularly those who are releasing their code for you to test and those who entrust you with prototypes before releasing into the market. What reassurances do you provide to those security-conscious customers who need you to meet stringent security requirements?

Jennifer: We have all the necessary certifications, but the physical security is also really important to our clients. We are behind locked doors or badge-controlled doors, so only the people who have a reason to be in the test lab can come in. Other Colorado employees are not permitted in the test lab and can’t see in. This is what we call level one, which is our first main level as you come into the test lab. 

Inside the lab we have separate rooms, each with keypad-entry doors. This is where we test any prototype or pre-release software because we don’t want people seeing what’s on the screens. All rooms have cameras that do not point to the computers but point to the entrances and the exits, so we can see who’s coming and going. Some of the doors are exit only and cannot be used as entrances. We can also double-check badge readers to confirm the actual physical presence of people. We have secure cabinets and a secure room to store the devices when they’re not being used. Customers can also ask to see our logs. Everybody who works in the test lab is under an NDA, of course, and all new testers go through two days of orientation where we talk about security.


Silvio: Has what you’re testing evolved over the years? For example, 10 or 15 years ago it was more traditional software applications and now you’ve mentioned that an automotive customer has provided a vehicle to test its infotainment system. Are you seeing more evolution in terms of what you’re being asked to carry out testing in?

Jennifer: Absolutely. And the devices — the extent, the number of devices that we test. 

We had a project where we were testing voice recognition on a small consumer product with testers indoors, outdoors and with background noise. We had to swear at it because that’s what the end user would do - we weren’t doing things like that 10 years ago. It’s very traditional sitting behind a desk, looking at software but now, it’s more fun. We even had a drone once that we were testing, so we had to get some experience in flying it.


Silvio: We’ve worked together for a number of years now on localization testing for SDL software solutions and I’ve certainly appreciated the support and collaboration from you and your team to ensure that SDL is practicing what we preach from a testing perspective. In closing and for those readers who are new to SDL and may be interested in our testing services, what would you say are the key differentiators provided by the SDL Test Lab?

Jennifer: The test lab in Colorado has been around for 20 years. In that time, we have seen all sorts of devices, products and platforms. We have a lot of experience doing testing in Europe for some consumer products. We have used European internet service providers with devices that were made in the US to make sure everything works correctly once it’s shipped overseas. 

We can also spoof from our test lab in some cases, so we can pretend we’re in Switzerland or France and perform the testing. This is something that was actually developed in the test lab. Some people have been trying to find out how we do that, but we are very cautious about sharing the knowledge and the experience that we’ve gained. And, for those who use SDL for translations, we can work hand-in-hand with our project management and in-house translators.

It’s all part of raising awareness about our capabilities further upstream, like when we’re just starting to talk to a client, to tell them what we can do. They may not see an immediate need, but at least they are aware of our services and may choose to talk to us when they’re ready.


Silvio: This has been an enlightening and fascinating interview, so thank you very much for your time Jennifer.

Jennifer: My pleasure. And if anybody has any questions or wants to know more about testing, they can contact me. If you’re ever in Colorado, we would love to see you and talk about testing!

Want to learn more about SDL’s testing services? More information is available on our web page or Contact Us.