OpenAI provider for Trados
By Trados AppStore Team
Free
Description
The OpenAI provider for Trados integrates OpenAI's advanced language models with Trados Studio's leading CAT Tool, advancing your translation with smart automation and precision. Compatible with both OpenAI and AzureOpenAI providers, it allows you to:
- Boost Efficiency: Automate batch tasks such as analysis and pre-translation, saving time and boosting productivity.
- Enhance Translation Quality: Customize prompts, and incorporate terminology for more accurate translations with alternative translations supported with Trados Copilot-AI Assistant
- Simplify Processes: Install and access OpenAI directly within Trados Studio, streamlining your setup and usage.
- Flexible Customization: Easily manage connections and prompts, tailoring the AI to your specific needs.
- Improve Accuracy: Leverage terminology-aware translation suggestions for more contextually relevant outputs.
Upgrade your translation capabilities with Trados Studio and OpenAI, and experience the future of intelligent, efficient, and high-quality translations.
Technical details
1.1.6.1 - Trados Studio 2024
No related information.
1.1.6.1 - Trados Studio 2024
- Include Fuzzy Matches from Terminology Results: Previously, only 100% terminology matches were included in the prompts communicated to the AI technology. With this change, the feature now retrieves the minimum match threshold from the project settings and uses it to include fuzzy terminology matches in the prompt. This means that terms matching at or above the minimum specified value will be considered, allowing for greater flexibility and accuracy in terminology usage.
1.1.5.0 - Trados Studio 2024
- Improved language mapping: When mapping against Trados Studio CultureCodes, the application now infers the Multiterm language from `CultureInfo.TwoLetterISOLanguageName`.
1.1.4.0 - Trados Studio 2024
- Improve language mapping when associating a termbase to the project. Resolve bug reported for CRQ-39127
1.1.3.3 - Trados Studio 2024
- Resolved a bug that resulted in an exception while attempting to load the settings dialog when no document is open in the Editor
- Resolved a bug when attempting to run Geometry. Parse on a value in a format that doesn’t align with the regional settings of the computer.
1.1.2.0 - Trados Studio 2024
- Support for Trados Studio 2024
- Update of Tell Me
- Add new LLM gpt-4o
- Rebrand
- OpenAzureAI endpoint support
- Implementation of Terminology-awareness
- prevent being assigned the ‘Translated’ status after being applied.
- Remove predefined shortcuts
- Enable option to Assign Match value
- Introduce a safer way to handle the threading, especially in returning the result as opposed to a pointer to the result.
- Block the asynchronous method properly while still managing exceptions effectively
- Implement timeout handling to make sure we handle timeout exceptions gracefully
- Implement a retry logic to handle transient failures mechanism to improve resilience capabilities
- Included support for specific languages with the Pre-translate batch task
- Resolved bug in saving language key indexes during pre-translate from specialized language settings
- Resolve bug in removing prompts from the provider settings
Overview
Unlock the power of Advanced Language Models (LLM) with OpenAI and Trados Studio. Enhance your productivity through seamless integration with the OpenAI provider for Trados. This powerful combination automates complex tasks, boosts efficiency, and delivers precise, contextually aware translations.
Supported Providers:
- OpenAI
- Azure OpenAI
Supported Ways of Working with OpenAI Provider for Trados
For initial translation and revisions, you can use OpenAI in these supported workflows:
- Interactively: On a per-segment basis
- Batch Automation: Automate translation tasks for increased efficiency
Getting Started
1.Minimum Requirements
- Trados Studio 2024
- Subscription with OpenAI
2.Installation
Install the OpenAI provider for Trados directly within Trados Studio via the Integrated AppStore or from the RWS AppStore.
3.Access
Once installed, OpenAI can be accessed in various ways:
- Trados Studio > Tell Me
- Default: `File` > `Options` > `Language Pairs` > `All Language Pairs` > `Translation Memories and Automated Translation` > "Use" dropdown list
- Selected Project: `Project Settings` > `Language Pairs` > `All Language Pairs` > `Translation Memories and Automated Translation` > "Use" dropdown list
Configuration
There are 4 areas to setup as part of the configuration.
1. Connections
Authenticate your connection based on your subscription. Configuration fields include:
- Name: User-defined for easy reference
- Provider: Choose from OpenAI or Azure
- Endpoint:
OpenAI default: `https://api.openai.com/v1/chat/completions`
Azure: Uses a private IP address for a secure connection, based on your custom configuration
- API:
OpenAI: Generate or obtain from your OpenAI profile
Azure: Get support from your Azure Administrator
- Model: Availability depends on your subscription
- Model Type:
Chat Completions: Generates responses in a conversational context
Completions: Generates text based on a given prompt
Temperature: Controls output randomness
Lower temperature: More predictable and repetitive output
Higher temperature: More creative and unexpected output
Authentication Validation: Test your configuration using the Test Connection feature.
Managing Connections: Each successful connection is listed for future reference and can be edited or deleted.
2. Prompts
Out-of-the-box predefined prompts include:
- Default
- Translation
- Formal Translation
- Multiple Translation
- Reduced Length
- No 3rd Person Singular Pronouns
- Refine Translations
Managing Prompts: You can manage prompts by editing or deleting them. Any updates will be listed within Trados Copilot - AI Assistant.
To add prompts:
- Prompt Name: User-defined for easy reference
- Prompt Text: An instruction given to generate a specific response
3.Search Options:
You can enable various features that impact behavior and output:
- Ignore translated segments: Only generate output for segments that are Not Translated or Draft
- Ignore locked segments: Skip segments with a locked status
- Enable Terminology-aware Translation Suggestions: Incorporate terminology context for accurate translations
- Include target segment with the translation request: Use existing translations as context
- Include tags in the Translation Suggestions: Preserve tag markup from source and target content
- Use In-Processing Caching: Select this option to reuse translations already obtained from the AI provider. The In-process cache will be reset when you start Trados Studio.
4.Search Results
This only applies to the Batch Pre-translate task. You have an option to leverage from OpenAI's translations either as part of your initial translation or alternatively as part of your revision phase. During revision, you may want to automate a batch replace to update some of your existing translations, using refined prompts for enhanced quality and accuracy.
In summary, you can either:
- Supplement TM: OpenAI can complete translations that were not translated using your Translation Memory (TM), working alongside your TM for a fully translated file.
- Prioritize OpenAI: OpenAI can replace existing translations, even those from your TM, with its own for improved accuracy translations that you may have validated prior.
Batch Pre-translation is managed by:
- Status setting for "Ignore translated segments"
- Values set for "Assign Match value" compared against the "Minimum Match Value" set within Batch Pre-translate
Examples:
Example 1: Batch Pre-translate Task with Minimum Match value set to 75%, OpenAI Assigned Match Value to 100%, and "Ignore translated segments" enabled.
Result: Confirmed segments pre-translated from TM stay unchanged. Remaining segments in draft/unconfirmed state get overwritten with OpenAI translations.
Example 2: Batch Pre-translate Task with Minimum Match value set to 75%, OpenAI Assigned Match Value to 100%, and "Ignore translated segments" disabled.
Result: All segments, overwritten with OpenAI translations because "Ignore translated segments" was disabled.
Example 3: Batch Pre-translate Task with Minimum Match value set to 75%, OpenAI Assigned Match Value to 99%, and "Ignore translated segments" enabled. In the SDLXLIFF there are also 100% fuzzy match value segments that are not confirmed.
Result: Confirmed segments pre-translated from TM stay unchanged. Segments in draft/unconfirmed state or with a fuzzy match equal to or less than 99% are overwritten with OpenAI translations. Segments that had 100% TM match but unconfirmed, are not overwritten by OpenAI because the 100% TM mach is higher than the OpenAI Assigned Match Value of 99%.
Working with OpenAI in Trados Studio
You can engage with OpenAI in Trados Studio in the following ways:
1.Translation Provider
Use OpenAI to retrieve and apply translations through the Editor's Translation Results Window. Automate tasks like Analysis and Pre-translation to boost productivity.
Batch Pre-translate Workflow Support:
- Incremental
First, review translations obtained from initial Batch Pre-Translate tasks.
Then, run a second batch task to apply OpenAI for refined translations.
- New Batch Task: Pre-translate Files with OpenAI for Trados
To use some OpenAI features effectively, translations must be applied to SDLXLIFF files. Therefore to support a fully automation approach that supports OpenAI being leveraged despite with other providers are simultaneously being listed, we recommend you create a custom batch task:
-Initially use Pre-Translate to leverage translations from other listed providers.
-Within the same process, add Pre-translate Files with OpenAI to refine these translations based on your preferential configurations.
2.Trados Copilot - AI Assistant
AI Assistant is a great companion when it comes to working with your LLM interactively.
It offers extended features such as:
- Alternative custom suggestions
- Specific prompts and settings
- Highlighting translated terms
- Visual tracking of changes
How much will this cost me?
To estimate token consumption with OpenAI's API integration in Trados Studio, you could approach the task by breaking down your testing and estimation process as follows:
1. Understanding Token Consumption for Full Document vs. Segment-based Prompts
- Full Document: Test token consumption by translating a few typical full documents you use in Trados Studio 2024 via OpenAI. Keep track of the document’s word count and token usage. This provides a base estimate of average tokens per word or per document.
- Segment-based Prompts: Run translations for a standard sample set of segments (e.g., 20-30) and calculate the average token usage per segment. OpenAI's token consumption tends to vary depending on segment length, so capturing various segment lengths helps in building an accurate estimate.
2. Calculating Average Token Consumption per Word/Prompt
- For OpenAI’s GPT models, on average, 1 token corresponds to approximately 4 characters in English, which roughly translates to 0.75 words. However, keep in mind that languages with complex structures (e.g., German) might consume more tokens due to longer words.
- Based on OpenAI’s average, you could use the formula:
- Approximate tokens per word = Total tokens / Total word count of the translated document.
3. Practical Testing in Trados Studio’s 2024 AI Prompting Framework
- Leverage Trados Studio’s new flexible AI prompting options for both document and segment-based translations.
- Track token usage across a sample set of documents and segments, calculating the average for typical work cases.
4. Estimating Impact on Existing Token Consumption
- Once you have an average tokens-per-word figure, apply it to your average daily/weekly/monthly word count to estimate expected consumption. This will allow you to extrapolate your annual token needs, giving insight into the potential cost impact for using OpenAI.
5. General Reference Values for Planning
- You may find it practical to use 1,000 tokens as equivalent to around 750 words in English as a rough benchmark. For more accuracy, though, it’s best to tailor your estimate to your specific content and translation volume.
By carrying out some initial trials with typical documents and segments, you’ll be able to fine-tune estimates and better understand the potential cost implications of incorporating OpenAI into your workflows.
Frequent Questions and Answers
Q: What can I do if I am having issues connecting?
A: Check if you have a paid OpenAI API account; you should be able to confirm this from here. You can also check your API key that you created from here, and added this to the Model connection settings. For further assistance, review the log files: C:\Users\username\AppData\Roaming\Trados AppStore\OpenAI provider for Trados\Logs
Q: How can I tell if a segment was translated using OpenAI?
A: When you have an SDLXLIFF file open in the Editor View, users will see a blue NMT in the middle Context View with details confirming the provider name. You can identify it via the Provider Name being set to "OpenAI provider for Trados."
Q: What is the default status of my segments if I use OpenAI?
A: Segments will be in draft state and will need confirmation.
Q: How can I set segments to confirmed?
A: While there is no built-in feature for this, you can use the Advanced Display Filter against all segments that have the origin set as "Neural Machine Translation." Once filtered, you can either batch select the segments to change status or use the Segment Status Switcher available via the Integrated RWS AppStore.
Q: Which terminology providers are supported for terminology-aware translation?
A: All terminology providers are supported, provided they offer recognised terms when in the Editor View.
Q: I have multiple terminology providers and recognised terms listed. Which one will be used for terminology awareness?
A: When multiple terms are recognised, the AI prioritises and selects the term to be applied based on discretion.
Q: What do I need to do for my terms to be used for OpenAI translation?
A: Regardless of your selected prompt, Terminology-aware Translation Suggestions needs to be enabled. However, we recommend that you keep your prompt scope updated for future reference. Therefore, you may want to reference any dependencies on terms for clarity.
Q: Can I have more than one instance of OpenAI listed as a provider?
A: Yes, you can. For output comparison, it’s best to list your primary OpenAI instance under Project Settings and review the initial OpenAI translation via the Translation Results Window. For alternative translations and comparisons, add your second instance within Trados Copilot - AI Assistant.
Q: Does OpenAI support track changes?
A: Yes, this feature is supported within Trados Copilot - AI Assistant.
Q: What if I want to update translations as part of a review?
A: You can either apply the revised changes as part of a Batch Pre-translate Task or interactively within Trados Copilot - AI Assistant. Once you define the type of change, use an updated prompt and ensure "Include target segments with the translation request" is enabled.
Q: Why is the output the same even after changing settings or prompts?
A: The output might be cached. Disable "Use In-Processing Caching" in the search options to see the changes.
Q: Are logs available?
A: Yes, logs are located at: `C:\Users\[username]\AppData\Roaming\Trados AppStore\OpenAI provider for Trados\Logs`.
Q: Where are my connection details stored?
A: Connection details are stored at: `C:\Users\[username]\AppData\Roaming\Trados AppStore\OpenAI provider for Trados\Settings\Connections.xml`.
Q: Where are my prompts stored and can I share them?
A: Prompts can be shared by sharing the relevant file located at: `C:\Users\[username]\AppData\Roaming\Trados AppStore\OpenAI provider for Trados\Settings\ProviderSettings.xml`.