QA issues in Memsource

The QA (Quality Assurance) requirement in Memsource is, ironically, problematic. It prevented me from submitting a completed task because of one “issue” it found with my translation. In this case, QA was incorrect; I know that I translated the word correctly, but QA refused to let me click “Complete.” It seems that there should be some way to override the QA once the translator has reviewed the issues that QA finds. Otherwise, we’re permitting MT software to have the final say in what is a correct or appropriate translation.

Has anyone else had this issue?!

1 Like

Hi, Liana (@lianakd)!

It seems that the project manager has chosen to disable the “Can be ignored” option in the QA settings. Try to explain your reason in the project section.

You can ignore multiples entries at once too.

Happy translating (and ignoring the issue)! :writing_hand:

Unfortunately, clicking “ignore” didn’t work even for single entries.

Have you contacted the project manager?

Well, let’s wait for other people to help out. I think the “ignore" function is disabled:

“By default, all QA warnings can be set to ignore by Linguists/Vendors. For some QA checks, the Project Managers can disable the Can be ignored option in the QA settings so Linguists and Vendors will not be able to ignore these warnings when setting a job to Complete.”

1 Like

Also, when I tried adding a comment just now, QA started listing the comment as yet another “issue.” I even then tried deleting the comment, and QA still lists an “unresolved comment” as an issue. Lol!

Ah. Yeah, the “ignore” function must be disabled, because the “Complete” button won’t even turn blue. And yup, I left a comment with the project manager tagged AND sent an email. No response to the comment; I got an automated “out of the office” reply to my email. Honestly I’m kind of annoyed :frowning:

Thank you for your response, though!!

1 Like

We’re here to help. Thank you for your patience.

Give some time to the project manager. It’s weekend. A mythological creature that translators sometimes catch. (OK, I’m little tired, so the sentence will be fragmented. :rofl:)

Have a nice rest! :sleeping:

It’s frustrating and time consuming.
Human translation is supposed to make machine translation better. If the machine keeps insisting that your work is not as good as the machine, not good enough, what are we doing here?
I thought it was just with English- Arabic language pair.

Hi Liana and Najah,

Thank you for sharing with us your concerns :dizzy:

As you may know, we are currently keeping the QA check as an additional step towards submitting the tasks as it might be sometimes helpful in reminding to correct a few issues. However, we are truly aware that a number of the QA issues can be false positives and indicates errors that don’t really exist :confused:

For this reason, if you come across a similar case, and you find that the QA issues cannot be resolved and prevent from marking the tasks complete, please don’t hesitate to reach out either here on the forum or at so we force the completion of the tasks from our end :white_check_mark: