I'm actually not sure what will become of tools like DeepL. Whatever edge they may have with dataset tuning and other tricks under the hood are likely superseded by a better architecture, which in turn requires a ton of capital to train. By the time they come up with a GPT4 equivalent, we will be using GPT5.