You can absolutely sync your vault without a paid subscription. Simply save it within your OneDrive or Google Drive folder. Alternatively, you could use Syncthing if you prefer a self-hosted solution.
However, these rights should be guaranteed to a company operating in the USA and strictly adhering to US law. Of course, if the law is (arbitrarily) changed to make this illegal due to the Chinese government's stake, then it could be forced to shut down, but that would be inconsistent with the constitution.
> However, these rights should be guaranteed to a company operating in the USA and strictly adhering to US law.
ByteDance is a Chinese company with it's headquarters in China. The so-called TikTok ban is a call for ByteDance to sell off it's controlling position over TikTok, otherwise TikTok can no longer operate in the US.
The fact that China is spinning this issue as a TikTok ban is telling.
If they want to do that, of course they can. (And indeed, Chinese car companies are already treated differently in US law to such an extent that they aren't in the US market at all.)
You didn't say why that would be inconsistent with the Constitution, you merely asserted that it is. But it isn't.
Our government gets to decide the terms under which businesses operate in this country. Always has and always will. This is not a constitutional question.
This approach is valuable because it abstracts away certain complexities for the user, allowing them to focus on the code itself. I found it especially beneficial for users who are not willing to learn functional languages or parallelize code in imperative languages. HPC specialists might not be the current target audience, and code generation can always be improved over time, and I trust based on the dev comments that it will be.
This is TensorFlow-based. But I also have another PyTorch-based implementation already, also public (inside our other repo, i6_experiments). It's not so easy currently to set this up, but I'm working on a simpler pipeline in PyTorch.
We don't have the models online yet, but we can upload them later. But I'm not sure how useful they are outside of research, as they are specifically for those research tasks (Librispeech, Tedlium), and probably don't perform too well on other data.
Thank you for your work. I have been having trouble achieving the desired level of formality in the generated text. When I ask for slightly formal content, the result tends to be too formal. However, when I ask the model to reduce the formality or use a semi-formal tone, the text becomes too informal. This will allow me to exercise more control over the style of the model's output and stop constantly battling with it.
This comment does nothing more than repeat the information provided in the title and in a very verbose style. I don't understand the tendency for GPT generated comments that is lately taking over HN and Twitter threads. They are insipid, add absolutely nothing of value to the discussion, and are generally a waste of time to read.
It's about minimising friction in our tasks and reducing any unnecessary obstacles. Even seemingly minor actions that only take a few seconds can build up over time and generate a sense of frustration, especially when they become frequent.
Personally, I have found that dealing with this kind of friction can erode my overall productivity, as I unconsciously shy away from these tedious tasks that involve manual repetition, no matter how small. That's why many of us choose to invest a little time in coming up with these small automations that free us from the clutches of monotonous and repetitive tasks, allowing us to focus on the core of our work.