Three times now I've argued strenuously against using MongoDB and Elasticache as the primary data store in an app, as they will inevitably turn into relational databases.
They did.
For Mongo going to postgres went great, but for complicated reasons we're stuck with Elasticache forever on our main product.
The only reason to ever use a non-relational db is for scalability and performance reasons. Joins and transactions are hard to do correctly and efficiently in a distributed system. So “NoSQL” solutions can be a good fit if your data is too big to fit on a single host and you can get by without joins and transactions.
(This is a massive oversimplification, but still used rule of thumb.)
And most companies vastly overestimate their data, and believe it to be "big", when it could be trivially handled decades ago by server-grade hardware.
> The only reason to ever use a non-relational db is for scalability and performance reasons.
And, most importantly, lack of market availability. Nobody is going to sell you a relational database nowadays, and rolling your own is... daring. Postgres was the last serious kick at a relational database, but even it eventually gave into becoming tablational when it abandoned QUEL and adopted SQL.
But tablational databases are arguably better for most use-cases, not to mention easier to understand for those who don't come from a math background. There is good reason tablational databases are the default choice.
damn this reddit thread is 3 months old? t1 here as well, and i struggle pretty bad. having been t1 for 20 years or more, i just can't click every article my friends and family send me promising progress for diabetics or potential cures. its just not worth getting my hopes up even when its a reputable outlet making some extraordinary claims. this sounds really promising but yea. its also depressing. its kinda too late to save me even if this comes very soon. which i doubt it will. However, this so called 'smart insulin' sounds to me much more like the shit produced by non-diabetics pancreases. like theres just no way the non-diabetic body is making a hormone that doesnt fully kick in for 90 minutes. that just wouldnt be as resilient and effective as what i witness in the people around me. its insane how they can, for example, eat a tub of ice cream on a whim and not be blasted into the 400s. or just go wild exercising at length on an empty stomach and not have an emergency low sugar.
There's already software that can definitely mitigate these problems you've outlined. I've been using AndroidAPS together with Lyumjev insulin, an insulin pump and a Dexcom system for several years now. Yes, I can go running with an empty stomach and yes, I can have a nice dinner without being in the high 400's... My glucose hasn't really been above 200 for months, and the last time was a leaking tube in the pump. My A1c has been between 5.5 and 5.9% for many years now. There's no need for ambulance to come and rescue me due to hypoglycemia.
If you're in any way technical, you should take a look into the solutions for artificial pancreas.
I've run it against my nightscout data a few times to get insights to my profile. So yes, you should install it somewhere and run a CGM app such as xdrip which can transfer your libre2 data to your nightscout database.
So yes, if you are interested on autotune, a nightscout is required for now.
I participated in studies where they administered insulin and glucose intravenously. It is wild how they can reliably drop my blood sugar from high to low within a few minutes. Subcutaneously this takes me hours to do in a stable way.
Not only veins, inhalable insulin like Afreeza is also really quick. Unfortunately it only appears to be available in the US (and maybe Canada?), not Europe/Asia from what I last remember.
Inhalable insulin (which is also very fast acting) iirc only allows a dose of 2 units. If your sugars are 400mg/dl (22ish mmol/l) one or two doses wouldn't put you into a coma if you knew your sensitivity. I'm pretty sure I've read up T1s talking about using it as such.
> Average wages in America’s poorest state, Mississippi, are higher than the averages in Britain, Canada and Germany.
This doesn't sound right.
Googling around, the median income (a much better metric than average) between Mississippi and Germany are about the same.
A huge difference: Britain, Canada and Germany don't have to pay for health care and education. I'm not sure how to factor that in, but something tells me if you do you'll find our Canadian and European friends are doing drastically better than those in Jackson, MS.
> but something tells me if you do you'll find our Canadian and European friends are doing drastically better than those in Jackson, MS.
This will show a lot of selection bias. Any Europeans with American friends who are in the forum are unlikely to be representative of the median German/Brit.
I have a cousin who is a mid-level manager in a steel plant in southern Illinois near near Missouri and he lives like a petty king. I've known FedEx drivers with lake houses in low cost of living areas. Middle income people in low cost of living states often live quite well in the US.
>Britain, Canada and Germany don't have to pay for health care and education
That is not true: British taxpayers have to pay for the health care and education of Brits.
The point is that when you add the money the government spends on you plus the money you spend on yourself, America has significantly more money per-person to go around.
What you should have written is that although America spends more money on health care per person than Britain does, Americans have worse health, so the money is spent less efficiently than in Britain.
(It is also true that America spends more on "criminal justice", e.g., prisons, but has more crime, so the criminal-justice money must be being spent less efficiently.)
> but something tells me if you do you'll find our Canadian [...] friends are doing drastically better than those in Jackson, MS.
Perhaps true, but also keep in mind that Canada and the US were on more equal standing until the last five years or so, when we started to see a significant divergence emerge. You can ride on the past for a while. Assuming the trend continues, it may simply be that it is too early to see Canada turn into Mississippi.
I have no idea what you mean by "somewhat existing." Public health expenditure in the US is more per capita than any other country's public + private spending combined.
This does not mean the US is healthier (after all absurdly high spending is in part to cover health issues). But money per se is not remotely the problem and public health expenditure very loudly exists.
> but its very unlikely they'll earn more than an American electrician.
what's he going to be putting his money to, if not being healthier and smarter?
otherwise he's just fatter, dumber, and full of plastic kitch manufactured overseas, much of which is now floating around his bloodstream in microscopic form.
Median income can also be misleading. If the government is taking on lots of debts to pay for things like health care or education, that's not necessarily sustainable economics.
GDP Per Capita is overall the best metric when thinking on a long-term economic scale.
As to how the people in Jackson, MS are doing - a lot of this comes down to what communities are doing with that wealth. There are lots of young professionals having decent lives, even in Mississippi - but a lot of that wealth disappears into cars and big private properties and other unequally distributed things.
I lived in Ontario for ~18 years. While national healthcare is fantastic (and the US should totally have it), Canadians are getting squeezed, and have been for the last 5-6 years.
When I lived in Ontario 20 years ago, Ontarians were getting squeezed and complaining then to.
There used to be something like 3 MRI machines across the entire province. The province containing 1/3 of the country's entire population. Wait times for screening procedures could be months. My exe's entire family drove down to New York to get their dental care, paying out of pocket, when it mattered. (UB's College of Dentistry became the 7th best dental school in the nation & 11th in the world largely on the back of this trade)
It's better now, but not better _than what can be bought_ in the United States. Does the US need to improve? Certainly.
> I look at a "sort of" competitor in our industry, and their rate of feature development is ridiculously slow by comparison.
Seconded. At my org, Ruby/Rails is our competitive advantage.
Our 5-6 competitors are all Java/.NET shops -- we deliver fixes and new features dramatically faster and deploy them with ease. It gets noticed.
The main downside is rails doesn't scale well re: complexity so regular refactors are necessary (ours is a high-volume big data app).
For performance/cost, it took some doing but by strategically moving business logic to higher performance techs we've even managed to get to a great spot there.
While certainly a little polarising the adage of optimise later really comes into play here. We do a lightweight industry specific system that had all the generic stuff (eg time/tasks etc) with some special sauce that makes us stand out. One of our clients said "Mate the rate you guys are adding features - the ERP company we went with they haven't done 1/10th of what you've done in the last year. Can you implement full multi-level bills of materials? "
I said "no, can't do that - that's huge".
A weekend later of experimentation (thanks to one very kick-butt gem on that has an amazing way of managing trees) "actually, we may just be able to - it's going to be a long road though - setting expectations..."
A period of time later we've now implemented a full BOM planning/tracking system, tested well upto and beyond the numbers of items these types of companies expect.
To the point they cancelled their renewal with a big/established ERP vendor and went with us instead.
There's definitely some instances where speed to market trumps everything else. We can go optimise queries later on.
What's been fascinating is I left my day job 2 years ago (to the month). We've gone from unheard of to becoming the market leader in our space. There is absolutely no way that could have been done with any other stack (without spending 5x the money - and I argue that co-ordinating a bigger developer team it almost wouldn't matter what money you throw at it, it gets exponentially harder getting everyone on the same page for delivery).
At my old workplace I couldn't convince the .NET team for love nor money to look outside the box.
Because Ruby and especially RoR is a strict downgrade in every dimension compared to .NET. Now, these developers could have been using old .NET Framework, in which case we usually pretend they don’t exist because it’s like stubborn Python 2 of .NET that is otherwise the most productive platform for back-end (with really, really good performance).
> Because Ruby and especially RoR is a strict downgrade in every dimension compared to .NET.
And that line of thinking is damaging in the tech world. There is no one ring to rule them all. Each tool, language, framework has a place - everything has tradeoffs.
I've worked with enough .NET world to know what the tradeoffs would have been.
For instance the critical library that enabled this piece of functionality, the nearest thing in .NET world looks to have 1 star on github, last updated 4 years ago. The ruby gem has 1.8k stars, tonnes of commits and upto date.
So if you were to implement that in .NET you would be (a) hitting the books (b) re-writing that library or (c) doing a hugely in-effecient/naive implementation that wouldn't scale.
In RoR land - drop in the gem and be prototyping that afternoon.
Sure if I was VC backed, and had money to burn, it might be a bit more interesting to use .NET - if were targetting more enterprise oriented clients and potentially wanted an on prem option, then yeah - .NET. Wanting more access to developers in our country - .NET, that's the predominant market. Etc etc.
Ruby is subjectively a worse language than C#, it has worse tooling (as testified by Ruby developers), worse expressiveness and, most of all, much worse performance.
If there isn't an SDK for something in .NET but is in Ruby, it speaks for the poor quality of an engineering culture in that company (as it is likely they offer Java SDK at the same time). And even in that case, you can just generate a client from OpenAPI manifest. Or a gRPC client from .proto (even better). Or if it's an algorithm, you're much better positioned with all the low-level tools C# offers to write something performant with reasonable amount of effort.
And even if that doesn't work, you can just call C bindings by using one of many existing binding generators or just `[LibraryImport]` the calls directly with little effort, after all, C# is a proper C language with C structs and pointers.
But most importantly, you get robust results and consistently good performing application with no scalability issues, good build system and very little risk of accidentally breaking the codebase.
And you don't need to make a developer productivity tradeoff in order to achieve that. I don't know when the bad reputation has to go away, but surely we're long past that point.
I will admit that the .NET ecosystem has gotten way better since being more friendly towards Linux users/environments and I used to code a lot of C#.
I love both Ruby and C# - I don't think one is better than the other. Ruby also has good interop with C. The performance of Ruby is good enough for most use cases. Scaling a Ruby app via multi-process scaling has worked perfectly fine without major issues for most use cases. The open-source 3rd party libraries in the Rails world have a certain "feel" to them.. as if they were written by devs that care a LOT about the - excuse the trendy term- DevEx that are often missing in the C# world.
Call it bias due to familiarity but Rails continues to remain the most productive option to bootstrap a web app/startup for me. Often I think most of the dissonance is simply due to difference in philosophy. Devs don't like Rails because its not "their way" and not that it is inherently better.
Pick the tools you like to use, most productive with, and most appropriate for the application - thats it. No need to nitpick on language semantics, tooling, or performance if those are not even major considerations for what you are building.
From the preface, it looks like a bunch of the reference material has been cut from the book in favor of C++ documentation on the internet:
"The third edition of Programming: Principles and Practice Using C++ is about half the size of the second edition. Students having to carry the book will appreciate the lighter weight. The reason for the reduced size is simply that more information about C++ and its standard library is available on the Web."
To be fair, someone should print the online reference material to get the real comparison.. it's not like we have fewer details in the standard library -- footnotes about compatibility with which version(s) and the related semantics could fill a book on its own.
If only 20 internals are ever going to see it, jquery is as light as can be, can be loaded right from a CDN, needs no kind of transpilation and handles basically all you need.
> Approximately 80% of the income of traditional capitalist conglomerates go to salaries and wages, according to Varoufakis, while Big Tech’s workers, in contrast, collect “less than 1% of their firms’ revenues”.
This doesn't sound right to me, but if it is Varoufakis has a point.
Our contract, such as it is, with traditional capitalists is they employ many, produce some sort of value and keep as much profit as they can manage after labor and expenses.
If the Amazons and Googles of the world can generate their enormous revenues (and profits) without dispensing much of it to labor, essentially just collecting large rents in perpetuity, both traditional capitalists and workers are in serious trouble. That is a big shift.
Google's revenue in 2022 was $280 billion. 1% of that would be $2.8 billion, and Google had 190000 employees at the time. I'm pretty sure the average wage of those employees is not $14,736, so Varoufakis would appear to be off by at least a factor of 10x.
On the other hand, if Google did pay out 80% of revenues, the average Google paycheck would be a cool $1,178,947.
I helped him process and visualize the original batch of parking ticket data waaaay back in 2016.
I can't believe he's still on this in 2025. We need more junkyard dogs like him fighting for what's right.