Hacker Newsnew | past | comments | ask | show | jobs | submit | zhdc1's commentslogin

Shortage -> Glut


Contrarian here. I've fell in love with Firefox's AI Chatbox sidebar. It's extremely helpful to have Gemini immediately available to help with translating and summarizing text.


Reintroduction of phonics has been pushed - hard - by academia.


Eventually and it was massively controversial within academia. There were studies that showed it worked, but studies are positivist and for many education academics, positivist is an insult. That's why it took literally generations and a political war to soak into academia at large after the science was uncontroversial.


> Construction in the USA is driven by capitalism. From my own observations, a big part of why we build less in recent times is the real estate market crash in 2008. We're still feeling the effects.

An efficient market would see an increase in supply to meet demand. This is exactly what happened in Minneapolis, Raleigh, and many cities in Texas, which made it comparatively earlier to get construction permits (in some cases, particularly for multi-family housing).


> We have too much space relative to our population

If you're arguing that there's an abundance of space, this is true in many countries (and was certainly true prior to the Federal-Aid Highway Act or Levittown).

> We have too much space relative to our population and our cultural focus on individualism mean that people will always prefer single family homes

Why? There are plenty of locales in the US where this very much isn't the case.

> Those cant exist without cars and people with cars need to be able to commute.

If we're simply talking about the average daily commute for the average person, why? There are still plenty of cities in the US that have effective public transportation.


> There are plenty of locales in the US where this very much isn't the case.

No there arent. Even NYC is full of SFHs


That’s completely by design.

Heavy subsidies to get the industry up, which are then retracted slowly to force competition.

It’s a well tried model at this point (a century or more, across a large number of countries).


Not going to happen.

MB had a large stake in Tesla. When they realized that Tesla could out innovate them (Model S vs the B 250e), they sold out.

This is still an economy where ‘digitalization’ is a desirable buzzword.


>they sold out

Why would they sell out?


MB provided a cash infusion during the GFC in exchange for 10% of Tesla. Both companies saw it as a strategic partnership.

Tesla planned on sticking to luxury vehicles and selling electric power trains to companies, like MB, that would handle everything else. MB, as far as I’m aware, thought that Tesla would prioritize this more than they did.

Tesla helped develop the MB B 250e, MB’s first BEV. At the same time, they developed and launched the Model S, which was far more expensive but a complete game changer.

Who knows what happened between this and when MB sold their stake in Tesla, but it’s easy to imagine that both companies became less enthusiastic about their partnership over time.


> I expect the EU will introduce hefty tariffs on Chinese EVs when the local automakers will lobby for it to protect local jobs, but hardcore protectionism long term only ensure your domestic industry lags behind technologically and falls behind globally.

Not sure what the EU’s strategy is here. The impact of Chinese EVs is going to be the same as electrification would have been without them in the first place.

Think about it this way. The EU automotive supply chain was perfectly healthy when US owned manufacturers had a large chunk of market share. Throw on the tariffs, force local production, and everything is good.

Chinese manufacturers will eventually do the same thing. Tesla and Greely have already done it. There are plenty of recent examples of traditional manufacturers doing it as well (Suzuki comes to mind).

The EU automotive industry is still in bad shape. The level of production and employment across the entire supply chain simply isn’t needed for BEVs.


I’ve transitioned a lot of my work over to Julia, but R is still the most intuitive language I’ve used for scripting out data collection, cleaning, aggregation, and analysis cases.

The ecosystem is simply better. The folks who maintain CRAN do a fantastic job. I can’t remember the last time a library incompatibility led to a show stopper. This is a weekly occurrence in Python.


> I can’t remember the last time a library incompatibility led to a show stopper.

Oh, it’s very common unless you basically only use < 5 packages that are completely stable and no longer actively developed: packages break backwards compatibility all the time, in small and in big ways, and version pinning in R categorically does not work as well as in Python, despite all the issues with the latter. People joke about the complex packaging ecosystem in Python but at least there is such a thing. R has no equivalent. In Python, if you have a versioned lockfile, anybody can redeploy your code unless a system dependency broke. In R, even with an ‘renv’ lockfile, installing the correct packages version is a crapshoot, and will frequently fail. Don’t get me wrong, ‘renv’ has made things much better (and ‘rig’ and PPM also help in small but important ways). But it’s still dire. At work we are facing these issues every other week on some code base.


I'd love to hear more about this because from my perspective renv does seem to solve 95% of the challenges the folks face in practice. I wonder what makes your situation different? What are we missing in renv?


Oh, I totally agree that ‘renv’ probably solves 95% of problems. But those pesky 5%…

I think that most problems are ultimately caused by the fact that R packages cannot really declare versioned dependencies (most packages only declare `>=` dependency, even though they could also give upper bounds [1]; and that is woefully insufficient), and installing a package’s dependencies will (almost?) always install the latest versions, which may be incompatible with other packages. But at any rate ‘renv’ currently seems to ignore upper bounds: e.g. if I specify `Imports: dplyr (>= 0.8), dplyr (< 1.0)` it will blithely install v1.1.3.

The single one thing that causes most issues for us at work is a binary package compilation issue: the `configure` file for ‘httpuv’ clashes with our environment configuration, which is based on Gentoo Prefix and environment modules. Even though the `configure` file doesn’t hard-code any paths, it consistently finds the wrong paths for some system dependencies (including autotools). According to the system administrators of our compute cluster this is a bug in ‘httpuv’ (I don’t understand the details, and the configuration files look superficially correct to me, but I haven’t tried debugging them in detail, due to their complexity). But even if it were fixed, the issue would obviously persist for ‘renv’ projects requiring old versions.

(We are in the process of introducing a shared ‘renv’ package cache; once that’s done, the particular issue with ‘httpuv’ will be alleviated, since we can manually add precompiled versions of ‘httpuv’, built using our workaround, to that cache.)

Another issue is that ‘renv’ attempts to infer dependencies rather than having the user declare them explicitly (a la pyproject.toml dependencies), and this is inherently error-prone. I know this behaviour can be changed via `settings$snapshot.type("explicit")` but I think some of the issues we’re having are exacerbated by this default, since `renv::status()` doesn’t show which ones are direct and which are transitive dependencies.

Lastly, we’ve had to deactivate ‘renv’ sandboxing since our default library is rather beefy and resides on NFS, and initialising the sandbox makes loading ‘renv’ projects prohibitively slow — every R start takes well over a minute. Of course this is really a configuration issue: as far as I am concerned, the default R library should only include base and recommended packages. But it in my experience it is incredibly common for shared compute environments to push lots of packages into the default library. :-(

---

[1] R-exts: “A package or ‘R’ can appear more than once in the ‘Depends’ field, for example to give upper and lower bounds on acceptable versions.”


Agree with this, I am pretty agnostic to the pandas vs R whatever stuff (I prefer base R to tidyverse, and I like pandas, but realize I am old and probably not in majority based on comments online). But many teams who are "R adherent" folks I talk to are not deploying software in varying environments so much as reporting shops doing ad-hoc analytics.

For those whom want to use both R/python, I have notes on using conda for R environments, https://andrewpwheeler.com/2022/04/08/managing-r-environment....


Can you not just build your own code as a package and specify exact dependencies?

It's a bit of faff but that seems like it should work (but maybe I'm missing something).


I basically don’t use anything outside of tidyverse or base R because of the package dependency issues.


At my old job we snapshotted CRAN and pinned versions of package dependencies _against_ CRAN.


We now provide snapshotted CRAN binaries (for many platforms) at https://packagemanager.posit.co.


Can you explain how your paid tier works?

Is it 1.95€ a day unlimited worldwide, or are there charges on top of that?


> Starting at €1,95/day

Emphasis mine.


I'm not affiliated, just randomly stumbled upon it.

In the app, it seems to scale between

1 day - 3.50 Euro

30 days - 59.40 Euro (1.98/day)

The pricing doesn't seem great, but also not terrible. So it could work out for them, due to convenience.


For Japan at least a 30 day sim is like 6-10 EUR so that's not a great deal.


Wonder how this differs from something like Airalo or bnesim then, aside from the free tier.


It does say, “Pay a fair local rate.” So, I'm going to assume it is €1,95/day + local rate (not roaming rate).


Per day sounds attractive if you’re somewhere for 3 days and don’t want to pay for a month


A cheap 30day SIM with 2GB pf data can be had for £5-7 in the UK, so it sounds pretty terrible to me.


What is the definition of "fair"?


I believe, it is going to be -- if the local rate is X, we just charge you that and nothing more added on top.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: