I think this is great to think about however I think many of the same lessons may still apply and can and should be applied now in a forward looking way:
From the article:
> Whereas the Netherlands clearly differentiates roads and streets — as do Germany, Spain, and France — the US is known for having “stroads,” roads where cars reach high speeds yet must also avoid drivers entering from adjacent businesses and homes. The majority of fatal crashes in American cities happen on these “stroads,” and impact pedestrians and cyclists in particular.
I think this will be _more_ important with autonomous driving. We've developed a built environment where car through traffic and destinations are co-mingled which leaves very little room for people to actually experience their destinations when they get out of their vehicles.
Perhaps I'm wrong, but my expectation is that the problem of "stroads" will only become more apparent if less focus is placed on getting from point A to B and more on where a person is trying to go which is my current long term expectations of the impacts of autonomous vehicles.
There are a bunch of things that still are language or language architecture specific.
How you do manage web servers and starting / stopping processes
Traditionally, PHP/Ruby/Python have had a process per request model but even then, how these processes are started and what memory is shared between requests is different
node.js when deployed on a server allowed one process to serve many requests through the use of the event loop but this has changed with the use AWS Lambda (one process per request) which has opened up more efficient approaches (cloudflare workers)
How do you manage database connections or other shared resources
PHP and Laravel expect certain types of things (i.e. db connections) to be long running. How do you deal with scaling up/down your servers in this world?
Laravel has it's own ORM with it's own apis. Can these APIs be improved to allow things to be more easily scaled?
vercel
I think vercel has shown that a tight integration between React and Infrastructure can provide a lot of value to many types of teams. I would expect the same types of benefits could exist for laravel/php!
In my opinion the greatest tragedy of php is that most libraries and frameworks are written under the assumption that the whole environment will be thrown away and the whole process killed after the current request is served and thus that it's perfectly fine to litter the address space with garbage (and php's garbage collection is nothing fancy, really).
Php people should really start assuming the process executing their code will stay around (and that restarting processes is really an anti-pattern).
Portland's trams don't move anywhere close to 35mph as the OP mentioned. Portland's trams are quite capacity constrained due to needing to navigate the short blocks and many intersections of downtown Portland. Dedicated travel cooridors where these trams could move at closer to 35mph would allow trips _through_ downtown to become competitive which currently are often not ideal.
Plain hybrids currently do sell better than PHEVs or EVs.
I'm not sure in which "best of both worlds" a standard hybrid is better than a PHEV - a PHEV allows for cheaper fuel (grid electricity) when it's available. That being said, the extra cost is associated with larger batteries than standard hybrids. As batteries come down in price / size, I'm not sure why people would want a standard hybrid over a PHEV.
The current salary cap for any programmer (or any non executive level roll) that is a federal employee is currently $191,900. I’m glad the hiring requirements are being updated but I’m concerned that making the lower bands easier to join without competitively hiring more senior people will make the federal workforce less effective.
I've worked both and, anecdotally, I didn't notice a real difference in efficiency. Public sector was plodding and methodical; private sector was faster but thrashed around wasting energy on the latest Top Priority project. Lots more tech debt in the private sector too. Speed isn't always efficient.
Methodical to their own ends without having to produce anything that even one member of the public wants, compared to thrashing around wasting energy on the latest Top Priority project that still has to actually make someone want to buy it. There is no comparison. Being methodical is not a value in and of itself, a serial killer can be methodical, that does not make them better than a plumber who is not particularly methodical.
The public sector is often inefficient and insulated from market forces but that doesn't imply that what they produce doesn't meet the needs or desires of citizens. And I've done a lot of work in the private sector that would never in a million years have an impact on the bottom line.
Inefficiencies are everywhere, denying the simplistic stereotype that that the public sector is unusually inefficient.
This news feels in stark contrast to apples announcements this week. The price of tech hardware is not keeping up with the target 2-3% inflation. Will higher pricing for software make up for lost (in real terms) revenue for hardware?
As someone who has lived in the southwest, it can't be understated how important the issue of water is.
One thing to keep in mind is that most estimates place human consumption of water at below 20% - a ton of the water of the basin goes to agriculture. To be clear, I think this makes sense - with added water regions in the basin can be some of the most productive ag regions in the country.
The big problem is policy has not adapted to scarcity. There are real tradeoffs when we have 30% less water than forecast and it's not clear who should suffer them.
I think there is often a misconception that this area is somehow "too hot" to live in. Since the advent of air conditioning, we have moved past this. Generally speaking similarly sized homes in Boston will consume more energy for HVAC than Phoenix will simply because heating homes in cold winters is often more energy intensive than cooling in the summer.
Conservation has to start with agriculture since that's the vast majority of usage. The simplest and most effective step would be to stop subsidizing that water usage so heavily. Last I looked the average farmer paid about 1/10 the price per gallon as residents, but it varies a lot and some pay less than 1/100. That leads to exactly the behavior you would expect: completely unsustainable high water usage crops being grown in large amounts.
If you don't mind the presentation style, https://www.youtube.com/watch?v=XusyNT_k-1c is great presents an even direr situation: because water rights are "use it or lose it", we are actively encouraging water misuse, beyond just "it's cheap enough to misuse it".
> Generally speaking similarly sized homes in Boston will consume more energy for HVAC than Phoenix will simply because heating homes in cold winters is often more energy intensive than cooling in the summer.
This is true, and I definitely agree that the majority of the work to match consumption with water availability lies in the hands of agriculture.
With that said, it's important to recognize that the CO basin states (AZ, WY, UT) have some of the highest per-capita domestic water use figures in the nation - far above the national average.
Not sure about the other two, but it might be the ubiquitous swimming pools in AZ. The evaporation in an AZ pool during the summer is dramatic. You need to have a pool water leveler on 24/7 or leave the garden hose trickling constantly.
While somewhat common, I wouldn't use the term ubiquitous... When I grew up in Casa Grande, I didn't know anyone who had a pool, and most of my friends and I would ride our bicycles several miles to the public pool. My grandmother's neighborhood in Phoenix had two houses in a couple blocks that had pools. The street I live on today has two houses (of a couple dozen) that have pools.
There are more wealthy neighborhoods where it's closer to 1 in 4, but again wouldn't call that ubiquitous at all.
That said, I think that some of the farming use is excessive and should lean into regenerative agriculture over the more wasteful use of chemical fertilizers and desertification over time only taking away and eroding soil.
In my very middle class suburban neighborhood in Gilbert, you are definitely an outlier if you don’t have a pool. If I look at an aero google map of my street and the street on both sides of mine there are 18 houses out of 91 that don’t appear to have pools. A few of them have so much tree cover that I can’t tell if there is a pool or not so I counted those as a no.
Define middle class here... The median income for 2022 in Arizona is $38k/yr, your neighborhood is most likely well within the top 10% of income earners in the state. Most people don't get that.
No offense, but Casa Grande was, until recently, a modest farm town with modest incomes. Ag labor just doesn't pay that well and pools are expensive.
In contrast, here's a random middle class neighborhood in central Phoenix[0], the fifth most populous city in the US of A. You'll notice some of the streets have a pool in every single backyard. When you zoom in on the higher income neighborhoods, like in Scottsdale and PV, it's rare to see a backyard that doesn't have a pool.
MOST people don't grow up in higher income neighborhoods. While it may seem odd to you, who probably makes well north of $100k/yr on your salary alone, let alone a spouse/partner... Most houses, in most of the Phoenix area don't have pools.
Yes, and we should be investing heavily into technologies and techniques that maximize the efficiency of the water that is consumed by agriculture. The government should probably subsidize the expense of the conversion. We should also get rid of any "use it or lose next year's ration" rules that are in place which cause some farmers to literally just run water out of their pipes to ensure they're recorded as having "used their allocation" and therefore "still require that much next year".
Using a normal common law water rights system is literally prohibited by some state constitutions (e.g. AZ article 17). It would take a movement on the order of civil rights to fix water rights.
Humans should have access to a consistent generous ration for hygiene, drinking water, and moderate home gardening. I think it's reasonable to cut people off (during a major drought with rationing) when they start focusing on trying to maintain large lawns, golf courses, swimming pools, etc.
I always find these comments about interesting, having worked at Facebook and Google, I never quite felt this way about Google's Monorepo. Facebook had many of the features you listed and quite performantly if not more so. Compared with working at Facebook where there are no owners owners files and no readability requirements, I found abstraction boundries to be much cleaner at FB. At google, I found there was a ton of cruft in Google's monorepos that were too challenging / too much work for any one person to address.
OWNERS files rarely get in the way - you can always send a code change to an OWNER. They are also good for finding points of contact quickly, for files where the history is in the far past and changes haven't been made recently.
Readability really does help new engineers get up to speed on the style guide, and learn of common libraries they might not have known before. It can be annoying - hell, I'll have to get on the Go queue soon - but that's ok.
I think I have heard similar things from other googlers and I think there might be two factors on why I think this:
- I worked on Google Assistant which was responsible for integrating many services. This meant I had to work with other peoples code way more regularly that many at google.
- I moved from FB to google - I'm not really sure how many people have had this experience. I think many of my colleagues at google found it surprising how many of the things they thought were unique to google actually also existed at FB.
At the end of the day, any of these processes have pros/cons but I think the cruft of having APIs that are a couple steps harder to evolve due to finding Readability/Owners for everything you touch just makes things slightly less cohesive and a trickier place to have a "good" codebase.
When I worked at FB, I would frequently rebase my code on Monday and find that, for example, the React framework authors or another smaller infra team had improved the API and had changed *every* callsite in the codebase to be improved. This type of iteration was possible in certain situations but was just much less common at google than at fb.
> I think many of my colleagues at google found it surprising how many of the things they thought were unique to google actually also existed at FB.
Google workers are groomed to believe Google is the best, and hence they are too. A corollary of that, then, is that nobody else has it that good, when in fact, others sometimes have it better.
I also made the move from FB to G and echo everything said above. Googlers have a massive superiority complex. In reality, it's naiveté.
My 2 cents: OWNERS is fairly useful, if only as a form of automating code reviewer selection. Readabilty is a massive drag on org-wide productivity. I have had diffs/CLs take MONTHS to be approved by every Tom Dick and Harry whose claws were added to my code and made me re-design whole project approaches, and they were only there because they're supposed to check if my new-lines are in the right spot for that language. I thought about quitting.
People really underestimate how much productivity drain there is in having a bad code review culture. One of the worst things about working at Amazon was that any feedback on a merge request, no matter how small, required you to request a re-review.
It's not culture (organic), it's systems (planned): if you don't, on day zero, agree what code reviews should cover (and what not), then code reviews are a pissing contest first, and a useful tool second.
I noticed a lot of people don't understand both the limitations of code reviews, and what issues they can and should solve. Writing good critique (of anything, not just code) is hard, we don't train people to do it, and usually don't even regard this as something that needs training and understanding.
Going from FB to $REDACTED to Oculus was a pretty wild ride, there were a lot of different cultures, though I think generally speaking the best qualities filtered through.
The system still grooms Googlers to think they're better than though. Until that root cause would be fixed (which would have to be at a huge cost to Google, so no surprise it'd never be), nothing would change.
In Google Cloud at least, we're quite aware we're not the market leader, so there's a pervasive humbleness. We're proud of certain technical achievements (ie: Spanner), but the world can catch up quickly (ie: CockroachDB, Foundation, CosmosDB, etc). This might be a departure from the feeling in older divisions of the company, haha.
Huh? Facebook has a lot of that infra because ex-Googlers built it there. It takes an insane amount of delusion to notice something common between a father and a son and say that the dad inherited it.
This isn't true at all for OWNERS files. If you try developing a small feature on google search, it will require plumbing data through at least four to five layers and there is a different set of OWNERS for each layer. You'll spend at least 3 days waiting for code reviews to go through for something as simple as adding a new field.
I agree that it could be worse! Facebook has significant (if not more) time spent and I found adding features to news feed a heck of a lot easier than adding features that interacted with google search. Generally a lot of this had to do with the number of people needed to be involved to ensure that the change was safe which always felt higher at Google.
I'm only an outside observer in this conversation but could it be that the review process (or the lack thereof) and the ease with which you can add new features has had an impact on the quality of the software?
The thing is, in my experience as a user Facebook (the product, not the former company) is absolutely riddled with bugs. I have largely stopped using it because I used to constantly run into severe UI/UX issues (text input no longer working, scrolling doing weird things, abysmal performance, …), loading errors (comments & posts disappearing and reappearing), etc. Looking at the overall application (and e.g. the quality of the news feed output), it's also quite clear that many people with many different ideas have worked on it over time.
In contrast, Google search still works reasonably well overall 25 years later.
There are pretty different uptime and stability requirements for a social product and web search (or other Google products like Gmail). When news feed is broken life moves on, when those products break many people can't get any work done at all.
One of Google's major cultural challenges is imposing the move slow and carefully culture on everything though.
I have the same background: I find the code quality at G to be quite a lot higher (and test pass-rate, and bug report-rate lower) than News Feed, which was a total shit-show of anything-goes. I still hold trauma from being oncall for Feed. 70 bugs added to my queue per day.
The flip side is of course that I could complete 4 rounds of QuickExperiment and Deltoid to get Product Market Fit, in the time it takes to get to dogfooding for any feature in Google.
Huh, also having worked at both I had exactly the opposite experience. Google’s tools looked ugly but just worked. At meta there were actually multiple repos you might have to touch and tools worked unreliably across them. Owners files made sure there was less abandoned code and parent owners woild be found by gwsqueue bots to sign off on big changes across large parts of the repo by just reading these files.
Same and another vote for meta. Meta made the language fit their use case. Go into bootcamp change the search bar text to ‘this is a search bar!’ press F5 and see the change (just don’t ship that change ;D). It’s incredibly smooth and easy.
Googles a mess. There’s always a migration to the latest microservices stack that have been taking years and will take many more years to come.
Like meta just changed the damn language they work in to fit their needs and moved on. Google rewrites everything to fit the language. The former method is better in a large codebase. Meta is way easier to get shit done to the point that google was left in the dust last time they competed with meta.
I think what you're saying is true for www, but not fbcode, and the later starts to look a lot like google3. I agree though, Meta's www codebase has the best developer experience in the industry.
I think we now know about a lot more externalities of this kind of logging than we did generations ago. For example, much of the hydrological basis that provide drinking water to the Seattle metro area was aggressively logged 100 years ago which impacts the hydrology of the basin in a negative way for the purpose collecting drinking water.
I think extra knowledge in the space of the environment often leads to indecision which is certainly it's own drawback but I think these choices are not without tradeoffs that should be acknowledged.
I'll admit I'm not up to speed on the affect of the aggressive logging of Seattle on the hydrological cycle. From the little time I've spent there, I managed to gleam one small fact, that by the 1950's the Seattle city sewage system was dumping up to 50 million gallons of sewage into the Puget sound (per day).
I'm only a software engineer and have no understanding of hydrological cycles, but I suspect the evaporation and precipitation of that alone would have an impact on the nearby watersheds used for drinking water.
Everything has externalities though. Don't get me wrong I'm all for a forest going unlogged, but we will replace those resources with something else.
We still use lumber, if it isn't locally harvested we buy it from another part of the world, outsourcing those externalities and throwing in all the extra costs of shipping, labor overhead for the various middlemen, customs, etc.
My point isn't that we're screwed and should just chop down forests because we're damned if we do and damned if we don't. But saving one forest won't fix anything by itself and could very well make things worse if we don't do it by simply reducing the number of resources we consume. Paying someone else to own the externalities will never help.
> and the size of the forests are mindbogglingly vast.
The size of the forests isn't really relevant; compared to lumber demand, they're mostly insignificant. Humans have never had a problem wiping out local forests.
What matters is how much wood a forest can produce per year, not how much has accumulated over the course of the past.
There is plenty of timber in northern California and southern Oregon, these regions are actually temperate rain forests, that are harvested sustainably and aren't old growth forests. Every 30-50 years, (depending on species: redwood or doug fir) the same tracts of forest can be logged again and again.
Once you get further north the taigas are colder and slower growing and may take 200 years or longer to grow back.
The best thing for the USA is to use the resources which supports jobs in logging, wood processing, transportation, and have lower costs associated with transportation and fees from importation. I'm not against importing timber into very northern parts of the US but there is no reason to ONLY use canadian timber.
The boreal forest aka Taiga is quite a bit more at risk than the forests of Oregon. There may be an argument to be made that the wrong kind of firs have grown in the south of Oregon (they're much more susceptible to fire from heat) and that logging and replacement with the right type of firs could be a win economically and environmentally. Somebody with specific knowledge would need to fact check that idea though.
You're still going to miss externalities when you limit the factors you consider. Sure, lumber sellers in Canada could do well and the US could avoid cutting downt heir own trees, but can we really assume that deforedtstion in Canada would be without consequence?
Assuming that stripping resources from other parts of the world is how we got into this ecological mess in the first place.
Washington has absolutely tons of other lumber forest that isn't part of the Cedar River watershed, and it is still harvested today. We don't need to be logging in our drinking water watershed.
While I agree that we shouldn't be logging in Washington, is the answer really that we should just pay someone else to log in their watershed instead?
We need to not be logging, period. There's a huge difference in selectively felling trees locally and commercial logging. The problem is that we have collectively grown so accustomed to immediate gratification and the appearance of unlimited resources that we've completely disconnected from how the world really works.
If we really want to fix anything meaningful it's going to take people realizing that cheap energy from coal and oil, combined with paying someone else to deal with the immediate ecological damages caused, aren't sustainable approaches to living here.
I did not say we shouldn't be logging in Washington -- just not in drinking water reservoirs (which we don't). DNR managed logging / the Campbell Global Snoqualmie tree farm seem like mostly a success.
Do you know how much logging they actually do there? I haven't kept up with that project at all and can't find any recent data.
I know when they were first proposing the project the state was going to limit them to a couple hundred acre clear cuts and Campbell had their own limit at less than that. Unless that number increased dramatically, I'd say the project is a success mainly because they just aren't logging a meaningful amount of timber at all.
Someone actually just clear cut a few hundred acres down the street from me before the locals ran the investor out of town. It's terrible to see it cleared and it's basically just a massive, open, festering wound now but st the end of the data a few hundred acres of timber is a drop in the bucket relative to what we actually consume.
Trees effectively behave as a buffer for the water cycle. Under conditions of high rainfall, they absorb a lot of water from the ground. Under conditions of low rainfall, they must nonetheless continue to release moisture via evapotranspiration, which promotes the production of rain in dry conditions. This mean that forested locations are more resistant to both floods and droughts.
Tree roots also greatly reduce erosion, which means that rainfall is more likely to end up in a few well-established waterways and less likely to be spread across numerous tiny rivulets; it also means that waterways are less likely to change shape over time. Large, stable waterways are much easier to collect water from (e.g., via dammed reservoirs) than small or unstable waterways.
However, I'm not sure how much of a factor erosion is here. I suspect it will only directly impact small communities that rely on minor waterways for drinking water. (Mind you, when small communities connect up to city water because their existing water supply becomes unreliable, that can affect the city's water supply.)
Sure, but this region is still heavily forested -- just with second-growth forest (trees ~100 years old and not older). Is there a significant difference between old growth and second growth for this purpose?
From the recreational perspective (summer hiking / back-country skiing) - the forests are night and day difference. Having a map of what has been logged and what has not often is the difference between forests that are easy to travel through and forests that are harder. A forest that was clear cut will have trees that are much denser and tightly packed together but tend to be smaller in diameter. These are very hard to travel through (hiking or skiing) compared with the old growth forests.
I don't know a ton about forest ecology but my sense is that trees that do the best are a function of what's already there and that it takes much longer than 100 years for the pre-clear cut conditions to return.
The Cedar River Watershed is not open to recreational use (it's a protected pristine watershed that supplies drinking water to Seattle and surrounding suburbs). The difference for that use is not relevant here.
A forest with larger trees and more extensive root systems will have a stronger effect than a forest with smaller trees with less extensive root systems.
Different tree species can also be more or less effective at functioning as a buffer. "Thirsty" trees typically do a better job of taking up water when it's wet and continuing to release water when it's dry. (Unfortunately, many new tree plantings favor more drought-resistant trees because they are easier to grow in clearcut fields, which are drier than forests.)
So, because I don't know much about this region specifically, these are my two questions:
Has there been a change in tree biomass in the region?
Has there been a change in the tree species makeup of the region?
They are most likely referring to the Cedar River Watershed, which supplies 70% of Seattle's water. While the city spent the 20th century buying up all the land so that it is now a protected wilderness area, plenty of logging happened during that time and less than a fifth of the old-growth forest remains. You can read all about it here:
Specifically, here is the forest management plan, which goes into great detail about the current conditions, their effects on the water cycle, and the long term objectives:
I spent a few years in my 20s volunteering for ecological restoration projects in the watershed. We dug up old logging roads, removed invasive species (Japanese knotweed, ugh!), deconstructed landscaping left over from abandoned small towns, did erosion control along creeks in logged areas, restored riverside habitats, and planted lots of native trees and shrubs. I am not in touch with the organization anymore, but I'll always feel some pride in the work we did and a sense of connection to the place.
Yes, I assumed they were talking about the Cedar River Watershed (Chester Morse lake basin).
> Specifically, here is the forest management plan, which goes into great detail about the current conditions, their effects on the water cycle, and the long term objectives:
This is a 130 page document. The first few pages mentioning old-growth forest are mostly discussing habitat for fauna. Is there a more specific part of the document discussing hydrological impact?
> I spent a few years in my 20s volunteering for ecological restoration projects in the watershed. We dug up old logging roads, removed invasive species (Japanese knotweed, ugh!), deconstructed landscaping left over from abandoned small towns, did erosion control along creeks in logged areas, restored riverside habitats, and planted lots of native trees and shrubs. I am not in touch with the organization anymore, but I'll always feel some pride in the work we did and a sense of connection to the place.
Very cool! My only connection to this is that my mom worked for SPU in drinking water.
What size should a parking spot be? Who should pay for a larger one?
Over the last number of years, it feels like large vehicles have become more commonly used but most built infrastructure remains the same. I notice this at the city owned parking lot a couple blocks from my house - the whole thing becomes very difficult to navigate with the length and width of some modern pickup trucks/SUVs. It sounds like it's not a big enough issue in this parking lot (perhaps because it's only one large vehicle) but it would become an issue if everyone were driving a cybertruck there.
Should parking spots in parking lots be made bigger? This would probably mean fewer spots / more expensive. Should cars be split into "standard" and "XL" classes? XL spaces or lots that support XL cars could be priced accordingly. People purchasing vehicles would have a better sense of "oh this car won't be allowed in many parking spots.
I disagree with this approach. Forced compliance by the “stick” approach I think never wins against a compelling “carrot” approach. Some people are reflexively defiant to what they see as arbitrary punishments for personal choices. Incentivizing smaller and more efficient cars is better. This is an electric truck, it’s still big, but would the net positives of someone buying an electric truck be better than a gas one regardless of size?
Not really. It’s not just the massively inefficient ICE drive train that makes large trucks dangerous and a poor fit for sharing roads with smaller cars or bikes.
It’s also the ponderous size the lack of visibility and sheer weight.
Cybertruck has all the rest of the problems. It’s in fact bigger and heavier than most trucks.
On a related note, many economists argue public free parking should not exist. People should have to pay the real cost of providing it, which includes maintenance, but also the opportunity cost of not using that land for something else. Therefore larger spaces should cost more.
Even "standard" cars have become bigger in past decades. In many countries the norms of width and length of parking spots have been updated. So it is not just pickups, but pretty much all cars outside very few models like maybe Smart.
From the article:
> Whereas the Netherlands clearly differentiates roads and streets — as do Germany, Spain, and France — the US is known for having “stroads,” roads where cars reach high speeds yet must also avoid drivers entering from adjacent businesses and homes. The majority of fatal crashes in American cities happen on these “stroads,” and impact pedestrians and cyclists in particular.
I think this will be _more_ important with autonomous driving. We've developed a built environment where car through traffic and destinations are co-mingled which leaves very little room for people to actually experience their destinations when they get out of their vehicles.
Perhaps I'm wrong, but my expectation is that the problem of "stroads" will only become more apparent if less focus is placed on getting from point A to B and more on where a person is trying to go which is my current long term expectations of the impacts of autonomous vehicles.
reply