Working my way up to calculus for the first time in my life, at nearly 40 years old. I've always hated math :(
When I was a kid, they always told me math would be super useful, especially if I liked computers. Well, 20+ years of a dev career later, I still have never used anything more than basic arithmetic and rudimentary algebra (to calculate responsive component sizes). But with web dev jobs going the way of the horse-drawn wagon, I figured it was time for a career change. Hoping to get into (civil/environmental) engineering instead, but I guess that field actually does use math, lol. We'll see how it goes...
In the meantime, also taking singing classes at the community college, and enjoying THAT way more. We performed at a nursing home a few weeks ago, and that brought SO much joy to the audience there, even though we're just a bunch of amateurs. It's just such a different reception than anything I've ever seen as a dev. Tech rarely inspires such joy.
If I could start all over again, I wish I would've pursued music over computer stuff. Much harder life though!
It sounds like you might enjoy my book on MACH+CALC, which comes with prerequisites included (a summary of the essential parts of high school math requited to understand calculus).
Last but not least, I highly recommend you check out the computer algebra system SymPy, which provide functionality for doing all kinds of basic algebra (`solve`, `simplify`, `expand`, etc.) You can use it to invent practice problems for yourslef, check answers, etc. It also has useful functions for calculus which you will need when you get to that. Here is a little tutorial on it:
https://minireference.com/static/tutorials/sympy_tutorial.pd...
Always just saw it as the handful of operations and a handful of generalized objects (sets, variables, etc) that represent data. So knowing those parts is all there is. Contrived compositions used in school to prove ourselves to some teacher just seems like patronizing helicopter parenting type vibes.
Went at most of my math classes with that in mind and didn’t really worry about relating it to past things
I don’t really buy into a notion like “more abstract” or “less” abstract. Everyone is its own abstract.
I dunno. That philosophy has worked for me so far shrug
> Does math have many prerequisites? [...] handful of operations and a handful of generalized objects (sets, variables, etc)
This handful of MATH is precisely what I was referring to as the prerequisites: notation, numbers, sets, equations (and the general procedure for solving them), functions, and a bit of geometry. I agree that with these concepts in place, you can tackle A LOT of topics, possibly filling in other knowledge in a just-in-time manner. This is what I find very fascinating (and a big part of my startup's mission): the fact that (re)learning a handful of math topics opens so many doors, that currently a vast number of adults are keeping firmly shut because they believe they are not "math people."
* Sorry for the confusion: there is a typo in my comment, I meant to write my book is on MECH+CALC, and the prerequisite are the handful of MATH topics you listed.
Have you heard about stokes theorem? Or Fourier Transforms? Those aren't particularly high level math concepts, but they have years worth of math prerequisites, without prerequisites there is no way to understand such things properly, and there are things that are way deeper with more prerequisites than those.
And those aren't some arbitrary equations, they are core concepts that are very useful in so many situations. There is a reason those are taught to most engineers.
I highly recommend the online book "Calculus at Moravian University", it does a pretty good job in building a fundamental understanding of calculus from the basics to multiple integration and vector calculus. I also used Joplin to take notes while reading, it's a notebook app where you can also use LateX (admittedly, many people won't like reading and taking notes on the phone, but I like it since it gives me a chance to learn even when you can't easily use books and laptop)
I'm taking a class at the local community college. It's OK, just some class notes and an e-textbook that I never bother to read. The pedagogy is much as I remember it, the teacher going through the problems on a whiteboard and expecting everyone to rote-memorize things.
But mostly, I'm learning from ChatGPT. You can enter (or take a picture of) any problem and ask it to break it down step by step and it does that very well, and explains it better than most resources I've found. There are some OK YouTube or KhanAcademy videos too, but overall I prefer ChatGPT for its higher signal to noise ratio.
At home I'll usually ask ChatGPT to explain the first one of a problem type, then try to do it again on my own and double-check it against the posted answers. For subsequent problems, I'll do it myself first (pen and paper or iPad), check it for correctness, and then ask ChatGPT for a breakdown if I screw up. I can usually tell it the mistake I made (i.e. how come I got X in step Y) and it can often correctly guess and explain where I went wrong.
Some examples (keep in mind that I'm still working up to calc, so still in pre-calc right now!):
Overall, I find this method of learning math (by rote memorization and parroting) very unsatisfying, and I'm unable to retain most of it in long-term memory. A few days after I learn anything I already forget how to do it. I ended up with an A in the class mostly just cramming the night before + morning of, using ChatGPT and class notes to refresh myself before the tests.
But IMO it's a terrible way to learn (at least for me) and part of why I hate math. I never really learn the whys and wherefores of anything, it's just a bunch of magic shortcuts and black-box algorithms that I have to memorize and re-use without any actual understanding. It's the educational equivalent of solving every problem with someone else's function/library :( I have no idea why anything works the way it does, only that I must remember it and re-use it exactly.
If anyone has a better approach to learning and retaining this stuff, I'm all ears!
Memorization can be a useful part of mathematics, but that's more true of higher level courses where you might expend some effort in memorizing definitions to have them at hand to use in proofs.
But most of the time memorization is a side-effect of just solving lots of problems, which is the most effective way to learn mathematics in the same way that writing programs is the most effective way to learn programming.
I'd highly recommend Cal Newport's books on studying [0] and his older blog posts on effective study habits [1]. Barbara Oakley's Learning How to Learn course on Coursera is also excellent [2].
In your current context, I'd probably just focus on time management and consistently grinding problems in areas you don't find easy with things like Schaum's Problem books and Outlines. You could use software like Anki to schedule review of problems and concepts that you've solved or understood.
If you aren't constrained by time, you could also use the Art of Problem Solving books to rebuild your math foundation at a much deeper level [3].
There's a popular math book called Infinite Powers by Steven Strogatz which is about the story of calculus. It's really good and may help bring the subject to life as opposed to rote learning problems.
I’m in the exact same boat as solardev (non-traditional student, taking precalculus, relying on LLMs, getting good grades but not having any true understanding) and I came here to recommend this same book.
It’s a great supplement because it gives you a chance to understand the ‘why’ of things and not just the ‘how’. The writing style is neither too dry nor too watered-down. It feels like the piece that was missing from K-12 education.
On a side note, I’ve found the LLMs to be terrible at math, but insanely good at writing LaTeX. I’ve been using GitHub Copilot to speed up the rewriting of my class notes and I’m just gobsmacked at how accurately it can print out the steps to some calculation after feeding in the original problem.
Pre-calc mathematics can be a bit boring because it consists mostly of a lot of technicalities that aren't themselves super interesting. A lot of those technicalities are connected in ways that unfortunately become only apparent in much higher-level maths. So I can understand that it's a bit of a pain, but I also think that it gets better afterwards (at least on the conceptual level).
I'm a bit skeptical about the use of ChatGPT because it can be very off about maths. If it's something it has seen exactly like this in its training materials, it will get it right (e.g. it will spit out the correct quadratic formula). But if you ask it to solve a specific problem that it hasn't seen before with those exact parameters, it might be bogus.
It could still help, but proceed with caution. For example, there's one part where you ask it why it multiplied by 4. Your question is wrong, because it multiplied by -4, but ChatGPT, always eager to please, doesn't correct you on that, instead it says:
"Multiplying by 4 is equivalent to dividing by −0.25 because 1/-0.25 = -4" - which is self-contradictory.
There are also other services you could use for this, such as symbolab (which is rule-based), but I think it needs a subscription to see all the steps.
I would recommend actually reading the textbook. Or, if it's a boring textbook, try out other textbooks. People find different kinds of explanations intuitive / different styles of exposition engaging, so you can experiment. I think it would make it easier for you to retain the material because you would learn some of the why.
In mathematics, it's often a better strategy to understand something and be able to know what it's true, than to just memorise the result.
The LLMs completely changed education for me. Whether it's math or music theory or programming, I'd rank ChatGPT amongst the best teachers I've ever known. (There are amazing real-human ones too, but it's really hit and miss!). For $20/mo it's totally worth it, way cheaper than tutoring or buying more textbooks. But for infrequent use, probably the free plan is enough...?
I tried MIT OCW but found the videos too long and tedious... it's hard for me to just sit still and listen to videos like that for hours =/ I chose an in-person math class on purpose just to have that real community feel and a live teacher, but YMMV... probably many would prefer the online or async versions instead.
Hi! I'm probably not your target audience here, unless you're specifically developing a product that targets math education for midlife washouts, lol. But if you want some rando dude's opinion, sure, I'd be happy to jump into a call with ya. Just let me know and I'll find a slot on your Calendly.
But I think you'd get better feedback from someone who's actually in your target audience/user demographic :) I'm just an annoying grumpy gramps who hates math.
I'm actually building a tool in this space. Would you be open to jump on a 15 mins call with me and help shape the product? https://calendly.com/vel-yan/15min
There is a book, “how to ace calculus, the streetwise guide” that sounds a little gimmicky but is actually a bit entertaining to read. Looks like it can be had on thrift books or eBay for about $5. It helped me understand some of the concepts at a deeper level than math text books. The problem with math text books are they’re mostly written by mathematicians who seem to think different than the rest of us, so when they try to explain things it’s in a mathematician way.
IMO taking Calculus later in life is probably easier for most (study of 1) people (assuming you have seen some of it before). I took Calculus at 19 years old and passed but I did not have any idea what it was really used for. Taking it again during Grad School (mid 30s) Calculus made way more sense. Ideas would click (mostly not CS related) but more in the general world view (think washers for nuts and bolts – area under a curve). I will be the first to admit that making front ends and tuning a DB will almost never need any calculus (but it does happen) and admit I did horrible in high school but turned it around in college (I was behind when I got there the first time for sure). How much do I remember right this second (not much), but I do feel its worth doing as if only a right of passage to say we deserve that 6-figure salary and maybe not feel too bad about calling our self’s engineers.
Good luck with the singing lessons and new job, but if your married (or have children) do NOT quit that day job just yet If not, well hell go for it my friend!!!
Lol, I hope I get to feel that way in hindsight! Right now it's just the struggle and the grind... math feels like a bad Asian MMO where you just run around crunching numbers to make other numbers go up slowly. And if you die/miss an assignment, you have to go back a few levels and do it all over again. Sigh.
> that 6-figure salary and maybe not feel too bad about calling our self’s engineers
Hah! I've only ever briefly made a 6-figure salary, and I quit that job because of its bureaucracy. I'm just some rando dev and I would NEVER call myself an engineer (even though my job title sometimes says that). It's an insult to all the REAL ones who actually survived the math, lol.
(Frontend) web dev has always been more similar to me to graphic design than anything I would call "engineering": It's pretty-looking shiny stuff backed by spaghetti code, not anything I'd ever trust to life-and-death scenarios like dams or bridges.
Anyway, that's not the point :)
> well hell go for it my friend!!!
Hah. This is the upside of a minimal DINK life. We don't have much money (at all), but we have fun!
chatgpt is proficient enough in writing web dev CRUD app code, which leads some to believe it'll lead to obsolescence of that particular skill set, and collapse the job market and pay for that, much like the skills of a horse-drawn wagon driver, or a buggy-whip manufacturer.
How to communicate with people and get through negotiations and all of those tiny little conversations that happen day to day in our personal and professional lives with more success.
Started with Difficult Conversations last year and it was a total game changer. It has been instrumental in my professional and personal life. If I was going to share two key points for anyone it'd be to remember to listen and that you have also always contributed to the problem.
Working through Getting to Yes by the same group of folks now and it is just as great. A bit more high level but I plan to dive back into more specific areas afterwards and read through the just recently updated version of Difficult Conversations.
Want to hit Supercommunicators and Crucial Conversations later. I've decided that so many things break down, big and small, because of these seemingly small but ultimately important conversations. Always be soft on the people and hard on the problem.
Relearning statistics with a Bayesian approach. My undergrad education was in social science research methods and I spent 4 years learning the strict frequentist approach of orthodox statistics. It made sense for highly-controlled experiments and simple dice games. But it broke down horribly when faced with any complexity and I never understood why. 25 years later, it's time to fill in the gaps. My reading list:
- E.T. Janes "Probability Theory: The Logic of Science" provides the fundamental theory.
- Robert McElreath "Rethinking Statistics" provides a practical application of the theory in R.
- Andrew Clayton's "Bernoulli's Fallacy" is a non-technical book that provides historical context to the frequentist vs bayesian debate.
I'm fairly convinced now that Bayesian approaches have more mathematical rigor than the crusty old heuristics of traditional statistics. But in terms of user-experience, doing Bayesian calculations still requires more effort on model design and more compute power. It's flexible to a fault, without a well-defined workflow. There is a strong temptation to follow the easy path - shove your data into a black box and publish if p<0.05. It's going to take a generation of (re)training and improvements to statistical software before Bayesian methods are widespread.
+1 for the Statistical Rethinking book by McElreath. I recently took the 2024 edition of the self-paced course https://github.com/rmcelreath/stat_rethinking_2024 and the lectures on youtube are amazing. He's not just teaching stats, but how to do science!
Another, more basic, book on Bayesian stats is: https://allendowney.github.io/ThinkBayes2/ The author uses the grid approximation for everything, so there is no need to get into Stan or other framework.
Myself, I'm still trying to (re)learn frequentist stuff properly (will post a separate comment about that), but the deeper I go the more convinced I am that it is total crap, and my desire to convert to the church of Bayes increases...
I'm learning how to build a MMORPG. Not a complete game but a project with all the architecture in place to expand on (think Asheron's Call quality).
I started programming because I wanted to work on Guild Wars 1 back in the day, but that didn't work out and I ended up as a web developer (although with some gamedev experience). I've always thought you can't make an MMO alone and so never tried.
Recently a combination of health problems that scared be quite a bit and seeing other people on YouTube tackle these kinds of projects have motivated me to fulfill my childhood dream of learning the tech behind MMO games.
My goal is still to work on a MMO game professionally one day, but if that never works out, at least I worked on a MMO. My own.
Apropos of the other HN article on the elder mathematician who credits his success by studying the simple things until he understands them really, really well, I'm practicing drawing boxes in any/every orientation in 3D space. This includes drawing two boxes connected by a common edge - imagine a box with a lid the same size as the box itself, slightly opened.
For me this is profoundly difficult to visualize. I've taken to learning the basics of Blender just so I can create these various boxes accurately to use as reference material. It's been slow going but the progress is tangible and the process is fun.
Technical Drawing was a core required course for first and second year Engineering students (Electrical, Civil, Mechanical, Electronic) in Commonwealth countries in the 1980s and for decades before.
Descriptive Geometry was a major component of Tech Drawing, everybody in pre CAD digital age (and still, I'd say at least) needed a good grounding in blueprints, schematics, and "constructive drawing" good enough to take measures from during building.
Wow, I've been perusing books on drawing/sketching and scouring videos and blog posts for years, and none of them mentioned "descriptive geometry". Perhaps they did and I just missed it, but seems unlikely given how much I retread materials.
I started (re)learning this subject in preparation for a new book, thinking all I had to do was review what I studied in my university days, and summarize the essential ideas, but it turns out statistics is A LOT more complicated than that. It's like a black hole that you can never get out of. There is lots of historical baggage, strong opinions, unjustified rules of thumb, etc. I've been in it for 5+ years now! As a physicist, I want to understand how things work under the hood, so I can explain to readers the underlying mechanisms and not just give formulas without explanations, and this has been very hard to do. The whole thing is well summarized by this quote from Richard McElreath "Statistics is a horrible thing."
There is hope though, in recent years teaching frequentist statistics is moving toward simulation based methods, a.k.a. the modern statistics curriculum, which makes a lot of sense. Here is a blog post about that: https://minireference.com/blog/fixing-the-statistics-curricu...
I'm a senior Android developer, and I'm learning the very basics of Jetpack Compose, and the basics of Dagger for dependency injection. All of the (very large) codebases I've worked on so far have functioned perfectly well without them, and I've always found Dagger, and now Compose, to be completely unintuitive and unnecessarily complex. But this is the way the industry has shifted, so I'm forcing myself to learn these patterns, and see if I can learn to like them.
Also learning to get better on the piano, specifically improvising.
I'm learning LLM at the moment, RAG particularly. [1]
I ended up built InkChatGPT as my learning project and it was huge fun. It is an AI Agent that could help learning from multiple documents and you can chat with it, thank Chat PDF GPT.
I use LangChain as LLM framework to simplify the backend, and using Streamlit as front end UI and deployment. Using OpenAI `gpt-3.5-turbo` model, Use HuggingFace embeddings to generate embeddings for the document chunks with `all-MiniLM-L6-v2` model.
To be honest, coming from Mobile development background, learning about ML and reading about LLM models, prompt tuning and various techniques really opens my mind, but the vast information and knowing how to start is difficult.
Thank you for your compliment, I just wandering and tinkering all weekend. I am picking to update the app UI and learn more about improving the app recently.
I plan to pickup more on LLMs topic and will start experimenting with different models.
More physics and mathematics. I am currently on Lagrangian mechanics and calculus of variations. Am also learning Jupyter and SymPy for visualizarions of the same.
During my high school, only Newtonian mechanics was taught, whereas in engineering college, they introduced quantum mechanics with Lagrangian/Hamiltonian formulation, skipping classical mechanics with those two.
Self-learning via books. I've bought a bunch. Am currently going through Susskind's Theoretical Minimum Classical Mechanics [1], also looking through No Nonsense Classical Mechanics [2].
AI chatbots sometimes come to rescue when I am stuck. Not with mathematics though.
Dart/Flutter currently, for cross platform application development. Not adverse to learning native languages or the various web frameworks, I just wanted to have one codebase across multiple platforms. I discounted the web frameworks because, as a novice, I didn't know where, or what, to begin with; I didn't want to make the wrong choice.
Other than that I'm learning powershell too, mostly forced to for work, but actually it's not a bad language.
As someone newly diagnosed with ADHD (and likely/suggested, but not formally-diagnosed ASD), I'm digging into both to understand how my experience differs from others. This includes things like strengths, weaknesses, the usual stuff, but I'm interested in the common ground between "neurodiverse" and "neurotypical" brains (talking in general terms here) as well as how both "sides" can better work together. Use strengths to fill in the weaknesses, that sort of thing. Eventually I may start writing some of this down more formally, but as of right now, I'm just a guy trying to understand his own brain.
Nix for developer environment and building containers.
I’m wondering if it’s worth it to introduce to the rest of the company. We’re pretty comfortable building/“maintaining” ~400 container images, and it’s relatively fast (~3-5 min build time if no packages are changed), but there a lot of shared dependencies between all these container images, and using nix to get actual reproducible AND minimal container images with a shared cache of Linux and language-specific packages (dotnet, node, python and R) would bring in a ton of efficiency, as well as a very consistent development environment, but I won’t force all the developers to learn nix, so the complexity should optimally be absorbed into some higher level of abstraction, like an internal CLI tool.
I’m aware that the caching of dependencies can be improved, as well as creating more minimal container images, but it’s tricky with R and Python in particular, and then I figured why not just to balls-deep on nix that actually solved these issues, albeit at a cost of complexity.
Hey!
I love Nix, and I've been using it as my daily driver for more than 1 year.
There is a lot of people putting a lot of energy on documenting and explaining, but the current recommendation is _suffer_.
For Docker, you could start here in this HN thread [1], for NixOS and flakes there is a video series and git [2] I used at the begining which I liked.
I wanted something a bit more 'complete', so if you will you can read through my nix repo [3].
I have built Portry and Go applications (that I'll push to nixpkgs at some point) and GCP images for VMs, so if you need a reference for that just ask :)
I'm looking forward to making a few sculptures that we can also use in our garden, as well as some Frankenbikes, though that might be a longer-term project - I don't want to spend money on it, so I'm hoping to source all of my parts from the trash - so far, I'm up to four frames, and my marriage is still intact!
Be sure to check out Atomiczombie.com, if you have not already. Definitely THE resource for frankenbike/recumbents. I've built two recumbents. One directly from their plans (the Meridian) and another one of my own design, a trike based on the Meridian. Bike hacking is a lot of fun!
Analog synthesis. Using MiRack on my iPad mostly. Yes, that is a digital emulation of analog circuits I know but it is a great learning environment. Trying to get my head around feedback patches and self sustaining/generating patches. Saw a few videos of a guy using a Serge system to explain ideas with Cybernetics and it has really piqued my interest.
I currently own a Lyra 8 synth and it is teaching me… something about FM synthesis and feedback. I also have a Vanilla synth from STG/Musonics on order and hope not be up to speed on the basics before I get it in June. I Will at some point probably get a serge system because it seems the most amenable to analog explorations at their most basic.
There's certainly a lot of interesting things happening with Rust but I'm one or two problems away from deciding that Rust isn't a viable replacement for the situations where I use C.
For me learning Rust isn't about "OMG this is my #forever-language", it's more about presenting me with a new way of thinking about and through problems. I code 99% in Python at my day job and when I made the effort/struggle to learn Clojure a few years ago it fundamentally altered how I think about problems. This is the same result I'm looking for from Rust. Basically just adding more tools to the mental toolbox.
It will not be as big a change in thinking as Clojure, I am quite sure, but the type system usage is where the change will comw from. That and perhaps borrow checker.
I suspect if you're looking at C++ as a viable language for the problems you're solving, you're solving quite different problems from me, so take this with a grain of salt.
What I wanted from Rust was basically a strongly-typed C and more modern tooling. I'm open to more checking than that, depending on what cost it has.
What I'm getting is strong typing and more modern tooling--cargo, for example, is excellent. But the borrow checker is extremely invasive, and `unsafe` doesn't actually make it less invasive. There are a number of cases where I still haven't figured out how to get the compiler to not complain about the lack of a `.clone()`. Borrow semantics are... pretty good, as it's basically enforcing a pattern I use frequently anyway, but the exceptions to that pattern are critical and working around the borrow checker causes more problems than it solves. It's possible I just don't know how to make "the Rust way" more workable, but if that's the case I am having a hard time finding how what "the Rust way" is for the things I'm trying to do.
That said, I can see how Rust is a great tool for what it was originally intended for (writing a browser). That's just not what I'm doing.
Also, unsafe is not meant to ease the borrow checker pains, that's like using void* everywhere in C because you don't know how to type a function pointer. Unsafe is meant for places where rust simply doesn't know better, like reading memory mapped registers.
I have given it a very cursory look, but ultimately it doesn't fit my needs for reasons unrelated to the features of the language. Specifically, I'm releasing my C work under the GPL and would not like to write off the possibility of integrating into GNU. Zig doesn't have a GPL implementation (Rust has one which is committed to reaching maturity).
> Also, unsafe is not meant to ease the borrow checker pains, that's like using void* everywhere in C because you don't know how to type a function pointer. Unsafe is meant for places where rust simply doesn't know better, like reading memory mapped registers.
Eh, one could argue that some of these "Rust simply doesn't know better" situations are often borrow checker pains. But in general, I'd agree that bypassing the borrow checker doesn't seem to be the point of `unsafe` in my limited experience.
But the bigger point I'm trying to make is that there doesn't appear to be any solution to some of the borrow checker issues I've run into. There are things you can do in C, which are strongly related to the reasons I'm using these low-level languages, that appear to be impossible in Rust.
Keep in mind the caveat that I'm new to Rust, so there may be some solution I'm just not aware of.
Yeah I think understand the sort of things you are talking about. Often times we have a "proof" that our code is safe in C. Maybe you are sending a pointer to a thread and then want to read from it at the end of your main thread. _You_ know it is safe because you made an informal contract of when that other thread stops using that pointer, but to Rust? informal is not enough. The hard part is knowing how to formalize all of those contracts. Send, Sync, 'static, all of those are a real pain to understand and know when to use correctly, but when you do it, you are formalizing those contracts. Now you don't just think your contract is held, it is _proven_ by the compiler.
This is a bit dated, but if you are interested in general electronics (rather than digital electronics), you can try https://repairfaq.org/.
Digital electronics is in many ways much easier, but the analog part tends to sneak in everywhere, especially when you find your digital stuff isn't working. I think it's hard to design anything new, even digitally, without at least a working knowledge of how Ohm's law, LRC circuits, transistors, diodes, op-amps, etc. work.
I don't think it was this course, but I recall checking out an impressively thick book meant for US Navy technicians. The advantage there is that the assumption (unlikely college textbooks), is that you are intelligent but not college educated, and you need to know how to work on nearly anything electronic. Again, the focus is mostly the analog side. https://archive.org/details/NEETSModule01/mode/2up
Just the github repo at the moment, specifically the "How to make the device" page[0]. I had to search on youtube to learn what a breadboard does. I watched this one[1].
Do you (or anyone else) have any suggestions for hands-on resources? I have Practical Electronics for Inventors, but you don't build anything until Chapter 7 (Page 551). I learn better when I'm making things.
Jazz Guitar, after spending years listening to jazz, decided to start down the path of learning to play it in my thirties. I already played some guitar, but over the past year, I've learned so many things about theory, chord construction, inversions, phrasing, time feel, and have been doing a concerted effort in improving my aural skills to lift other players' music by ear and transcribe their playing.
I did find that the steps that I took to get to intermediate all helped immensely. Learning triad shapes all across the neck, learning all the notes in the neck. Learning the basics of theory for chord construction, and learning how to build chords in the guitar neck. Then focus on time and keeping a the forms of songs.
Like any completely new area. I'm exploring my options, connections and hacks that will help me to skip the muddy parts quicker. Mostly clueless and anxious atm, but it's been only a couple of months since the idea became an intent. Whole last year I was doing nothing, restoring from burnout.
The wide world of esp32 and bringing an IOT product to market. Espressif, the manufacturer of esp32, is a leader in the embedded category by offering a robust sdk and extensive documentation, yet there are many foot cannons and their documentation is misleading, creating a lot of friction in the process of productionizing a device. There's so much involved other than firmware development.
Hardware production is wonderful/terrible and constant learning. A lot of people get tripped up not understanding that the manufacturing process is a product equal to your actual product.
Hit me up if I can help. I've brought a half-dozen devices to market; right now I'm bringing a modular data logger to production and blogging about it at https://blog.supermechanical.com
Surprised to know nobody mentions reinforcement learning here.
Bought three books (in their transitional Chinese edition), whose original titles are,
* Reinforcement Learning 2nd, Richard S. Sutton & Andrew G. Barto
* Deep Reinforcement Learning in Action, Alexander Zai & Brandon Brown
* AlphaZero 深層学習・強化学習・探索 人工知能プログラミング実践入門, 布留川英一
None of them teaches you how to apply RL libraries. The first is a text book and mentions nothing about how to use frameworks at all. The last two are more practice oriented, but the examples are both too trivial, compared to a full boardgame, even the rule set is simple for humans.
Since my goal is eventually to conquer a boardgame with an RL agent that is trained at home (hopefully), I would say that the 3rd book is the most helpful one.
But so far my progress has been stuck for a while, because obviously I can only keep trying the hyperparameters and network architecture to find what the best ones for the game are. I kind of "went back" to the supervised learning practice in which I generated a lot of random play record, and them let the NN model at least learn some patterns out of it. Still trying...
Do you have any recommendations for learning resources for marketing/advertisement/SEO ? I want to get into that, but it's a very noisy domain with lots of "experts" selling their knowledge, so difficult to know what is good or bad.
I find Gnome 46 to be very promising (proper font rendering on hidpi displays as well as an actual tray, finally) so I'm learning GTK4 + libadwaita app development.
1. How to administer a secure network for a property with multiple vacation rentals. I bought a bunch of Microtik components so I’ll be learning RouterOS and how to setup VLANs.
2. Ray tracing in a weekend but in Rust.
3. How to return to the job market after 4 years of off and on freelance and being a caregiver for sick parent.
And also reading the book on Modern Manufacturing Techniques by Groover.
I work as a Software Engineer but somehow I haven't had the itch to write any personal software or work on side projects for some time now. Looking to expand my toolset and get my creative side going again in a new space.
Are you aware of python first libraries like build123d and cadquery?
build123d has an improved API over cadquery, excellent docs and no dependency on conda. Best tool for modern parametric python CAD.
I am trying to learn Erlang. The idea is to write a CRUD app which is a multi-user PWA app completely without framework. For front end, HTML, CSS and JS with reefjs for reactivity (these are are things I already know).
The idea is to experiment and stress test with the idea of write once run forever to the max. See how far can I get. Any resource for Erlang will be much appreciated.
Currently using exercism and Joe's book of Programming for the concurrent world 2nd edition. I guess beginner topics are covered. More interested in advanced erlang topics for which resources seems hard to come by. Specifically related to security since I would want to know how to go about the security of the backend using erlang.
I do intend to use cowboy. I guess I should have been more clear. There are definitely parts which I don't want to reinvent for obvious reasons.
But yeah. Product logic and security without using frameworks like Chicagoboos, Nova framework or Nitrogen is what I meant when I said no framework. Replacing any possible crypto parts or critical libraries like cowboy is not the intend.
yeah. you can get pretty far with it. i remember doing a similar experiment where I used just cowboy and plug without a framework (ie pheonix) when playing with elixir.
That in Europe you must NEVER be a contractor for those companies living off public tenders, whether they be companies winning contracts for wastewater plants or companies producing "research"
What are you using to review/relearn basic math? I'm asking because this is the space I'm working in, so I wonder what "products" people are hiring to do this "job."
Tbh I don't have any specific path how to do it right now as I started doing this just recently. Currently it's something like: "oh, I don't remember how to do this, I'll check it on wikipedia/yt/<some random page found in google>". But this way I can't relearn things that I simply forgot that it exist.
I'm building a tool that helps with self-directed learning. Would you be keen to jump on a 15 mins call with me and help shape the product? https://calendly.com/vel-yan/15min
I'm learning PCB design, and embedded Rust to go along with it. Building a flight computer for an RC plane (I've been working on laying out and programming the computer for almost a year now -- haven't even started the airframe.)
It's been very fun. I keep saying that I'm going to do a blog series about it, once I get it in an MVP stage.
The state of embedded Rust is also progressing very quickly underneath me, which is nice. Writing drivers is much easier now than it was at the beginning of the project.
Sick! I’m an embedded software engineer, but I want to grow my PCB design skills as well. Can you suggest some resources on this topic that you’re finding most valuable?
I'm in embedded also, I started this journey at work doing board bring-up. I wrote drivers mostly, not doing any direct electrical work, but after reading the schematics I thought, "oh wow, so that's how they do schematics professionally," followed by "wait, I can do that!" Before that, I just did a few small things with Arduino.
The Youtube channel Phil's Lab has a really good series [1] on PCB design focused on STM32 series chips, that is where I started.
I also got a lot of good information in this thread [2]
And I got feedback on one of my designs on the PrintedCircuitBoard reddit, which is very active and full of amateur and pro-am PCB designers getting feedback. You can post a schematic or layout and people will leave high quality critiques.
For vendors, I have had 2 runs made with JLCPCB (with each run having flaws; some bad, and some not too bad; my board is fairly complex/ambitious so that's part of it). I have heard PCBWay is similarly cheap and good. No quality issues but I've kept traces, spacing, and components on the larger side.
I'm not good at PCB design but am having a blast with it.
Real Analysis. I know Baby Rudin is probably the gold standard, but Jay Cummings' Proofs and Real Analysis have been great for people like me delving deeper into mathematics.
Swift and iOS development! I've been lucky enough to spend a lot of my career being able to get my hands dirty with a bunch of different projects and at a lot of layers of the stack, but I never got deep into app development.
Now I'm diving in and scratching that itch! It's also been great because I've been able to start looking beyond that to making things for the entire apple ecosystem. It's also just been so good to dive into something without any work pressure there!
Happy too! So my main resource is Hacking With Swift (https://www.hackingwithswift.com), specifically the 100 days of SwiftUI Course. It takes you over Swift and SwiftUI. I've paired this with the official Swift site (https://www.swift.org) so I can dig into the language more, and Apple Documentation where appropriate to get used to using the tools.
In terms of finding it, it was a bit of a shot in the dark. I did some poking around and this popped up the most, specifically because I was looking for iOS specific materials. I'm sure if you want to make cross platform apps there's probably a whole host of great resources!
Happy to speak on that! I mostly wanted to dig into something I hadn't done before, with a whole new set of tools to learn. In this case, I had been super interested lately in the Apple ecosystem/platform. I'd spent some of my career in .NET/Windows land and got to see a bunch of stuff there and how it all worked, so I figured why not try out what Apple has to offer.
I also felt like since I have no real intent to try and turn what I'm learning into something that makes money, I could go a little crazy/niche and dig in.
This all being said, If someone asked me about making an app as something they want to release/make money off/turn into a company I'd fully be pointing them at tooling like React Native.
Piano, I brought a second hand numeric piano Roland FP-30, and I use an old iPad with the app Piano Marvel connected in bluetooth to get the midi output into the app so it can grade me (if I play the correct note at the correct time), I also use a simple audio mixer to get the sound generated by the piano and the app into a single headphone.
I tested pretty all piano app and even if it is not the slickest or most efficient, it has the most progressive learning plan of all.
Rust, CUDA and JAX - I'm guessing one or more of these will be in increasing demand with a relatively high barrier of entry compared to other software tech
I have found learning JAX to be rewarding. Of course this depends on the work you do - but if you want to occasionally code out problems which involve some kind of an optimization of a differentiable quantity (without you having to work out complex gradients), JAX is perfect.
Honestly not really, but what I work on is bottlenecked by api calls and not actually DDOSing our apis.
The other day I wanted to create a 10GB file with a perl one liner and it was taking over 10 minutes and doing some napkin math I figured out that should only take 4s at 1 character per clock cycle, so something is wrong (it was flushing every single character). I tried versions that flushed less and could get it to 7s by printing chunks of 1MB. That's ultimately the goal of the class, get a rough idea of what should be possible.
What I really like about that class though is it really makes you realize even assembly is a high level language now. I had heard about it, but it didn't really make sense to me, but the course basically has you writing doing assembly that looks like it should be doing the same thing (e.g. same thing but with loop unrolling), and explains what's happening in the background making one version twice as fast as the other.
I’ve been learning Chinese for a few years. I’d recommend taking the time to learn the radicals as well, as it will help you get the feeling for what the character might mean, to the point that eventually through context and the radicals used it becomes useful to understand what it means even when you don’t actually know the character.
It also makes it easier to learn characters: Yesterday I learned the character 祂 while reading about Buddhism. It means “it”, but when referring to a god, which I could infer from the context, but also from the fact that 他 means “he” (with a person radical 人), and 她 means “her” (with a female radical 女), and I’m familiar with the radical for god/temple which is 礻.
In Chinese this also helps with “guessing” how the character should be pronounced, to an useful extent, but I don’t know if this also would apply for Japanese.
Serious leetcoding, I don't have a CS background, after two years of being DevOps while also interested in SWE, I find sitting down and actually grinding through leetcode is a good brain exercise, I also find it to be a marathon instead of a short sprint, feeling like I'm smarter everyday!
It's a fun journey -- I hope you enjoy it! I never liked math, until I read Abrash's "Zen of Graphics Programming", and learned that graphics is chock-full of math :D
I read Understanding the 4 Rules of Simple Design by Corey Haines yesterday evening.
It's a short book, so you can get through it in one sitting. The advice is mostly about OOP design, and how to shuffle things around and invent arbitrary constraints in order to make the code more testable. I found the advice fairly… underwhelming.
I decided to implement Conway's Game of Life myself (though not with OOP) as the author asserted the advice transcends implementation languages and paradigms.
My implementation isn't amazing either, but I'm not convinced by the advice in the book.
I’ve been building small web UI applications. I’m a backend developer who has been curious about people claiming that web components are all you need for UI. So I’ve been finding out how far I can get with rollup with only the Typescript plugin.
Go (the game). I wrote a clone of AlphaGo in Go (the programming language) 8 years ago. Along the way I learned to play Go.
I've been using a combination of my own AI, LeelaZero and KataGo to teach myself in Go. For 8 years I've languished at the same level of play. Then I met a real human teacher who taught me Go at the end of 2023. And since then my game improved. I beat my own AI (which was intentionally trained to be impoverished in skill) for the first time in January.
Learning Go is teaching me all sorts of new ideas in pedagogy and putting a dampener in any enthusiasm that involves LLMs in education.
I'm trying to start making games so I am learning graphics programming at the moment, and some physics. Currently work primarily with automation and embedded so it's quite different.
Learning how to grow my indiehacker side gigs. Building is easy, getting a few users also works. But growing/maintaining revenue is really really hard for me :(
I’m learning psychology. But I don’t want to be a psychologist that works with the mentally ill people.
I basically want to study and be an expert in social engineering if I’m being completely blunt. That’s my whole motivation for studying psychology.
And I think I should just clarify it’s not because I want to use social engineering knowledge to go do terrible things - I just find it literally fascinating.
I'm a threat intel analyst and recently started self-learning how to improve my storytelling skills, which is generally lacking in cybersecurity. I've refined my writing skills over the years, starting with Medium, LinkedIn newsletters, and now Substack: https://shorturl.at/jHKLQ
The ins and outs of the Deno runtime. Since I want to build my first text/video course about it.
On the one side it could be a cool side project where I can help people getting their hands dirty with Deno and follow my passion on teaching interesting topics to folks. On the other side I want to push a project of myself over the finish line and maybe make a dollar or two with it.
That is actually to be discussed. I started thinking about putting it up on Udemy but in the meantime I was thinking that I might just put it in a .zip and just distribute it via gumroad. Or even use their video hosting platform.
I have always worked as a developer and have been building products for the past 6 years. But as a founder, you need to know GTM and growth for your product!
Initially, we used to think that we could outsource, but we were so wrong! :D
Hence, now learning it on our own.
Happy to take help and suggestions, or if you are up for the discussion around growth, GTM and marketing.
I'm currently trying to learn music production and composition to put the singing classes I'm taking to use.
It feels a bit more daunting than when I was first learning how to program, since there appears to be numerous more tutorials, but half of them are only there to promote a course and it's hard to distinguish good learning material from bad.
Deeper studying of hardware and software that drives deep learning for the simple reason that I find Python a horrible experience and I am wondering why there are no real alternatives in modern deep learning. Also home DYI, forest building and Portuguese.
It’s very personal and I don’t need anyone to agree with it. I have 20+ years of Python programming behind me and while I like coding, I really don’t like Python or the Python dev experience. I guess I want to find out if the grass is greener or not at all.
I've been building a game in Compose Multiplatform and have been learning how to integrate new UI features across WASM, Android, iOS, MacOS, Windows. It's great new tech with a lot of promise even though it's still early.
how to be a better fiction writer. i wrote a novel and i'm revising it for the fourth time. i accidentally ripped off 'as i lay dying' by faulkner. i started writing my story about a funeral told in the first person present with multiple narrators and realized that faulkner had already done that. while writing it, i avoided reading him because i didn't want to rip him off more. but now i'm re-reading the book very closely and learning how he did it and taking some stylistic tricks. that book is genius, perhaps the best novel i've ever read in my life. it was a bad idea to avoid.
This also got me interested in Creative Nonfiction and I’m reading about that now, although the Snowflake Method book is something else — a fiction story that explains a nonfiction subject.
with the free tier, he answers questions from students and that's valuable alone. with the paid tier, you get to learn the mechanics of a short story from one of america's best living short story writers.
Curious what resources / apps you're using for learning. I'm building a tool that helps with self-directed learning. Would you be keen to jump on a 15 mins call and help shape the product? https://calendly.com/vel-yan/15min
Jetpack Compose and SwiftUI, I hope this will be the last evolution of mobile development, I’m getting older (over 14 years mobile development experience) and my brain is tired with all the mobile development shiftings.
Learning how to weld at PS:1 was fun.... but the safety film on what NOT to weld (due to the chances of dying, especially CHROME) pretty much has me sticking to hot rolled steel only.
- how to build a fence and trying to wrap my head around convolutions and learn some stats in the process (theres a pretty good 3b1b intro on it but where to go next to actually learn it).
Learning a couple of jazz standards on guitar. Building a RAG model over bespoke documents. And streamline, react, or some other web tools so that I can make better interactive demos.
Currently, I am focused on Mandarin Chinese, LLMs. I would like re-learn math up through undergrad discrete math too and develop an understanding of the Linux kernel.
Curious - do you feel lack of math mastery hinders your machine learning knowledge? I ask because despite pretty advanced math in several postgrad programs with good grades, I feel like my true math skills are shit.
I took an excellent course (professional development for teachers) from The Coding School on machine learning. I got 4 CEUs from Stanford and it was a great course.
Finally, after many years of procrastination, learning to track my time correctly. Paradoxically, I am clocking in more hours while spending less real time at work.
Longtime data engineer - web apps in vanilla HTML/CSS/JS with Node - and some fun with ncurses in C++. Copilot is helping me out a bit but I’m having fun!
Hey, you're like my evil twin! Longtime web dev here who always wanted to learn more data engineering.
HTML/CSS/JS is fun and creative to some degree, but kinda abstract. Data engineers get to work with real-world phenomena! What prompted the interest in web stuff?
I never did proper "data engineering" per se, but we did have casual run-ins with geodatasets and GIS, and used them to build a home solar calculator that takes a ZIP code and estimates that area's available yearly sunlight, utility rates, electricity consumption, etc. and combine it into an estimated size for a home solar system. It was like a primitive version of Google Sunroof (https://sunroof.withgoogle.com/), which is much, much better.
I too love the intersection of data and visualization/usability, turning obscure spreadsheets into useful public interfaces :) The geospatial world is just one part of that. It used to be cool, but I guess nowadays it'd probably be more about turning data into LLM-digestible training sets so that they can more directly analyze questions and answer them in plain language.
Do you think traditional DE is still worth learning, with Skynet on the horizon? What's a good way to get started?
Wow - that sunroof thing is cool, I bet that was a fun area to work on.
There’s a lot to be said for data engineering when done at scale - systems design, devops, cloud engineering etc.
I’m not sure that’s going yet…
Building the individual units (reliable ETL pipelines) might be on the chopping block in the near future - I’m not convinced there’s 100 competing ways to build these.
Still - best way to get started is probably to get stuck in with running a data pipeline using a locally deployed Airflow instance. Read some data from some api and write it to a local database deployment (postgres?).
May I ask the same of web application development?
> that sunroof thing is cool, I bet that was a fun area to work on
Yeah, it was (and is) really cool! To be clear, I didn't work on that, just something similar but much more primitive and with much simpler datasets. Google did a really good job there.
> locally deployed Airflow instance
I'm pretty comfortable with basic ETL stuff, but never used Airflow. Will have to look into that, thanks!
> May I ask the same of web application development?
I can give you my opinion, but that's all it is. I'm not a FAANGer and primarily work with small businesses and nonprofits, so my perspective may be biased and incomplete. I should also note that I'm also kinda an AI-optimist, meaning I have a much more positive view of both its competence and its threat than many people. But ultimately, I'm a nobody, just a rando on the internet, so take all this with a big grain of salt :)
In the time I've been doing web stuff (20-30 years, depending), I've seen the industry move towards higher and higher levels of abstraction. In the old days (the 90s and early 2000s) it was a lot of hacked-together HTML + backend logic, then it gradually moved towards the clientside (frontend) with things like ActiveX, Flash, and eventually CSS and JS.
JS eventually won out and you can build really amazing apps almost entirely in the frontend now (like Figma, Photopea, OpenSolar, Felt, and other frontend-heavy, UI-driven things like those).
But those are what I'd consider proper "apps". There are also may websites that are just, well, sites and not what I'd consider apps, things like your average news or blog site, or maybe basic ecommerce stuff. That's where I see the most abstractions/consolidations into a few big frameworks, like Shopify or Wordpress + WooCommerce, or Wix/SquareSpace/Weebly for simpler sites, or headless CMSes + Jamstacks for more complex sites (disclaimer: I currently work for a headless CMS company). I think these sorts of sites are the most at risk of automation, which really has been happening for decades already. What used to take weeks of setup and then tons of ongoing maintenance (both on the frontend and backend) is basically just a one-click deploy these days with little need for actual coding anymore. And even complex, bespoke UIs are increasingly being AI-driven... the company that makes Next.js (a big frontend framework) is also using AI to try to replace their own customers (us frontend devs), lol: https://v0.dev/
I think there will always be a few humans needed in the loop, but probably fewer and fewer over time, and usually at the "architect" level, where you design overall systems and write some minor glue code to tie it altogether, but don't need to dive too deeply into the nitty-gritty anymore. I'm especially scared for junior coders, because even today, Copilot and ChatGPT are already way better than most of them (and often, better than myself too). One moderately experienced coder with a few AIs can easily replace what used to take a team, and do so with much less overhead (no need for Agile crap, three layers of mid-management, everlasting meetings, etc.). I think it's going to get a lot leaner, which means fewer openings, more productivity per remaining dev, but fewer dev openings overall.
But that's only if we assume that current web devs stay the course and don't upskill/adopt more and more AI practices. Just like fewer and fewer of us work in PHP or ASP or Ruby on Rails these days, I would assume that many would sidestep and just incorporate more AI and GPT into their workflows. Maybe there's some opportunities there for lean + mean startups who can do a lot more with fewer employees than before. But still, AI/ML is a fundamentally different enough skill set (like actual CS + math + modeling stuff, not just gluing together UI code) that a lot of us are going to flunk out and join the breadlines.
Even at this early stage of AI, I'm fairly confident that my career as a small-biz frontend dev is a dead end, with maybe 4-5 years left if I'm really lucky. It's not just because of AI doing a better job, necessarily, but the simple hype around AI means a lot of the money that used to be in Web is now pivoting towards AI. The bubble's burst, and sure, there'll always be a few dev jobs here and there, just like there are still newspaper or graphic design jobs here and there, but I'm fairly certain its heyday is over. Just my 2¢ =/
That said, I don't think it ever hurts to learn HTML + CSS at least. Those are relatively simple, declarative markup languages that's really more similar to Markdown than programming. JS (and especially React, etc.) is where it gets tricky, but even that ecosystem is finally somewhat mature/stable, such that it's a pretty easy time to get started, having missed the craziness of the late 2010s and early 2020s where it was going through very rapid iterations. Today it seems to have stabilized around React + Next.js as the go-to framework (by popularity), and the documentation and examples have gotten better. I wouldn't quit your day job to go learn any of this stuff -- like I don't think it's a good time to dive headfirst into becoming a web dev, with it becoming more and more yesterday's game instead of the future -- but as a side project? Sure. At the very least, it enables better UI design and visualizations, and that's always fun! Even with AI as the companion, there's a lot more creativity (and human psychology) there than gluing together pipelines and APIs (of course, that's probably just my bias as a frontend person creeping through).
Alternatively, it's also possible to get really good at some particular niche in the stack (like Canvas graphics or WebAssembly or WebGL) and find a super-specialist position at a bigger company. That would probably be a safer bet than my generalist background.
I dunno... sorry, I don't mean to be a downer, it's just that tech has always been fast-moving and cutthroat, and it's probably going to become even more ruthless in the future. Once AI starts self-iterating, it'll become exponential and I don't think humans will be able to keep up. Right now it seems like the biggest barrier is simply our hardware manufacturing capacity, but that's going to ramp the hell up soon. But anyway... we can only play the cards we're dealt, as we're dealt them. No need to fret about things out of my control, and Skynet is very much out of my control, lol.
As an old man (nearly 40), I have limited time and neuroplasticity left. It's harder for me to just pivot to the new shiny every few years. But if you're younger, I don't think it's ever wrong to explore and try new things and see what sticks (and what pays)!
Sorry, that's just the bigger-picture overview/rant. If you have any questions about more specific technologies/stacks, I'd be happy to share thoughts on those too.
Thanks for all the web dev advice and tech callouts. Some real neat examples there which I knew nothing about.
Regarding niches, I think as a data engineer, becoming fluent in an interactive front-end data-viz library might be the slam dunk that I'm probably looking for.
Having looked at D3, I am quite impressed! Do you have any experience or knowledge of this lib?
Finnish, mostly. I moved to Finland a few years ago and have been making slow but consistent progress along the vectors I care about ever since. I just wish language learning didn't take so much time.
These in turn led me to devour a book on the inner workings of SQLite and web dev, because I needed some way to scrape Tatoeba without losing my data every time. Eventually I got good enough to start reading the 'clear Finnish's news, but then I realized YLE.fi didn't seem to have an easy way for me to scrape all previous news articles, so I built https://hiandrewquinn.github.io/selkouutiset-archive/ as an excuse to get a little deeper into Hugo and also learn some stuff about Git modules, systemd timers, doing things on a Raspberry Pi, doing things in GCP...
... And finally today I made the first lurching version prototype of a flashcard generator for that news archive, at https://github.com/Selkouutiset-Archive/selkokortti . I guess I just keep stringing the tools and interests I have together to make bigger and bigger things. Maybe that's all a career/vocation really is at the end of the day.
I've also been learning a lot about QEMU and virtualization. That's mostly for work. I make software that runs on trains.
romhacking. I want to make patches to old console games from the late 80s and early 90s. I've never really worked with binary data before, so I have had to learn some new things.
Thank you for sharing this idea! Some time ago I was trying to learn Danish, and I found that audio description plus subtitles on netflix helps a lot with learning a new language.
I will try your suggestion with German. Thank you!
All in Danish. It was a form of multi-modal learning. I was pairing the text to the sound and the visual input. I frequently needed to pause and translate the subtitles — often word by word — but I think the combination of the three helped quite a bit as it helped pair the sound with the image without translating to an intermediary language that I understood.
When I "blitz", I'm looking for an immersive, memorable experience in my target language. Learning is merely a bi-product of that.
That said, as someone with an A1 ability according to CEFR (I started learning Polish 5 months ago), my ability to follow along with a native speaking narrator, even at 2x speed, has improved exponentially. My exposure to Polish vocabulary has also exploded, and I'm familiar with many Polish words in their various grammatical forms. However, I haven't committed to long term memory as many Polish words as I'd like to. Many meanings are still fuzzy, like a word at the tip of the tongue. I think I just need to see these familiar but fuzzy words in more diverse contexts before they are nailed down. That means more blitzing.
When I was a kid, they always told me math would be super useful, especially if I liked computers. Well, 20+ years of a dev career later, I still have never used anything more than basic arithmetic and rudimentary algebra (to calculate responsive component sizes). But with web dev jobs going the way of the horse-drawn wagon, I figured it was time for a career change. Hoping to get into (civil/environmental) engineering instead, but I guess that field actually does use math, lol. We'll see how it goes...
In the meantime, also taking singing classes at the community college, and enjoying THAT way more. We performed at a nursing home a few weeks ago, and that brought SO much joy to the audience there, even though we're just a bunch of amateurs. It's just such a different reception than anything I've ever seen as a dev. Tech rarely inspires such joy.
If I could start all over again, I wish I would've pursued music over computer stuff. Much harder life though!