> The aesthetics of the machine have not been neglected. The CPU is attractively housed in a cylindrical cabinet. The chassis are arranged two per each of the twelve wedge-shaped columns. At the base are the twelve power supplies. The power supply cabinets, which extend outward from the base are vinyl padded to provide seating for computer personnel.
They needed someplace to put the power supplies anyway. The seat cushions just added a touch of whimsy. Could you imagine IBM putting seats around one of their computers?
Top 5 will never cover the field. Here's my top 10
* Brookshear and Brylow - Computer Science - An Overview
* Forta - Teach yourself SQL in 10 minutes
* Stallings - Computer Organization and Architecture
* Stallings - Operating Systems - Internals and Design
Principles
* CLRS
* Kurose, Ross - Computer Networking - A Top Down Approach
* Sipser - Introduction to The Theory of Computation
* Stallings, Brown - Computer Security - Principles and Practice
* Aumasson - Serious Cryptography
* Russell, Norvig - Artificial Intelligence - A Modern Approach
And even this fails to cover programming languages. Python is the lingua franca of the field. Most past recommended books are getting outdated, but perhaps Matthes' Python Crash Course 3rd edition.
When I need a refresher on the basics of Python, I refer to Python Distilled, and when I want a deep dive, I turn to Fluent Python. Reading these books makes me feel like I'm sitting next to an experienced, witty colleague.
I agree that five books won't ever cover every discipline withing Computer Science. Just providing an introductory book, a university-level textbook, and an expert/graduate-level reference for each discipline turns into a long list.
I'd make the argument that TCP/IP Illustrated Volume 1 covers the details of TCP/IP in a very "packet and fields" oriented way. Volume 2 goes into a lot of the "data structures and implementation" way. That makes for a very good supplemental reference, but makes for a less than ideal introductory textbook on the subject of computer networking.
Kurose's book really does take the top-down approach from high level networking concepts through the application layer to the transport layer and downward. It provides just enough of the necessary details (here's a datagram with fields A and B) over a comprehensive list of all the details (here's every field, every field size, and a list of every field option).
Concrete Abstraction and next, SICP, both for Scheme. If you do these, you already understood most of the grounds of CS; learning another language will be a piece of cake.
Given unlimited time, read all of them, learn all the languages. It will all help make you a better programmer in your preferred language. With limited time (as a normal being), start with the top 100 books. Any of them. The next will be simpler than the first...
I have an M.Sc. in Comp.Sci. Flicking through books like these, all the chapter titles resonate with courses, exams, and problems we solved. It also makes me realise I have probably forgotten more than I like to think.
On the other hand, bashing my head against graph theory and logic, has made me a much better programmer. Similarly, the hours spent in Van Roy and Haridi's fairly abstract and technically language-agnostic "Concepts, Techniques and Models of Computer Programming" made me primed to learn a lot of languages fast - because I had the primitives mastered.
I got no particular book recommendation, but this book seems more about the numbers than relations -- maybe my PDF search is broken, but both 'type theory' and 'category theory' return 0 results. I would recommend to also look into those if you are interested in mathematics of computer science.
no, there's no such agreeable thing. everyone has their own idea. but if i was to recommend such today, i would say, go on a self discovery method and find your idea books for algorithms/algorithm analysis & data structure, automata theory, programming languages, operating systems & machine learning.
It's more useful to practice programming through projects. Then once you feel you're missing the knowledge for a particular problem you're trying to solve, read up about that one.
Projects are essential, but I've found there is a huge problem with your advice: you have no clue about the possible solution surface.
My advice to learners has been "try to learn as much about a topic as someone who has taken the subject in college and forgotten about it".
For example consider calculus: Someone who took calc 20 years ago and hasn't used it since will probably forget exactly how to compute most derivatives and integrals. But if someone mentions an optimization problem "we need to know when this curve peaks" or asks something involving finding the area under a curve, alarm bells should start ringing. They'll know this can be done, and likely go grab a calc book to refresh.
Another example I run across all the time, which is the opposite scenario: Survival analysis. I have been on countless teams where somebody needs to understand something like churn or the impact of a offering a discount that hasn't expired yet, etc. These are all classic survival analysis problems, yet most people are ignorant that this field of study even exists! Because of this I've seen so many times where people complain that "we'll have to wait months or years to see if these changes impact customer lifetime!" (note: if anyone out there is doing Churn or LTV analysis and aren't familiar with survival analysis, you are most certainly approaching it incorrectly).
I've seen a lot of people get frustrated with self study because they try to learn the material too well. If you aren't going to be using survival analysis soon, it's not really worth remembering all the details of how to implement a Kaplan Meier curve. But if you even have a vague sense of what problem this solves, when you encounter that problem in a project, you know where to go back to. Then you typically walk away with a much stronger sense of the subject then if you had studied it harder in the first place.
Computer science is to programming what physics is to engineering. They are not the same thing. You can do some programming without knowing computer science, but a general knowledge of computer science will give you a much more solid foundation, and for some aspects is indispensable.
Thats a little like saying if you want to learn mechanical engineering, fix things around your home and then do research when you get stumped.
Building a bunch of software projects probably isn’t a very efficient way of learning computer science. You might figure out things like big-O or A* on your own, but a more academic approach is going to take you further, faster.
It's well established that practical project work is what works best at producing tangible results, and most institutions that aim to produce the best programmers focus on that.
I can understand this is not the approach preferred by academic types which is a strong community on hackernews.
Most people are more motivated to understand the theory because it helps them solve a practical problem, rather than theory for the sake of theory.
I thought this thread was about computer science. Working on a programming project is related to computer science in the same way that welding together a shelf is related to mechanical engineering.
Being "handy" around the house (or even more advanced tinkering) and a mechanical engineering degree--maybe especially from a good school--are absolutely not the same thing.
That seems more like a necessary precondition, than a path to learning computer science. Like you will probably need to learn penmanship and multiplication tables before you get into real mathematics, but, that isn’t really mathematics.
Your describing the impact Steam Deck is having without SteamOS being available to easily install on a custom built machine. The tipping point is going to come this year when people who are building new machines have the option to install Windows or SteamOS. A lot of people are going to pick SteamOS.
Sure, they’ll gain more of the gaming enthusiast segment for sure this way, and it will be a tipping point for those users. I just hope that there are ways beyond the gaming sphere to create converts though, as enthusiast gaming is still a smaller segment than people realise, and it will take a long time if this is only something people really consider with new builds, especially with today’s hardware prices! I wish I could run steamos myself reliably, but I get issues with my old nvidia pascal card still and it causes crashes for me on many games, so I can’t commit until I buy new hardware I don’t think.
This is a gross simplification. It can be part of the explanation, but not the whole one, not even the most important.
It mostly boils down to filmmaker choices:
1. Conscious and purposeful. Like choosing "immersion" instead of "clarity". Yeah, nothing speaks "immersion" than being forced to put subtitles on...
2. Not purposeful. Don't atttibute to malice what can be explained by incompetency... Bad downmixing (from Atmos to lesser formats like 2.0). Even if they do that, they are not using the technology ordinary consumers have. I mean, the most glaring example is the way the text/titles/credits size on screen have been shrinking to the point of having difficulties reading them. Heck, often I have difficulties with text size on by FullHD TV, just because the editing was done on some kind of fancy 4k+ display standing 1m from the editor. Imagine how garbage it looks on 720 or ordinary 480!
For the recent example check the size (and the font used) of the movie title in the Alien Isolation movie and compare it to the movies made in the 80-90s. It's ridiculous!
There are many good youtube videos that explain the problem in more details.
reply