Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have you read a lot of biology papers? Are they all about operating microscopes?


We're not talking about biology. There is no such thing as computers without programming.


Honestly your comments/behavior in this thread read as borderline sealioning. I’m not accusing you of it, just saying that it is so absurd it almost comes off that way.

Computer Science is the study of how we design algorithms to process data. We do computer science using grammars that allow us to describe abstract operations on data. We categorize the different types of algorithmic solutions to problems. We study the limits of efficiency and prove things about the various classes of algorithmic problems and their solutions. We generally work with discrete structures and type systems (like our beloved lambda calculus, origin of the Y combinator).

You can design algorithms without ever compiling a single piece of code just like you can add numbers without ever using a calculator. In computer science we talk about abstract syntax trees, higher order functions, context free and regular grammars, finite state automata, logic, and numbers. You can even get meta and modify your own algorithm’s data as part of the algorithm itself. All this happens independent of a particular instance of a physical machine.

Programming means “issue instructions to a given instance of a machine so that it behaves a certain way”. We take all our theory and apply it to an electrical device that has a physical processor and fixed memory. We program a microchip by writing chains of instructions to its memory. We measure performance in cycles per second. When programming we talk about machine instructions, loadable images, calling conventions, binary interfaces, program counter, alignment, and words.

TL;DR: just go read the Wikipedia page on Computer Science, it’s quite clear.

https://en.wikipedia.org/wiki/Computer_science



Try to prove your type system is sound without the academic math.


What is an unsound type system. Why and how is it relevant for programing?


I can't give you a mathematical definition of it, but my intuitive one is that an unsound type system is a collection of kludges. It's relevant for programming because the language winds up with all kinds of ugly edge cases where things just don't line up right.

With a sound type system, you can do things like compose a new type out of other types, and it will work consistently (not produce weird edge behaviors).

For example, C's `void` type introduces anomalous behavior. A function returning `void` is not composable, e.g.:

    void foo();
    void v = foo();
would work in a sound type system, but does not work in C. You'll see it in the compiler implementation because it's a special case that appears over and over.


Isn’t that why some other languages explicitly draw a distinction between a function and a procedure? I think ‘void’ was introduced in C to do the same thing, implicitly (because omitting the return type does not achieve that).


Yes, but Wirth (Pascal's inventor) was an academic, which is why Pascal has a separate notion of functions and procedures. `void` is a hackish special case in the type system, and not just for function returns. It's unsound.


A type system is an aspect of programming. I would hope a CS course covers type systems.


Proving something about a type system requires zero expertise in programming, debugging, git, Linux, C++, etc. And expertise in programming, debugging, etc., is of zero use in attempting to prove something about the type system.

As has been pointed out to me several times by CS academics, I could stand taking a course in type theory. I have a good intuitive sense about it (likely from my math background), but have no idea how to prove something about it.


A type system is an aspect of a logical system. You can have typed boolean algebras, I don't think anyone programs in those though.

A CS courseload certainly doesn't (and certainly not in the required selection of courses) cover soundness of a language. Even my theory-focused classload only covered automata up to proofs of regularity etc. This is, sort of, step 0 in soundness-proving methodologies, but it's only step zero.


The purpose of a type system is to ensure the correctness of programs. That's what they are, that's why they were invented, that's why they were studied. That CS involves math doesn't change the fact that it's about programming.

The whole reason we want to prove a type system sound is so we can prove certain things about the programs that use that type system.


That's one use, yes, but it isn't the only use. And it's certainly not why they were invented

Typed lambda calculus was formalized before the first programmable computers, and it's relation to programming wasn't clarified for another 20+ years (and real type systems don't really start to appear in programming languages for another decade after that afaik).


You need to learn some history. Or at least cite your claims.

We use type systems to help abstract patterns of data into logical constructs that can be reasoned about. They are logic systems with grammar that describes relationships between axioms and constructs.

Seriously, go read a textbook and then we can pick this discussion up. Wikipedia has a good overview.

https://en.m.wikipedia.org/wiki/Type_theory


Ah, yes. Type theory originates before programming. Type theory only has relevance to computers insofar as it’s used to prove things about programs. Type theory in the context of computing is about programming.

We don’t call set theory and category theory “computer science” unless it’s about programming computers.


No, I mean that's an application yes, but I'd still call Alonzo Church a computer scientist and his work theoretical computer science.

Much as how people who study computability are computer scientist, even though none of the asymptomatic improvements to matrix multiplication since 1990 are even remotely relevant to any kind of real-life program.


This is a narrow view on types; for example, types also help the compiler to produce optimized code. In general, they assist reasoning about and understanding the program.


> There is no such thing as computers without programming.

But there is computer science not applied to computers: Operations research for instance is basically a branch of CS: it is not about programming and it has applications in business, logistics, etc.


Do you consider an ASIC a computer? How about a sufficiently large number of NAND gates, arranged a certain way? They can perform computation but are not programmable after the fact.

My computer science degree covered a lot of topics that didn't require a computer, for example relational algebra, discrete mathematics, and introductory formal logic. Of course, the practical usages of these are often best done through a computer and programming, but it's not a requirement.

In reality all computer science degrees I'm familiar with make an attempt to expose students to relevant programming languages, it's simply practical and students demand it, but the details of coding style, how to use git, or similar topics may not be relevant in a computer science degree. That stuff isn't so hard to learn if you have clear guidelines and they have some baseline amount of intelligence which is one thing a degree tries to validate.


An ASIC is a computer which runs a fixed program. Determining what the ASIC is supposed to do, and implementing it, is programming. People who design ASICs are programmers.

A computer is a machine which runs programs (it computes). That's literally the practical and theoretical basis for computing. Instead of running the program in our heads, we have designed machines to do it for us.

We can calculate SHA256 with a paper and pencil. But we created machines (computers) to compute for us, according to the instructions we give them (the program, the algorithm).


Is calculating SHA256 using pencil and paper programming? Or computing?


> There is no such thing as computers without programming.

not sure what you mean by this, because there's no such thing as programming without computers. (actually, the first computers were invented before programmable computers were invented)


Computing is running a program. That's what computing is. An algorithm is a program. A computer is a machine that performs the algorithm, so we don't have to. The Turing machine computes the answer by running the instructions (the program, the algorithm).


On what computer does the algorithm:

    1. take two slices of bread,
    2. apply peanut butter to one side of one slice, 
    3. apply jelly to one side of the other slice, 
    4. place the slices together with the coated sides touching, and
    5. enjoy.
run?


Sure. What would the study of human behavior or algorithms that aren’t computable with computers be called? Surely not “computer science”…


Humans are biological machines that can compute, no?

Or, you really don't think I can design a logical grammar to formally express the creation of a PB&J?


more to the point, the universe is computational in nature, and humans are, at best, subroutines with a shitty ABI


Obligatory PB & Jelly Exact Instructions Challenge video: https://youtu.be/cDA3_5982h8

The answer is: it runs on dad, poorly :)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: