Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but I think he ends up missing the mark. Ultimately his talk boils down to "let's revolutionize programming!" But as I said that ends up being fairly trite.

As for testing, integration, building, and refactoring I think it's hugely mistaken to view them as "symptoms of a problem". They are tools. And they aren't just tools used to grapple with explosions of complexity, they are tools that very much help us keep complexity in check. To use an analogy, it's not as though these development tools are like a hugely powerful locomotive that can drag whatever sorry piece of crap codebase you have out into the world regardless of its faults. Instead, they are tools that enable and encourage building better codebases, more componentized, more reliable, more understandable, etc.

Continuous integration techniques combined with robust unit and integration testing encourage developers to reduce their dependencies and the complexity of their code down as much as possible. They also help facilitate refactoring which makes reduction of complexity easier. And they actively discourage fragility, either at a component level or at a product/service level.

Without these tools there is a break in the feedback loop. Coders would just do whatever the fuck they wanted and try to smush it all together and then they'd spend the second 90% of development time (having already spent the first) stomping on everything until it builds and runs and sort of works. With more modern development processes coders feel the pain of fragile code because it means their tests fail. They feel the pain of spaghetti dependencies because they break the build too often. And they feel that pain much closer to the point of the act that caused it, so that they can fix it and learn their lesson at a much lower cost and hopefully without as much disruption of the work of others.

With any luck these tools will be even better in the future and will make it even easier to produce high quality code closer to the level of irreducible complexity of the project than is possible today.

These aren't the only ways that programming will change for the better but they are examples which I think it's easy for people to write off as not core to the process of programming.



Hrmmm ... you seem to still be tremendously missing the overarching purpose of this speech. Obviously to delve proficiently into every aspect of programming through the ages and why things are the way they are now (e.g. what 'won' out and why) would require a Course and not a talk. You seem to think that the points brought up in this talk diminish the idea that things now DO work. However, what was stressed was avoiding the trap of discontinuing the pursuit of thinking outside the box.

I am also questioning how much of the video you actually paid attention too (note: I am not questioning how much you watched). I say this because your critique is focused on the topics that he covered in the earlier parts of the video and then (LOL) you quickly criticize him for talking about concurrency (in your previous comment)... I clearly remember him talking about programming on Massively Parallel architectures without the need for sequential logic control via multiplexing using threads and locks. I imagine, though, it is possible you did not critique this point because it is obvious (to everyone) that this is the ultimate direction of computing (synchronously with the end of Moore’s law as well).

Ahhh now that’s interesting, we are entering an era where there could possibly be a legitimate use to trying/conceiving new methods of programming? Who would have thought?

Maybe you just realized that you would have looked extremely foolish spending time on critiquing that point? IDK … excuse my ignorance.

Also you constantly argue FOR basic management techniques and methods (as if that countermoves Bret’s arguments) … but you fail to realize that spatial structuring of programs would be a visual management technique in itself that could THEN too have tools developed along with it that would be isomorphic to modern integration, testing management. But I won’t bother delving into that subject as I am much more ignorant on this and more importantly … I would hate to upset you, Master.

Oh and btw (before accusations fly) I am not a Hero worshiper … this is the first time I have ever even heard of Bret Victor. Please don’t gasp too loud.


I definetely get what OP was trying to say. Bret presented something that sounds a lot like the future even though it definitely isn't. Some of listed alternatives like direct data manipulation, or visual languages or non-text languages have MAJOR deficiences and stumbling blocks that prevented them from achieving dominance. Though in some cases it basically does boil down to which is cheaper, more familiar.


I think the title of Bret's presentation was meant to be ironic. I think he meant something like this.

If you want to see the future of computing just look at all the things in computing's past that we've "forgotten" or "written off." Maybe we should look at some of those ideas we've dismissed, those ideas that we've decided "have MAJOR deficiencies and stumbling blocks", and write them back in?

The times have changed. Our devices are faster, denser, and cheaper now. Maybe let's go revisit the past and see what we wrote off because our devices then were too slow, too sparse, or too expensive. We shouldn't be so arrogant as to think that we can see clearer or farther than the people who came before.

That's a theme I see in many of Bret's talks. I spend my days thinking about programming education and I can relate. The art when it comes to programming education today is a not even close to the ideas described in Seymour Papert's Mindstorms, which he wrote in 1980.

LOGO had its failings but at least it was visionary. What are MOOCs doing to push the state of the art, really? Not that it's their job to push the state of the art -- but somebody should be!

This is consistent with other thing's he's written. For example, read A Brief Rant on the Future of Interaction Design (http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesi...). Not only does he use the same word in his title ("future"), but he makes similar points and relates the future to the past in a similar way.

"And yes, the fruits of this research are still crude, rudimentary, and sometimes kind of dubious. But look —

In 1968 — three years before the invention of the microprocessor — Alan Kay stumbled across Don Bitzer's early flat-panel display. Its resolution was 16 pixels by 16 pixels — an impressive improvement over their earlier 4 pixel by 4 pixel display.

Alan saw those 256 glowing orange squares, and he went home, and he picked up a pen, and he drew a picture of a goddamn iPad.

[picture of a device sketch that looks essentially identical to an iPad]

And then he chased that carrot through decades of groundbreaking research, much of which is responsible for the hardware and software that you're currently reading this with.

That's the kind of ambitious, long-range vision I'm talking about. Pictures Under Glass is old news. Let's start using our hands."


Okay. Single fingers are a amazing input device because of dexterity. Flat phones are amazing because they fit in my pockets. Text search is amazing because with 26 symbols I can query a very significant portion of world knowledge (I can't search for, say, a painting that looks like a Van Gogh by some other painter, so there are limits, obviously).

Maybe it is just a tone thing. Alan Kay did something notable - he drew the iPad, he didn't run around saying "somebody should invent something based on this thing I saw".

Flat works, and so do fingers. If you are going to denigrate design based on that, well, let's see the alternative that is superior. I'd love to live through a Xerox kind of revolution again.


Go watch these three talks of his: Inventing on Principle, Dynamic Drawings, and this keynote on YouTube (http://m.youtube.com/watch?v=-QJytzcd7Wo&desktop_uri=%2Fwatc...)

Next, read the following essays of his: Explorable Explanations, Learnable Programming, and Up and Down The Ladder of Abstractions

Do you still think he's all talk?

Also, I can't tell if you were implying otherwise, butAlan Kay did a few other notable things like Smalltalk and OOP.


I've seen some of his stuff. I am reacting to a talk where all he says is "this is wrong". I've written about some of that stuff in other posts here, so I won't duplicate it. He by and large argues to throw math away, and shows toy examples where he scrubs a hard coded constant to change program behavior. Almost nothing I do depends on something so tiny that I could scrub to alter my algorithms.

Alan Kay is awesome. He did change things for the better, I'm sorry if you thought I meant otherwise . His iPad sketch was of something that had immediately obvious value. A scrubbing calculator? Not so much.


Hmm didn't you completely miss his look, the projector etc? He wasn't pretending to stand in 2013 and talk about the future of programming. He went back in time and talked about the four major trends that excisted back then.

The future in that talk means "today"


No. I'm saying, there is a reason those things haven't' become reality. They have much greater hidden cost than presented. It is the eqivalent of someone dressing in 20th century robe of Edisson and crying over the cruel fate that fell on DC. Much like DC these ideas might see a comeback but only because the context has changed. Not being aware of history is one blunder but failing to see why those thing weren't realized is another blunder.


I get it, I really do. And I'm very sympathetic to Victor's goals. I just don't buy it, I think he's mistaken about the most important factors to unlock innovation in programming.

His central conceit is the idea is that various revolutionary computing concepts which first surfaced in the early days of programming (the 1960s and '70s) have since been abandoned in favor of boring workaday tools of much more limited potential. More so that new, revolutionary concepts in programming haven't received attention because programmers have become too narrow minded. And that is very simply fundamentally an untrue characterization of reality.

Sure, let's look at concurrency, one of his examples. He bemoans the parallelization model of sequential programming with threads and locks as being excessively complex and inherently self-limited. And he's absolutely correct, it's a horrible method of parallelism. But it's not as though people aren't aware of that, or as though people haven't been actively developing alternate, highly innovative ways to tackle the problem every year since the 1970s. Look at Haskell, OCaml,vector computing, CUDA/GPU coding, or node.js. Or Scala, Erlang, or Rust, all three of which implement the touted revolutionary "actor model" that Victor brandishes.

Or look at direct data manipulations as "programming". This hasn't been ignored, it's been actively worked on in every way imaginable. CASE programming received a lot of attention, and still does. Various workflow based programming models have received just as much attention. What about Flash? Hypercard? Etc. And there are many niche uses where direct data manipulation has proven to be highly useful. But again and again it's proven to be basically incompatible with general purpose programming, likely because of a fundamental impedance mismatch. A total of billions of dollars in investment has gone into these technologies, it's counterfactual to put forward the notion that we are blind to alternatives or that we haven't tried.

Or look at his example of the smalltalk browser. How can any modern coder look at that and not laugh. Any modern IDE like Eclipse or Visual Studio can present to the developer exactly that interface.

Again and again it looks either like Victor is either blindly ignorant of the practice of programming in the real-world or he is simply adhering to the "no true Scottsman" fallacy. Imagining that the ideas he brings up haven't "truly" been tried, not seriously and honestly, they've just been toyed with and abandoned. Except that in some cases, such as the actor model, they have not just been tried they've been developed into robust solutions and they are made use of in industry when and if they are warranted. It's hilarious that we're even having this discussion on a forum written in Arc, of all things.

To circle back to the particular examples I gave of alternative important advances in programming (focusing on development velocity and reliability), I find it amusing and ironic that some folks so easily dismiss these ideas because they are so seemingly mundane. But they are mundane in precisely the ways that structured programming was mundane when it was in its infancy. It was easy to write off structured programming as nothing more than clerical work preparatory to actual programming, but now we know that not to be true. It's also quite easy to write off testing and integration, as examples, as extraneous supporting work that falls outside "real programming". However, I believe that when the tooling of programming advances to more intimately embrace these things we'll see an unprecedented explosion in programming innovation and productivity, to a degree where people used to relying on such tools will look on our programming today as just as primitive as folks using pre-structured programming appear to us today.

Certainly a lot of programmers today have their heads down, because they're concentrated on the work immediately in front of them. But the idea that programming as a whole is trapped inside some sort of "box" which it is incapable of contemplating the outside of is utterly wrong with numerous examples of substantial and fundamental innovation happening all the time.

I think Victor is annoyed that the perfect ideal of those concepts he mentions haven't magically achieved reification without accumulating the necessary complexity and kruft that comes with translating abstract ideas into practical realities. And I think he's annoyed that fundamentally flawed and imperfect ideas, such as the x86 architecture, continue to survive and be immanently practical solutions decade after decade after decade.

It turns out that the real world doesn't give a crap about our aesthetic sensibilities, sometimes the best solution isn't always elegant. To people who refuse to poke their head out of the elegance box the world will always seem as though it turned its back on perfection.


> I get it, I really do.

It's always a red flag when people have to say that. Many experts don't profess to understand something which they spent a long time understanding.

Ironically, Bret Victor mentioned, "The most dangerous thought that you can have as a creative person is to think that you know what you're doing..."

The points you mention are bewildering, since in my universe, most "technologists" ironically hate change. And learning new things. They seem to perceive potentially better ways of doing things like a particularly offensive veggie, rant at length rather than even simply taste the damn thing, and at best hide behind "Well it'd be great to try these new things, but we have a deadline now!" Knowing that managers fall for this line each time, due to the pattern-matching they're trained in.

(Of course, when they fail to meet these deadlines due to program complexity, they do not reconsider their assumptions. Their excuses are every bit as incremental as their approach to tech. The books they read — if they read at all — tell them to do X, so by god X should work, unless we simply didn't do enough X.)

It's not enough to reject concrete new technologies. They even fight learning about them in order to apply vague lessons into their solutions.

Fortunately, HN provides a good illustration of Bret Victor's point: "There can be a lot of resistance to new ways of working that require to kind of unlearn what you've already learned, and think in new ways. And there can even be outright hostility." In real life, I've actually seen people shout and nearly come to blows while resisting learning a new thing.


You haven't addressed any of inclinedPlane's criticism of Brett's talk. Rather your entire comment seems to be variations on "There are people who irrationally dislike new technology."


Well, I don't agree with your premise, that I haven't addressed any of their criticisms.

A main theme underlying their complaint is that there's "numerous examples of substantial and fundamental innovation happening all the time."

But Bret Victor clearly knows this. Obviously, he does not think every-single-person-in-the-world has failed to pursue other computational models. The question is, how does the mainstream programming culture react to them? With hostility? Aggressive ignorance? Is it politically hard for you to use these ideas at work, even when they appear to provide natural solutions?

Do we live in a programming culture where people choose the technologies they do, after an openminded survey of different models? Does someone critique the complectedness of the actor model, when explaining why they decided to use PHP or Python? Do they justify the von Neumann paradigm, using the Connection Machine as a negative case study?

There are other shaky points on these HN threads. For instance, inferring that visual programming languages were debunked, based on a few instances. (Particularly when the poster doesn't, say, evaluate what was wrong with the instances they have in mind, nor wonder if they really have exhausted the space of potential visual languages.)


@cali: I completely agree with your points. @InclinedPlane is missing the main argument.

Here is my take: TLDR: Computing needs an existential crisis before current programming zeitgeist is replaced. Until then, we need to encourage as many people as possible to live on the bleeding edge of "Programming" epistemology.

Long Version: For better or for worse, humans are pragmatic. Fundamentally, we don't change our behavior until there is a fire at our front door. In this same sense, I don't think we are going to rewrite the book on what it means to "program," until we reach an existential peril. Intel demonstrated this by switching to multicore processors after realizing Moore's law could simply not continue via a simple increase in clock speed.

You can't take one of Bret's talks as his entire critique. This talk is part of a body of work in which he points out and demonstrates our lack of imagination. Bret himself points out another seemingly irrelevant historical anecdote to explain his work: Arab Numerals. From Bret himself:

"Have you ever tried multiplying roman numerals? It’s incredibly, ridiculously difficult. That’s why, before the 14th century, everyone thought that multiplication was an incredibly difficult concept, and only for the mathematical elite. Then arabic numerals came along, with their nice place values, and we discovered that even seven-year-olds can handle multiplication just fine. There was nothing difficult about the concept of multiplication—the problem was that numbers, at the time, had a bad user interface."

Interestingly enough, the "bad user interface" wasn't enough to dethrone roman numerals until the renaissance. The PRAGMATIC reason we abandoned roman numerals was due to the increased trading in the Mediterranean.

Personally, I believe that Brett is providing the foundation for the next level of abstraction that computing will experience. That's a big deal. Godspeed.


Perhaps. But I think he is a visual thinker (his website is littered with phrases like "the programmer needs to see....". And that is a powerful component of thinking, to be sure. But, think about math. Plots and charts are sometimes extremely useful, and we can throw them up and interact with them in real time with tools like Mathcad. Its great. But, it only goes so far. I have to do math (filtering, calculus, signal processing) most every day at work. I have some Python scripts to visualize some stuff, but by and large I work symbolically because that is the abstraction that gives me the most leverage. Sure, I can take a continuous function that is plotted, and visually see the integral and derivative, and that can be a very useful thing. OTOH, if I want to design a filter, I need to design it with criteria in mind, solve equations and so on, not put an equation in a tool like mathcad, and tweek coefficients and terms until it looks right. Visual processing falls down for something like that.

Others have posted about the new IDEs that they are trying to create. Great! Bring them to us. If they work, we will use them. But I fundamentally disagree with the premise that visual is just flat out better. Absolutely, have the conversation, and push the boundaries. But to claim that people that say "you know, symbolic math actually works better in most cases" are resisting change (you didn't say that so much as others) is silly. We are just stating facts.

Take your arabic numbers example. Roman numerals are what, essentially VISUAL!! III is 3. It's a horrible way to do arithmetic. Or imagine a 'visual calculator', where you try to multiply 3*7 by stacking blocks or something. Just the kind of thing I might use to teach a third grader, but never, ever, something I am going to use to balance my checkbook or compute loads on bridge trusses. I'm imagining sliders to change the x and y, and the blocks rearranging themselves. Great teaching tool. A terrible way to do math because it is a very striking, but also very weak abstraction.

Take bridge trusses. Imagine a visual program that shows loads in colors - high forces are red, perhaps. A great tool, obviously. (we have such things, btw). But to design a bridge that way? Never. There is no intellectual scaffolding there (pun intended). I can make arbitrary configurations, look at how colors change and such, but engineering is multidimensional. What do the materials cost? How hard are they to get and transport? How many people will be needed to bolt this strut? How do the materials work in compression vs expansion? What are the effects of weather and age? What are the resonances. It's a huge optimization problem that I'm not going to solve visually (though, again, visual will often help me conceptualize a specific element). That I am not thinking or working purely visually is not evidence that I am not being "creative" - I'm just choosing the correct abstraction for the job. Sometimes that is visual, sometimes not.

So, okay, the claim is that perhaps visual will/should be the next major abstraction in programming. I am skeptical, for all the reasons above - my current non-visual tools provide me a better abstraction in so many cases. Prove me wrong, and I will happily use your tool. But please don't claim these things haven't been thought of, or that we are being reactionary by pointing out the reasons we choose symbolic and textual abstractions over visual ones when we have the choice (I admit sometimes the choice isn't there).


Bret has previously given a talk[1] that addresses this point. He discusses the importance of using symbolic, visual, and interactive methods to understand and design systems. [2] He specifically shows an example of digital filter design that uses all three. [3]

Programming is very focused on symbolic reasoning right now, so it makes sense for him to focus on visual and interactive representations and interactive models often are intertwined with visual representation. His focus on a balanced approach to programming seems like a constant harping on visualization because of this. I think he is trying to get the feedback loop between creator and creation as tight as possible and using all available means to represent that system.

The prototypes I have seen of his that are direct programming tend not to look like LabView, instead they are augmented IDEs that have visual representations of processing that are linked to the symbolic representations that were used to create them. [4] This way you can manipulate the output and see how the system changes, see how the linkages in the system relate, and change the symbols to get different output. it is a tool for making systems represented by symbols but interacting with the system can come through a visual or symbolic representation.

[1] http://worrydream.com/MediaForThinkingTheUnthinkable/note.ht...

[2] http://vimeo.com/67076984#t=12m51s

[3] http://vimeo.com/67076984#t=16m55s

[4] http://vimeo.com/67076984#t=25m0s


Part of Bret's theory of learning (which I agree with) is that when "illustrating" or "explaining" an idea it is to important use multiple simultaneous representations, not solely symbolic and not solely visual. This increases the "surface area" of comprehension so that a learner is much more likely to find something in this constellation of representations that relate to their prior understanding. In fact, that comprehension might only come out of seeing the constellation. No representation alone would have sufficed.

Further, you then want to build a feedback loop by allowing direct manipulation of any of the varied representations and have the other representations change accordingly. This not only lets you see the same idea from multiple perspectives -- visual, symbolic, etc. -- but lets the learner see the ideas in motion.

This is where the "real time" stuff comes in and also why he gets annoyed when people see the it as the point of his work. It's not; it's just a technology to accelerate the learning process. It's a very compelling technology, but it's not the foundation of his work. This is like reducing Galileo to a really good telescope engineer -- not that Bret Victor is Galileo.

I think he emphasizes the visual only because it's so underdeveloped relative to symbolic. He thinks we need better metaphors, not just better symbols or syntax. He's not an advocate of working "purely visually." It's the relationship between the representations that matters. You want to create a world where you can freely use the right metaphor for the job, so to speak.

That's his mission. It's the mission of every constructivist interested in using computers for education. Bret is really good at pushing the state of the art which is why folks like me get really excited about him! :D

You might not think Bret's talks are about education or learning, but virtually every one is. A huge theme of his work is this question: "If people learn via a continual feedback loop with their environment -- in programming we sometimes call this 'debugging' -- then what are our programming environments teaching us? Are they good teachers? Are they (unknowingly) teaching us bad lessons? Can we make them better teachers?"




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: