Hacker News new | past | comments | ask | show | jobs | submit | highlights login
If you run across a great HN comment (or comment tree), please tell us at hn@ycombinator.com so we can add it here.

It's pretty interesting to me to see these patterns of who gets credit and who doesn't. My ex-wife got off of active duty and joined the national guard to go to college nearly 20 years ago before she got back in as an officer through ROTC. While she was in the guard, she deployed with a civil affairs unit to Djibouti, which we have always maintained a permanent presence in to secure shipping lanes for oil. She was their supply sergeant and she identified a problem with the local water supply and got them all the equipment they needed to fix it, and in the process probably legitimately saved at least thousands of lives. She got a bronze star for it, but certainly no magazines will ever have heard of her.

It's even more interesting that she got branch-detailed to the field artillery when she got back in, and I'm pretty sure she became the first female to ever command a combat unit in the Army when she filled in as XO of a forward line battery and they lost their permanent commander. But you'll never see her in a history book or read about her on Wikipedia because it was still illegal at the time for women to serve in combat units at all, and officially she was on the books as part of the brigade headquarters company, only filling in because they were severely understaffed.

I try to keep her in mind when I feel like I'm being slighted at work and not getting sufficient recognition for accomplishments as I earn a salary triple what she gets as a Lieutenant Colonel now.


I visited Kai in the castle he bought after moving back to Germany and renamed to the 'Byteburg'. I think this was around 2001.

Kai was hosting pitch nights for startup ideas for young people; but really anyone was welcome.

Apart from Kai there was his buddy, Uwe Maurer, and two 'staff' guys who were kinda running things in the castle. I.e. upkeep, cooking/food and beverages.

Kai was just all over everyone, running around with a little tablet serving nibbles and making sure everyone always had a fresh beer.

A kind, humble and deeply interesting person.

There was chit-chat and board games (mostly strategy stuff like Go & co) before the night deteriorated and we went to the castle's cellar for pool and foosball; until the early morning hours.

An untold story relayed to me first hand, that night, is how KPT got so popular.

No one knew what a Photoshop plugin to make fancy procedural patterns etc. was useful for. Certainly there was the crowd of people doing flyers for techno clubs/parties; but that was a tiny minority.

Sales were meh. The story goes that Uwe hired a bunch of students that phone-bombed all major US department stores and chains that were selling software at the time.

They pretended they were all studying graphic design and needed KPT "for their assignments".

After that sales started rolling in.

This was relayed to me as more or less the "founding myth" of what later became Meta Creations.


The company that I work for uses a dynamic pricing SaaS. This SaaS basically scrapes the 10-15 competitors' ecommerce websites, including the website of the company that I work for and they provide an excel sheet with every product and the price at which every company sells it for. Additionally if you have a contract with this company, you can also provide it your product stock information on a daily basis. This tells the SaaS how quickly or slowly some products are being sold.

It provides two services that I could tell (I'm not a pricing expert)

1. It gives you a "dumb" daily recommendation per product to either lower or increase the price so that your price in on par with the competitor who is selling it for the highest price.

2. It also gives you a "smart" daily recommendation per product to either lower or increase the price so that your price in on par with the competitor who is making the most sales.

Now I see no reason why all 10-15 competitors can't be subscribed to this SaaS and therefore all competitors are pricing their products practically the same. It's hard for me to come to terms with it, because on one hand it is "pricing it right" and on another it is algorithmic collusion.

The only moral issue I see here is that some products are "essential" (think food, toiletries) and therefore the customer will buy these products at prices that they are not comfortable with.


I wasn't involved in this study, but I wrote the study that estimated the magnitude of this earthquake[0]. In case anyone is interested, usually the magnitudes of 'paleoearthquakes' (historic/prehistoric earthquakes discovered by finding evidence of old ground deformation) are estimated by relating the measured offset of the earth's surface or a rock/dirt layer across the fault line to the earthquake magnitude through empirical 'scaling relationships'; larger offsets are of course indicative of larger earthquakes. These are simply functions relating a measurable attribute of the earthquake to its magnitude. In the study I did, we combined the measurements of the offsets of a number of paleoearthquakes with estimates of the map length of the fault lines involved and used length-magnitude scaling relations to further refine the final magnitudes. There are some corrections for sampling bias that are included in there and it's all nice and Bayesian if anyone wants to nerd out on the stats.

When we did the study, it was speculated that two of the paleoearthquakes, one on the Seattle Fault and one on another fault on the Olympic Peninsula, could have actually occurred in a single event, but there wasn't much evidence to support this; we consider the magnitude of it on a paragraph at the top of page 1149 but not in the rest of the paper. The recent study (TFA) makes it highly likely that they were part of the same earthquake, but they could be separate earthquakes spaced a few minutes to a few months in time (think of the 7.8 and 7.7 earthquakes in Turkiye this spring, separated by a few hours).

A bit of context about the earthquakes in the Seattle region as well as Cascadia and other areas:

- The earthquakes in the Puget Lowlands and vicinity are relatively infrequent; there are about 15 known earthquakes over the past 17,000 years, and many of them are relatively small (M 6-7). However, they are spatiotemporally clustered[1]: There was a big cluster about 900 AD, and things have been mostly quiescent since then. It can be also shown from the geologic data that at the measurement sites ('paleoseismic trenches'), there haven't been any earthquakes since 17,000 years ago (when the Puget ice sheet retreated) on many of the faults, although the Seattle fault has had a number of earthquakes before.

- The big Cascadia subduction zone events are more frequent (perhaps every 500 years?) and larger, but they may not all be M 9 events, unlike what has been discussed in the famous New Yorker article. That article is based largely on the research of Chris Goldfinger, a scientist at Oregon State University, whose views are credible but on the high side of credible, in the eyes of many other scientists in the region. Many of the earthquakes suggested by the geologic data could be smaller earthquakes (M 7.5-8.5) which won't cause as much ground shaking over such a wide region.

- Earthquakes cause seismic waves at the fault surface, and these attenuate as they travel through the earth towards the surface. The initial magnitude of the waves as the earthquake occurs can be different for subduction zone earthquakes than for shallow earthquakes in the crust, and the attenuation is different for these as well. But importantly, not only are subduction zone earthquakes far off shore, but much of the seismic energy is released deeper in the earth as well, which means more attenuation of ground shaking by the time the waves make it to Seattle.

- A Cascadia earthquake will cause widespread but perhaps moderate damage across the PNW with perhaps, but a strong Seattle fault earthquake will absolutely destroy central Seattle, particularly Pioneer Square and Sodo. The fault comes ashore at Alki Point, for reference. However areas farther away (Edmonds, Tacoma, etc.) will not see nearly as much damage.

- SF and LA both have higher seismic hazard than Seattle[2], considering all earthquake sources, the frequency and magnitudes of earthquakes from the sources, and the seismic ground motions emanating from all of these earthquakes to a site within any of the cities, according to the most recent USGS national seismic hazard model. (See Figure 12 for hazard curves for major US cities).

[0]: https://rocksandwater.net/pdfs/styron_sherrod_bssa_puget_eq_...

[1]: https://pubs.geoscienceworld.org/gsa/geosphere/article/10/4/...

[2]: https://journals.sagepub.com/doi/10.1177/8755293019878199


The article says "it has happened" and cites a footnote mentioning the Galileo antenna deployment. I was part of the team that built the Galileo Jupiter Atmospheric Probe and followed most aspects of the whole mission closely, but I have never before heard of this being the cause of the antenna deployment problem. I just read the cited paper (from 1994) and it does seem to be conclusive. If websites like HN had been operating in 1994, this news probably would have reached me almost three decades ago.

https://en.wikipedia.org/wiki/Galileo_(spacecraft)#Galileo_e...


This is so amazing!

Exactly 36 years ago, I was 7 years old and I remember going into an electronics shop that caught my eye.

I was literally always interested in electronics. I finally summoned up the courage and went in. The inside this shop that to me looked like a candy shop, was very friendly man. I ended up convincing my parents to buy me my first electronics kit from him, an audio amp.

I soldered the kit, placed it in a fancy metal box, together with heavy transformer, making the thing pretty beefy, all of which was purchased from him. I remember bringing it to the shop to show it off. He was very happy with my work.

My parents ended up buying more and more kits, and I remember very clearly just randomly coming into the shop when I would pass by for many years later. He was always himself, friendly, helpful, busy helping someone or repairing something, smoking, always smoking.

Not too long ago I passed there again and remember seeing the shop has closed. It was a sad moment.

David, believe it or not is literally the guy who is the repairman mentioned in the story must be at the very least 70 by now. I ended up spending half an hour reading some of the stuff he wrote in the guide, just for the sake of some glimpse of good old times.

I hope he still lives! I'm sure there are many other kids and teens he positively touched.

Great memories, never to be forgotten.


Ha! This was probably the first serious problem I ever tackled with an open source contribution!

The year was 2002, the 2.4 Linux kernel had just been released and I was making money on the side building monitoring software for a few thousand (mostly Solaris) hosts owned by a large German car manufacturer. Everything was built in parallel ksh code, “deployed” to Solaris 8 on Sun E10Ks, and mostly kicked off by cron. Keeping total script runtime down to avoid process buildup and delay was critical. The biggest offender: long timeouts for host/port combinations that would sporadically not be available.

Eventually, I grabbed W. Richard Stevens’ UNIX network programming book and created tcping [0]. FreeBSD, NetBSD, a series of Linux distros picked it up at the time and it was steady decline from there… good times!

[0]: https://github.com/mkirchner/tcping

edit: grammar


It's not the software per se, which is generally fit for purpose but not amazing, but the traditions and economics underpinning how libraries maintain their bibliographic metadata.

Libraries sharing metadata for their catalogs has a long history, dating back to at least 1902 when the Library of Congress started selling catalog cards for use by other libraries. In the 1960s, the Library of Congress embarked on various projects to computerize their catalog, leading to the creation of the MARC format as a common metadata format for exchanging bibliographic records. (And there is a straight line between how card catalogs were put together and much of how library metadata is conceptualized, although that's been (slowly) changing.)

One problem is that bibliographic metadata from the Library of Congress is mostly generated in-house, and LoC does not catalog everything; not even close. In the late 1960s, OCLC, the organization behind Worldcat, was started to operate a union catalog. The idea is that libraries could download bibliographic records needed for their own catalogs ("copy cataloging") and contribute new records for the unique stuff they cataloged ("original cataloging"). Under the aegis of OCLC as a non-profit organization, it was a pretty good deal for libraries, and over time led to additional services such as brokering interlibrary loan requests. After all, since Worldcat had a good idea of the holdings of libraries in North America (and over time, a good chunk of Europe and other areas), it was straightforward to set up an exchange for ILL requests.

Tie this with a general trend over the past couple decades of libraries decreasing the funding and staffing for maintaining their local catalogs, and need for sharing in the creation and maintenance of library metadata has gotten only more important.

However, OCLC has had a long history of trying to control access and use of the metadata in WorldCat, to the point of earning a general perception in many library quarters of trying to monopolize it. To give a taste, Aaron Swartz tangled with them back in the day. [1] One irony, among many, is that the majority of metadata in Worldcat has its origins in the efforts by publicly-funded libraries and as such shouldn't have been enclosed in the first place. OCLC also has a focus on growing itself, to the point where it does far more than run Worldcat. Its various ventures have earned itself a reputation for charging high prices to libraries, to the point where it can be too expensive for smaller libraries to participate in Worldcat. (Fortunately for them, there are various alternative ways of getting MARC records for free or very cheap, but nobody has a database more comprehensive than Worldcat.)

That said, OCLC does do quite a bit itself to improve the overall quality of Worldcat and to try to push libraries past the 1960s-era MARC format. But one of the ironies of the scraping is that it's not going to be immediately helpful to the libraries who are unable to afford to participate in Worldcat. This is because the scrape didn't (and quite possibly never could have) capture the data in MARC format, which is what most library catalog software uses. While MARC records could be cross-walked from the JSON, they will undoubtedly omit some data elements found in the original MARC.

[1] http://www.aaronsw.com/weblog/oclcreply


Decades later I discover I was his first boss

Interesting read. I wrote the original compiler back in 2002/2003, but a lot changed by the time it was open sourced (including the confusing name -- I just called it a javascript compiler).

One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.


When my son was younger - maybe 9 or 10 or so, we were on a plane and he was using his phone and I looked over his shoulder and realized he was on the internet... but I hadn't paid for an internet plan. I said, "son, how are you using the internet?" He said, "oh, a kid at school showed me - if you go here" (he opened up the wifi settings where the DHCP assigned IP address is) "and start changing the numbers, eventually the internet will work." Apparently, at the time, on American Airlines, when somebody bought and paid for an internet plan, it gave them an IP address and authorized it to use the internet... if somebody else guessed your IP address (which was pretty easy, it was a 192.168 address) and spoofed it, they could take over your internet connection with no further authorization.

I had to tell him not to do that, but I was kind of proud of him for having the temerity to go for it.


I remember chatting with some Nvidia rep at CES 2008. He showed me how cuda could be used to accelerate video upscale and encoding. I was 19 at the time and just a hobbyist. I thought that was the coolest thing in the world.

(And yes I "snuck" in to CES using a fake business card to get my badge)


An early story of Ballard, "The Voices of Time" blew my young mind as a teenager and single-handedly projected me toward a future career in distant places, a journey I could scarcely have imagined from my life on a remote midwest farm. It was immensely gratifying decades later to have an opportunity to thank the author personally after a lecture in London.

Just in case there are any "future me's" out there, here's a link to an unpredictable future adventure.

https://readerslibrary.org/wp-content/uploads/The-Voices-of-...


Oh, how I wish I had your scripts (and insights!) when I was analyzing Unix logs in 1986, looking for the footprints of an intruder...

Nice work! Always fun to see something I wrote long ago reverse engineered. The packet format was indeed inspired by ESP over UDP, and I named it XSP. After system link shipped with the original launch of the console, I also worked on Xbox Live networking, including the client/server interactions and the design and implementation of the front-end Security Gateways that all Xboxes would talk to, first to authenticate themselves to the service, and then to maintain a heartbeat connection to the service (to keep NAT ports open during idle time), and to facilitate NAT traversal.

Telcos used to monitor their copper outside plant for moisture. This was called Automatic Line Insulation Testing in the Bell System. The ALIT system ran in the hours before dawn. It would connect to each idle line, and apply, for tens of milliseconds, about 400 volts limited to very low current between the two wires, and between each wire and ground, measuring the leakage current. This would detect moisture in the cable. This was dealt with by hooking up a tank of dry nitrogen to the cable to dry it out.

Here's a 1960s vintage Automatic Electric line insulation test system at work in a step-by-step central ofice. [1] Here's the manual for automatic line insulation testing in a 5ESS switch.[2] 5ESS is still the major AT&T switch for copper analog phone lines. After that, it's all packet switching.

For fiber, of course, moisture doesn't affect the signal.

This led to an urban legend: "bell tap". While Western Electric phones were designed to not react to the ALIT test signal, many cheap phones would emit some sound from the "ringer" when the 400V pulses came through, some time before dawn.

[1] https://www.youtube.com/watch?v=Wt1GGdDa5jQ

[2] https://www.manualslib.com/manual/2755956/Lucent-Technologie...


Yep! I worked there from 2007-2012, and at the time we used Photoshop, which on the surface sounds counter-intuitive (being a primarily raster-based program.) We would take screenshots of product (or get screenshots from HI if we weren't allowed access), and would redraw everything in Photoshop at a minimum of 288ppi (4x) using vector shapes and type layers, lots of nested smart objects, and layer effects (back in the iOS <6 days everything had gloss and shadows and reflections.) Content teams pop in retouched photography, photos of employees as avatars, names of employees to go with those avatars, and text to tell a story for that launch. Our 4x vector screens (sometimes well over 2GB PSBs) would then be shared with international teams for translation, and translations would come back to HQ for a final approval. We would also sometimes scale these screens as high as 32x for output to MacWorld / WWDC hero banners which was fun. You better believe we put our names in all these screens (mine appears once in the linked article!)

Enron...sigh.

I went to college near Houston (Texas A&M) and interviewed with Enron in early-mid November, 2001 prior to December graduation. Given the timing, I believe it may have been the very last interview loop they did.

For a soon-to-graduated Comp Sci student, Enron put on an amazing presentation. The interview was onsite at Enron headquarters. They took the candidates to dinner one evening, put us up in a nice hotel, and spent all day the next day grilling us in panel interviews, having us work individual problems, and challenging groups of interviewees with team problems. It was calculated, thorough, and to this day it's the most comprehensive and satisfying interview experience I've ever had. Driving home, I knew they had successfully sifted our group to identify who the best candidates were.

The job market was also really tough in late 2001, with the dotcom bust and 9/11 making it challenging to find good opportunities. If memory serves, some of the other options I was considering at the time were to write PowerBuilder for an old-school oil company and maintain Fortran for a small, decades-old engineering firm. Enron, in comparison, seemed like a godsend.

Enron was opulent. Enron paid well. The software people seemed genuinely interested in what they were doing. Everyone seemed smart and ambitious. We were shown one room that was an open workspace and each desk had a pair of flat panel monitors (this was three years before I bought the first flat panel for my home PC and years before I had two screens at home). Enron was as sexy to a software professional as any workplace in the world in its time.

I knew there was some drama surrounding the company, but I was generally oblivious to the details. I was overwhelmed with a 16-hour course load, a student job slinging C++ for a local consultancy, and a job search. While onsite, a few better-informed candidates asked some of the employees about the brewing scandals. The Enron folks shrugged it off. They believed in the company and believed in the work they were doing.

I had a great interview and left really excited. A week later I got a call from their recruiter and was told they liked me and an offer was incoming. I was stoked. Then, silence. Maybe the Thanksgiving holidays were the reason for the delay, I thought. In another week or two, the Enron collapse was headline news. Every single Enron employee I had interacted with was out of a job, and the prospect of thousands of Enron employees hitting the market simultaneously made my job search even tougher.

Enron provided me and my fellow interviewees some SWAG while we were there. I had a hat, a clear plastic mug, and a few other items. I sold them for a dollar at a garage sale a few years later.


Wow. This post gave me emotional whiplash.

I opened the collection of links, which is quite good if a bit old. But then I had a subconscious mental itch, and thought, wait... where had I heard the name mrelusive before? That sounds _really_ familiar.

And then I remembered - oh, right, mrelusive, JP-what's-his-name. I've read a huge amount of his code. When I was working on Quake4 as a game programmer and technical designer, he was writing a truly prodigious amount of code in Doom 3 that we kept getting in code updates that I was downstream of.

And he was obviously a terrifically smart guy, that was clear.

But I had cut my teeth on Carmack's style of game code while working in earlier engines. Carmack's style of game code did, and still does, heavily resonate with my personal sensibilities as a game maker. I'm not sure if that particular style of code was influenced by id's time working with Objective-C and NeXTStep in their earlier editors, but I've long suspected it might have been - writing this comment reminds me I'd been meaning to explore that history.

Anyway, idTech4's actual game (non-rendering) code was much less influenced by Carmack, and was written in a distinctly MFC-style of C++, with a giant, brittle, scope-bleeding inheritance hierarchy. And my experience with it was pretty vexed compared to earlier engines. I ultimately left the team for a bunch of different reasons a while before Quake4 shipped, and it's the AAA game I had the least impact on by a wide margin.

I was thinking about all this as I was poking over the website, toying with the idea of writing something longer about the general topics. Might make a good HN comment, I thought...

But then I noticed that everything on his site was frozen in amber sometime around 2015... which made me uneasy. And sure enough, J.M.P. van Waveren died of cancer back in 2017 at age 39. He was a month younger than me.

I didn't really know him except through his code and forwards from other team members who were interacting with id more directly at the time. But what an incredible loss.


First, I am big fan of your articles even before I joined IPinfo, where we provide IP geolocation data service.

Our geolocation methodology expands on the methodology you described. We utilize some of the publicly available datasets that you are using. However, the core geolocation data comes from our ping-based operation.

We ping an IP address from multiple servers across the world and identify the location of the IP address through a process called multilateration. Pinging an IP address from one server gives us one dimension of location information meaning that based on certain parameters the IP address could be in any place within a certain radius on the globe. Then as we ping that IP from our other servers, the location information becomes more precise. After enough pings, we have a very precise IP location information that almost reaches zip code level precision with a high degree of accuracy. Currently, we have more than 600 probe servers across the world and it is expanding.

The publicly available information that you are referring to is sometimes not very reliable in providing IP location data as:

- They are often stale and not frequently updated.

- They are not precise enough to be generally useful.

- They provide location context at an large IP range level or even at organization level scale.

And last but not least, there is no verification process with these public datasets. With IPv4 trade and VPN services being more and more popular we have seen evidence that in some instances inaccurate information is being injected in these datasets. We are happy and grateful to anyone who submits IP location corrections to us but we do verify these correction submissions for that reason.

From my experience with our probe network, I can definitely say that it is far easier and cheaper to buy a server in New York than in any country in the middle of Africa. Location of an IP address greatly influences the value it can provide.

We have a free IP to Country ASN database that you can use in your project if you like.

https://ipinfo.io/developers/ip-to-country-asn-database


I did my PhD on energy harvesting (specially focussing on hostile environments with high temperature or high radiation) around 15 years ago and harvesting from stray EM radiation was the holy grail for room temperature stuff where vibrations or heat gradients couldn’t be found.

If you’re willing to sacrifice always on connectivity and have a node report in on an infrequent basis then I always figured EM harvesting would be the way to go for most applications since even a tiny amount of energy can build up over time to become a useful amount.

I knew I’d gone deep into this world when I started thinking that micro watts was a large amount of power!


As a 7 year old, I met two of them at IFIP68 in Edinburgh. I don't have much recall of Minsky, but McCarthy was nice. We went out to turnhouse airport and he flew a light plane around. At dinner he commented we did the washing up by hand and when he offered to send us a dishwasher my mother (of course) said "no no John don't be silly" but she said to me in the kitchen later on: "if he ever offers again accept"

We also ate at a chinese restaurant across town and he was surprised when me and my brothers and sisters said birds-nest soup was disgusting.

Susy McCarthy stayed with us for a while. She was a typical Californian teenager, ate her cereal with the wrong hand holding the spoon and her feet up on the chairs and we younger kids were told in no uncertain terms not to emulate her. She refused to go to school and there were exasperating phone calls about what to do. In the end she went somewhere else. Johns wife died climbing Mt Everest. it was really sad. I think it was the first all-women's climbing team to try the route (or even Everest's pinnacle)

Minsky and McCarthy were quite happy to respond to emails in the 80s and 90s. I asked Minsky about writing with Harry Harrison and he said he was frustrated at the ending of the book they did ("the Turing Option") and McCarthy talked about his ideas of long term information ontologies in a notation or language he called "Elephant"

McCarthy was a notable USENET curmudgeon and would be oppositional to any anti-nuclear, anti-oil, anti-plastic agenda he saw. He also said he cried when he learned the bombs had been dropped, and that the invasion of Japan was off, because he knew he wouldn't die on a beach in an opposed landing. (It is possible I have this mixed up with memories of what Paul Fussel wrote, but I am pretty sure John said this)

John played "Alice's Restaurant" with a guitar, he had a very nasally whiny voice.


> the backup system applied the same logic to the flight plan with the same result

Oops. In software, the backup system should use different logic. When I worked at Boeing on the 757 stab trim system, there were two avionics computers attached to the wires to activate the trim. The attachment was through a comparator, that would shut off the authority of both boxes if they didn't agree.

The boxes were designed with:

1. different algorithms

2. different programming languages

3. different CPUs

4. code written by different teams with a firewall between them

The idea was that bugs from one box would not cause the other to fail in the same way.


I worked in a lab studying malaria vaccinology. There a bunch of difficulties with trying to develop a vaccine that I think are well covered here. I also learned some about the history of malaria research.

1. It is a pain in the ass to study malaria. You need to have an insectary to grow mosquitoes (through their muktiple life stages, of course) and have them feed on infected mice just to passage the parasite. This isn't like viruses where you just throw then into vero cells or bacteria that will grow in LB overnight. Parasites require dedication. This kind of operation costs a university hundreds of thousands of dollars to get up and running, and there are not too many places in the US that have robust malaria research because of it. UW and the Boston area are two that I know with good malaria research centers.

2. The lifecycle of malaria is very difficult to make a vaccine against. This is described in the article. Essentially, you go from mosquito -> skin parasite (few hr) -> liver parasite (7d also no symptoms) -> blood parasite -> mosquito. Also, the prevailing idea is that the amplification during the liver stage through red blood cell stage is so great that once the blood stage is established, it's game over. You are going to get sick as a dog. So you have a few options: target the sporozoite in the skin and blood within a couple hours, or target the liver stage where the parasite is essentially dormant and is nearly impossible to find. (My research was finding antigens in the liver stage, there are very few and they don't produce very good immune responses with standard vaccination techniques). You have to remember that for the immune system to work, you need to see the signs of the pathogen, then give yourself 5 days at the absolute minimum to expand your T cells to eradicate the pathogen. And oftentimes we're talking less than 10 infected cells in the entire liver, which is a huge organ with famously tortuous circulation. So good luck on the liver stage.

All this ignoring the fact thatthe mouse model of malaria goes through the liver stage in just 4 days, which doesn't allow expansion of T cells for killing the bug. So even if we found the perfect antigen to vaccinate against, we don't have good models in mice to actually evaluate how effective a vaccine would be because of the differing biology of the model. There are some people who reconstitute human livers in mice so they can use the 7 day parasite, but those mice are pretty messed up and very expensive. And handling mosquitoes that carry human malaria is much more annoying than the mouse malaria. Still a very interesting and compelling model for research.

3. Interest in protein based vaccinations. There was a study long ago that used irradiated parasites that protected people from subsequent infections. The problem is that it took a LOT of parasites, and 5 doses of the vaccine. This strategy is similar to the inactivated virus kind of vaccinations that we all likely have received. But these parasites needed to be injected IV, 5 doses, and the parasites need to be kept at -80C until injection time. That might work for a vaccine delivered in a metropolitan area, but good luck finding -80 freezers in an African village. Around this time, researchers got interested in protein based vaccines, like the pertussis vaccine. So people went looking for a protein that was highly immunogenic, and they landed on CSP, which the article describes nicely. However, this again is mostly expressed on the skin parasite so you have just hours to recognize that protein and kill the parasite. Much less than the ideal 5-7 days.

4. Financial incentives are of course a problem. Though many would argue that financial incentives are bad for vaccine development in general because they prevent any disease from occuring, so you are eliminating your market if they work well.

So where does this leave us? If we keep thinking of vaccines as we typically do, we are going to just create marginally better versions of RTS,S, which isn't great. In my opinion, the most likely way to vaccinate against malaria is transmission blocking vaccines, which would eliminate only the blood stage parasites that need to be picked up by a feeding mosquito to allow replication in the mosquito foregut. But this kind of vaccine wouldn't prevent the individual from getting sick. It would take a replication cycle of a fully vaccinated population to take it out, which is a very unappealing proposition.


A slight tangent - I played small college football as a lineman and every once in a while we'd be in a televised game. Those games were almost a different sport because they'd stop play during commercial breaks, so it was as if 10+ random time-outs were sprinkled through the game.

While not an exact comparison, playing line during a possession is like having a series of 5 to 10-second intervals where you're either sumo wrestling or sprinting, with 10-40 seconds to recover between each. So during the off-season, linemen are trying to reach a balance of strength, speed, and cardio to meet their team's play style. Because most of our games were non-televised (and therefore we didn't get the additional breaks) our linemen conditioned more heavily for cardio/recovery. So when we'd play a televised game we'd basically feel like our cardio was "underutilized". I'd love to see an analysis of whether our team ran more line-intensive plays during these games because we had the additional recovery time.

It's weird for me to see these huge guys in D1 college and the NFL playing line, because I'm pretty sure they'd cramp up/get gassed during a drive in a non-televised game.


Recently attending the World Championships for Athletics in Budapest, supporting my fiancée in the 5000m. I'm not super athletic myself, but one of the country's notable figures is Ernö Rubik, inventor of the Rubik Cube.

They had a social media thing going around where you could tag your country for a chance to get featured. From a 14 second PB in my youth, it was still pretty trivial to get somewhere in the 25-30 second range on the janky stock cubes they distributed to all the athletes. It was probably the most I did for Team Canada during that trip.


As a diesel engine mechanic by trade some of the stuff these professional drivers endure in these temperatures is just unreal to me. Performing a predrive check in -60f weather is insane to me.

We got an old logging peterbilt from yukon in our shop once. Definitely driven, definitely well maintained. Popping the hood there was a big orange sticker near the radiator warning us "DO NOT FILL UNDER 60C." It stumped us for a bit until we found out truckers in the north sometimes never run engine coolant because it may freeze up. Pretty surreal.


Many years ago, I worked on a product that provided a bunch of old emulated games. We properly licensed them, but many of the license holders no longer had the original ROMs, ripping the game data off of some of the really old consoles was quite difficult, and the only ROMs available publicly were cracked copies with demo's added. That was the birth of the demoscene, which was awesome, but bad for us trying to legitimately provide these games. So we ended up cheating. We used the cracked versions of these games, but we always loaded the games from a state just past when the demo would play, making them look normal and legit. Thank God all those early cracks put their demos only at the start and not, like, between levels 1 and 2, or we'd've been screwed.

Fun (unrelated) oscilloscope story happened to me recently. I was having some problems with fingers of my hand going numb so the orthopedic surgeon sent me for a procedure she said would be "unfortunately pretty unpleasant" called a "nerve conduction study" with an eminent London professor to try to diagnose the problem.

When the day came the dude showed up with a laptop attached to a small box and a number of leads. I said to him[1] "hey there, I'm an engineer, would you mind telling me how this study works" and he (delighted) explained to me that although his title was a professor of neurology he was actually a biomedical engineer and the procedure was hooking me up to an oscilloscope and sending currents down the various nerves of my arm to see whether (or not) they were detectable on the oscilloscope and therefore being conducted properly down the nerves. So I spent 30mins or so having various electric shocks applied to my arm and looking at the pictures on the (very fancy) oscilloscope while shooting the breeze about medical imaging and machine learning with this professor.

The procedure was mildly unpleasant but super-interesting as a result.

[1] Sort of as a coping strategy because I expected it to be painful.


Andre LaMothe, my counterpart writer for Waite Group Press. I wrote “The Black Art of Windows Game Programming” in 1994. Mitch (Waite) loved those “Black Art” titles.

Edits for spelling and year, which I’m still not sure about.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: