Hacker News new | past | comments | ask | show | jobs | submit | OliverGilan's comments login

Curious how this scales. Just tried this with the test dataset and it was probably the slickest deduplication experience I’ve had


Appreciate the kind words! Linear scaling in terms of speed and cost. We haven't yet optimized the prompts & choice of model to minimize token usage, so I'd recommend emailing us for advice if you want to run this on a large dataset


Is there any way to even use this today? I’ve been waiting for the server and file system components to be open sourced for a couple years now


I used it for the last year or two at a company where all of our source control was with github. It has bindings to interact with github (not sure about what else) that allowed me to use it locally without anyone else having to change their workflows/install anything. Granted, I was mostly using it for stacked pr management among a few other things and not really fully taking advantage of all of it.


Yes, it uses Git as the default backend, so it's more or less just a different interface to a Git repository. Everyone today uses it this way.

The server-side components have been open source, but not fully usable due to fb-only code. That's changing and you can in theory build a working OSS server now that works on mysql/s3, but it isn't supported yet.


This seems weirdly hostile. He laid out a bunch of points but you’re grabbing on to this one to make it seem like he’s using classic corporate-speak. Do you find it so unrealistic that the CEO of Sourcegraph has heard from devs that their managers asked them to try to clone or investigate the product before buying? That seems pretty likely


Investigating Sourcegraph's source code as part of procurement is not only plausible, but useful work that a software engineer should be happy to do.

Stating that making such evaluations impossible is a good thing is therefore more bullshit than other reasons to go closed source.


[flagged]


It's both hostile and, worse, boring. I know it sucks to be intrinsically less interesting than someone you disagree with passionately, but it is the case here that the CEO of the company explaining their policy shift is much more interesting than your rebuttals, which seem superficial and rote by comparison.

Someday somebody is going to be intrinsically more interesting about, like, supporting DNSSEC than me (maybe Geoff Huston will sign on and start commenting), and I'm going to want to claw my eyes out. I have empathy for where you're coming from. But can you please stop trying to shout this person down?


If we ignore the final sentence of his reason, then you might have a point. But given his reason ends with:

> This honestly was just a waste of everyone's time.

Makes it pretty clear that the benefits to Sourcegraph (I.e. not wasting time negotiating with companies acting in bad faith), was a large part of this rationale.

Besides, if you had ever tried using the OSS version of Sourcegraph, you would realise that OSS Sourcegraph is a shadow of its enterprise version. Trust me, Sourcegraph didn’t loose any sales to people running OSS Sourcegraph, and anyone who’s willing to rip out the licensing system, so they can use the enterprise features without paying, obviously isn’t going to become a paying customer either.


people can do things for more than one reason


What was the breakthrough and what have we learned? I'm interested in reading more about this


Metagenomic sequencing: The field exploded after technologies and techniques were developed for using next-gen sequencing to characterize entire populations/communities of living things, first with 16S rRNA sequences, then with full genomes. The cost to do this has also gone down many, many, many orders of magnitude in the last decade or two (just search "sequencing cost graph" on google).


There's been way too many papers. Here [1] is a link to Google Scholar for 'brain gut' which turns up a zillion results. Basically, your gut biome and brain are heavily linked. As a bonus, here's [2] a search for 'glyphosate gut microbiota.' Advances in biotech/consumables and dramatic unforeseen consequences seem completely inseparable.

[1] - https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=brai...

[2] - https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=glyp...


The most cited study on the brain-gut axis going by Google Scholar results was a review paper published in 2011.

My best hypothesis is this review paper inspired the community to investigate the weird relationship.

https://www.frontiersin.org/articles/10.3389/fphys.2011.0009...


There have been thousands of breakthroughs in the field. The pace of research and progress is incredible.


No information was exchanged in this conversation


They are not wrong, I can attest that there have been a ton of breakthroughs and new developments. I don't know what else you're looking for here? Are you expecting them to list every single breakthrough here for you?


Literally 1 would do :)



Warning: Method is recursive on all execution paths.


Very interesting, thanks!


This post is a bad analysis.

> While some will praise Satya Nadella and hero-worship Sam Altman, breaking OpenAI into two parts will slow down momentum for LLMs and research while handing even more power to the Cloud and Azure in its future

Except that Microsoft nor Sam is responsible for the breaking of OpenAI. It was the non-profit board. Instead now Sam and co will have access to the Microsoft war chest, funding for more compute, chip design, and datasets so if anything they’ll probably move even faster than before.

> Microsoft taking Sam Atman and his followers in, is like shutting down your best investment just for a short-term benefit. These stories don’t usually end well for big corporations.

Again this makes little sense and its very clearly obvious how this makes long term sense for Microsoft. Also they did not initiate this move by OAI

> It’s the job of Venture Capitalist to praise Microsoft, Satya Nadella, and Sam Altman to vilify OpenAI’s board in all of this.

No it isn’t

> Microsoft eating OpenAI and poaching their talent, is the worst possible scenario for the startup that was just beginning to get momentum.

No firing the beloved CEO of the fastest growing tech startup in a decade and ignoring warnings from 80% of employees that they would quit is the worst scenario for a startup regardless of what Microsoft does

This whole post seems like really bad pattern matching by someone who is anti capitalism and tries to frame every scenario they see in business through that lens


>Except that Microsoft nor Sam is responsible for the breaking of OpenAI.

How do you know?

We don't know the reasons for Altman's dismissal in the first place.

Maybe MS was involved and the board acted in panic mode to prevent an hostile takeover.

>ignoring warnings from 80% of employees

The warning came after the firing.


Can you be more specific about what was “solved” in the 90s? The platforms upon which apps are run now and the technologies as well as the expected capabilities of those apps have drastically changed. Not all UI development has changed just for being shiny.

I see little reason why building a UI today cannot be a superset of what was solved in the 90s so I’m curious to know what that solved subset looks like to you


Just start with the fact that all desktop programs had a menu bar where you could find every feature - and at least on Windows - also a shortcut for that feature.

This was broken with Ribbon and the hamburger menus that every application seems to have switched to for no other reason it seems than to copy Chrome.

To be fair Ribbon is somewhat useable again, but the in the first version I have no idea how people were supposed to find the open and save functions :-)

Other problems:

Tooltips are gone. Yes, I can see they are hard to get right on mobile, but why remove them on desktop while at the same time when the help files and the menus were removed?

The result is even power users like me have to hunt through internet forums to figure out how to use simple features.

Back in the nineties I could also insert a hyphen-if-needed (I have no idea what it is called but the idea is that in languages like Norwegian and German were we create new words by smashing other words together it makes sense to put in invisible hyphens that activates whenever the word processor needs to break the word and disappears when the word is on the start or in the middle of a line and doesn't have to be split.)

Back in the nineties I was a kid on a farm. Today I am a 40+ year-old consultant who knows all these things used to be possible but the old shortcuts are gone and I cannot even figure out if it these features exist anymore as documentation is gone, tooltips are gone and what documentation exist is autotranslated to something so ridiculously bad that I can hardly belive it. (In one recent example I found Microsoft had consistently translated the word for "sharing" (sharing a link) with the word for "stock" (the ones you trade).


IMO ribbon menus are when implemented correctly actually better than a menu bar. It gives icons and the benefit of a GUI to old text-menus.

Hamburger menus I disagree with but sort of understand the logic of - they're basically like making 'fullscreen mode' the default mode, and then the hamburger menu button just sort of temporarily toggles that off. It makes perfect sense on mobile (I don't think that's what you're talking about though), and on the desktop it can make sense in a web browser when you have, essentially, 3 sets of chrome - you have the desktop window, the browser's chrome, and then the website's chrome all before you get to the website's content.


A related detail, even things like icon design have gone in a strange direction. In the interest of simplicity they've gone from recognizable to rather amorphous blobs. A button for print has gone from a clearly recognizable image of a printer, enough you could probably even guess the model number, to the icon being a rounded square with another rounded square sticking out the middle top. Many of these newer icons are just too abstract and similar to one another to be recognizable, IMO, and I think the user experience suffers.


And since saving to a floppy is not a thing anymore...


What’s a printer? Does it make cheap iPads?


You list a bunch of unrelated things that has absolutely nothing to do with the topic: UIs back in the day had less latency, by not caring about accessibility, internationalization, etc - but I’m quite sure they were way worse off in terms of properly handling blocking operations: you surely know the Solitaire effect


At the very least, you knew that when the UI locks up, it's actually doing something. These days, previously-blocking operations may be run async, but the result is that, every now and then, the UI will miss the signal that the async job completed (or failed). An UI that regularly desyncs from reality is a relatively new problem.


> At the very least

I don’t care if the food is bad because it is too salty, or because it is overcooked, I still won’t eat it.


You'll reconsider when your only alternative is food that's cooked perfectly, but also bacteriologically contaminated.


There were all kinds of standards, guidelines that made people recognize what the program is doing and how to operate it. Nowadays, UIs are mostly defective, trying its best to hinder effective usage and hide/strip functionality. There are of course progress too, but something was lost that make applications today much less intuitive and hard/tough to use.


I don't understand what specifically is of concern here. Do you have exact examples?

Some platforms publish a list of recommended guidelines which are effectively a standard. For example here's one from Apple about when and how to use charts in an application: https://developer.apple.com/design/human-interface-guideline...


Some of it may be found in older HIG sources sure: https://en.m.wikipedia.org/wiki/Human_interface_guidelines

Also they were called GUI standards or UI at the time.

The modern equivalents called UX isn't reflecting the same conglomeration of standards and conventions though. So not talking about the newer stuff.

I'm no expert on it, and it required specialized expertise. It's been abandoned for mobile interfaces and the modern UX stuff, which often optimizes for design over functionality.

If you've never used old software, it's hard to explain. But old Apple or Microsoft GUI standards would cover the basics, but you'd also need to study the applications and how they presented their GUI.


While I broadly agree with the UX/HIG/design guideline issues that are common in modern software... literally none of them have anything to do with the technicals of how quickly the UI reacts to actions and renders the next state. You can have responsive UI in all of them. And all the modern ones also say "jank is bad, don't do that".


Back in the days™ there were widget libraries as part of the OS, which followed the guidelines by the OS vendor. This gave a foundation for somewhat similar behavior and standardisation of behavior. This gives usability.

Nowadays man's applications are web apps, build without such frameworks, with less UI research and even where frameworks are used they are often built with a somewhat mobile first approach.


One made up example that attempts to embody the complaint:

If I visited a site dedicated to hamburgers today, I would not be surprised if the "Log Out" button was presented as an image of a hot dog. It would be a mystery to me what that hot dog did until I clicked on it.

Compare this to 90's UI, where it would pretty much be unheard of to do something like that. It would have been a joke, or a novelty. These days that sort of ambiguous interface isn't presented as a joke or a novelty - it's the real product.


For example, Apple's human interface guidelines mandated that you have to give the user instant feedback (I think they even talked about how many milliseconds of delay are tolerable). A correctly programmed application in OS 9 would give instant feedback on each menu item selected, button pressed, and so on.

They later gave this up and almost everything else in their very reasonable guidelines based on actual research when they switched to OS X in a hurry and multithreaded everything. The early Mail application was a disaster, for example. Generally, people started complaining a lot about the spinning beach ball of death.

In contrast, modern UX guidelines are mostly about web design, how to make web pages look fancy. They also recommend instant feedback, but many libraries and application designs don't really support it.


I am negative for the Celiac blood test (got tested three times) but I am absolutely allergic to gluten. Starting three years ago randomly I became allergic and severely ill for months - losing up to 30 lbs in a matter of a month - before isolating the problem to gluten. I have also on multiple occasions accidentally eaten gluten without realizing and immediately felt the symptoms. None of the tests I've taken show gluten as an allergen or problem in my body but cutting it out removes all my symptoms. Our tests are not nearly good enough and I suspect many people deal with chronic inflammation and other health issues due to food they can't eat and don't even realize


I'm curious, what effects and allergies did you get?


https://olivergilan.com

Just a personal blog with some essays that I post occasionally to get myself in the habit of writing. My next post will be about living with chronic illness as a SWE and how to get through it


I like how clean and distraction-free the design is.


Companies have been gathering data for over a decade now and storing it in massive data warehouses with often no plan for how to effectively use it. I would not be surprised if turning those warehouses into a source for various LLM applications becomes the dominant use case for the industry very quickly. I expect we'll see some of the big incumbents like Snowflake roll out first-class vectorization of existing warehouses in the coming months.


Inflation has been remarkably low since 1980 though… and that graph doesn’t mention prices. Theoretically if more is being produced then you don’t need wages to keep up to afford the same luxuries. In fact all the stats I’ve seen shown that we can and do afford a lot more these days


Meanwhile the headline article is saying the exact opposite.

We're not talking about luxuries, we're talking about essentials. A nice iPhone is useless if you can't afford to feed your family or pay your energy bills - which is quite literally the situation many families in the UK are now finding themselves.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: