Or just set your browser javascript to temp whitelist only and rarely enable it while using HTTP/1.1. 99.99% of surveillance is mediated via javascript execution if only because it's the easiest way. Not a lot of people using actual logs anymore since they doesn't allow one to be really invasive.
The trust on first use self-signed TLS culture of gemini is great though and should be applied more widely by individuals in their behavior on the normal web.
Before HTTP became popular, there was another protocol for accessing remote documents on the internet. This protocol was called Gopher. Gemini is an extension of the Gopher protocol that includes things like TLS.
Gemini exists separately from HTTP and thus "the world-wide web".
Says the article whose splash image is AI-generated slop, that reads like an intrusive advertisement, and whose source code contains links to scripts from www.googletagmanager.com.
I've read a lot about Gemini and nothing really excites me the same way the web excited me in the late 90s.
To me it mostly seems different to be different, not better in any way (except claiming that it undoes the badness of the "modern web", whatever that really entails).
I still write my websites like I did in the 90s, just with HTML 5 and CSS, and JS where needed.
Maybe the better path would have been to base something on HTTP, with a defined subdomain (remember when we saw www. at the start of 90% of urls?) and some sort of trust model
I just completed moving my WEB site to Gemini, I find it far easier to maintain. There is a lot of content out in Gemini without the annoyance of the modern WEB.
No it's not. That is entirely a choice. My website is just HTML files in directories with other files. The html is handwritten and just <p> and <ul> and the like. I just use a normal text editor.
Gemini is not simpler than HTTP/1.1 and writing HTML. Gemini is like forcing yourself to write in Haiku because it constrains you rather than just limiting yourself ... yourself.
In what way is formatting more complex with HTTP? Assuming the mime mappings are set up correctly the HTTP server pushes out the files with the correct content type and with that you can edit your text files with ed for all I care and it will work. The fact that you can do more complex things with HTTP does not mean you should, the aforementioned example of a directory full of files is just as valid as it was when Tim Berners-Lee fired up the first site on his Next cube.
What special tools? Since when is vim a special tool? What special file format? Special file formats exist but none are required. I don't see an answer to the question here.
The deployment procedure is too complex. Have we learned nothing in the past 30 years?
One of the major shortcomings of the WWW is that it requires technical skills to publish content. If this was as simple as consuming it, and we had the equivalent UX as the web browser, the centralization we got and all its associated problems would've been minimized, if not avoided altogether.
Maintaining the P2P distributed nature of the internet is crucial in protocols of this type. Otherwise once (if) it ever gains traction, the alternative will have the same problems. Gemini doesn't address these issues directly, other than by being very niche and deliberately brutalist. This means it was never designed to be popular, so this article will not appeal to anyone who doesn't already have an interest in it. Let alone entice anyone to escape the current web.
Yes, the current web is deeply broken beyond repair. Using it is a hostile and frustrating experience. I'm not sure what the solution to these problems is, but it's certainly not Gemini.
Back in the mid-90s, most ISPs gave space to users to host a website. There were produts like Dreamweaver and (although I hate mentioning it) Microsoft FrontPage that provided WYSIWYG editing of HTML and used FTP to upload the pages to the server (although using FTP was usually an implementation detail, I think FrontPage had its own way of uploading). Easy.
> Yes, the current web is deeply broken beyond repair.
Which web are you referring to? The document web from the 90s to early 2000s, or the now app-centric web we have now? If TBL's view of the web emerged, you wouldn't need a separate tool to author web pages, the browser could do it, and use PUT to store new pages on a web site. But alas, we don't have browsers capable of editing (but we can always download a gig of Javascript to do it! Yeah! That'll work!)
> Back in the mid-90s, most ISPs gave space to users to host a website. There were produts like Dreamweaver [...]
Those tools and services were still too limiting and complicated. Why would I need to use a separate protocol to upload my content to someone else's machine which will then serve it for me? My domain name and server configuration would still be controlled by someone else. These solutions were made in response to something the web was missing, but they were far from ideal, and would later become the cause of the major problems we deal with today.
The limited upload bandwidth and network infrastructure are often brought up as reasons in this discussion, but ISPs only offered what users demanded, which in turn was driven by what was technically possible. Had there been demand for symmetric connections and home servers, we would've standardized on that much earlier.
> Which web are you referring to? The document web from the 90s to early 2000s, or the now app-centric web we have now?
The current web TFA criticizes. But the design of the early document web is what allowed it to get to this point.
> If TBL's view of the web emerged, you wouldn't need a separate tool to author web pages, the browser could do it, and use PUT to store new pages on a web site.
I don't think that vision was ever developed in detail. The original proposal[1] mentions a phase 2 of the project where "authorship becomes universal", but that was never implemented AFAIA. The entire concept is very vaguely described.
I imagine that the complexity of that phase far exceeded phase 1, which pushed it back, and eventually sites like GeoCities and Tripod appeared to fill the void. By that time the momentum was gaining ground, and any web-native solution would've been difficult to launch.
I think this was a major misstep which led to hyper centralization, provided a huge opportunity for advertisers, and the rest is history.
TBL is only recently trying to fix this with his Solid project, but it's too little, too late. Over the years we've also had several attempts at something similar, like Opera's Unite, but none of them gained traction. And now we have the fediverse, and all kinds of decentralized protocols, but they're all focused on the technology rather than the user experience, so I'm not hopeful that they will succeed.
I don't think Dreamweaver was "too complicated" [1] and I recall seeing tons of sites in the 90s, some managed by hand, some by program (whether custom written or commercial). Yes, that might have been early adapters of technology, but I recall seeing web sites from as many non-technical people as technical (and I'm not talking about GeoCites either).
Hindsight is always 20/20. I don't think it was centralization via GeoCities and Tripod was a major misstep. Yes, possibly a misstep, but there were several along the way (Javascript and Active-X are two that come to mind).
[1] Just how stupid do we need to assume a user is? A person who has never seen a phone would still need training on how to use it, and that's not a hard thing to use at all.
The WYSIWYG tools were part of the problem. The entire process of publishing was too complicated.
Think of it when compared to a web browser for consuming content. The user doesn't need to know anything about DNS, HTTP, HTML, or any of the underlying technologies. They type an address (which is already a hurdle for many people, and one we mostly managed to collectively overcome) or click on a link, and content appears on their screen. It all just happens behind the scenes.
In contrast, even with web builders of the 90s, they needed to know how to setup an FTP client, what "uploading" was, how all this maps to web "pages" and URLs, etc. If they wanted to make their websites interactive, they needed to at least be familiar with this new fangled JavaScript, CGI, ActiveX, or whatever the tech du jour was. Eventually they needed to be familiar with databases and storage. Today this is a complicated mess reserved for web developers. Modern website builders do a drastically better job at simplifying all this, but they're 3rd parties you give full control to over your content and data. If you wanted to avoid this and self-publish, then the barrier to entry is significantly higher.
The core of the problem is that the publishing equivalent of the web browser simply doesn't exist. The web would've never taken off if the web browser was this complicated. Some early adopters managed to make publishing work, but most of them were tech geeks who later became web developers. This process was never approachable to the tech illiterate.
The reason I think this was the primary misstep that got us to where we are today is because of the companies that stepped in to fill the publishing void. At first they were early "giants" like GeoCities and Lycos, but later they became the MySpaces, Facebooks and Twitters. The entire industry of social media simply wouldn't exist if people were enabled to self-publish their content instead of entrusting this to a 3rd party. It's possible that the idea of web search and discovery would've been a built-in feature of this distributed web, so giants like AltaVista, Yahoo and Google would've been less powerful or unnecessary. Without this industry forming, advertisers would have a smaller entrypoint and we might have missed the early annoyance of banner ads and popups, and the current evolution of it with user tracking, targeting, data brokering, privacy violations, propaganda machinery, and all the related evils of adtech.
The WWW would've been a wildly different experience, and possibly closer to what TBL originally envisioned. It's a damn shame we never got that. Now it's too late not just for the web, but for any related technology to achieve similar adoption. This is why the solution needs to place the UX first, and be vastly easier to use and more powerful than the current web, if it ever wants to attract users beyond just tech geeks. I haven't seen a solution that does this yet. Opera's Unite was probably the closest we ever got.
The trust on first use self-signed TLS culture of gemini is great though and should be applied more widely by individuals in their behavior on the normal web.