I see that your website has a link to http://stani.sh/pratyeka, which is the same username as "Pratyeka" who created this 529 byte stub for bitcoin on 8 March 2009:
Bitcoin is an open source peer-to-peer electronic cash system developed by Satoshi Nakamoto that's completely P2P|decentralized, with no central server or trusted parties. Users hold the cryptography|crypto keys to their own money and transact directly with each other, with the help of the network to check for double-spending.
But the stub article did not meet wikipedia's standards as it lacked references to 3rd party sources, so was marked for deletion July ~10, 2010 https://bitcointalk.org/index.php?topic=342.0 and officially deleted July ~31, 2010 https://bitcointalk.org/index.php?topic=652.0 and was only restored after the bitcoin community rewrote it fully in a manner deemed sufficient by the wikipedia community.
Yes! Creation in the context of Wikipedia clearly means creation. FYI your research missed the fact that I was a part of the argument for undeleting / saving it from deletion twice, IIRC.
Yes, so what? So were many others. But you don't see everybody who created a wikipedia article make ego-boosting comments on HN whenever their article is mentioned. Nor do you see everyone involved in nurturing bitcoin comment on HN whenever bitcoin is mentioned.
Honestly, the reason I did research was I was hoping you were one of the cool people I used to chat with on the early days of bitcoin forums. I was curious to read your early comments and eager to ask you about your experience or how you first came across bitcoin. But all I found by searching for you username on bitcointalk was someone else's post "Paying homage to Pratyeka of Wikipedia for Bitcoin's inclusion." https://bitcointalk.org/index.php?topic=99437.0 featuring this bravado quote from your wikipedia profile of "particular contributions that your proud of":
"I argued bitterly for the restoration of the Bitcoin article from the evil hands of the deletionists, and have since been vindicated in my judgement as Bitcoin's rise to prominence has continued."
Your quote sounds like it is from a Middle Earth fantasy epic, and is just asking to be signed something like:
"-First Knight Pratyeka 'The Clairvoyant' of Wikipedia, The Creator of the Bitcoin Article, and The Protector of the defenseless Bitcoin community from the Void of Obscurity, The Hero entrusted with the Power of Wiki-Influence who rose during The Challenging Times and singlehandedly defeated the onslaught of The Deletionists."
I'm exaggerating, deliberately to give you an idea of what impression your comment and wiki profile imparts on others. I know many of us all like to speak like that on the internet in good humor, but after I combined your wiki profile quote with your HN comment, the picture of you in my head was another internet personality with a little too much ego that needs to be dampened. I'm imagining that in real life when you overhear strangers in public talk about bitcoin, you approach them to tell them that you created the bitcoin article.
Your comment feels just like the cliché first comment phenomenon that typical on sites like youtube. It offers nothing of substance or insight itself. Yes, everyone knows about the butterfly effect and how little things do make a difference. But this means every little bitcoin transaction, every little bitcoin bug report, every little bitcoin commit, every time someone explained bitcoin to their friends, everybody who made a bitcoin for ______ service, and every little anonymous wiki edit deserves some bit of credit for bitcoin's rise. And the blunt reality is that the reason the wikipedia article was reinstated was because these little interactions gradually increased the market price, causing more news sites started taking notice, so there were more 3rd party sources which is precisely what the article needed to survive. That was due to everyone's little contributions. And regarding your initial "creation", had you not made the initial stub, someone else would have made it a couple weeks later.
The good feeling of advancing human knowledge and understanding by creating and editing Wikipedia articles should be the reward itself. Posting on internet forums that you were the "first" just seems like your seeking attention or praise. And even worse, as another commentator kiro said earlier, your comment sounds like you take credit for the whole Wikipedia Bitcoin article. Regardless of the truth that factually and technically you did indeed "create" the initial stub, did a few edits, and argued to keep the article up, a humbler comment that offered insight wouldn't have provoked such a strong negative reaction.
Regardless, I am appreciative of Wikipedians like you for contributing and defending against The Deletionists.
Yes. I stopped contributing to the article because it turned in to this sort of thing, and had achieved my immediate purpose of making it exist and stay existing. I suggest you also learn to manage your time, as your posts while impressive and reasonable do not represent a healthy degree of interest to display in a passing piece of minutiae. Sarcasm can be hard to pick up, but at least you got the RPG/fantasy reference. :)
It is not minutiae when someone phrases their posts in a manner that may imply they're taking credit for other people's work.
I didn't mind spending a few minutes because I respect all wikipedians who created and maintained the bitcoin article.
I wanted to help you understand, but now see it is futile. I would hope that the Wikipedia personality whose name is the abbreviation of the enlightened Pratyekabuddha ("a lone buddha", "a buddha on their own" or "a private buddha" [1]) would understand, but apparently I am talking to the personality "contingencies".
I'm sticking to self-hosted fossil-scm. Very simple, lightweight, C, but has full wiki and issue tracker and code trees.
The one thing self-hosting lacks is the social network. What is really needed for self hosted sites is a federated network (maybe with something like pump.io and openID) to allow coders from other self-hosted or sass hosts to seamlessly login and contribute and display their profile, and to help mitigate spam.
I think it is fair to for GitLab to pitch their service, considering that the google code blogspot post specifically mentioned both GitHub 8 times and Bitbucket 3 times, while GitLab offers similar functionality. It would have been fairer for the blog to not-specifically endorse anything or post a link to https://en.wikipedia.org/wiki/Comparison_of_source_code_soft...
> I think it is fair to for GitLab to pitch their service, considering that the google code blogspot post specifically mentioned both GitHub 8 times and Bitbucket 3 times, while GitLab offers similar functionality.
The only thing similar between gitlab and github is that they do something with version control. Gitlab is not a place for open source projects.
The projects with the most stars on gitlab s gitlab itself with 221 stars right now. Even the smallest nodejs project achieves that popularity on github.
There are many other hosting services like gitlab, what would make gitlab different so that it would deserve a pitch there?
"There are many other hosting services like gitlab, what would make gitlab different so that it would deserve a pitch there?"
Which is why I said the blogpost would be fairer to just post the wikipedia comparison link. I never said that gitlab should "deserve" a pitch on the blog. I did say it was fair for GitLab to pitch their services, as the CEO has done here, but that is different from "deserving" a pitch.
It is hard to objectively determine what is the better hosting, but based on people's individual preferences, they can subjectively decide for themselves, which the wikipedia article helps out by clearly explaining the differences. Everyone who uses google code knows about github anyway, but might not be aware that there are other services.
We saw gitorious, an entirely-opensource service, fail due to lack of revenue. And we've seen Google Code, an entirely-proprietary solution, fail. For those concerned about the longevity of their hosting setup, they may want a hosting provider that both has a steady source of income to help guarantee longevity and that is based significantly on opensource software (maybe for reasons of philosophical principle in that opensource development should use opensource infrastructure, or for pragmatic reasons such that they can always fork the hosting service code). GitLab fills this niche nicely, in that the community edition is based on fully open-source code, while the enterprise edition (that the their commercial service gitlab.com is based on) uses proprietary extensions.
"The projects with the most stars on gitlab s gitlab itself with 221 stars right now. Even the smallest nodejs project achieves that popularity on github."
The lack of stars on gitlab projects may simply be due to the first-mover advantage of github, but does not necessarily represent any fundamental deficiencies is the hosting service.
> It is hard to objectively determine what is the better hosting, but based on people's individual preferences, they can subjectively decide for themselves, which the wikipedia article helps out by clearly explaining the differences.
It's quite easy to tell actually. Responsiveness of the UI, featureset, availability, size of the community.
* Gitlab is measurably slower (it takes about 5 seconds to load the commit page of a project, compared to <1 for github)
* Gitlab's lacking many features that github has (many filetypes cannot be previewed, lack of integrations, general inferior issue tracker, no search and much more)
* Availability: github rarely goes down. Right now it tracks at 100% availability over the last month.
* Size of community: there is really no discussion here.
Note: I'm not talking about commercial hosting, but about a place for Open Source projects. There is currently absolutely no objective reason to put a project on gitlab.
Well if we're using "Responsiveness of the UI" as the metric, then I would argue that http://fossil-scm.org/ beats both GitLab and GitHub. Fossil is easily self hosted (just run one small executible file). And it is really fast, because it is written in C with sqlite and is simple and minimalistic. It is not git based, but is a simpler DVCS. Dynamically generated pages on my home computer take less than .001 ms to display. It has all the features most small developers need, with a builtin lightweight wiki, issue tracker, and code tree. Your self-hosted website is available even if you don't have an internet connection, or you can use free hosting service like http://chiselapp.com/. Of course it looses in terms of size of community. But popularity does not determine quality.
You will likely loose arguments on public forums if you make statements like "absolutely no objective reason to ..." because someone just needs one reason to disprove. Here goes: GitLab has a functioning interface for managing git projects and lets anyone selfhost the community edition. Therefore there is an objective reason to put a project on GitLab. QED.
> Well if we're using "Responsiveness of the UI" as the metric, then I would argue that http://fossil-scm.org/ beats both GitLab and GitHub.
Which is why I really don't think (and that was my original point) that a Google announcement of shutting down Google Code should act as some sort of advertisement for $code-hosting-site/project. Github and Bitbucket deserve the mention because those projects are well established.
> "Github and Bitbucket deserve the mention because those projects are well established."
According to Wikipedia article, GitHub and Bitbucket were established in 2008, and GitLab in sept 2011, making it ~3.5 years old and about half the age. Although the precise meaning of "well established" is vague, and while you could say that GitHub and Bitbucket are "more established" than GitLab, I would say GitLab is at least "sufficiently established" (i.e. at least sufficient enough to host projects with %99+ uptime). Quick searches reveal that GitHub is known to go down, with major ddos in Nov 2011 and 2 hours in March 2014, Bitbucket was down sometime 27th April 2014, GitLab.com went offline for a full 8 hours in July 2014. (Someone who cares further can do a more precise comparison of uptime). But they were all fixed quickly, still up, functioning, learning, and improving. While maybe your threshold for consideration an internet service to be "well established" is different from another person's threshold, your threshold is not necessarily more valid and cannot be determined without more specific criterion.
Based alone on the argument that "well established" services deserve mention, then that should mean services established before Github and Bitbucket that are still running reliably should be mentioned as well. But you have specifically said it's ok for github and bitbucket to deserve mention but not others.
> "I really don't think (and that was my original point) that a Google announcement of shutting down Google Code should act as some sort of advertisement for $code-hosting-site/project."
It should not. And as I pointed out clearly in my original comment which I will repeat for emphasis as it has been the core of my whole argument:
"google code blogspot post specifically mentioned both GitHub 8 times and Bitbucket 3 times"
While it was appropriate (and arguably a duty as benefactor) for google to post a link to https://code.google.com/p/support-tools/ containing their export tools to github and bitbucket and the sourceforge import, that reference only takes one sentence and doesn't even require the google blog post itself to specifically mention any services. Considering that a shutdown announcement is a serious matter, it should be kept brief and limited to only information relevant to shutdown. All those specific references could have been omitted and the shutdown announcement would still make sense. By specifically mentioning certain services multiple times, the writer of the blog post has opened the door to queries about mentioning alternative services specifically. Had he written in a neutral manner (either by only posting the Wikipedia link or not mentioning any services), then it would have been inappropriate for GitLab to query for a request to be mentioned.
Again, your milage may vary (cache) and I do agree that the commits page of GitHub feels faster most of the time.
GitLab doesn't have preview support for as many features as GitHub, but it has many other features GitHub doesn't have such as protected branches and git-annex support (version large binaries with git).
We understand if people place open source projects on GitHub, they have way more registered users. But some people choose GitLab and their numbers are growing.
> GitLab is faster in on some pages. The commit page probably is slower, but not by so much:
For the initial HTTP request maybe and if you're in the US. Gitlab is painfully slow when loaded from a European network connection and you factor in the time it takes to fetch all resources. Cached or uncached.
> GitLab doesn't have preview support for as many features as GitHub, but it has many other features GitHub doesn't have such as protected branches and git-annex support (version large binaries with git).
Neither of which are important for Open Source projects.
> We understand if people place open source projects on GitHub, they have way more registered users. But some people choose GitLab and their numbers are growing.
I think Gitlab is a reasonable website to use for commercial hosting; I just don't see it for Open Source software.
Right now GitLab is hosted in Germany (AWS Frankfurt) and I was testing from the US (Mountain View). But I sometimes see the same delay you mention. We'll move it to the US east coast over the next couple of months.
I think protected branches are really nice for open source projects too, although I agree that most contributions will come from forks. Right now no open source projects use Git Annex but that might change now that it becomes easier to use, probably it is really nice if you have an open source game with huge digital assets.
On GitLab I could not search for top repositories by programming language. For a beginner like me, it is very important. I could go to Github and search for top repositories in the language I am learning. I usually find top projects, try to figure out how they work, how a experienced dev does a specific stuff (compared to noob like me) and I ask myself how can I "copy" those traits and improve myself.
On GitLab, if I stumble upon any repo I can't figure which language it uses. On GitHub it is very clear. It even shows percentage of programming language used.
Open Source helps in so much for learning about programming for a noob like me and GitLab no way does that as like GitHub. Thats the one reason I don't use GitLab much, though I have an account. And one more reason why GitHub shines over GitLab for open source projects.
Thanks these valid concerns. We would certainly like to improve the discover functionality in GitLab. Right now https://gitlab.com/explore is pretty limited. But the community is actively working on this. This month we introduced a commit calendar and I'm sure that it will improve further over the coming months.
What reasons do you have to you believe this statement? I'm curious.
> The projects with the most stars on gitlab s gitlab itself with 221 stars right now. Even the smallest nodejs project achieves that popularity on github.
Really? I'm surprised. I started coding a forum platform and I only amassed 3 stars. On my non-furry personas, I've capped out at about 13.
I don't think popularity in github projects is fairly distributed enough to use that as a refutation of the quality GitLab has to offer. It's a bigger bandwagon, sure, but that's all.
While a big bandwagon can be attractive (it is for me), I don't think it's fair to disqualify a competitor based on this metric alone.
> What reasons do you have to you believe this statement? I'm curious.
Did you use it? Gitlab has barely any open source projects let alone developers on it. Say as much as you want, but for many open source projects community is a big deal.
Currently there is absolutely no reason to use Gitlab. Neither does it do things better than Github or Bitbucket in any way, nor does it have a larger community. It's "just" another github clone.
> Really? I'm surprised. I started coding a forum platform and I only amassed 3 stars. On my non-furry personas, I've capped out at about 13.
> Since you used the words, "absolutely no reason", here's a counterpoint: as part of an anti-censorship hydra effort.
I don't see how this is relevant at all? First of all github did not remove that content, it IP blocked it. Secondly what do you think that gitlab would do if they would be hit by this?
Do you think it's better for github to go down for Russia entirely? Imagine that would be your proposal for what gitlab should do. Then it would even stronger enforce that gitlab is not a place to go for an Open Source project.
Why should your project suffer and become unavailable because some other project violated Russian law?
Bitbucket is also mentioned even though they don't have a lot of public projects.
GitLab is open source, has a great interface, many features (protected branches, git-annex support) and we offer unlimited (private) repositories on GitLab.com
We did not move anybody. People imported their repo's themselves. Another project that moved is F-droid https://gitlab.com/u/fdroid that was before the acquisition.
Yes, we certainly strongly suggest it to people. Just wanted to make sure everyone understands that the move is initiated by people themselves, we didn't move anyone.
We use GitLab Mirrors[1] to keep running syncs of our public repos on GitHub mirrored into our self-hosted GitLab server. Works really well and I highly recommend it.
interestingly I wasn't able to run vi, but could run vim. I reported to the github issue tracker.
This seems very neat. Definitely much less overhead than VirtualBox. Plus can write to the same filesystem as windows without windows admin privileges (which can be done with cygwin, but not with virtualbox). Once it supports x86-64 binaries and threading and users, this could be very useful.
I was wondering if maybe they preconverted the videos into a sequence of tilemaps and a set of approximate tiles. I notice they are only using the same 5 colors, which is under-utilizing the capabilites of the nes which should be able to display a 13 colors at the same time. (Each individual tile uses one palette of 4 colors, and there can be max of 4 palette used at a time, but one background color is shared by all palettes.) To more fully utilize the nes they could also use a different set of 4 palettes for every frame, which would allow for a much broader range of colors.
Hypothetically, the NES could play streaming video. For memeory limitations, a custom rom cartridge could be used which would grab a new set of video frames (consisting of tilemaps + tilepatterns) via a network controller. The 6502 core does not need to decode video, but could simply swap in chunks of these tilemaps+tilepatterns which the picture processing unit displays.
This is exactly what we did. The video frames were converted to tilesets and stored in the rom image. For playback, the memory mapper (MMC3) is used to swap between the frames without having to rely on too much CPU. Luckily Guy figured out the MMC3, otherwise we'd have only had enough space for a few frames.
Correct as well about the colors. It would have been nice to better utilize them, but we didn't have enough time to do that properly.
Thanks! Alex had been studying getting better compression by repacking the tiles based on a scheme I still haven't quite figured out, but we were getting CPU --> PPU bound in what we could do during vblank. The MMC3 solution let us basically dma our way around the problems and push a much larger screen update far more often. The original version ripped through playback so fast that we had to slow it down to get a decent number of frames into the available 32 CHR ROM banks (it's a mapper 4 NES rom).
The homebrew community was a fantastic resource! Shoutouts to nesdev, Shiru, and the countless articles we googled!
You can get even more than 13 simultaneous colours by writing to the palette mid-scanline. With this method it is possible to display all 410 available colours in one frame.
Neat. I was also wondering if maybe it was possible to switch between two colors based on a certain ratio of frames, faster than the CRT phosphors to fully switch colors (or atleast faster than the eye could detect the swapping), such that it looks like a fractional blend between two colors based on that ratio. I remember doing this trick with the TI-85 calculator to give any degree of greyscale even though the LCD was only black and white.
You absolutely have to split jade documents and use includes/blocks though, any decent sized page, even if it's static becomes unmanageable very quickly.
Certain elements are, or can be made to be, "inline" which gets them treated the same as characters on a line. Thus those elements, such as an image, will have white space between them as a word does in a paragraph. If you butt those element's tags right next to each other, with no white space between them, then there will be no white space on the displayed page.
As mentioned earlier, FPGA's do not fit this 3-part distinction.
In addition, CPU microcode doesn't fit this distinction, as the microcode is software (not hardware) that is hidden behind the ISA's interface.
Also, I can think of computers that only have hardware and data input (without any programs or instruction set). Basically number crunchers that run a fixed program (e.g. DSP's or non-general GPU's that perfom a specific function), with only the data input that can change.
You bring up a good point by acknowledging "The Interface". But I would reclassify into a different set of parts, something along the lines of: "Interface" - "Implementation". I/O can be programs and/or data that interface with the implementation which can be hardware and/or software, or another whole computer. (This is potentially recursive classification, so it can include virtual machines.)
Clever, they insert an additional chip between the cpu and wifi whose sole purpose is to load wifi firmware, and thus make it non-update-able "efectively circuitry" to bypass the silly FSF rules.
Using the same logic, what if someone ran netflix's proprietary DRM decryption-specific code in a seperate co-processor that interfaced with the main CPU only via the w3c's Encrypted Media Extensions? Assuming this code is never updated and runs isolated from the CPU, according to the FSF silly rules about updatability, they would be OK with this "effective circuitry" (of course ignoring their concerns about DRM itself). But if that same code would be run inside firefox's sandbox for EME plugin's, it would be considered dangerous non-free software.
And for the entire history, you are only responsible for 0.19% of edits to that page, according to https://tools.wmflabs.org/usersearch/usersearch.py?name=Prat...
Every little wikipedia stub and edit is important, but sorry I was off-put by you entrance displaying the mantle "Article creator here".