Hacker Newsnew | past | comments | ask | show | jobs | submit | more devsquid's commentslogin

Its cool they are using DP for some analytics. But its not quite the holy grail Apple and its fans has been selling it as. Because any analytics campaign using DP will always eventually average out to pure noise or end up being non-anonymous.

Heres a great interview from the ms researched that invented the technique http://www.sciencefriday.com/segments/crowdsourcing-data-whi...

One of the quotes I always liked from it is "any overly accurate estimates of too many statistics is blatantly non-private"


Oh please you are going to tell me sexual abuse employees faced was "from entrenched interests". Get real...


Theres lots of reasons to use Go. Mine is the language is really stupid simple.

Theres nothing fancy in the syntax, you can pick up a project you put down months ago and easily start working on it again


Yea, but the Swift compiler is extremely buggy. Plus its gotten buggier with each release.


That sounds like you are making engineering decisions by superstition. Unless you have some source of data to back it up, please don't spread your subjective misinformation.


I'm from the bay area. We created this problem. Each of the two bubbles I have lived through did this.


The fact my nation is as wealthy as it is but it does not want to invest in the health of its people is insanity to me. Health insurance is essentially a bet that someone's health cost will be less than their monthly premium. Why would my country not take on that bet? It seems like something that would be way more effective if it was orchestrated on a national level. (ノಥ,_」ಥ)ノ彡┻━┻


Considering every single TV spot is selling us things to make us Fat or die sooner, I would not want to take the otherside of the bet against people getting sick.

Americans, we are all about freedom. Freedom to kill ourselves and eat twinkies. But we expect help when we are sick.

One side or the other has to budge a little. Do we give up being able to get the same insurance despite unhealthy habbits? Do we give up helping the poor and the sick?


This is why we need fact based research to understand these costs, and use a direct tax on those items to recoup those costs from those who engage in those risks.

Go skiing? Well there's a higher tax for the ticket because you have X% higher risk of breaking a leg. Twinkies right now are a grocery food so in fact it has less tax, where it probably should be taxed like booze or cigarettes (in effect a sin tax). Of course sin taxes get abused, where the money goes to the general fund rather than going to mitigation for the undesirable behavior.


Then what happens to diet soda? Likely not good for you but no calories. Tax free or same tax as normal cola?

I think for most things we are too stupid to really calculate the extranalaties of products.


I feel your sentiment. But I always thought getting browsers to adopt Dart was a crazy and impossible idea. I was actually glad when I heard they had abandoned that plan.

Flutter confuses me. Its annoying you have to use their DSL for layouts rather than HTML and what about native widgets, like WebViews or VideoPlayers?

With them drastically speeding up the Dart2JS compiler and implementing a strong typed mode, I am quite pleased with Dart's direction.

I do still fear for its future. It would be really to sad to see Dart die. The last web app I made, I was able to make the entire thing in under 90kbs, including HTML, CSS, SVGs, and JS. Dart allows me to write desktop like applications for the web with any extra overhead.


> I feel your sentiment. But I always thought getting browsers to adopt Dart was a crazy and impossible idea. I was actually glad when I heard they had abandoned that plan.

I agree; a saner approach than adding one particular language at a time to browsers is to give browsers support/hooks which arbitrary language implementors can target, without having to perform gymnastics like compiling to JS.

It looks like WebAssembly is (at least the first step) in that direction, so Dart and others can use that as and when it matures, without creating a legacy of language-specific issues for future generations of Web implementors and archivists do deal with.


> But I always thought getting browsers to adopt Dart was a crazy and impossible idea.

I don't know — browsers are already planning to adopt a non-JavaScript language (WebAssembly); it's entirely possible that maybe, just maybe they might start to support more than just the late-90s mistake which is JavaScript.


As someone that enjoys more developing native apps than web ones, I imagine Web Assembly will eventually be the revenge of plugins.

How long it will take for Canvas/WebGL + WebAssembly frameworks to appear?


Eventually someones going to create a XML based layout language to use to construct views in the Canvas, we could even call it HTML ;)

I'm not a web developer either and I look forward to WASM and its potential. But I really like HTML, I've used lots of layout languages before and I think HTML is the best I've used. The only thing I find HTML to be obtuse about is complex animations.


If I remember right, its just the calculator app. Lol....


The dock and launchd were rewritten in Swift for Sierra.

There is a WWDC session talking about it.


Well that doesn't bode well ;) Sierra's dock is sooo buggy. I have to restart my machine numerous times a day because the dock stops working, literally doesn't work lol. I'm sure theres a way to restart just the dock app, but eh i've been quite busy. Sierra has been one of the buggiest version of Mac I can remember using.

I would surprised if they actually did tho. I know someone did some static analysis of the apps for mac and iOS and found Swift was barely used at all.


The dock is a separate process, as is Finder, you can just kill them.

I have to stomp on coreaudiod a few times a month because my USB DAC stops responding.


I know the first thing I thought about was does this mean Tesla's going to start writing its software in Swift. If so they are fucked :P. (BTW I write non-critical consumer software in Swift as my day job)


Swift has its problems but keeps getting better. I only get to use it for my at-home projects, since at work the portion of the codebase that is for iOS is Objective-C with no plans to switch any time soon. Old Obj-C too, and started by people used to programming in VB on MS platforms. So count your blessings.


I find Swift to be getting worse each year. While sure i'm blessed that I don't have to write applications in assemble, I have grown jaded towards Swift. If it wasn't because I am an iOS developer, I would happily not use the language. I have been slowly positioning myself away from doing iOS development. I feel like a massive corporation like Apple can provide better tools to write apps for their walled garden than what they are providing me.


What would you recommend instead?

C#? C++?

C++ has become incredibly great recently after languishing in the C++2003 period for too long.


SPARK ADA of course.


ARC is a whole debate, but the one thing it is not is simple and I would argue its more error prone than a traditional GC. I've used it for most of my career and I have seen what sloppy/unaware coding can do to it.

Lattner is a well known expert on compilers. Having used Swift since its inception, I would call into question the reliability of the Swift LLVM compiler. In its current state (3.0.2) its absolutely terrible and does not back up the sentiment; "But fast-moving Silicon Valley needs a fundamental shift in quality standards when it comes to safety-critical software, and if you look closely, Lattner has been leading this charge at the language and compiler level".


I'm not claiming Swift 3.0.2 is the most reliable language ever. All systems require time to mature and it is very young historically speaking. I am saying Lattner's design principles should realize better results for safety-critical software in the long run. Compiler bugs can be fixed; programmer error is tied to design. It might be Swift, it might be something more pared down, but he's a good guy to have in charge of an autonomous driving software platform.


Well I'm sure he's a very experienced manager for technical projects and also brings a large amount of experience with compilers to the table.

His online presence always has that Apple Arrogance™ to it. Thats coming from someone who was born and raised an Apple fan.


I'm not sure I would say Swift is the kind of robust that is needed for safety critical software, but it is a nice step forward for application code. It mostly forces you to deal with things safely while still allowing Objective-C dynamic behaviors when you need to get around the restrictions (often to interact with other Obj-C code).

So, yes I can see why one would call Lattner at the forefront of making more reliable compilers and pushing shifts in quality standards in fast-moving Silicon Valley. It is an awesome achievement to create a new language that improves both readability and safety and even more awesome to get it mainstreamed so quickly.

There are a few people who I would like to trade places with. Lattner is one of them, Musk is another. They both fulfill different parts of my long-held dreams. So I consider them to both be quite awesome. Its cool that they'll be working together too I guess.


Sure Lattner and Musk are interesting people, but I find the level of hero worship in the tech industry to be sickening.

Having used compilers for a few new languages (Rust, Go, Dart, Kotlin, Swift). Swift is the only one I've had any issues with as well as Swift seems to be the only language to have adopted the "move fast and break things" philosophy of Silicon Valley. I dunno, I just don't see the argument.


Lattner is best known for starting the whole LLVM project. Swift is just a small side project in comparison, it just got adopted by Apple for some reason.

LLVM is one of the most influential pieces of software of the past decade. Hero worship isn't good but credit where credit is due.


Are you sure no one else was involved in starting LLVM? PhD supervisor, fellow students, etc. Just saying...


Oh yea LLVM and Clang are incredible pieces of software! He is truly a master.


clang was mostly steve naroff.


You're confusing language development with autonomous vehicle development. Think of the long term goal. It's desirable to move fast and fix things with language development in the near term, to achieve a more perfect design and accelerate its maturation at the temporary cost of more volatility. After this process achieves a high level of maturity, said design principles may offer a safer, more reliable programming system that would be better suited to safety-critical applications.

Additionally I'm sure we can all agree there is no substitute for maturation through time and usage in the field. Which frankly is an argument for more popular languages over obscure ones. None of the ones you mentioned are ready for safety-critical system development (including Swift 3), but which one is most likely to achieve widespread adoption and field testing in the long run?


No I'm not confusing them. I'm responding directly to the comment that Chris lattner represents a more measured approach to software development than is tradition in the tech industry.

I don't think Swift stands to gain wide spread traction outside of Apple orientated app development. Aside from a lack of stability, Apple is to well known for boxing its competitors out. I've used and loved their products my entire life and I know how annoying it is to go against Apple's grain.


>I don't think Swift stands to gain wide spread traction outside of Apple orientated app development.

It already is though, there are several Linux web frameworks etc. It's open source and community run so I'm not sure how they're planning to box out competitors from it.


There are some web frameworks that are indevelopment. That does not mean Swift has gained any traction. Also having toyed around with one, the experience was not great.

When writing a server, I would take Go over Swift anyday. It out preforms it, uses less memory, its simpiler, oh and it uses a "tradiontal" GC.


>uses less memory

That is very much _not_ the case according to the testing I have done recently.

Swift uses a lot less memory than Go unless the program uses only trivial amounts of memory in the first place. Using interfaces in Go data structures makes the difference even more pronounced.

On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.

That said, Swift has a few very weak spots when it comes to memory. Most notably the String type, which is terrible on all counts, but that is a whole different story.


> On top of that, all runtimes that use a tracing GC require tons of spare memory at all times unless programs are very carefully written to avoid memory allocation.

Only if the said language doesn't allow for stack or static globals.

Quite a few languages do allow it.


Doesn't that effectively amount to manual memory management? What particular languages are you referring to?


> Doesn't that effectively amount to manual memory management?

Not really, example in Active Oberon:

    TYPE
      point = RECORD x, y : INTEGER; END;

    VAR
      staticPoint : point; (* On the stack or global *)
      gcPoint     : POINTER TO point; (* GC pointer *)
      noGCPoint   : POINTER(UNTRACED) TO point; (* pointer not traced by the GC *)
> What particular languages are you referring to?

Mesa/Cedar, Oberon, Oberon-2, Active Oberon, Component Pascal, Modula-2+, Modula-3, D, Oberon-07, Eiffel, BETA.

There are probably a few other ones.


>noGCPoint : POINTER(UNTRACED) TO point;

That's fine for one point. How about N points where N varies at runtime?

If I allocate memory dynamically outside the GC's remit, I'm going to have to release that memory somehow.


Depends on the specifics of the language.

On Active Oberon's case, those pointers are still safe. They can only point to valid memory regions, think of them as weak pointers that can also point to data on the stack or global memory.

This in safe code.

If the package imports SYSTEM, it becomes an unsafe package, and then just like e.g. Rust's unsafe, the rules are bended a bit and usage with SYSTEM.NEW() SYSTEM.DISPOSE() is allowed.

Just like any safe systems programming language, it is up to the programmer to ensure this pointer doesn't escape the unsafe package.


I still don't get how you can say that this memory is not under the GC's control but it's "not really" manual memory management either. Is it reference counting then? How does that memory get released?


It doesn't get released, unless you are doing manual memory management inside an unsafe package.

In a safe package it can only point to existing data, there isn't anything to release.

If the pointee is something that lives on the heap, it is similar to weak references. Points to GC data, but doesn't count as yet another GC root.

If the pointee is on the stack or global memory (data segment in C), then there is also nothing to release. Global memory only goes away when program dies, stack gets released on return. Memory that was allocated by the compiler due to VAR declarations, it is static.

Usually the idea is that you use untraced pointer to navigate statically allocated data structures, they are not to be exposed across modules.


It will be great if you have any benchmark to share. From what s available on internet Swift does seem to use more memory than Go.


What sources have you found on the internet?

My own code is unfortunately a bit messy and entangled with unrelated stuff. If I find the time I'm going to clean it up.


ARC is a tradeoff between manual and automatic memory management. Requring a little bit more care from programmer is intentional, not a disadvantage as you picture it, it is a price for not having, you know, GC. GC is less error prone not for free but at the price of eating CPU and memory, which in the world of mobile devices equal less battery life, so it is quite desirable for iPhones and MacBooks software not to have it.


ARC is automatic memory management.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


Yup but people love to argue over the small shit. Chris Lattner even refers to ARC as a form of GC.


Different approaches to memory management differ in extent of how much of programmer's job they automate.

Garbage collectors are fully automatic and rarely if ever require to mind anything; automatic RC does almost everything but requires programmer to analyze and annotate some things as 'weak'; manual RC requires a lot more programmer's effort while still technically being "automatic"; and manual memory management means the programmer does everything.

Automatic/manual is a scale, not a boolean yes/no, and the point is that ARC lies on it a bit closer to manual than garbage collectors.


The thing is, ARC is a garbage collection algorithm.

There isn't anything like ARC vs GC, that is layman knowledge and just wrong from CS point of view.


ARC is not an algorithm, it a language-level feature that generates retain/release calls automatically so that the programmer does not have to. Unlike GC systems where the resulting program does run an algorithm (and wastes CPU on that), with ARC the generated program is no different as if retain/release were written manually and runs no extra code.


That is an implementation detail of how a reference counting algorithm can be implemented.


If you're going to say that, then hand-coded memory allocation is just an implementation detail for a garbage collection algorithm.

True in some sense, but mostly useless. Come on.


The right comparison for ARC is with manual memory management -- not GC.


Depends on what your point is. Both ARC and GC are approaches to limit the complexity and difficulty of memory management. As such, I think it's very reasonable to compare them, because they're different approaches to the same underlying problem.

FWIW, as someone who was a Java programmer for over a decade before learning Objective C right after ARC came on the scene, I greatly prefer ARC over garbage collection. I find the things you have to remember to think about with both ARC and GC (e.g. circular references and unintentionally strongly reachable references) to be about the same cognitive load, but the deterministic, predictable behavior of ARC means you won't have to try to debug random GC hangs that only happen in prod under heavy load and the subsequent fiddling with a million GC options to get performance to be acceptable.


ARC is a GC implementation algorithm, you probably mean tracing GC algorithm.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


I think it is time for you to buy more books.


I have a very good collection of CS books and papers about programming languages and compiler design...


I'm actually excited about server-side Swift for exactly this reason. It's early days though.


Not really. They are both valid comparisons, since ARC is much easier to work with than manual management and can offer more predictable performance than GC. That said, it's also slower than manual management and can be trickier to work with than GC.


ARC is a GC implementation algorithm, you probably mean tracing GC algorithm.

"The Garbage Collection Handbook", chapter 5

http://gchandbook.org/


Just to point out that the Swift LLVM compiler is written in C++. So it being unreliable doesn't necessarily say much about the reliability or goals of Swift as a language.


It looks like you haven't used C++ in a while.

C++14 is a whole other world, and can be written with most (if not all) the safety guarantees you would expect from Swift or Rust.

I had to use it for a project and was very surprised about this too...


You're lucky if you get to use C++14 in the real world. My last C++ job was maintaining a 2 millions of line of legacy MFC code.


Even there one can write quite safe C++ code with help of CArray, CString, CComPtr, ...

I used MFC like that in the late 90's/early 2000.

The problem are the team mates that write Win32/C like code, instead of MFC/C++ code.


I wasn't trying to suggest that C++ was the cause of the compiler issues, just that Swift wasn't.


Can you explain why you call into question the reliability of the Swift LLVM compiler?


It would be apparent if you were a regular user of the compiler. Random crashes while compiling well-formed code and performance problems with larger files/projects are quite common.

Still, I would say it's mostly usable now. It used to be a lot worse.


You find it better than before? I find that Swift 3.+ is drastically worse than Swift 2.


Do you use Swift at all?

-edit- I meant it as a serious question. But the person who responded to me sums up the issues.


This is downvoted presumably for lack of information, but it's pretty much true.

The Swift compiler segfaults very frequently. I do find this amusing in that it's the compiler for a theoretically largely-memory-safe language (yes the compiler is written in C++, it's still funny). The syntax highlighter in Xcode, which is driven by the same stuff, also crashes, which breaks autocompletion and even indentation. Using Xcode, you just have to get used to it. It frequently reports the wrong error message - just something that isn't even close to related. Sometimes object files just become 0 bytes and you either need to clean (and experience the Swift compiler's blazing performance again) or go and modify that file so that the incremental compiler will pick it up.

I've found most of these to be triggered by using a lot of closures and possibly type inference. Shaking out the incorrect errors or segfaults is... not fun.


The most annoying one is the incremental compiler is broken under Xcode 8, leading to full recompiles every time a line of code is modified.

https://forums.developer.apple.com/thread/62737?start=0&tsta...


Oh? I meant it as a srs question. I wasn't sure what parts I should include. But thanks for essentially saying most of it.

I should mention I also find the community to be sorta toxic. They are so focused on Swift being the one language to rule them all and they use terms like "Swifty".


All language communities do that.


I get your point, programming language communities all have a certain level of fan-boyism. However I find Swift's community to be particularly abhorrent. I stopped partcipating when people started to ask if certain codes is "Swifty" and people would judge the merit of something on whether its "Swifty". It also got tiring with how militant they were with other programming languages, especially Java.



Yes I can agree with these.

I wonder if we'll have refactoring in Xcode for C++ now Lattner has gone. I wonder why they never added it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: