Hacker Newsnew | past | comments | ask | show | jobs | submit | marwatk's commentslogin

> although I wonder if a FOIA request would actually compel them to give it to you

I believe most of it is open source: https://github.com/18F/identity-idp


Such frequent interventions means you need very reliable cell service. In times of heavy congestion (concerts, etc) that doesn't seem feasible.


Hardware tokens (Yubikeys, etc) are signed by their vendor. They support attestation which allows q site to disallow vendors not in a white list. Some banks (Vanguard was/is one) actually enforce this preventing all but a handful of hardware keys from working with their 2FA.


From https://www.chromium.org/security-keys/:

> Chrome’s users have an interest in ensuring a healthy and interoperable ecosystem of Security Keys. To this end, public websites that restrict the set of allowed Security Keys should do so based on articulable, technical considerations. They should regularly update their set of trusted attestation roots that meet their policies (for example, from the FIDO Metadata Service) to ensure that new Security Keys that meet their requirements will function.

> ...

> If Chrome becomes aware that websites are not meeting these expectations, we may withdraw Security Key privileges from those sites. If Chrome becomes aware that manufacturers are not maintaining their attestation metadata we may choose to disable attestation for those devices in order to ensure a healthy ecosystem.

e.g., it is acceptable for a bank or other public-facing site to say "we'll only accept authenticators which have been L2 certified via this independent program that maintains an up-to-date list".

It is not acceptable to say "we'll only accept this one vendor's products, and maybe another vendor after a 2 year audit if we feel a business need".

And Chrome has stated publicly that they will remove some or all of the WebAuthn API from your domain if you do so.


So Google will be the arbiter? What could go wrong...


This principle will last until there is some major Chinese key provider that adheres to all the standards. Then, we'll go the way of TikTok with "risks because of control by the Party".


> Regardless, the fact that companies raise their prices in response to shortages is not only defensible but desirable.

Gouging is good. I'm sure they'll be lowered again when supply stabilizes. Any time now.


RedHat spends a lot of time back-porting security updates to older software (e.g. RHEL7). Stream is always only the latest RHEL version, I believe.

The whole point of RHEL is the long term support (the back-porting), which is what they're going to stop publishing.


CentOS 7 is supported and will be EOL at the same time as RHEL 7.

Stream does have major versions so you can continue to use CentOS Stream 8 and get backports. You only lose anything if you're tied to some minor version of EL for some reason.


I love these things:

https://www.aliexpress.us/item/3256804116114245.html

There are a few suppliers, but the 4x Intel NICs open up lots of possibilities. They're very lower power, but still fast enough to handle a lot of traffic.

I run VMWare ESXi on mine and use openwrt for my router on two ports and then a general purpose server in another VM.


I have a couple of N5105 boxes I got off of Amazon on sale, but I haven't put them into use, yet.

I've never bought off of AliExpress. I'm in the U.S. Do people take precautions when ordering from them? I'm not so worried about the products, themselves. I'm worried about the use and security of my financial data.

Worried as in, I just don't know whether they are safe stewards. No experience, no one to ask. Except maybe you guys. Give them my credit card #? Generate a one time #? PayPal?

Sorry to go a bit OT, but this is yet again where I've been tempted to order from them.


I buy from AliExpress often. Most recently a batch of anodized metal business cards, before that a bunch of home automation sensors. I could get the same sensor from Amazon for a lot more money, I did get one for a test piece before buying 16 of them.

My experience has always been a good one. Just be prepared to wait a while for delivery, the business cards were about three weeks.


Just use a 1 time card and reinstall the OS.


Why stop at reinstalling the OS? You also need to throw away the PC in the dumpster of another neighborhood or city and at least use an angle grinder on the NIC and SSD. Don't forget to wipe off the fingerprints beforehand.


No, that's excessive. I always just unsolder and replace every component, but I thought that went without saying.


Wgat about the firmware? Stuff like Intel ME.


Even if we fixed all of the above, changing jobs would still require uprooting your family. In an era where you worked for one company for life it made sense. Today I don't think it's feasible.


Perhaps but perhaps not:

1. There can still be many companies in an area with lots of close by housing such that you’re not tied to a single job.

2. I moved cross-country several times while my kids were young and it didn’t bother them at all.

3. I’m mindful of staying settled for a while now that my kids are getting older and forming real friendships. But this is like a 10-year window of time, not an entire career.


Yep, this is why LIDAR is so helpful. It takes the guess out of "is the surface in front of me flat?" in a way vision can't without AGI. Is that a painting of a box on the ground or an actual box?


I've read the assertions section a few times and I still don't understand the argument. How is:

  if got == nil {
    t.Errorf("blog post was nil, want not-nil")
  }
Better than

  assert.NotNil(t, got, "blog post")
? They seem to suggest that you lose context, but their "Good" examples are similarly devoid of context.


The main reason I dislike the assert libraries is that you lose the ability to format things nicely; being able to see what's going on quickly is pretty nice when tests fail. Sometimes you want %s, sometimes %q, sometimes maybe a diff or formatted as something else. Testify "solves" this by just dumping a lot of stuff to your terminal, which is a crude brute-force solution. Plus with table-driven test you're saving what, 2 or 3 lines of code?

For example for a simple string comparison it's 12 lines:

   --- FAIL: TestA (0.00s)
       --- FAIL: TestA/some_message (0.00s)
           main_test.go:11:
                   Error Trace:    /tmp/go/main_test.go:11
                   Error:          Not equal:
                                   expected: "as\"d"
                                   actual  : "a\"sf"
   
                                   Diff:
                                   --- Expected
                                   +++ Actual
                                   @@ -1 +1 @@
                                   -as"d
                                   +a"sf
                   Test:           TestA/some_message
 
vs. 2 lines with a "normal" test:

   --- FAIL: TestA (0.00s)
       --- FAIL: TestA/some_message (0.00s)
           main_test.go:11:
               have: "as\"d"
               want: "a\"sf"
With your NotNil() example it's 4 lines, which seems about 3 lines more than needed.

This kind of stuff really adds up if you have maybe 3 or 4 test failures.


NotNil is fine, but as a library author you need to implement every comparison operator for every type. You also have to implement each assertion twice; once for t.Error and once for t.Fatal. We have a homegrown assertion library in our codebase at work, and every time I write a test I have to settle for an inaccurate assertion, or add two more methods to the library. I grew up on Go at Google where I would not have been allowed to check in assertion helpers, and I think they made the right call there.

I've seen a lot of gore in code that uses assertion libraries like assert.Equals(t, int64(math.Round(got)), int64(42)). Consider the error message in that case when got is NaN or Inf.

It is so easy to write your own assertions. I can type them in my sleep and in seconds:

    if got, want := f(), 42; got != want { 
        t.Errorf("f:\n  got: %v\n want: %v", got, want)
    }

    if got, want := g(), 42.123; math.Abs(got - want) > 0.0001 {
        t.Fatalf("g:\n  got: %v\n want: %v", got, want)
    }
Why implement a NotEquals function when != is built into the language?

(The most popular assertion library also inverts the order of got and want, breaking the stylistic convention. It's so bad. Beware of libraries from people that wrote them as their first Go project after switching to Go from Java. There are a lot of them out there, and they are popular, and they are bad.)

Finally, cmp.Diff is the way to go for complex comparisons. Most use of assertion libraries can be replaced by cmp.Diff. I wouldn't use it for simple primitive type equality, but for complex data structures, it's great. And very configurable, so you never have to "settle" for too much strictness or looseness.


Drawing inspiration from:

> Complex assertion functions often do not provide useful failure messages and context that exists within the test function.

I think the best compromise is to avoid combining individual logical units into one assertion. For example:

Bad:

  assert.True(t, myStr == "expected" && myInt == 5)
Good:

  assert.Equal(t, "expected", myStr)
  assert.Equal(t, 5, myInt)
Real world example: at my workplace we have some code that tests HTTP request formation for correctness (right URL, body, headers, etc). Replacing big all-or-nothing booleans with individual assertions on each property of the request provides much more useful test failure messages.

As with any published "best practices" like this, have an open mind but don't just cargo cult whatever Google does. Best to be selective about what does and doesn't work for your situation.


Coming from Swift development I first missed having a collection of assertion functions for testing, but I've come around to the go testing patterns. I do think test assertion libraries usually result in less useful messages, and you end up having to implement a new function for every type of comparison under the sun.

(One thing I absolutely abhor is those assertion DSLs similar to rspec)


I don't think they were able to "become HBO faster than HBO can become us"[1] and now they're paying the price. They have effectively zero moat with their own IP and now they're competing with everyone else for the same creative talent without an ability to separate the wheat from the chaff resulting in quality all over the map.

1: https://www.gq.com/story/netflix-founder-reed-hastings-house...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: