Forcing standardization and interop is obviously good for interop, but it's bad for companies trying to innovate, because it ties their hands. The moment apple ships a v1 they have to ship an API, and then they have to support that API and can't change it. When it's private they can figure it out.
Apple already spends years in R&D before releasing anything. Many of their R&D devices never see market. Requiring them to share an API they've actually shipped to paying customers is not a significant additional hurdle. We know how to version APIs now. They can still make improvements to public APIs without hurting anyone.
Which is why DMA only applies to huge, dominant companies (the complete list: Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft) and there too it does not apply to all technologies, only for those where standardization is important to enable competition. It's much more important to have at least some competition than letting dominant companies monopolise entire markets through 'innovation' with private APIs.
They extend it in some ways, but I'm not sure if they do in this way. They do sound kind of terrible, but I always assumed it was due to the microphones being way back by your ears. I'm not sure though
You know… I'm grateful that they moved quickly and decisively to bridge businesses through that time. We'd all be worse off without this having happened. I'm willing to accept a certain amount of waste for the importance of speed.
Mm. It’s certainly good to work at the other end of the funnel (thank you!) but it also won’t help address pattern matching that people do in hiring.
It’s an incredibly natural thing for people to hire people like themselves, or people they meet their image of what a top notch software dev looks like. It requires active effort to counteract this. One can definitely argue about the efficacy of DEI approaches, but I disagree that JUST increasing the strength of applicants will address the issue.
Yes it will! That pattern matching is based on prior experience and if the entire makeup of candidates changes that'll cause people to pattern match differently. If old prejudices are taking a while to die out, it won't be long until someone smart realizes there's whole groups of qualified candidates who aren't getting the same offers as others and hires them
> it won't be long until someone smart realizes there's whole groups of qualified candidates who aren't getting the same offers as others and hires them
There's an argument to be made that this is exactly what pipeline-level DEI programs are!
If the goal is to prevent people from being biased, why not anonymize candidate packets? Zoom interviews can also be anonymized easily. If it's the case that equally strong, or stronger, candidates are being passed over anonymization should solve this.
Rather than working to anonymize candidates, every DEI policy I've witnessed sought to incentivize increasing the representation of specific demographics. Bonuses for hitting specific thresholds of X% one gender, Y% one race. Or even outright reserving headcount on the basis of race and gender. This is likely because the target levels of representation are considerably higher than the representation of the workforce. At Dropbox the target was 33% women in software developer roles. Hard to do when ~20% of software developers are women.
Anonymization is probably an under tried idea. Various orchestras switched to blind auditions and significantly increased the number of women they hired.
They can cheat non-anonymous interviews too. An alternative is to have candidates go in person to an office to interview, but the grading and hiring panel only sees anonymized recordings of the interview.
I don't think they're advocating not doing defer in C? They're saying you can backport the functionality if needed, or if you want to start using it now.
They're recommending changes to the proposal though, such as requiring a trailing semicolon after the close brace. It also changes the syntactical category of the defer statement, though it's not clear to me what that actually affects.
As is relevant to the Ninth Circuit’s opinion, the AADC creates two sets of requirements. First, the AADC requires businesses to complete a data protection impact assessment (DPIA) before a business offers to the public any new online services, products, or features likely to be accessed by children. The DPIA must “identify the purpose of the online service, product, or feature, how it uses children’s personal information, and the risks of material detriment to children that arise from the data management practices of the business.” The DPIA also must address, to the extent applicable, eight factors, including “whether the design of the online product, service, or feature could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature.” Businesses must document the risks and “create a timed plan to mitigate or eliminate” the risks before the online product, service or feature is accessed by children.
Second, the AADC contains a list of prescriptive requirements in sections 1798.99.31(a)(5)-(10) and (b). Specifically, sections (a)(5)-(10) require businesses to:
Estimate the age of child users or apply the privacy and data protections afforded to children to all users.
Configure all default privacy settings provided to children to the settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that the different setting is in the best interests of children.
Provide privacy information and other documents such as terms of service in language suited to the age of children likely to access the product, service, or feature.
Provide an “obvious signal” to a child if they are being monitored or tracked by a parent, guardian or any other consumer.
Enforce published terms and other documents.
Provide prominent tools to allow children or, if applicable, their parents or guardians, to exercise their private rights.
Section (b) then provides that businesses cannot:
Use children’s personal information in a way that the “business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child.”
Profile a child unless certain criteria are met.
Collect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature, unless the business can demonstrate a compelling reason that doing so is in the best interests of children likely to access the product, service or feature.
Collect, sell, or share a child’s precise geolocation information by default unless strictly necessary to provide the requested service.
Use dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected.
Use personal information collected to estimate age or age range for any other purpose or retain that personal information for longer than necessary to estimate age.
I stand with the parent commenter - Maybe Techdirt could actually hyperlink to this kind of information, instead of hyperlinking to three other articles pretending to answer the question but not doing so.
I found the law's text via hyperlinks. It wasn't directly linked in the article, or anything it linked to, but it was a link from an article linked by this article.
Given that it's part of an ongoing series of articles, it's not surprising that the law itself isn't directly linked (the author clearly expects that you've already been following this saga to some degree).
I basically only swipe back. This aligns web pages with iOS nav stacks.