Hacker Newsnew | past | comments | ask | show | jobs | submit | throwawaywindev's commentslogin

That’s why you use COM or WinRT with strong contracts.


Doesn't help you when what you need is MSVCRT.


Agreed, my first thought was why don’t they remove the Like button too?


Think about the worst of the worst ideologies.

Previously if you were linked their videos it might have 1000 upvotes and 100,000 downvotes, and it would be obviously terrible. Now you just see 1000 upvotes and it would look like it has substantial support.

Google can and does remove videos they simply dislike by using nebulous policies against 'hate' or 'meanness' or similar. Removing dislike then, by and large, helps push those unpopular views that Google themselves approve of.

Does removing likes let them accomplish that? They can still remove videos that trigger them, but without a like count they can't push their ideology with the appearance of support.

I believe this is the actual reason why they will show upvotes but not downvotes, and as prediction I expect them to remove total views since that allows for estimating the viewer sentiment based on likes per view (or make it useless for estimation, for example replacing a count with big buckets like "thousands" or "millions").


They teach that stuff in business school.

https://www.youtube.com/watch?v=cFLjudWTuGQ


Time to sell every possible 4K bitmap as individual “generative art” NFT’s.


http://www.milliondollarhomepage.com/

Is there a million dollar NFT homepage yet?

Edit: There is. https://themilliondollarnft.net/


Has to be for their mixed reality product rumored to launch next year that they’ll want native apps to be developed for. There are only so many ProRes video editors out there.


Yea, I originally ordered an M1 Max model after the presentation, then cancelled it when I realized for what I would use a GPU for (gaming and 3D development) a RTX 3080 laptop would be a much better choice. I also don’t care about performance/watt as much since I use my iPad for non-work stuff.

But the technology nerd in me still wants to buy one for completely irrational reasons.


Me too. I really want an M1* mac but I also realize that I just want to run Linux, so it's kind of pointless right now (I'm not a Linux kernel level developer, so yeah).


C++ compilers probably will.


I believe they compared it to a ~100W mobile RTX 3080, not a desktop one. And the mobile part can go up to ~160W on gaming laptops like Legion 7 that have better cooling than the MSI one they compared to.

They have a huge advantage in performance/watt but not in raw performance. And I wonder how much of that advantage is architecture vs. manufacturing process node.


I am very confused by these claims on M1's GPU performance. I build a WebXR app at work that runs at 120hz on the Quest 2, 90hz on my Pixel 5, and 90hz on my Window 10 desktop with an RTX 2080 with the Samsung Odyssey+ and a 4K display at the same time. And these are just the native refresh rates, you can't run any faster with the way VR rendering is done in the browser. But on my M1 Mac Mini, I get 20hz on a single, 4K screen.

My app doesn't do a lot. It displays high resolution photospheres, performs some teleconferencing, and renders spatialized audio. And like I said, it screams on Snapdragon 865-class hardware.


What sort of WebXR app? Game or productivity app?


Productivity. It's a social VR experience for teaching foreign language. It's part of our existing class structure, so there isn't really much to do if you aren't scheduled to meet with a teacher.


The MSI laptop in question lets the GPU use up to 165W. See eg. AnandTech's review of that MSI laptop, which measured 290W at the wall while gaming: https://www.anandtech.com/show/16928/the-msi-ge76-raider-rev... (IIRC, it originally shipped with a 155W limit for the GPU, but that got bumped up by a firmware update.)


The performance right now is interesting, but the performance trajectory as they evolve their GPUs over the coming generations will be even more interesting to follow.

Who knows, maybe they'll evolve solutions that will challenge desktop GPUs, as they have done with the CPUs.


A "100W mobile RTX 3080" is basically not using the GPU at all. At that power draw, you can't do anything meaningful. So I guess the takeaway is "if you starve a dedicated GPU, then the M1 Max gets within 90%!"


The M1 Max is drool worthy but Mac gaming still sucks. Can’t really justify it given that I don’t do video editing or machine learning work.


Video game consoles have been using integrated graphics for at least 15 years now, since Playstation 3 and Xbox 360.


You are mistaken. On both PS3 and Xbox 360 CPU and GPU is on different chips and made by different vendors(CPU made by IBM and GPU by Nvidia in case of PS3 and CPU by IBM and GPU by ATI for Xbox 360). Nonetheless in PS4/XOne generation they both use single die with unified memory for everything and their GPU could be called integrated.


For the 360, from 2010 production (when they introduced the 45nm shrink), the CPU and GPU was merged into a single chip.


When they did that they had to deliberately hamstring the SOC in order to ensure it didn’t outperform the earlier models. From a consistency of experience perspective I understand why, but it makes me somewhat sad that the system never truly got the performance uplift that would have come from such a move. That said there were significant efficiency gains from that if I recall.


Yup. Prior they were absolutely different dies just on the same package


If you mean including PS3 and X360, these two consoles had discrete GPUs. The move to AMD APUs was on the Xbox One and PS4 generation


Longer since integrated graphics used to mean integrated onto the north bridge and it's main memory controller. nForce integrated chipsets with GPUs in fact started from the machinations of the original Xbox switching to Intel from AMD at the last second.


In that case, it's more like discrete graphics with integrated CPU :)


Yeah, and vendors like Bungie are forced to cap their framerates at 30fps (Destiny 2).


They capped PC as well.


If they did, it definitely wasn't at 30. I was getting 90+ on my budget rig.

But no, I don't think they did.


Destiny 2 is capped on PC? The cutscenes are but the actual game is not


It used to have a bug that randomly cap the fps at 30. Only toggle vsync on and off again can fix it. It have no idea whether that has been fixed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: