Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Microsoft really hit it out of the park yesterday.

Did we watch the same event? Microsoft introduced a $3,000 desktop PC in an era when nobody uses desktops anymore. It introduced a minor update to the Surface Book that starts at $2,300 with dual-core CPU, only 8GB of RAM, and last-gen graphics hardware.

For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen, twice the RAM, and a quad-core CPU, and it's Microsoft that hit it out of the park?!



> Microsoft introduced a 28" desktop PC

No. Microsoft introduced an entirely new kind of tool for digital artists. I'm not one myself, but the Surface Studio makes me wish I were. And the people I know who are, are over the moon about it. They can't wait to get their hands on one.

Yes, it's Microsoft that hit it out of the park, and made Apple look amazingly weak by comparison. Today's event wouldn't have been particularly impressive even on its own. By comparison with yesterday's, it's just embarrassing.


I'm a digital artist myself. I'm a more goal based guy, so I use both Windows and Mac hardware and software. I know that I am probably the target market for both the Macbook Pro and the Surface Studio, but I have to say, I'm not impressed. And I'd argue that the only people who are would have to be Microsoft or Apple fans. Certainly not tech based artists.

First, this is a business for me. And I suspect it's a business for anyone that Apple or Microsoft expects to sell this stuff to. And having to pay $3000 for outdated hardware is a tough pill to swallow. The hardware in the Macbook "Pro" is simply inexcusable. (REALLY???? 16GB RAM MAX!?!?!?) But the hardware in the Surface Studio is equally galling. They give you an NVidia what? 965 for $3000??? And the BEST I could even POSSIBLY get would be a 980??? And for that magnanimous gesture on their part I would be obliged to pay a MINIMUM of $4100??? For LAST GENERATION graphics cards???? On a machine purportedly about graphics???? (A 1080 would run both faster AND cooler using LESS power Microsoft. And at this point in time, for a business investment 64GB of RAM should really be offered by anyone NOT trying to screw you. What's the deal Microsoft???)

I think to objective observers who were waiting for these presentations... both proved SEVERELY underwhelming considering the pent up expectations. I mean... OK... if you put a gun to my head, I'll probably buy the Surface Studio. But don't expect me to pretend that I don't know that both Microsoft and Apple are robbing me.

Sorry for the rant.


I think you (and a few others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle.

Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs. Apart from a relatively small number of professionals who work with 3D rendering, high-end graphics cards are an even greater waste of money (unless of course you're buying the machine for gaming, which doesn't really fit the "digital artist" target of these machines).

Microsoft and Apple may still be pricing these products too high ("robbing you" as you say), but that's a separate discussion.


"...I can't help but notice that you (and others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle..."

I can't help but notice that you are equating that 99% of "digital art" that can be handled by mobile graphics cards, to the 1% of "digital art" that anyone's going to actually PAY someone to produce. As I said, this is a business. If I could make the living I currently make retouching photos, then I'd gladly pay $3000 for an underpowered machine...

but I can't.

because no one is going to pay us that kind of money to retouch photos are they?

Look, these machines, to a business, are INVESTMENTS. You invest for the FUTURE, not to take advantage of the past. A 1070 or 1080 is not too much to ask considering Pascal's MANIFEST superiority in efficiency. Additionally, I was being kind, I think MS should OFFER 64GB in the Surface Studio, but a 128GB option would really be necessary to future proof this thing.

I'll tell you what MS and Apple are going for here... it's a money grab. And they are setting themselves up to come back to guys like me every 18 to 24 months for another mandatory money grab instead of just giving us a 60 month machine from the outset.

Offer me a 60 month Surface Studio at the $3000 price point and I'd update every workstation here. But that's not what they're offering is it?


What kind of art are you talking about? I've done much graphic design (yes, paid) and used Photoshop and Illustrator for nearly a decade (though I now use Affinity, as of last year), and I have never been in a situation where I was running out of resources.

Up until this year, I have never owned a machine with more than 8GB of RAM, and never a video card with more than 1GB of RAM until now either.

Are you perhaps talking about video editing or CGI rendering?


"...Are you perhaps talking about video editing or CGI rendering?.."

Yes. Video editing and CGI. The workflow normally requires the use of MASSIVE memory mapped files. Additionally, a given artist may do several "lo res" renders before shipping off to the farm. ("Lo res" being a relative word. Because in most cases we're talking about 2560x1440 or 1080p minimum.) I really don't see how your tech guys are able to get that to work for you on cards that have 1GB memory. But if they can, then you'd better NEVER let those guys go because they are worth their weight in gold. (Unless you don't do any CGI or video editing???)


The two CGI and Animation shops I've worked at have had most of the artists on Linux boxes...


Yes, this is the case in most of all VFX and animation studios. Pretty much everything is done on Linux, usually CentOS. That being said 8GB is a joke. 32GB is minimum and 64GB is what new machines would be ordered with for almost all artist workstations.

Source: I was a system engineer for VFX and animation for a number of years.


Ah I see. Yeah, I don't do any video or CGI.


Our artists at work were drowning with 8 GB of RAM. Upgrading them to 16 GB made a substantial difference for them.

Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance just by closing their browser or using Safari.


> Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance

Can confirm. Got up to 15GB RAM usage today, before closing 30+ tabs to get back down to 7GB. Maybe it's ridiculous, but based on the information I'm needing from those tabs--100% text--it's infuriating.

Oh, and also sitting awfully comfortably in that 7GB area is Dropbox, which is doing who knows what with all of its RAM...I do like the Linux command line tool though.


I had same issue but at home where my work flow is different. My solution was to use the extension "The great suspender" which sleeps tabs that hasn't been used in a while. Might help you too.


Firefox is even worse. I caught it using 37GB virt the other day on Linux and a pretty significant chunk of RAM too. I don't even have 37GB of physical memory.


"Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance just by closing their browser or using Safari." This is so true. I've recently stopped using Chrome and switched to Safari. Now my 5-year old MacBook Pro runs like butter.


Browsers and web is frankly getting ridiculous. The modern web has a lot going for it but when moving the web forward people seem to completely forget about the efficiency argument. I wonder how much of the worlds power we waste casually like that.


[flagged]


[flagged]


Please review the HN guidelines, which ask you both to comment civilly and to resist complaining about downvotes.

https://news.ycombinator.com/newsguidelines.html


As a developer and a digital artist I am often running multiple VMs, my various development tools and 2-5 Adobe products on top of my 2 browsers and other misc programs. Agreed that for any single piece of digital design software 16gb should be enough (even to a certain extent After Effects). But if I could have a high quality laptop, that runs Adobe software, is Unix-y AND run the whole Adobe suite plus my development tools... (whilst not having to close all my research notes in Firefox)... gee whizz would I throw my money at that.

I'm probably an outlier though so can understand why it doesnt exist.


I don't think you're an outlier at all. 32GB is the absolute minimum I need to get my work done without disk thrashing. If I spent more time on video than audio I'd want 64GB.

That's completely realistic for pro and semi-pro users.

The cost of making a laptop that could take 32GB or even 64GB of RAM must be tiny. No one would care if Apple only supplied 16GB but left a couple of slots free, or even if the stock overpriced 16GB could be swapped for 32GB.

Why cut corners and make "pro" machines that can't be used for professional work?

It's counterproductive. It may increase margins in the short term, but over time it alienates users and erodes the appeal of the brand.

It seems "Think Different" has become "Don't Expect Much."


To be fair, I think most people's interaction with "graphics arts" is Lightroom, which is a pretty darn slow piece of software. Not the complicated stuff, where they reused code from Photoshop, but the simple stuff like opening the file import dialog.

I think people are imagining that a better computer will improve its performance, but it won't. There's a sleep statement in there or something. (Their blog post says something like "a cloud industry-leading innovation in picking-your-camera-out-of-a-list user experience story flow." Really? Just suck the files off my memory card in the background as soon as you detect one.)


Most people I know working as professional artists/designers are working with 2D assets in software like photoshop. Even the few working in print don't need crazy GPUs. I know some people who deal with video and 3d but most of them farm out to servers to do the heavy lifting. I'm not sure what you do but in my experience this is more than enough for most digital artists and designers.

I don't know if you saw this posted elsewhere in the thread but this is a pretty solid endorsement: https://www.penny-arcade.com/news/post/2016/10/26/the-surfac...


Considering the extremely long time between hardware refresh on the MacBook Pro I entirely share your sentiment and am agog Apple didn't wait for a suitable Kaby Lake part. People have been waiting years, what's another six months? If you're going to charge me top end prices bloody give me top end hardware. It isn't simply top end by definition with a Microsoft or Apple logo plastered on the side, sorry.


I don't care as much if the hardware is "top end", but I do require that it lasts, my surface 3 keyboard didn't last a year.


This is 100% where I'm at. I'd buy a Surface Monitor right now, because I could be assured to use it on a...well, I almost said "a real computer", and I feel a little bad for that, but I also sorta don't. I need forward compatibility with my hardware, and my workstation already blows the doors off the little box they've saddled that great monitor with.


The stats for the Cintiq Companion 2 are in the same ballpark and, from personal use, the thing has a hard time providing smooth UI when editing very large photoshop, illustrator, or unity projects.

I think you're right that artists largely don't need what high end gaming PCs might require to play the newest steam titles at full graphics settings, but for what Microsoft is asking here, they do at least need enough GPU/CPU strength to have a very responsive UI when editing very large and complex graphical projects at 2560 x 1600 and beyond.

My worry about the Surface is simply that the minority report-style advertisement isn't really possible with a real-world PSD or 3dsmax scene.


> gaming console graphics hardware

High end graphics cards are usually much better than a gaming console, in that they offer much better frame-rates, and a lot more resolution, and at the same time even a single card can be more expensive than a complete gaming console.

So, yes, they are for gaming 99.9% of the time. But please don't compare them with consoles.


> Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs.

Look, I'm sorry - I'm with you for most of it, but you have this stick held by completely the wrong end. Gaming consoles are closer to APUs than discrete high-end GPUs.


This person is a (self-identified) digital artist; if their claims are true, I'm sure they know what digital art does and doesn't require.


your rant about GPUs on the Studio for 3k is heavily misguided. What you are paying for with that 3k is mostly the new screen and its capability. The closest rival to that screen is a Wacom tablet which is very near the same price point without all the hardware MS has added. Are you aware of the new stuff that is packed into that one screen? That is the question you need to answer for yourself to understand that price point.

As for having "outdated" cards, that is more a function of the production timeline. I am sure the Studio has been in the works for a while and those were probably the best in class cards around the time.


Pro tip: When you have to explain to people why they shouldn't be disappointed by something they've thought about, you've almost always already lost the argument.

And the video is outdated, period, no air-quotes. There is simply no argument there. Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech.


He is directly rebutting one person's opinion, not coming up with a talking point after consulting with a survey of people's reactions to the new announcements. Who's to say it's not the parent that was mistaken?

Using the Surface as a tablet for drawing has been a huge use-case for visual artists from day 1. Just see this positive review of the original Surface Pro by Gabe from Penny Arcade: https://www.penny-arcade.com/news/post/2013/02/22/the-ms-sur...


And his follow-up post yesterday on the Studio https://www.penny-arcade.com/news/post/2016/10/26/the-surfac...


That's a pretty good endorsement.


I know several artists who bought the Surface for exactly this reason and found it disappointing. The hardware in the Surface Pro is pretty mediocre unless you spend a lot of money on it (at which point a Cintiq Companion becomes an intriguing idea if you don't know what Wacom's customer support is like).

Most of those artists ended up buying iPad Pros to replace them with.


"Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech"

This is a weird complaint to me. So what if the display hardware is dated if it is perfectly up to the task it's presented with? Maxing out the specs on a machine for no other reason than to max out the specs seems to me to be nothing but a waist.


The GP was the one making the complaint. I have no interest in the device. But if it is important to people, then it is important to people.

And even if it is perfectly suited the the rest of the hardware and any additional RAM/performance would be wasted[1], that's not a good excuse to overcharge for old commodity hardware.

[1] I would be extremely surprised if the rest of the system were designed around the video card. Actually, designed twice, since there are two video card options.


Depends if you want it to be great for years to come as well.


Another pro tip: when marketing to professionals, they will get it over time. If you're a real professional, you will follow the news, see the reviews and tests, and after a while you will understand new technology that was not properly marketed in the first place.


This "digital art graphic card" is starting to sound like "for my Big Data project I need these nonstandard tools and big compute pellets" when all they have is moderate data queries and a batch-processing workflow.


> or LAST GENERATION graphics cards

Not even just that, it's the last gen mobile cards, which means you have to spend 4.2k just to have 4GB of VRAM on something that, at the end of the day, is really a desktop. the 980 desktop version is ~$600 with 6GB VRAM.


that's not what you are paying for.

you are paying for a digital drafting table.

As mentioned in another thread, purchasing the lcd panel itself ala-carte would cost more than the entire unit.


Except that you are paying for it. If it costs money (which is does) and you pay (which you do) then you are very much paying for it. And that's the problem. Not only are the 9 series GPUs a generation behind, the shift to the Pascal architecture delivers previous gen. performance for literally half the price (see GTX 970 vs. 1060).

Maybe the thing you want to pay for is the screen, but there's no avoiding the outdated GPU tax that you have to pay to get it.


Or even an RX480. It could have had 8GB of fast VRAM for less than $300.


You don't even need the desktop 980 to compete with the laptop 980M. The 1060 beats it with a significant gap.


Yeah, but my point was that if they were too far along with old tech, there were still drastically better pre-existing options for a product that isn't really mobile.


I share your surprise that Microsoft didn't use the 1080. I would really love to hear the back story on that, was it getting it to fit? was it power envelope? was it driver support? Etc. I expect that if they pre-sell a million Studios or more we'll see an updated unit ("Studio Pro" anyone?) that is a 64GB/1080 machine but time will tell.


AFAIK:

980M - 100W / 1060 ~ 80W / 1070 ~ 120W / 1080 ~ 150W

If they started out with the max TDP of 100W for the cooling system - upgrading it to 150W in that small space it sounds like too much of a change to meet their DL?

Clearly this does not excuse the reengineering but it might explain why they went this route.


I can't imagine that it was power envelope. Pascal really made progress on power efficiency, which is what makes it even more surprising.


You're surprised that they didn't use a top of the line card when most of their target audience will never even make the thing sweat? I have a 1080 and use it to play games at 4k while driving two additional 1080p monitors. It kicks ass and I push it much harder than most digital artists ever will.


Well there are two reasons I was surprised they didn't use the 1080, one was that it is surprisingly hard on a GPU to do the sort of lag free drawing/manipulation. When I'm turning a 3D part around in "quality" render mode in TurboCAD and modifying it on the fly it works out the GPU fairly well. Unlike games its pretty much forced to recompute lighting fairly often.

The other was that they have already crossed over the "luxury" threshold from "just another PC" to "special hotness" sort of territory. So unless Nvidia is giving them a smoking deal on "old" GPU's it seems that from a marketing perspective you would want to check off all the boxes. Its reported that it would use less power and generate less heat, and Nvidia says it is 20% faster. That seems like it would be a "no brainer". But I clearly don't know.

That is why it surprised me when I read the high end Studio only had the GTX980 in it.


> when most of their target audience will never even make the thing sweat

Really? First thing I thought when I saw it was how cool it would be for 3D sculpting. There are more uses for high end GPUs than gaming.


Of course there are people who will, but there are more who won't and thing that e.g. Photoshop requires a 1070 or better.


I think the idea that it's a digital drafting table that can also be used as a desktop computer is the right way to think of it (as many have alluded to in this thread).

I wonder why ms didn't stick with AMD (like the xbox one) if they weren't going to use a 1080 (or 1070, might make more sense).

As someone that used to enjoy drawing/doodling but never made the leap from mouse to a digital pen - sketching and doodling on the surface 4 was eye-opening. Any number of programs paired with the pen/screen feels a lot like ink drawing without the mess and with multi-level undo.

The combination of a decent pen and great screen makes for a very satisfying and immediate drawing experience.


While I really like the look of the Surface Studio I was shocked at the GPU choice and the fact it comes with a hybrid drive. I mean I do think it is still a good machine even for the price but come on!


Oh, heck. Didn't notice that, yet. That's a bit rough!


What's wrong with a hybrid drive?


They are not as fast as an SSD.


In some use cases they are, since they contain an SSD as a large cache. They are also a lot cheaper than an SSD, so it's nice to get some of the benefits without all the cost.


For me solid state > moving parts. Faster and run cooler.

Edit: afaik the only moving part in the Surface Studio are the fans. Replacing a fan is an annoyance but at least it won't lead to data loss!


I don't understand how the bump in graphics card hardware matters to your drawing workflow.


Photoshop, and many other gfx software use GPU acceleration since 5+ years ago.


And almost none of those apps are going to max out an Iris Pro, let alone a 980M.


They don't need to "max out" a card, because PS is not a game.

What counts is the time it takes to perform an operation like a Camera RAW import or a blur. If that goes down by a significant percentage, productivity goes up by a significant amount.

For drawing, good acceleration is the difference between a distracting experience that's laggy and unusable, and a smooth experience that gets out of the way of the work.

Besides, there are people with an interest in art and machine learning. Good luck with a 980M if that's your area of interest.


Not to the level people in this thread think they do.


I am also puzzled by this... Maybe knowing they have more horsepower than they need empowers them. Or maybe they are doing a lot of high poly 3D work or something.


Driving that 16mln pixel screen needs some graphic power today, tomorrow it will be even more resource hungry


Maybe the potential to use the $3000 computer for something else than drawing?


The architects at Foster + Partners designing the Apple campus use Revit and being architects they want the coolest tech. Surface Studio hits that sweetspot better than anything from Apple at the moment.


I'm struggling to believe you're a digital artist if you're complaining about the Surface Studio. A Cintiq monitor, with no computer, costs nearly what the surface studio does, and gives you an inferior display to boot.

The machine is not "purportedly about graphics", it's about 2D artists, and they made that clear in pretty much every presentation I've seen on the device.


Ok graphics shouldn't really be that big of a problem, if companies were trying to maximize it. Instead, they're trying to minimize the footprint of the actual computer without sacrificing productivity capabilities. Luckily, this provides incentive for the eGPU market, which will hopefully give us compact devices with modular powerful computing specs.


>> And the BEST I could even POSSIBLY get would be a 980???

Not only that- it's a mobile processor, 980M, so it would lag some distance behind a 980. That is cutting serious corners for a desktop at that price point.


You go on and on about specs. Nobody cares about specs. Just see this MacBook landing page, it's full of meaningless percentage numbers.

70% faster, they could just as well write "new recipe".


Actual graphics and design pros (as opposed to Apple 'Pros') absolutely care about specs. Come hang out in VR developer land for a while, where interest in specs extends to system architecture choices.


Isn't something like the Mac Pro aimed more at you? It's packed with graphics hardware. Although admittedly they haven't updated that for several years.


This. It's aimed at a very specific group of users and they love it -- e.g. https://www.penny-arcade.com/news/post/2016/10/26/the-surfac...


For me, the most revealing piece of information from that Penny Arcade article was how much the 27" Cintiq costs ($2500). Makes the Surface Studio seem very affordable for what it offers.


The bonus for the Cintiq is that the 12" one I bought over 8 years ago is still working great!


I use a Mac with an original 30" Cinema display, along with a giant Wacom tablet.

The Surface Studio may seem impressive, but as far as workflow goes, the Mac with tablet setup really is perfection.

I've never had the desire to look down on the tablet to draw, as the pointer always moves synchronously on screen.

In fact, I feel my hand would get in the way of drawing..

Right now it seems like the Apple products are refined perfection, while Microsoft is still trying to figure out its place.


In the last two years all of the artists at our studio (which does animation and interactive work) have switched to Cintiq's. We were really apprehensive to make the jump but they all seem extremely happy and some have even reported that pain they experienced after extensive tasks has gone away. I'd be really interested to see how they like the Surface Studio compared to a Cintiq.


Different strokes, I imagine. My fiancee is a graphic artist who had a wacom tablet and a cinema display as her setup for a long time, but she got the 27" Cintiq a few months ago and LOVES it. I had similar skepticism that it'd really be that big of a step up, especially given what she was spending on it, but she says it's noticeable improvement on her productivity.


Whether or not it improves her productivity, if she loves working with it, that's worth 100 productivity points.

"The enjoyment of one's tools is an essential ingredient of successful work." --Donald Knuth

It is worth spending the money, if you have it, to buy the tools you will enjoy working with.


and are you not a bit disappointed that the products you describe, Apple's desktop products, appear to have been totally ignored by Apple? No new iMac, Mac Pro or Mac Mini hardware announced, it seems like it's not a focus for them at all.


I think their focus has been on mobility for some time now. Very few people actually need the added horsepower a desktop platform can offer due to the relaxed power and space budget. I've been working from laptops exclusively since 2003 or so.


lol why would i be disappointed? Just use a laptop and dock it to a monitor when needed.

Why would anyone need a desktop in this day and age? Moore's law means the things that desktops used to do 5 years ago, laptops do now.

The compute requirements for photo editing, for example, haven't changed in 5 years. But the processors and systems (IO, displays, etc..) keep getting faster. It's only natural that laptops would cannibalize desktops. And phones/tablets will cannibalize laptops in a few years.


Can I suggest you learn something about a domain before commenting on it?

Photo editing can be hugely resource intensive. Camera RAW files have been getting larger over time, and all but the simplest edits demand an SSD, a fast processor, and at least 32GB of memory. 128GB isn't unusual for commercial work.

You can get by with less, but sooner or later - usually sooner - you run out of RAM and your machine starts thrashing.

Many Photoshop features are hardware-accelerated, so having a good graphics card makes a significant difference to editing speed.

3D animation definitely needs a good graphics card. So does video editing.

Audio doesn't, but it needs as much raw processor power as possible. I know DJs/producers who are running dual 12-core Xeon systems for their mixes, and adding PCI DSP cards on top.

Surface Studio looks very nice, but it fails to tick most of these boxes. The reality is there's a significant market for creative power users willing to pay good money for multicore server-grade towers for their work.

Neither Apple nor MS are paying attention to this market. The Mac Pro is an underpowered curiosity now, and Surface Studio has - sadly - been hobbled by greed and penny pinching.


MS definitely is not paying attention to the high-end workstation market. They are paying attention to the trends of interaction and new interfaces that technology is allowing us. The product here is the new styles of content creation and the accelerated pace of current content creation via the form factor and accessory knob thing. If you have needs that demand extreme resources, you can probably afford to have remote rendering or processing using all of the myriad of wonderful networking technologies that have advanced so much.

It can't be an effective interaction device and a server-level resource at the same time. Anyone who is enough of an enthusiast to require a dual Xeon workstation is clearly not who MS is targeting with a single product. Leave the multicore server-grade towers to HP, Dell, Lenovo, etc. because there's nothing to innovate there - you just throw hardware, bought at market price, into a box. What is MS supposed to innovate on there if all you care about is specs/$ and ignore the design and use-case?


This is a very niche market. As laptops get more and more capable, it'll be an even narrower niche.

I love to have just as much power I can. I'd love to have a multi-CPU POWER9 loaded with a couple terabytes of RAM and a couple flash cards attached to a fast bus, but I wouldn't know what to do with it.


Yah none of what you said matters to actual photography pros or graphic designers.

Camera raw files haven't been increasing in size faster than Moore's law.

They can get by fine with a laptop. The Mac Pro is overpowered for them.


> The Mac Pro is overpowered for them. I'm not sure if you are trolling?

Moore's law has slowed down btw and is predicted to end in 4-6 years (2020 - 2022) as it would be physically impossible to shrink transistors any further /unless/ there is a change from the current silicon CMOS technology.

In fact, manufacturers are no longer even targeting the doubling of transistors anymore! Instead they are focusing speeding them up.


I'm not sure if you're being facetious but for those of us who enjoy playing video games a desktop is the best choice. If you don't care about portability, why not get something that is by far more powerful, cheaper, customizable, and future-proof?


You're in the wrong discussion if you think gaming has any relevance to this line of laptops. People that own Macbook Pros don't play games seriously. Gaming is strictly for Windows devices, as it's largely an afterthought in the Mac world.


> Why would anyone need a desktop in this day and age?

This is what I was responding to.


Yes, but the context of the thread is graphics design.


Gaming on Mac is rapidly approaching parity with Windows, ironically helped a lot by Linux and Valve


Are there many more studios outside of Blizzard that are developing for OS X? I hadn't realized there were more devs working with it. Every other game I know of that has Mac support is usually running in some WINE equivalent which isn't great.

So we have Blizzard and Source engine, who am I missing?


Ahaha ahaha... No, no it's not. I can't imagine you play any 'serious' games. Hell, even WoW, a game which is well supported by the developer on OSX, plays like crap on my $2600 MVP.


Going by my recently played list on Steam

http://steamcommunity.com/profiles/76561197998185123/games/?...

The Elder Scrolls V: Skyrim Special Edition: Win

Team Fortress 2: Win/Mac/Lin

Factorio: Win/Mac/Lin

Rocket League: Win/Mac/Lin

Particle Fleet: Emergence: Win/Mac/Lin

SHENZHEN I/O: Win/Mac/Linux

Oil Rush: Win/Mac/Linux

Castle Crashers: Win/Mac

Tropico 5: Win/Mac/Linux

Literally every game I've played recently except ONE supports Mac

Obviously I'm not a "serious" gamer :)


Well, I should have worded that better, shouldn't have been so dismissive. By 'serious' I meant 'resource intensive'. I don't know all of those games, bit the games I do know are definitely not very taxing on a GPU (aside from Skyrim, but what settings are yo uplaying it at? Does that seem worth it for the price tag?)

I don't think you'd argue that a mac is a good gaming rig from a performance / $ perspective (or any other perspective really).


CS GO worked good enough on Mac and Linux last time I checked


Define 'good enough'. These aren't inexpensive machines. In the context of gaming, $2600 is far too much to pay for a laptop that gets ~30fps at medium settings.


...you're joking, right? What a ridiculously narrow view.


I never used the Cintiq, but I've been using Wacom tablets for a good 15 years. I owned & used the Surface Pro 2 & 3, and the iPad Pro with pencil.

Both have their place - pen on screen or pen off. Working with your pen on a screen, especially a smaller screen, means you lose a good chunk of your visible space. Overall opinion -- it is great for artists that two major companies have made a significant effort on getting the pen on display right.


How does the iPad Pro with the pencil compare to the Surface Pro? I played with the Surface Pro 4 and had a lot of issues with the palm rejection.


Try drawing/animating in Harmony with it and tell me how it goes. Drawing directly on screen has its purposes.


sent simultaneously from 12 mac devices.


I am basically convinced those are paid Adds.

"Overwatch was fast and fun on medium settings. Civ VI is what I’ve played the most on the Studio and it looks and runs fantastic."

"When I first saw the device months ago in that secret room at MS, they asked me what I thought. I said, “Well I have no idea if anyone else will want it, but you have made my dream computer.”

Nothing about that is, I went out and paid my own money to buy this.


So you must've somehow missed the "Just to be clear, I don’t get paid to do any of this stuff but I enjoy doing it and I like to think I’m helping make the Surface better for artists." while reading the article, right?


Advertising is more than just paying actors to pretend to like something. News articles rephrasing press releases are often also adds. Video game reviews are usually not simply handing money to someone. That does not make them unbiased.

People spreading Viral Videos are often unpaid paid, even if they are a critical part of a marketing campaign. Hand someone some free hardware and they don't think is this worth X, just is X cool.


I suggest familiarizing yourself with the history of Gabe/PA and the Surface. Gabe initiated the relationship by writing a post out of the blue about its potential as a drawing tool, as well as complaining. In response, Microsoft invited him to a lab to try prototype Surface hardware/software and made a lot of changes based on his feedback. So, they've basically made a machine designed to be liked by him, and they continue to get his feedback to ensure that doesn't change.

So, it's not surprising or questionable that his reviews are positive.


I actually read his first post about the surface when it came our. The point is MS is continuing the relationship for good press, and by becoming part of the process they have become biased.

Also, pay is not limited to just cash.


If the consideration is that they made the product work for his use case, then that's not corruption in any meaningful sense of the word. That is exactly the behavior you want to reward companies for.


Iff they where not followed by large numbers of people then you might have a point.


IIRC, he returned his last trial-Surface device and bought one with his own cash.

Based on the article (he's been lent this device for the last week or so), I imagine the same thing may well happen if he's happy with it - which it sounds like he may be.


Pretty sure the guys who've been making a very successful business out of Penny Arcade for the last couple of decades don't need to take backroom deals from Microsoft in order to make ends meet.


rc fail? You're so far off; I don't even know where to begin.


Apple delivered meat and potatoes for ordinary users (albeit with a weird garnish), while Microsoft delivered a molecular gastronomy dessert that a tiny niche has any use for. I don't see how that is to Microsoft's credit, much less "embarrassing" to Apple.


MacBook Pro (Retina, 15-inch, Late 2013)

They haven't replaced my laptop with anything new. They just released a newer model. Similar to every yearly company announcement ever. Nobody cares about version 7 with nothing new, they care about version 1 of something innovative.


Wacom releases innovative products all the time, but I don't think that makes Apple's regularly-scheduled update "embarrassing."

Microsoft just blew Wacom out of the water in a niche product market. Great. What does that have to do with new MBPs?


>Wacom releases innovative products all the time,

Hahahahahahahahaha

Wacom hasn't released an innovative product in 10 years, and much like Texas Instruments and their $130 monopoly on school calculators, everyone is still paying the same $2500 to Wacom that they did decades ago for Cintiq.

The lack of competition for Wacom is criminal and has allowed them to coast on last-decades product line for a very long time.

Hell, the most innovative thing Wacom has done in 10 years is rebrand Graphire into Bamboo. Ooooooh!

I imagine the day the Apple Pencil was announced was a very grim day in what ever is left of the Wacom R&D department.


> Wacom releases innovative products all the time

Does Wacom have an entire suite of comparable products, or do they cater specifically to a niche market? Was their announcement yesterday? Life has context.

> What does that have to do with new MBPs?

I'll go ahead and quote the original comment you originally responded to: "This event was by far the most disappointing Mac event in the history". Apple had an event where they announced not much more than that we are now in late 2016. Nobody is saying MBP itself is bad, the event is comparatively lacking.


We agree, then, that Microsoft has just out-innovated Apple. When would you say is the last time that happened? Has it ever?


Let's see:

* I think you can reasonably make a case that Windows 95/98/NT were better than MacOS 7/8/9. (You could probably make a reasonable case for the opposite; I certainly wouldn't argue the point).

* The Zune was better than the iPod. Unfortunately for MS not better than the iPhone a few months later. The "Metro" design language it introduced was quite a bit better than Apple's at the time (and started the "flat design" trend that even Apple would end up aping, years later).

* Various areas where MS beat Apple by default, at least circa the 90s-00s, owing to Apple making no or a half-hearted effort. Office suites, gaming consoles, web browsers, the server, 3D graphics, yadda yadda.

* C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.

* Edge is arguably a better browser than Safari. MS have also made some great tooling for webdev recently, VSCode and Typescript, while we haven't heard much from Apple.

* The Surface tablets have been arguably better than recent iPads, by virtue of running a full OS.

After that I've got nothing.


> * C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.

C# is a nicer language, and VS a better IDE when it comes to language integration but.. the platform APIs are far more productive on OS X than they are on Windows and the GUI toolkit was good from the get go, whereas MS never commits to a toolkit fully (MFC, WinForms, WPF, now UWP which is like a restricted, not fully compatible WPF). Why do you think there's so many great, lightweight alternatives to Photoshop with non destructive image editing on Mac OS X, like Affinity Photo, Acorn, Pixelmator, and none on Windows ? why is it that Apple can dogfood and write everything in their modern platform APIs, even rewrote their file browser Finder, in it, but Windows out of the box comes with exactly zero .net apps? Recently Windows 10 brought some .net stuff out of the box on the desktop, but they're all toy apps no one would want to use, and certainly no equivalent to Apple Mail, Photos, iMovie, Garageband etc. The UWP mail client is pitiful. C#/.net platform in general, on the desktop, as far as commercial, desktop apps sold to consumers come, is a dead wasteland. Whatever few .net apps I've seen as an enduser that made use of .net, I tend to associate with "slow, heavy, not featureful" particularly WPF apps. .net greatest success is the same as Java: on servers, or on the desktop for in-house business software where the well being of the enduser is not a priority and no one cares if the app feels sluggish or has terrible UI.

The lack of dogfooding has been a common complaint among windows developers for a long time, for example :

http://functionalflow.co.uk/blog/2011/09/27/microsoft-please...

> I understand deadlines and priorities, and I know that probably Microsoft just had to ship something at some point, but it really seems that there was a big lack of dogfooding in the WPF case.

> There’s a striking example of this: what was the number one complaint that developers had about WPF since 2006?… Blurry text and images. And when did Microsoft fix it?… Only in 2010, when they started using WPF for Visual Studio.

> Another issue that has been bugging me since I started using WPF was the airspace limitation. It seems that it’s finally going to be fixed in 4.5. Why do I think it’s being solved now? Because they probably needed some native WinRT component to play nice with WPF…

Microsoft still doesn't really use .net outside of dev tools and server apps. UWP apps are just toy apps. UWP OneNote isn't even close to the desktop OneNote. And so on. MS themselves don't really produce high quality desktop apps with C#. If they won't, then who will? Compare to Apple and how everything they make, makes heavy use of their platform APIs such as Cocoa, Core Image, Core Animation etc. How could .net not be a barren wasteland for desktop apps?

Apple always had the better developer platform, dogfooded and thus battletested, and now they're also getting a nicer programming language to work with, with Swift. Their IDE is still no visual studio, but AppCode from IntelliJ fixes that.


>...the platform APIs are far more productive on OS X than they are on Windows and the GUI toolkit was good from the get go...

Hence why I made no mention of APIs.


When they released the surface book.


The Tablet PC. Prompt the usual "Apple did it best". Yes, but they did not innovate on the format.


Well, you could argue that Apple hasn't done a tablet PC. They've done a tablet-sized phone. Of course, where you draw the line between those is probably different for different people.


They did, just not in terms of the underlying concept. The sheer size and weight of the iPad was innovative. Doing away with a physical keyboard altogether was innovative. Using a capacitive touchscreen for a far more pleasant user experience was innovative. In short, many small innovations rather than one big one.



When did Clarke and Kubrick work for Microsoft?


What's interesting is that it's a dual volte face; Apple produced a gimmicky consumer-oriented iteration whilst Microsoft gave us something genuinely innovative, gorgeous looking, but aimed at a high-end, professional market with the money to spend. It follows on from Apple giving us desktop users a transparency abomination straight from the Windows Vista era. What is happening?


To continue the food analogy, it's as if Apple reheated some leftovers --the kind that taste better the next day, but when it comes down to it, it's the same food.


Some of you may remember when many many years ago Microsoft showed off an interactive coffee table. The Surface. And now finally with Studio they have brought to reality a device that will be nirvana for designers, factory planners, circuit designers and anybody else who can actually use an interactive surface (pun intended). Im a product manager for enterprise software and Im bubbling with ideas that such an eminently usable large screen touch device brings to the market. After decades of heaping scorn on MS for the stuff i had to work with that hasnt improved (talking about you Word, Powerpoint, Excel, Visual Studio) they have post-Balmer become an entirely different company and Im pleased about that. Competition is good for innovation and Studio is truly innovative in engineering and the things it makes possible.


> Microsoft introduced an entirely new kind of tool for digital artists.

It isn't an entirely new tool, it is an HDR, higher res, larger version of a Cintiq, with a built in slow PC with last-generation mobile graphics. The hinge and stuff was definitely an improved industrial design.

Those screen improvements alone are probably enough for lots of professionals to switch from cintiq, even if they lose support for professional-tier desktop hardware, but what about the pen? As far as I can tell the pen doesn't even have tilt detection and from videos you have to use the dial/puck to simulate rotating or tilting the pen.

In spite of that, it was definitely more innovative than anything Apple showed.


No. Microsoft introduced an entirely new kind of tool for digital artists. I'm not one myself, but the Surface Studio makes me wish I were.

I'd bet that one could make a revolutionary stats program with the new form factor/interface possibility.


> a revolutionary stats program ...

SURE! Except, of course, you're not going to be allowed to code it in Python, R, or Octave. Instead, it will be Microsoft(TM) Visual(TM) R#(TM) with the new NetDNA/DCOM/.NET/CloudyMcCloudface Framework (TM)!! (now, with 40% more Cortana!(TM))


Well, no, it's a regular desktop that runs any application, including those written in Python, R or Octave.


And it's even sillier when you consider that Microsoft's invested time/money in both Python [1] and R [2].

1: https://github.com/Microsoft/PTVS

2: https://powerbi.microsoft.com/en-us/documentation/powerbi-de...


Python runs fine on my surface. What are you talking about?


I purchased surface pro 3 for drawing. so disappointed. I don't want to try again.

if they are serious about art creation, they should only use wacom technology and nobody else.


> Microsoft introduced an entirely new kind of tool for digital artists

Actually I believe they introduced a competitor to the Wacom Cintiq that comes in a variety of sizes and does not require you to use a desktop or Windows 10.

Calling it 'an entirely new kind of tool' is really over stating it.


> but the Surface Studio makes me wish I were

Have you used Surface before? Asking because when I picked one up about a month ago, I had an awful time with it. Makes me think their announcement won't really do anything but look interesting. I could be missing something, though.


What was your problem with it? I've had the pro 4 for about a month and have honestly loved it


>> when I picked one up about a month ago, I had an awful time with it

Did you get to play with it at a store or friends house, or actually take it home and use it for some time. Everyone i've met that has a surface (book, 3, 4, etc) loves it.


Digital artists? I could program in that machine and be happy :). The only thing I would love is to have the Surface Studio but with OS X.


> Microsoft introduced an entirely new kind of tool for digital artists.

Not really, you could buy a Wacom Cintiq since about forever. I'm glad they're giving wacom some competition though.


i think the biggest announcement yesterday was microsofts cheap vr.


Microsoft announced a laptop that gets 6 more hours of battery life than the MBP, while Apple used a chipset a full year behind Microsoft's.

For the desktop, if you think people don't use desktops you're out of touch with reality. Microsoft introduced the Surface Studio primarily targeting what Apple has started to abandon, creatives & creative professionals.

A 27" Cintiq tablet costs $2800 on Amazon, and that's without the stand, which is an additional $600. This is what video editors, audio engineers and graphic artists use in their studios. The Surface Studio replaces all of that -and- it replaces the computer itself. What are you missing here?


> Microsoft announced a laptop that gets 6 more hours of battery life than the MBP, while Apple used a chipset a full year behind Microsoft's.

Which means Microsoft has introduced a dual-core low-voltage part significantly less powerful than the one in the MBP. Let's not kid ourselves here, Apple was not going to use 15W LV KL in either MBP, and the 45W parts are will 4~6 months from release.


They do use the 15W Skylake in the non-touchbar MBP. Same claimed battery life, though.


> They do use the 15W Skylake in the non-touchbar MBP.

True. I wasn't talking about the red-headed step-child but you've got a point here. The hold-up there might have been that there's no Iris KL (incidentally I'm pissed that they've apparently gone with the non-iris pro on the 15", what the fuck)


Well, not only is there no iris KL, there's no quad-core KL yet, either. Dual-core 15W CPU in a pro machine seems... inadequate. Also, all the 15" have dedicated graphics so an Iris Pro would seem like just a waste of TDP to me.


The IGP on the CPU is considerably more power efficient, the dedicated graphics can be pretty much switched of for anything non-Gaming or GPGPU related.


Yup. I'm just saving having (better, more power hungry) integrated graphics seems like a waste when you already have a dedicated GPU for when you want more performance.


Dude, I use a desktop. A Mac mini, in fact, which has been struggling for the latter half of the past 6 years.

I'm seriously considering getting a PC instead, even though I've written http://taoofmac.com for 14 years.

(I do happen to work for Microsoft, by the way, but would very much prefer to keep my home setup on a separate tech stack for the sake of keeping an open view. I'm now ogling the Skullcandy NUC and wondering how well it will run Elementary OS)


On your blog, you wrote: As far as I’m concerned, Apple is completely out of touch with my segment (call it UNIX-centric pros, if you will), so I’m going to seriously rethink my options over the next couple of weeks.

I would guess that just here on HN there are tens of thousands of us who could describe ourselves as "Unix-centric pros", who want powerful unixy client machines that match our unixy servers and who also want all the client-side stuff everyone needs (excellent video, audio, wifi, battery mgt., etc., drivers perfectly matched to hardware and working out of the box) plus a full range of apps. We can get the unixy client from Linux, BSD, or Mac. We can get the reliable, everything works consumer client stuff from Windows or Mac. The only overlap is Mac.

Unfortunately for us, this power user unix workstation stuff is just a historical accident for Apple, not a corporate objective. They're trying to get out of the computer business so they can focus on fashionable, high-margin, designer "lifestyle" products and services for the hip ones who consume stuff, not the old drudges in the back room who make stuff.

Microsoft holds their Windows legacy so dear that they'll slap more layers on the papier-mache balloon that is Windows but never put unix at its core, and all the nifty client-side Linux systems out there seem to have the motto "before you complain, realize that how much of it ends up working depends on how smart you are".

I would wish for Google to develop a line of powerful Linux-based workstations, but they would just spy on me until Google canceled the project altogether.

I wish Apple would create a wholly-owned subsidiary called "Apple Computer, Inc." that would have access to Apple's manufacturing expertise but would build a line of computers for power users who produce stuff--computers that weren't "thinner".


Windows linux subsystem, it's legit. I tried it yesterday, it is no joke. It is full on bash in windows, running on Linux. Not in a VM.

Look into it, Microsoft released it earlier this year specifically for unix developers.


I can't see the point, I'd rather just run Linux in a VM.


> Unfortunately for us, this power user unix workstation stuff is just a historical accident for Apple, not a corporate objective.

Many that went Apple when they showed OS X to the world, never really understood Apple, or for that matter, NeXT culture.

The POSIX layer never mattered, what mattered were the Objective-C frameworks on top and a workflow based on Xerox PARC ideas, that Steve Jobs brought to Apple and NeXT.

The UNIX compatibility was there, because NeXT was competing against Sun and Irix workstations, so it needed to be compatible to a certain extent.

Later Apple trying to promote the same compatibility because after the Copland failure, they needed to sell computers or close shop, so the UNIX with nice GUI was a way of doing it.

For those of us that develop native applications, it never really mattered.


> Microsoft holds their Windows legacy so dear that they'll slap more layers on the papier-mache balloon that is Windows but never put unix at its core

But they just did, with Bash on Windows.


Bash on Windows isn't at all the same as Windows on Linux.

I do appreciate that they are trying to do as much as they think they can without throwing away the thing that they believe is the most valuable corporate asset they have: the mountain of stuff built over the years for Windows. They have carefully preserved this asset by making huge efforts to ensure than anything that worked on Windows in the past will keep working forever.

The thing is, what I wish they would do for me (which might not be what they need to do for themselves) is to take the equivalent of my Linux server and wrap a Windows API around it, the way a Mac is like a Linux server inside with a Mac API wrapped around it for client software usefulness. I'd be just as happy with a Windows GUI/API as a Mac GUI/API, but what I want is for it to resemble my servers under the GUI, so I can leverage whatever unixy skills and tools I may have.

Having a simulation of a unix shell on top of an NT core makes Windows more useful than it would otherwise be, but it's not the same as a Windows shell on a Linux kernel.


> the way a Mac is like a Linux server inside with a Mac API wrapped around it for client software usefulness

No, Darwin is Mach with BSD welded to it; it is not a pure Unix design. Likewise, WSL is the NT kernel with the Linux syscall interface on top, via the Pico Process interface. The NT kernel was designed for portability; in fact, POSIX compatibility was a design goal from the early days, in contrast to Mach where the idea of mashing together Mach and BSD was never considered.

In many ways, WSL is a much cleaner design than Darwin is.


I did exactly that. After using the Mac forever (started on 10.1), I got kinda fed up and made the jump to an Intel NUC with Linux. I tried Elementary OS and didn't find it to my liking. But I really really dig Fedora, specifically Gnome 3. The breadth of choice in Linux desktop environments at the moment is neat and not something I expected when I made the leap to PC.


When I was using Fedora, I was really happy with the look and feel of Gnome 3.


Gnome 3 is great - it steals the best ideas from macOS and Windows.


I would love to hear more about your experiences using Intel NUC since I'm exactly leaning this way for my development machine.


Sure thing. First, some details. I switched to the NUC three or four months ago as my main machine. Mine is the NUC6i5SYH model. I installed 32GB of RAM and filled both drive slots. Windows 10 is installed on one drive, Fedora 24 is installed on the other.

As long as you have no plans to play games, it's a wonderful machine for development. It's really fast for its small size, noticeably faster than the Macs I'd been using before. I spend most of my time with the NUC doing Ruby development in Fedora, with some Clojure and Node.js on the side. I've done some office type stuff with LibreOffice. I've dabbled in OpenSCAD while working on a 3D print. The NUC is more than enough power for all of that. I haven't played with Windows 10 too much, but it's also snappy when I need to boot into it.

It's a very quiet machine. On full blast, you might get a blowing sound similar to a stressed MBP. But usually in normal usage it's no more than a low hum; I can't even hear it unless I listen for it. You can throttle down the processors in the BIOS to make it nearly (totally?) silent if you like.

The most annoying thing about the NUC was installing the BIOS updates before getting started. The NUC6i5SYH had a lot of problems at launch that needed to be patched. Once I got beyond that, everything was smooth. The NUC is basically a laptop board shoved into a small box instead. If the things you want to do are possible on a laptop, they'll be possible on the NUC no problem.


Thanks for the response, it seems that your use case is very similar to mine (Ruby, Node.js, etc.). I actually decided to go with the i3 model with 16GB of RAM, since I don't think that I will do anything extremely demanding or graphically intense and it's super affordable to boot. I'll run elementary OS or Fedora on it, hopefully it will all work out.


I wonder how Visual Studio runs on it - I am a gamer but I'm very tempted to go NUC for development and stay with the Xbox One for gaming until I can afford a proper gaming rebuild.


I still use a 2010 MacBook Pro (i7 8GB RAM 256SSD) to run my IDE and all the "normal" apps (Skype, Slack, Chrome), and a NUC (NUC5i7RYH) on Ubuntu to build and run the stack.

I mostly access the NUC via SSH (and rsync my codebase between the two computers). The NUC is also connected to a monitor and I use the MacBook keyboard and mouse to drive both computers with Symless Synergy.


I commend your platform agnosticism. A sane way to approach paradigm shifts in computing, imho. Which I personally extend to Android/iOS.


If you are fine with dual core, get a NUC5i5 with 16GB RAM - Yosemite and higher run absolutely flawless there. Skylake NUCs are still a bit of a mess unfortunately.


> Microsoft introduced a 28" desktop PC in an era when nobody uses desktops anymore.

"a 28" desktop PC" is a bit of an understatement. It's a huge low-latency touch display which works like a drafting table.

> nobody uses desktops anymore

There's a desktop on most desks in our office. A Macbook is certainly an alternative option but not if you're looking for a large touchscreen display.

> For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen and quad-core CPU

If there was really a performance problem I think a lot more people would be upset with what was announced. Last gen means more stability and likely cooler temperature hardware. Also, does the 15" MBP have a detachable/touch screen? Does it come with Windows? These are real considerations for potential Surface adopters.

The Macbook is simply a laptop. There's no problem with that, but Surface is selling more than that, so a direct comparison on cost and performance isn't really fair. Or are you suggesting Apple is the only company capable of putting newer hardware into a computer?


> Microsoft introduced a 28" desktop PC in an era when nobody uses desktops anymore.

The fraction of desktops vs. laptops shipped (~40%-~60%) has been holding steady for years. https://www.statista.com/statistics/272595/global-shipments-...


How many of those desktops are used in corporate environments that would never spring for a machine that starts at $2,999?


I don't know? I'm interested in correcting your hyperbolic statement, not in "which computer will sell better to IT departments" predictions.


This isn't a computer for excel\word\facebook.

The target market for this is people who buy a $1,800 wacom cintiq and pair it with a $1,000 computer.

The price is reasonable for the people it targets.


Slight correction, though your point stands. You would need to buy the $2,800, 27" Cintiq[1] and a new computer to compete against the Surface Studio offering. This makes the Surface Studio a more compelling offering on the surface, at least to me.

[1]https://us-store.wacom.com/Catalog/Pen-Displays/Cintiq/Cinti...


Penny Arcade's Mike Krahulik has been using one for a week and compared it to his Cintiq:

> Tycho asked me to compare it to my Cintiq, and I told him that drawing on the Cintiq now felt like drawing on a piece of dirty plexiglass hovering over a CRT monitor from 1997.

A bit hyperbolic, but you get the idea. https://www.penny-arcade.com/news/post/2016/10/26/the-surfac...


Honest Question: How does the actual computing hardware on the entry level Surface Studio compare to a $1000 computer?


Not an exhaustive search: http://store.hp.com/us/en/pdp/business-solutions/hp-z240-tow...

So for 1,100 you are getting half as much ram, half as much storage, and a lesser gpu.

I'm sure you could spec out something similar to the internals for a few hundred less with a little bit of work.


Comparing it to an Apple computer maybe?


It's kind of amusing watching Apple fans replay "spec" and "niche" arguments they used to poo-poo. Kind of like watching them do a smooth about-face on the superiority of PowerPC processors back in the day.


I've been running a custom desktop PC at work for years. I tend to want the best possible hardware and getting a custom workstation is the cheapest way of getting it. Very happy with my machine and I've never had any need for a laptop in the office.


It's just trendy to complain about everything Apple does although most likely 90% of the complainers will be getting these new macbooks anyway.

If they just push the specs up, increase battery life..."Apple isn't innovating anymore", "So what's new?", "It looks just like my old MBP" etc.

Now that they actually bring something innovative you'll get this kind of complains.

Now no matter if people liked this event or not I find it hard to believe anyone seriously though yesterdays MS event was any better. Overpriced useless last gen crap for insane price.


> Overpriced useless last gen crap for insane price.

That perfectly fits my opinion of Apple. ;)

I jumped off the Apple band wagon years ago, but an important point here is: Apple events used to be exciting. They would announce something interesting and worthy at every event, which is why so many even started watching life streams.

The iPhone is still a really great phone, but iOS has been playing catchup with Android for years. And apart from mobile Apple hasn't done much that's innovative or interesting for quite some time.

That's why opinions are so harsh. People have come to expect more.


If that's your opinion of Apple I wonder what you think of others then.

Apple products usually have top notch latest specs and are most powerful products out there. Both in laptops and in smartphones. So definitely not last gen crap.

Overpriced? Expensive maybe, but probably one of the least overpriced actually. Unlike most manufacturers who just throw in Windows or Android on top of their generic hardware, Apple design a lot of HW and SW and their integrations in-house and they actually have costs to cover. (I heard Silicon Valley engineers aren't exactly free)

I don't know what is "much that's innovative". I guess it's subjective. If we look at the last few years: 3D Touch, TouchID. That's not so bad.

If we look at the whole history of Android...eh...well...hmmm...Samsung Edge...maybe?

I agree that Apple isn't innovating as well as they used to back in the days of Jobs, but nobody else really didn't innovate back then and they sure as hell don't innovate now unless you see putting unnecessarily high specs on a smartphone as an innovation.


The hardware usually isn't amazing and definitely sub par considering the price, except for the screens, which are always top notch.

That Apple products have BIG margins can clearly be seen in their profits.

People are willing to pay the addition for the ego boost and great design (MacBook is still the nicest looking Laptop imo, iPhones always look great).

But it's not because they use the best, super expensive hardware.


>top notch specs

You mean low-spec RAM? Slow SSD's? Obviously soldered in, lel...

A GPU that is apparently is on par with last, severely slower (compared to 1000 series nVidias), generation of GPU's? 2GB of VRAM, lel again

A CPU that is nothing to gawk at? (2.6~2.8 GHz nominal)

Crappy sound, crappy cooling, zero moisture resistance, all for ~2800 bucks

Meanwhile Lenovo I've got in order to work with Autodesk Inventor has the same CPU, 1TB 7200rpm HDD + 128GB SSD (replaceable), up to 32GB RAM (I've got 16GB in 2 slots out of 4), 960M with 4GB VRAM, SD card reader that is amazing to have, bitching sound that kinda surprised me, full size keyboard, all for a whopping ~1000$. (And I'm fairly certain it will not die to humid air - I'm guilty of sometimes using it with water dripping off my fingers even)

Sure, it only has a 1080 screen and I don't care about battery life (since I use it as a mobile workstation that is plugged 99% of the time) and it obviously comes with Windows 10 (THE HORROR)

The point is: stop spreading bullshit about how amazing Apple hardware is. For the price you are paying it is a joke.


Clearly seeing you're from /g/ I'll reply.

I used to be Thinkpad hardcore too. I switched to Mac when my x200/x201 hybrid that I built died (fell off the top of a server rack) and work bought me an MBPr. I'm pretty firm in the Apple camp now. I used to be a hater and holy shit was I wrong.

Now that my workflow is mostly photography and 4k video editing I couldn't imagine being back on a Thinkpad with Arch/CentOS. Windows is dead to me and has been for years, I'd go back to Linux FAR before Windows, but the workflow on OSX is just so damn sweet that I'm not even thinking about it.

There's really no company that comes even close to Apple for my use case. That being said I wouldn't pay $2800 either. Let the suckers buy it and I buy lightly used for half the price. The new screen is really, really, tempting though.


> it only has a 1080 screen and I don't care about battery life

Then you are not in Apple's market, period. For a lot of mobile users, wicked specs are pointless if you are tied to an outlet.


I think we would be all much more receptive to their innovation if it didn't also include the removal of important features like the Escape key, USB type A, HDMI, and headphone jacks.

If they replaced function keys with the touch bar and left everything else alone (save for faster components), I don't think half as many people would be complaining.


I just don't care about the Touch Bar. It's fine I guess, I'm not religious about the F keys. But only USB-Cs? No USB-A? Not event MagSafe?? That's the last nail in the coffin for me, I'm afraid. It's supposed to be a Pro machine after all.


It makes sense, people who don't use Apple products don't care about what's in the Mac so they don't complain.


It used to be trendy to be a fanboy about everything Apple does. So maybe there's something to it after all.


I sit at an Apple desktop (macbook pro with 2 apple monitors) for most of my day so I can use my giant screen to design interfaces and I'm not cramped into a laptop's tiny 13 inch screen. The last office of about 5,000 devs and designers that I worked at had desktops with monitors. No one worked on laptops except a few people here and there.

I would lose HOURS of productivity per week if I were not on a dual screen set up. It's just better for what I do.

That being said, if I need to go mobile, which I normally don't I use my macbook. I think I've disconnected my macbook from my monitors probably 2 times in the last 2 years.


> I sit at an Apple desktop (macbook pro with 2 apple monitors)

You're kind of proving his point... even the people who doesn't need a portable computer do get a portable computer instead of a desktop one nowadays. (By the way, I don't agree on desktop computers not being used anymore: I spent most of my day using a desktop computer. And a desktop telephone, another supossedly dissapearing technology.)


    > I would lose HOURS of productivity per week if I were
    > not on a dual screen set up.
I agree that desktops have their place, but it's not because of dual screens... This latest MBP supports 2 at 5K (or, presumably, 4K if not buying Apple screens...)


2 at 5K... from LG. LG is making a 5K display. Apple got out of the display business.


Almost everyone I know has a desktop. In fact, the only people I know don't are my parents and people who only need a laptop for email. I happily paid $1,200 for my graphics card, and while that is too much in general there is definitely a demand for high end computing and gaming. Sure, this doesn't really apply but if I worked in a creative position this machine would be a godsend.


I work in an open-plan office of about 40 people; a mix of business and tech staff. I'm the only one with a desktop. The 'killer feature' of the laptop here is being able to take your computer into meetings for various reasons.

Mind you, if you totalled up all the staff hours wasted this year with people wandering around searching and asking others for various dongles, we probably could have afforded another couple of MBPs.


I switched to "higher end" beefy laptops. At ~14 pounds it's like a portable desktop type deal :)

Still cheaper than macbook pro btw, which I have that too but mostly use it for ios stuff only.


Professionals use desktops.

You cannot seriously expect anyone edit a 4k video on a laptop with 16 GB or RAM, no matter how hard Apple is pushing their ridiculous onstage demos.

You cannot seriously expect serious developers run multiple VMs, for example, on 16 GB RAM on a laptop.

The list goes on and on.


For 11 months when I was backpacking, I did all my development on a laptop with 16GB of ram:

http://penguindreams.org/blog/msi-ws60-running-linux/

Scala development, and some devs ops stuff that required spinning up VMs. I mean not a lot; like 6 ~ 8.

At work, I use a 16GB ram laptop, Scala development an no VMs (we do everything in Docker and deploy to Mesos/Marathon).

I do have a desktop at home now with 32GB of ram, but it honestly feels like overkill and I may scale down. There is a lot of dev work that does require a pretty beefy workstation, but I've been doing HD video editing, Photoshop/Lightroom and Scala/Java work on laptops for years.


You're literally wasting money and your brain by forcing yourself to work on downsized hardware.

Studies are consistent here.

Time you spend waiting? Distracts you from your tasks, overall focus drops -- you're dumber.

Less screen real-estate? More working memory dedicated to what's not on screen -- you're dumber.


I run several Docker containers with Java-based microservices on Win10 Dell XPS 13 laptop, while debugging PhoneGap client app at the same time. Don't see any performance problems with that. The moment when laptops became good enough for cloud development happened 2 or 3 years ago.


Hm, I don't know -- have you used more powerful stuff? I have a 2x10 core desktop with 64G RAM and a relatively powerful Dell Latitude workstation laptop -- and the latter is significantly slower. (I'm also running Java microservices inside containers, mostly.)


No, I didn't use such beasts. I have a desktop with 8 cores/32 Gb RAM, but I don't spend much time on it. Top hardware wouldn't add much to my personal performance, so I won't spend money on building such workstation.


There are a lot of different "developers" in the world. Some require server farms. Some would be just as productive on a 486.


Oh yes I do. Traveling consultant, need a laptop. Run VMs, Docker, many tabs open. Really need 16G.


16 GB is nuts...


> in an era when nobody uses desktops anymore.

A lot of people have desktops, myself included. I'm not really sure where you got such an impression.


>> A lot of people have desktops, myself included. I'm not really sure where you got such an impression.

Not just that... If nobody uses desktops, why would Apple waste so much engineering and design resources on something like the iMac if there's no market?


Dropping by millions of fewer units shipped each year.

https://www.statista.com/statistics/269044/worldwide-desktop...


So they're still selling plenty? That's not exactly "nobody" then, is it?

Most people don't buy pickup trucks, but 1) people who buy them need them, and 2) it's a profitable market segment.


>> For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen and quad-core CPU, and it's Microsoft that hit it out of the park?!

While they are alike in many respects, the Surface Book is an orange in comparison the the Macbook Pro in three important ways:

1 - It's a convertible

2 - It's got a touch screen

3 - It's got a pen digitizer


The Surface Studio isn't really just a desktop PC - the whole swivel stand and touch-screen thing makes it a really interesting proposition for people who do a lot of creative and design work. No, it's not mass market, but it is a lot more interesting than anything we saw today.


I was so disappointed with both events.

Overpriced Microsoft devices, although impressive. I am more worried about the silence from MS on MFC given the world's investment in it. We can't all just rewrite our apps in UWP overnight. The foundation classes were just that - a foundation, and we can't move the giant building built on top.

For the Mac, goodbye MagSafe and SD cards and optical audio in (or any audio line in, actually). What's the point of a super thin device if you have to carry it in a thick bag due to the bulkiness of the 24 adapters to make the device useful or connect it to anything? It beggars belief.

"We've made the thinnest car ever! Steering wheel is an optional extra".

I think we are reaching the pinnacle of computer hardware and software development. We may have even passed it.

In short, I'll stick with my current MacBook Pro for as long as I can. That new giant touchpad looks to make typing tricky.


They have been pretty vocal about MFC at a few BUILD events during Q&A.

It is in maintenance mode, only bug fixes are planned or little improvements like they did on VS 2015.

https://msdn.microsoft.com/en-us/library/mt270148.aspx

For C++ on WIndows the future is XAML, for legacy MFC applications there is Project Centennial.


Do you have a link to a video with a time index so I can watch that? It'll be helpful. Even better would be something in writing.

Thanks in advance.


I will need to search for it.

What I have bookmarked is for Windows Forms ones, where they state exactly that.

https://www.infoq.com/news/2014/04/WPF-QA

Maybe this is already worthy read for you, it also mentions C++.


I see the mention of Windows Forms being explicitly deprecated in that link but no mention of MFC itself. Thanks anyway.


Here you have the statement that there are no plans to support MFC on UWP.

https://blogs.msdn.microsoft.com/vcblog/2013/08/20/atl-and-m...

"MFC can't be used for for Windows Store apps because it uses too many APIs that are banned for store usage. In order to allow MFC usage in your scenario we would probably have to break MFC into multiple DLLs, which would be a prohibitive amount of work."


Thanks for the link, really appreciate it. Pat Brenner has since left Microsoft so we can only wonder who is in charge of the MFC stuff nowadays.


I think the main difference isn't "who announced something that I'd prefer buying tomorrow", but who announced something that shows a ton of potential growth areas? The MBP refresh announced today seems OK. It's where the MBP is right now. The Microsoft announcement yesterday showed where computers should go. It's a different approach, and kind of subtle, but I see a much larger potential in the Surface Studio's future vision for desktop/portable PCs than in Apple's.


> The Microsoft announcement yesterday showed where computers should go.

I don't think that the Surface Studio is where computers should go. It's not relevant for 95% of users.


>> desktop PC in an era when nobody uses desktops anymore

Gee, I guess that confirms I'm nobody.


I'm pretty sure our office of 300 people only has desktops in, so saying that no one uses desktops is a bit much,no? Also, all of our artists(~80) have the CintiQ 24" Tablets, which can now be replaced by the Surface Desktop. It's a fantastic device, if you have a use case for it.


Most illustrators I know are using a combination of an iMac and Wacom tablet. If it is an agency, they'll possibly be using a Wacom Cintiq, which approaches $2,800. The cost of a Surface Studio is not out of the ordinary. https://us-store.wacom.com/Catalog/Pen-Displays/Cintiq/Cinti...

The sentiment on Designer News was positive, and most of the people over there are UI designers and front-end engineers. https://www.designernews.co/stories/75964-microsoft-surface-...


> Microsoft introduced a $3,000 desktop PC in an era when nobody uses desktops anymore.

Loads of people still use desktops. If you were to say not many people/companies need that kind of machine then you would be right.


For comparison, my 27" Wacom Cintiq cost me $2500, which is just a screen with a stylus and the Studio blows it out of the water in every measurable way.


I'll take my desktops for doing real work. I don't know any laptop that can drive four and five screen setups and an area about 4' x 3' of display area.

The laptop is fine for fooling around, or "working" from home on the couch, but I'd lose my mind trying to do anything that small a screen, that few pixels, and a cramped, irregular keyboard with no travel. Plus trackpads. Fuck trackpads.


We have about 150+ engineers with Macs - and precisely zero of them asked for a desktop versus a MacBook Pro. Consider the possibility that your statement, "The laptop is fine for fooling around" is a fairly personal experience that isn't necessarily representative of the broad population.


Of course it was a personal opinion. Few are lucky enough to have as nice a setup as I do.

That people get anything done at all on these little midget laptops is, quite frankly, amazing to me.


> That people get anything done at all on these little midget laptops is, quite frankly, amazing to me.

Simply lacking screen real estate reduces available working memory that would be used to focus on a coding task.

Given how this empirically impacts work quality, I'm pretty amazed how younger tech companies have pushed employees in this direction.

It's not all that different from the open office trend.


As with all matters of psychology, YMMV. But it's definitely not for everyone.

http://www.pcworld.com/article/251521/when_two_monitors_aren...

>"People dramatically overestimate their ability to manage their environment," Meyer says, adding that in some ways, using multiple monitors to keep all sorts of data visible is analogous to using a cell phone while driving.


Well, sure, but isn't it rather obvious that using resources to engage in distracting non-work activities is detrimental to work efficiency?

I've got 6880x1440 worth of curved display on my desk, and obviously, watching Netflix, browsing Facebook, or otherwise distracting myself with one of them would be a net productivity loss.


Studies show that it's true for the broad population, too. More screen real estate and faster processing directly and significantly impacts focus and working memory.

Perhaps your employees aren't perfectly rational actors whose incentive is to maximize the utility of their tools while at the office?

For example, there seems to be correlation between 1) wanting to keep git work local to a workstation and not push anywhere with public visibility, and 2) wanting to use a laptop so you can bring that local repository state with you.

Neither desires are necessarily aligned with what benefits the company (or the quality of work), but they do drive a laptop preference.


My experience with multiple large monitors is my eyes got drier, the room got hotter and I never found anything useful to put on them.


Doesn't sound like much of a controlled study. Selection bias may have played a part :P

Even just having a physically larger display that fills your FOV appears to improve spatial task and memory performance: http://research-srv.microsoft.com/en-us/um/people/desney/pub...

From another study (http://pro.sagepub.com/content/56/1/1506.abstract) on actual satisfaction (not open access, so quoting some key bits):

First, they provide a good summary of previous findings:

"Previous research has demonstrated productivity increases for users performing tasks on larger or multiple screens. For instance, a 9% productivity increase was noted while using wider screens (15” flat panel vs. 46.5” curved screen) (Czerwinski, et al., 2003)."

"Similarly, a 3.1% increase was noted using dual screens over a single screen (Poder, Godbout, & Bellemare, 2007). Task success has been found to increase for tasks performed on 4 – 17” monitors vs. a single 17” monitor (Truemper, et al., 2008). Likewise, tasks were performed faster and with less workload while using 2 – 17” monitors over a single 17” monitor (Kang & Stasko, 2007)."

"Other studies have noted increased window interaction and open windows while using multi-monitor configurations (Hutchings, Smith, Meyers, Czerwinski, & Robertson, 2004)."

"Issues with single monitor use have been well documented. Generally, higher mental workload, more window switching, repositioning, resizing, inadvertent opening, and closing of files have been associated with single small monitor usage (Czerwinski, et al., 2003). Users generally perceive small single monitors as requiring more workload (Grudin, 2001; Hashizume, Kurosu, Kaneko, 2007)."

Second, their own findings:

"Participants felt more rushed, that they worked harder, and were more frustrated [with a single monitor configuration]."

"Participants spent more active time in the PDF reference document while completing tasks on the single monitor configurations"

"Participants were observed to leave the PDF reference document open and viewable (but not in focus) while working with other source documents when in the dual monitor configurations."

"Participants clicked less during tasks on the dual 22” monitors than the dual 17” monitors and single 17” and 22” monitors."

"Participants switched between windows more frequently during tasks on a single 17” and 22” monitors than the dual 22” monitors."


> We have about 150+ engineers with Macs - and precisely zero of them asked for a desktop versus a MacBook Pro.

Given that they're using glossy-screened laptops — maybe they care more about looks and hipness than about getting the job done?


>> Given that they're using glossy-screened laptops — maybe they care more about looks and hipness than about getting the job done?

Why the hate against glossy screens? The colors pop more and unless you're sitting in a spot that's prone to glare, you're just as productive as anyone else. I've never had an issue with a Macbook's glossy screen affecting my ability to work.


If I need to look at it the whole day, I find glossy hurts my eyes more. If it is about having something pretty as a status symbol, then yeah glossy makes sense.


>> I find glossy hurts my eyes more.

Sure, that's valid. But everyone's eyes are different, which is why I don't get why there are a lot of glossy screen haters out there. I never had that problem sitting in front of my glossy screened MBP for 8+ hours a day for 3+ years (it died but not because it had a glossy screen).

>> If it is about having something pretty as a status symbol, then yeah glossy makes sense.

Aren't glossy haters mocking/judging people basically using matte screens as status symbols? i.e., you must be a hipster or shallow and not a real programmer/{insert other profession here}?


You made them make a choice regarding portability. They chose portability. That doesn't mean it was the most productive or efficient choice.


Indeed. For me, it comes down to ergonomics. Monitor at eye height, keyboard comfortably positioned. Laptops force you to hunch in, eyes down, hands squished. It's a recipe for RSI. I'm sure some people can manage it fine, but I can't.


The 15inch Macbook pro supports 4x 4k monitors (upto 4096x2304), at the same time as the built in display, FWIW.


Both Apple and Microsoft presentations were overwhelming for the size of their companies. However, the latter introduced a completely new product which seems innovative. Meanwhile, Apple just gave their MacBook Pros a touch bar, and basically neglected any demanded updates for their other product lines (e.g. MacBook Air).

And honestly, MBP is not the best purchase for specs. You can get a cheaper Dell XPS 15.


> nobody uses desktops anymore

?? What profession doesn't use -- doesn't absolutely rely on desktop PCs??


by far.

This is microsoft making a product that designers love.

The behemoth which everyone considered dead is finding ways to make inroads into apple's old stronghold, with design innovations.

Yeah its going to have to deal with Windows, but this is a huge turn for a creature with that kind of momentum to overcome.

With the internal stack re-orgnization (rationalization), and having to build their own reference PCs, and buying their own pen tech, they are doing what a good firm should do - work hard on improving.

They have far to go but theres lots of signs of them moving in the right direction.

Whereas in sharp contrast you have

the Ipad pro which still runs ios, and the touch bar. And lets not forget their moment of courage, where they decided that they will obsolete the entire world of headphones.


Everyone has already chewed you apart for this but

> nobody uses desktops anymore

That's just not true. Maybe you only use computers for social media and youtube but other people need things that can actually run resource intense programs.


Noone uses desktop PCs? I don't know what type of hipster country you live in but they are certainly still used heavily. The product that really hit both of them out of the park was the Razer Blade Pro. Legitimately better in every single way than both the Apple and Microsoft offerings. http://www.techradar.com/reviews/razer-blade-pro-2016


> in an era when nobody uses desktops anymore.

Lots of people use desktops. And this is more than a desktop PC, the "studio" functionality makes that clear.

And last-gen/overpriced current hardware is what Apple has been doing for a long time, yet it's easy to see by their success that it's not pure specs but rather the actual experience that matters.


Microsoft announced a better $3000 monitor+computer alternative to a $2500 (3000 Euros - depending where you live) monitor (cintiq).


There are plenty of existing laptops that can outperform a MacBook Pro ALREADY. Meanwhile, Microsoft is innovating. Apple replaced their laptop lineup; there are no new products, and for my job the lineup is virtually the same. Hell, the lack of an escape key has me looking at thinkpads + an iPad for music.


> in an era when nobody uses desktops anymore.

For general computing (e.g. web browsing), yes. But that doesn't cover all the possible uses of a computing device. Gamers, designers, scientists, video/media professionals are some of the people who use primarily desktops.


Maybe a relative perception. Apple has a historical relationship with content creation personal workstation landmarks, so a mild event is a let down; Microsoft is new in this, so triggering excitement there is a very positive surprise.


> when nobody uses desktops anymore

I'd argue that this isn't true, especially for high-end artists, developers, gamers, and... well, my anec-data (gathered in only the finest biased areas) shows that this is blatantly false.


i want the microsoft thingie, but know that the app scaling issues and other MSFT funkery will still be there, under the polished sheen. if only that thing ran some linux flavor, and some linux flavor ran adobe creative suite, we'd finally be free from mac/windows forever! it's the new bernie sanders.


new kind of tool? all in one pcs with touch screens have been there for ages.. The microsoft event was just cool branding, taking a page out of apple's manual of selling something that has existed forever and packaging it neatly and saying it's innovating, when it's really really not


And a much better graphics card (we think, I don't think we know what is in surface book)


>an era when nobody uses desktops anymore

Any data to support this?

Sent from my desktop PC.


> ...in an era when nobody uses desktops anymore.

Speak for yourself.


Welcome to the world - a place where perception matters more than reality.

This is why every single company has giant marketing departments and this perception informs so many of our decisions both consciously and unconsciously.

This is the real danger that apple is in - that how it's perceived is changing even if its products continue to be superior.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: