Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Post PC (asymco.com)
85 points by ingve on Dec 1, 2020 | hide | past | favorite | 141 comments


>For example connectivity to cellular networks (still missing in even the fanciest laptop but available on my watch.)

Most "Fancy" PC laptops have cellular

>PC cameras are comical in comparison.

Microsoft's cheapest ($400) Surface device has 2 1080P cameras, on 5MP one 8MP. Great cameras actually, company I work for uses them for all their tutorial video recording, looks so much more professional than the current MacBook Pro camera.

Don't think this writer has any experience outside of the MacBook ecosystem, it's only Apple who are woefully trailing behind in these areas. I fully expect them to catch up by Q2 next year but to pretend these things haven't been standard for years in the rest of the laptop ecosystem is silly.


> Most "Fancy" PC laptops have cellular

Yeah, I would diagnose this as a clear case of "Apple lock-in syndrome" on the author's part...

But he does have a point! I recently bought a Lenovo Ideapad Duet Chromebook as a more flexible alternative to an Android tablet, and was a bit surprised that it doesn't have GPS - which you take for granted from all but the cheapest tablets these days, but of course this is a Chromebook with a detachable keyboard, not a tablet...


Might already know this, but your phone's hotspot should pass through the location data. We used my laptop on a couple of roadtrips where we wanted to pull the map up larger.


> your phone's hotspot should pass through the location data

What is the protocol for that ?


NMEA perhaps?


Good to know - there is actually a feature to connect your Android phone to your Chromebook to share the internet connection, and I tried that a few months ago, but the location data wasn't shared. After that I realized while on vacation that using OsmAnd on the phone was enough for all practical purposes, so I forgot about this issue. But maybe in the meantime Google published an update which also shares the phone's location data? Would definitely make sense to have something like that...


Fwiw I have a fancy (ok expensive/powerful but not “fancy” like a surface) Dell and its camera is terrible. I’m guessing it adds around $3 to the BOM of a $3k laptop. I have seen similar crap on similar models from HP/Lenovo. This pandemic hopefully changes this for new models once everyone does video calls and notices this.


Laptop cameras are terrible because they are embedded in screens, and laptop screens are very thin. It's a physics issue, not a cost one. The Surface can have a better camera because the screen is the thick part (although this makes it a much worse as a laptop).


My laptop screen is about as the same thickness as my phone (OnePlus 3), which has a front-facing camera that is unimaginably better than the one in my laptop. The border on the top of the screen is also wide enough to fit it and in fact the camera inside of the laptop is only marginally smaller than the one in my phone (I've unfortunately had to take both apart enough times that I'm quite certain of this).

It's definitely not a physics issue. Just like the painfully slow barely USB 2.0 SD card reader that laptops include, despite costing a couple grand and having a 20Gpbs link literally right next to it, it is a cost issue.


>My laptop screen is about as the same thickness as my phone (OnePlus 3)

OnePlus 3 is 7.4mm thick. You will have to measure your laptop screen thickness. Most Modern laptop dont have a lip that is anywhere near that.


Just took some calipers to my 2017 Lenovo Yoga and the screen (glass to aluminium) is just over 6mm, so I was off by quite a bit. I also measured a similarly old but low-end Lenovo laptop I had around and and some 2019 HP business laptop and both were close to 7mm.

But the OP3 is also not the thinnest device these days. I know both Samsung and Apple have made flagship devices 6.9mm thick and their cameras were excellent. I've read about 5mm phones as well, but I don't know a thing about their cameras (I'd imagine still better than most laptops though).

And besides, even if there was no way to fit a decent camera in 6mm, I'm certain most business users couldn't care less about a an extra mm of thickness or even a small bump on the back of the screen if it meant that they'd be getting actually usable video out of their absurdly expensive laptops.


Here [1] is the image of a 2017 Yoga I find. I dont know how that is 6mm thick, and not width. Considering the Whole Laptop thickness is only 17mm.

The thinnest part of MacBook Air ( It is a Wedge Shape ) is 4.1mm. That is both the base and the lid combined. The old MacBook Pro lid I have now is only 3mm thick, Glass to Aluminium back. Basically we are talking about fitting a camera on top of a circuit board with under 2mm of space.

[1] https://www.amazon.com/Lenovo-15-6-Touchscreen-Bluetooth-Fin...


I’m Confused why you keep your magical laptop screen a secret??


What?


I for one would not want a 4k fron facing camera on a laptop for video calls even if it was possible. 720p is perfect in my book, people don’t need to see every pore in my face or my nosehairs when on a call.


> Most "Fancy" PC laptops have cellular

what

just checked one of the biggest laptop retailer in my country, and out of 651 laptops referenced, only 10 have 3G/4G support. I don't see any with a GPS.


Yeah, no. I've had an HP Elitebook that had a 3G modem, but that was very much an enterprise spec, I don't recall a single high end consumer laptop with a cellular modem. I think you could get the XPS with one years ago, but that option is long gone now.


There were modems for mPCIe and mSATA interfaces, there are for M.2 now. All HP Pro/Elitebooks I've seen had a 2nd mPCIe/mSATA interface on mobo for WWAN but the cheapest models might not have an actual slot soldered.


My 10 year old thinkpad has an empty bay where you can add a cellular modem. Cellular in laptops is not new.


Cell connectivity has been available on laptops for a while, but it's rarely actually present (IME) because the OEMs overcharge for it, and tethering plans exist for phones, so why pay twice?


Yeah this is a post written by someone who hasn't touched a non-Apple PC in years.


"Desktop" software hasn't caught up to the idea that you can have a network connection that a) goes away and b) can be expensive. What's to stop a native application from downloading a huge update as soon as you connect to 3/4/5/G? IOS and Android have ways of controlling this and mobile apps are written with this expectation from the ground up. Which is why I expect OSX will continue to converge with IOS/IPad OS and provide a more managed environment for apps to run in.


I haven't looked into it too deeply, but my (Win 10) laptop can tell when I'm tethering and goes into a "metered connection" mode, where at least the windows native apps don't connect without explicit interaction.


I have a Surface Pro 7 from work and honestly it's not even half a MacBook. The OS is particularly poor, as is the battery life. The Mac is a better frontend for Windows sessions through Citrix than a literal Windows device is-- gestures, scrolling, trackpad predictability, etc. I could not give two craps about the cameras. I don't use my laptop as a camera. If I need a high-quality camera, I'll attach one. If I have to do a video call without preparation, the receiver can deal with a "mere" 720p. Would I welcome a 4k cam on a Mac? Sure, if it's free, why not? I'd rather have a microSD reader tho.


Curious why you feel the need to jump in to try to defend companies like this. He said nothing about macs other than they're trailing behind in video and cellular, nothing about the os or battery life.


Another thing missing from the Mac is touch screens and pen input. If you've ever used a Surface, it just feels like there's something missing from your Mac.

It boggles my mind that Apple shows all these musicians, designers and artists in their ads, and pretends that a one-dimensional strip above their keyboard is allowing them to express themselves in a completely new way...



> But what about the Mac? The Mac is the PC from the company that more or less defined the Post-PC era. Would it not be the first to be sacrificed for the new future?

This to me is one of the really big but less talked-about implications of Apple Silicon. 12 months ago - before M1, before Big Sur, before the Mac Pro reboot, before the MBP 16 - the Mac had atrophied. Hardware quality had gone down, the software felt neglected (the UX update in Yosemite felt like a half-hearted skin). The Mac felt like a side-project. It felt unimportant to Apple; there was speculation that it might be turned into an iPad with a keyboard, or be dropped altogether with the all the new energy surrounding the iPad Pro. Apple seemed like it might just exit the workstation business completely.

In the past year that trend has been turned on its head. First they fixed the keyboard and refreshed the MBP form-factor and internals. Then they came out with the radically-powerful (and expandable!) machine that pros had been asking for for years. Then, most importantly, they came out with a huge financial commitment to the future of the Mac, making it a seriously compelling workstation again, and the cherry on top is a new UX that, for the first time since OS X (nearly 20 years ago!), looks like it was designed cohesively from the ground-up. No matter how you feel about Big Sur's design choices, it shows that Apple cares about the Mac again. And they're just getting started with the new silicon.

For an ecosystem that asks a lot in terms of commitment, it is hugely encouraging to see such an unambiguously renewed focus on this product line.


Counterpoint: Moving to ARM and custom silicon is a way to unify their iOS device line with their Mac lines.

The Mac may not be an afterthought, but I suspect it will eventually become an "iPad XL" with a keyboard. The Mac will move further from being a general-purpose computing device and more toward being an appliance. I'm sure they'll still allow you to run XCode and Photoshop, but maybe not Wireshark or VPN software[0]?

[0] https://thenextweb.com/plugged/2020/11/16/apple-apps-on-big-...


This again.

I've been using Macs for 20 years (and Linux for 20, and Windows for 30). There's no evidence I can see that Apple is going to move the Mac away from being a general-purpose computer. People LOOOOOVE to say they will, though.


We're barely a week away from offline 3rd party applications not working on a mac due to Apple's servers being overloaded.

To dismiss the idea that Apple may move towards this direction is way too hasty, especially when nearly every improvement Apple has made has gotten the mac closer to its i counterparts. In fact, it's hard to think of a change Apple has made to macs over the last several years, which would potentially make such a transition even slightly harder.


You're confusing "when online, Apple checks binaries" for "Apple by default blocks 3rd party apps."

>To dismiss the idea that Apple may move towards this direction is way too hasty

This is just a silly sentence. It's not on me to prove they're NOT doing this. It's on YOU to show me things that definitely say they ARE.

Apple (and MSFT) have both taken steps to button up the OS more in line with the level of expertise of the baseline user, which is WAY closer to your 60 year old aunt Millie than to anyone who posts here. Apple does a better job here; enabling non-AppStore binaries is a single checkbox. Microsoft's S mode is a gesture in the same direction, but it's hilariously bad in implementation -- harder to disable, can't be turned back on once disabled, etc.

Weirdly, though, nobody is yelling that MSFT is going to move to an app-store-only model.


That was a dumb failure mode of a certificate check for expired certificates where it didn't handle the server being up but not responding. If you are offline, that check failed silently with no issue. Adding the certificate server to your hosts file fixed the issue.

I guarantee you that code has been updated already with the recent updates and anything similar is also under review. It's always easy to Monday morning quarterback about edge cases after they happen :p

It absolutely is NOT some sort of "tell" for Apple locking the Mac down. They don't need to - if you want an appliance device they already have that with iOS.

There is zero upside for Apple to try to lock the Mac down; it's all negative. Apple can do some dumb things, but they are NOT stupid. Used a Mac since 1986, owned one since 1987 (thanks to my crazy parents - could have had a good used car instead!) and dunno why people love to focus on the negative with Apple but unfortunately it's nothing new.


Can you run unsigned apps on recent MacOS?


Of course.

To even go further `sudo spctl --master-disable` to disable Gatekeeper, and `csrutil disable` in Recovery Mode to disable System Integrity Protection and you have a Mac that is just as open (including the loss of security, which may or may not just be theatre, I am not going into that one) as any old OS X/Windows/*nix machine of yore.

I personally just tend to use cheap-o machines myself these days, I just use web terminals for the most part personally.


Yes.


VPNs are required for corporate use of the machines. That's a fairly big market, I doubt it will go away quickly.


Substitute "VPN" with "any creative use of the hardware that requires low-level access to do something Apple might not endorse".

But you're right, corporate VPNs will certainly still be allowed. Maybe you'll need to install it through the app store, and your company IT will have to submit their VPN configuration for Apple security review or something, but they won't block all VPNs.


They will provide a standard Apple VPN solution. Because they're big enough now and because they've started to be adopted by enterprises plus they don't care if the adoption is immediate, network device vendors will start following.

Then once they're ready they'll pull the plug on third party VPN solutions.


They already have VPN support built-in, you can add a VPN connection via the Network system preferences screen.


This and considering edge security is on a massive rise due to Covid-19 they won’t dissallow it. Heck Wireguard and OpenVPN are available as well on IOS.


Corps might not care that much that everything but apple telemetry is going through the VPN although.


People keep drawing a very long line between the chipset and the OS flexibility/philosophy. These things have nothing whatsoever to do with each other. You can have a locked-down OS on x86 hardware and a power-user OS on ARM. The M1 has very little to do with the "What does the future of macOS as a platform look like?" discussion, aside from providing strong evidence that there is going to be a future for the Mac.


Indeed - this is the first time in over 20 years I'm excited about new Mac hardware.

I was all set to replace my 2015 MBA with an M1 MacBook Air. Then I started watching all the videos of people running games on the Air (Air!) - and even Windows games - many of which I play - under Crossover (commercial version of WINE from CodeWeavers - more than worth the $60 for the integration work they have done) I'm starting to realize that I may not care about keeping a Windows gaming machine any more. My only problem is I'm addicted to way to many assets in Cities:Skylines to get by with 16GB of RAM, so now I'm seriously thinking about waiting for round 2 for something that can get to at least 32GB of RAM.

And by the time those machines are out I suspect a few of the games I care about that don't run on the M1 may have a solution of some sort. It's nuts, and not a path I thought I might go down - but the performance I'm seeing from non-native code already simply blows me away.

It's only going to get better from here. This is the _ground floor_. I have to keep reminding myself that this is the start


>This to me is one of the really big but less talked-about implications of Apple Silicon

Yes. Everyone is too busy hyping. No one is asking questions. ( Hype has die down a bit and we are now starting to see some quality information and discussions going [1] )

It was basically Apple admitting defeat. I mean even Steve Jobs didn't write off PC completely[1]. Or Intel is partly to blame because their roadmap were off by 4 years. The MacBook 12" in 2015 was designed with the hypothetical 10nm in mind.

People keeps talking about Tablet taking over PC, Phone taking over Tablet etc. It really isn't the form factor. It is the default input devices, and how they are optimised for. The PC is strictly optimised for Keyboard and Mouse. And for lots of task, business conducting activities, Keyboard and Mouse remains the best option. Business are not switching to Tablet, or Tablet with Keyboard. In fact they are switching back to Keyboard and mouse for lots of the task they thought Tablet were better. In other sectors, AutoCAD CEO mentioned ( sorry cant find the video , it was about 2-3 years ago asked by Emily Chang on Bloomberg ) in certain fields ( I remember it was construction ) all of the their customers has literally moved to Tablet / iPad. So it really isn't a once size fits all solution.

And the PC market is still growing. A very low single digit number, so 2020 will be an outliner. But it didn't slump to 150M or 100M unit a year as many predicted. Something somehow even Horace Dediu refuse to admit.

And since the PC market ( WorldWide, so you could argue China and India or other countries are picking things up ) is not shrinking by any definition, there are still innovations to be made. Estimate of 1.5B of PC Worldwide ( You wont find an actual real number, since even Microsoft doesn't have clue and keep changing it ), 750M of those are business. Compared to ~110M Mac, there are plenty of space to grow. In 2015 I wrote about Mac should be at least 150M and really should be pushing for 200M by 2020 if they continue with momentum, which they didn't. Luckily Apple was only really competing with itself.

The new Mac will have lots of user switching or coming back. There seems to be enough margin for an even cheaper MacBook Air at 11" or 12". Depending whether Apple follows the iPhone playbook where they have 16" and 14" for Pro, then the MacBook Air will be 14" and 12". That is going to be quite disruptive for a 12" MacBook Air at $799 and for Students $699.

Having said all that, whether Apple remains committed to Mac in its ideal as a Personal Computer, or a lock down iPad Appliance remains to be seen. It seems Apple execs are unsure. I could only wish a Mac remains a Mac.

[1] https://news.ycombinator.com/item?id=25257932

[2] https://www.youtube.com/watch?v=YfJ3QxJYsw8


I think you're overselling the input device argument here.

I work all day on a regular computer -- big monitor, mechanical keyboard, mouse of my choice, etc. BUT when I do personal travel, or leave my office, I can do almost anything I want or need to do with my iPad + Magic Keyboard, which includes a trackpad.

It's not the lack of, or default positioning of, input choices that make a general computer superior to a tablet for many tasks; it's the whole way the OS works.

MacOS, Windows, and (I suppose) most Linux window managers are predicated on having lots of persistent windows you can move around, reference, etc. Tablets and phones assume a fullscreen approach. Modern iOS has the ability to show multiple things at once, some of the time, but it pales next to what you can do on the general purpose OS of your choice. THAT, to me, is the productivity gap.

This is NOT to say that I want iOS to change and become more like MacOS in this regard. I think there's a real place for both modes. It just depends on what you need to do (and, to a point, where you need to do it).


> Apple cares about the Mac again

I don't think Apple cares about macOS users though. It's a huge difference. The way Apple drops support for existing things easily drives people away from them.


I wish this was true, but everyday I see more dev shops moving from windows to mac than the opposite. I don't see as many moving to linux.

Moving applications to the cloud and the slow deprecation of old windows apps made this possible.


I've seen people dropping macOS and switching to Linux for a development computer. Can't really quantify that.

But it's even more pronounced for gamers about whom Apple doesn't care at all. Linux is no doubt better for development than macOS, but for gaming it's glaringly better.


> For example connectivity to cellular networks (still missing in even the fanciest laptop but available on my watch.)

What? Lots of laptops have a sim card slot.

My fairly long in the tooth Thinkpad Helix 2 does and so far as I can tell so do lots of the modern Lenovos.

A quick google suggests that many Dell Latitudes do as well.

I wonder if the author is accidentally conflating "fancy laptop" and "MacBook".

Edited to add: I was getting 12h+ on thinkpads going back as far as 2005, the "all day battery life" thing isn't that new to me either though I'm sure the M1 based machines would last 18h+ given a similar level of use so it's still not at all shabby.


Edited to add: I was getting 12h+ on thinkpads going back as far as 2005, the "all day battery life" thing isn't that new to me either though

Pretty sure you weren't watching or editing 4k video for that amount of time, which is what people are doing (and more) with these M1 Macs that last all day.


Pretty sure I acknowledged that in the second half of the sentence you quoted :D


Verizon was selling cheap HP laptops with cellular years ago. We bought 120 of them for one grant.


You payed less than $10 per laptop?


They said Grant, with a T, not Grand.


Yeah, so he's paying 40 cents per. Sounds about what a HP is worth.


grant as in government grant program - sorry for the confusion


Who even cares? You always have your phone. WiFi hotspots are trivial. It's a feature few would use.


I think this is one of those things that's more likely to be found in cheap devices, or devices intended for poorer markets - similar to dual sims in phones. It's something that's more useful in areas with poor internet connectivity.


They're very common here in NZ, where our internet and WiFi connectivity beats the pants off USA.


What https://www.glimp.co.nz/countries-with-the-fastest-internet-....

> On the global list, the average internet speed in New Zealand put us in 27th place. However, the average internet speed in NZ was 14.7 Mbps, still faster than the Australian average speed of 11.1 Mbps.

The US is 10th on average

When I visited, I had the hardest time finding free WiFi. A place gave me 5mb of data over 1 hour which i had to redeem using a paper based password system. I mean at point, why even bother giving me a hope.


Thinkpads tend to be road warrior machines, so the higher end ones having that feature makes sense for that reason.

Certainly I've used 4G on trains before now because the on-train wifi was frigging useless.


Cellular is a niche feature. I would disagree with how categorical that quoted statement was but not the general point.

Of course laptops with cellular exist but they are hard to find and it is anything but typical that you can just add that to some laptop you want to buy. You have to search them out.

Contrast that with the Apple Watch or iPads. Every Apple Watch except the lowest end model and every single iPad Apple sells offers the option to order it with cellular. For these kinds of devices it’s just obvious that this is possible. It’s a foundational, central feature, not just some add-on you can get with some models.

So, yeah. That is actually a difference that matters. PC makers have been much better at moving their PCs into this direction than Apple with their Macs, but they still suck at this (with it being relegated to a niche feature).


> PC cameras are comical in comparison.

PCs usually do not have cameras. Laptops do, sometimes. But they are not meant to take photos - would you carry your laptop around for that?

The article does not get the facts right. There is huge R&D investment on the side of the big PC hardware manufacturers, just look at the big ones, AMD, Intel and Nvidia. There are also really nice Windows/Linux laptops. And great peripherals, with big changes over the years, just compare modern cases like a Lancool II 215 (which isn't even expensive) to what was common a decade ago. And of course the PC market did not grow as much as the smartphone market in the last decade - smartphones started from zero (well, add a few years), while many people that need a laptop or a PC do have one already.

Nonetheless the available market numbers for PC components and systems are not as bad as articles like this make it seem - and that was true even before the pandemic. Writers like this just always forget that when they want to paint a negative picture of PCs in favour of smartphones/tablets or when there is another round of "consoles will end PC gaming", which was never true in the slightest. There is no need for a closed off hardware platform like the Mac to save the PC.


"There is huge R&D investment on the side of the big PC hardware manufacturers, just look at the big ones, AMD, Intel and Nvidia."

Those are the only significant sources of R&D for PC hardware. None of the big vendors - Dell, HP, Acer are NOT doing basic R&D. Crap, HP is where good technology goes to die or whither away. All the stuff from DEC flushed down the toilet including Alpha (amazing processor - loved our NT Alpha systems back in the day), StrongARM (how ironic), Compaq (Still have my Personal Juke Box PJB-100 - first hard drive MP3 player with a solid lead on Creative and their awful thing, years before Apple and the iPod[and the PJB guys came from DEC now that I think about it]), Palm - don't get me started on Web OS. Itanium? Talk about making a bad partner choice, Intel. Didn't pay attention with Alpha? Then again Intel got StrongARM from HP and then pissed it away too. But I digress...

The problem with the PC vendors is they are all pulling parts out of the same bin of least-common-denominator parts.

Apple no longer is. We haven't seen the chips aimed at desktop use where battery life and thermal management will be far more flexible. I don't think Apple will focus on one single aspect like single core performance (which they are crushing all but AMD's newest with a 10 watt mobile part - 10 watts vs 50 watt desktop parts!) but overall system integration. Hardware accelerators. Neural Engine. ML cores. Stuff like that. Stuff we haven't even thought of yet.

There is much speculation that Rosetta 2 got specialized hardware assistance which explains its ridiculous performance, esp. compared to the original Rosetta. Why not? Apple owns every transistor. M1 runs Intel code faster than the machines they replace. Apple can add what they want - and leave out what they don't need. Not just in the CPU but ANYWHERE in the system now.

I think people are dramatically underestimating just how powerful having full vertical control really is. I suspect we are going to see it in a BIG way when they do release the desktops. Remember they really didn't need to do this processor transition; the only reason to totally disrupt their ecosystem is if the benefits DRAMATICALLY outweighed the disruption. They wouldn't commit to replacing all Mac's, including the Mac Pro, unless they thought they could carry the performance difference through all models. It just can't be marginally better.

There will be no hackentosh with Apple Silicon. If Apple follows through and the desktop performance scales to what we have seen with what they introduced on their entry level hardware I guarantee you very few people are going to care AT ALL that these Mac's don't run Windows or really old software - especially in professional settings where computers aren't a hobby but tools to earn you money - and time is something you can't make more of. Needing less time is a competitive advantage. Religious debates will go out the freaking window in a heart beat if these new Macs are as dramatically different from everything else as I fully expect them to be. Otherwise why bother with Apple Silicon? If you can't have such a stark difference that detractors will just look silly, then why bother?

But let's say they aren't embarrassingly better in performance and instead stay above M1 performance but not embarrassingly so (say multicore doesn't scale that well) they will still be the best Mac's made and still best the majority of PC's out there.

Also just because the M1 Mac's don't support PCI Express or GPUs doesn't mean no chips will. Many people didn't think we would see Thunderbolt - yet these chips have it. And with AMDs latest graphics cards, I really don't care about Apple's rift with nVidia any more.


We have different definitions of what amounts to R&D expenses. Just look at the regular stuff like the tooling for a new case - we have lots of that, and that stuff is expensive. Mouses, keyboard, displays - nothing stood still. It's normal and a positive thing that the field became more stable after a while, like that all the alternative processor architectures vanished. Until now of course.

> Apple can add what they want - and leave out what they don't need. Not just in the CPU but ANYWHERE in the system now.

Yeah. The "courage" headphone jack company can now make even more decisions I will disagree with and create unupgradeable systems that are unrepairable. Can't even change the ram in those things, pure throwaway garbage. Apple is the very last company I want to be in a decision making position over my desktop system. This is not a positive development.

And yes, they have thunderbolt, but you can't run GPUs with those ports. How useful.

> Otherwise why bother with Apple Silicon? If you can't have such a stark difference that detractors will just look silly, then why bother?

Money. And being able to release new systems when they are able to, not when Intel gets something out of the door for once. Which is a plus on its own, they won't and do not need to be radically different.


"But I'm also willing to bet you're reading this on a phone."

How much are you willing to bet?

What is interesting about this website is that it isn't accessible via https, archive.org nor search engine cache.

These increasingly rare holdout sites cause me to take note of just how much I have had to change the software and configurations I use in order to accomodate the rise of manadatory TLS use (at least on sites posted to HN).


"But I'm also willing to bet you're reading this on a phone."

Web use from phones surpassed desktop several years ago. Anyone with a general purpose news site can see from their server logs/Google analytics they get more traffic from phones than PCs.

https://leftronic.com/mobile-vs-desktop-usage/


> How much are you willing to bet?

I bet if (post-facto) they got $1 for each person reading on a phone and paid $1 for each person reading on a not-phone, they would come out ahead.


The data for the website I work on shows four out of five people on mobile and one out of five on desktop. So I certainly would take your bet.


Horace Dediu is a very well regarded analyst who certainly reviews viewer stats for his site Asymco. I wouldn't back your bet.


He is an apple-is-the-best analyst, he has a bias.


The only headers I sent were Connection and Host. But it does not matter if he guesses correctly. His bet is I am using a phone. My bet is I am not. Fact is I am not using a phone. I use a phone as a cellular or wifi phone and for GPS, but never for web browsing.


> "But I'm also willing to bet you're reading this on a phone."

Chuckled a little as well as I read this on my PC.


In a bet on total readership split, you would lose.


Every time I try to use non-PCs like I use a PC I run smack into one brick wall or another.

Let me give you an example. I wanted to relax while playing a game and listening to a podcast. My phone has plenty of horsepower for doing both. Oopsie, there's a problem. I can't hear my podcast over the game,and android has no way of setting the volume of different apps to different levels unless the app includes its own internal volume control.

Mobile devices are littered top to bottom with this sort of "I can run but not walk" horseshit.


You just need to choose the right phone. This works fine on iOS, especially if the game offers in app audio volume selection, although some games deliberately block other audio sources. It’s a per game issue though, not a limitation of the system.


The fact that this requires in-app volume controls, or that an app can block other audio sources, feels like a limitation of the system.


>or that an app can block other audio sources

I think that's a useful feature, and the real issue is that it can't be overridden or reconfigured.


What desktop OS provides per application volume controls, outside the apps?


Both Windows and Linux (via PulseAudio) provide this. I'm not as familiar with the Windows side of things, but on Linux I can also redirect each application's sound to different audio devices.


Yup, this is extra useful when I'm using e.g. an external USB headset for calls. I can have media played through my speakers while using the headset for calls, regardless of whether or not the application supports this itself. (Admittedly VoIP applications generally do.)


Windows does out of the box. Linux and OSX can definitely be configured to even if its not built in. On mobile, you cannot configure or add OS level features the way you can with desktops - at least without jailbreaking.


Linux distributions released in last 10 years or so come with pulseaudio, which does exactly that out of the box. No need for any configuring.


Depending on the DE it might not be exposed to the user at that level of granularity despite being supported under the hood. You need a shell plugin to expose per-app volume controls in Gnome for example. (You can always install pavucontrol, of course.)

Plasma provides it right out of the box.


That's literally a limitation of the system. Actual operating systems like Windows and Mac OS control program volume independently of what the programs internal volume is set to.


Can Windows or macOS do that either?

It looks like there might be some third party options for this, which is maybe your point, but it's not a particularly common use case as far as I can tell.


Windows can, and so can Linux. (I make use of it fairly often when I'm e.g. playing a game and using an external voice chat program.)

macOS cannot without third-party tools, as far as I can tell.


Yes. Out of the box, and for multiple decades now. Controlling how loud your programs are is as basic a function as controlling where they are on your screen.


Interesting. It's a function I've never seen or used. I guess because most software that I would want to do that with provides its own volume controls. Definitely useful.


Most of them do, sure, but why bother with remembering where everything keeps its volume setting when you have one central icon that lets you balance the entire system at once, right on the taskbar?


At first the only computers you could use were trains (mainframes)

Then they became buses (mini computers)

Then they became pickup trucks (desktop PCs)

Then they became vans (laptops)

And now finally, they are sedans & SUVs (smartphones & tablets)

One day, the motorcycle and electric scooter will become popular (watches, AR glasses)

The vast majority of car owners although, given the choice, will buy a sedan or an SUV. Pickup trucks are popular for some subcultures and some forms of professional work, and will always be needed and will never go away. Mainframes / trains have morphed into datacenters, but are functionally similar. None of these form factors will go away, except maybe minicomputers, but the vast majority of the populace will move onto the smallest safest thing, unless they need the extra space that a pickup truck gives them :)


This is a well thought out analogy but by the same token even though it might be nice to shrink the form factor even to watch size if you could but a lot of the space required isn't the machine at all it is screen and keyboard in which the pickup truck or van style input has an enormous advantage.


> The term Post-PC was first coined by MIT scientist David D. Clark in 1999 and was quickly adopted by both Microsoft’s and Apple’s former CEOs.

Up to a point, Lord Copper. See (once again) Jobs' discussion of the "Digital Hub" concept at Macworld 2001 San Francisco https://youtu.be/AnrM4n6S3CU?t=2585 . This January 2000 Fortune interview https://money.cnn.com/magazines/fortune/fortune_archive/2000... is also relevant. It also turns out that Walt Mossberg described the Newton as 'a sort of “post-PC” device' in 1993: http://allthingsd.com/19930812/the-apple-newton-messagepad-r... .


> What Apple is doing once again is ratcheting up the bar for the PC to make sure it’s the top choice for creative professionals.

This just isn't true, unless one means laptops.

Creative professionals, as I understand the term, generally need all the power they can get (video, audio etc.)

The Apple machines are nowhere near the top of the line from this perspective. If you care about CPU/watt then the M1 looks pretty awesome. If you care maxCPU, the M1 is currently not particularly interesting.


> The Apple machines are nowhere near the top of the line from this perspective. If you care about CPU/watt then the M1 looks pretty awesome. If you care maxCPU, the M1 is currently not particularly interesting.

That is true today.

A very simple way forward for Apple is to follow in the footsteps of AMD with a chiplet design.

All modern CPUs so far have been thermally limited, so being really good at CPU/watt is a huge advantage. If they can package 2 M2 (next gen) chips in a package for power users and 4 for their Mac Pro, they could be on top of the world in very short order.


If you want to compare apples to apples Zen 3 is supposed to offer 20% more performance vs Zen 2 and offer the same max 64 cores as current generation. Granted they are expensive with the current 64 core CPU part coming in at $5800 and the whole machine easily closer to 8k but if prices remain similar the top end mac pro will probably perform less ably than a machine based on the AMD part and cost about $15k.

Apple will of course retain its crown as the fastest OEM of hardware that runs its own OS since it will remain the only one legally allowed to do so.


The M1 is the first go at it and it's been relegated to all of the lowest tier machines, which creative professionals were probably not buying anyways. There's already rumors of a 12-core "M1X" 16-inch MacBook Pros with double the amount of high-performance cores.


The Apple machines are nowhere near the top of the line from this perspective.

All of the benchmarks and testimonials are showing a $699 Mac mini running on a M1 beats a $4000 iMac Pro and pretty much all Intel desktops.

Apparently Hollywood doesn't agree with you; a new Mac mini is faster than the Mac Pros they've had for years: "Hollywood thinks new Mac mini 'could be huge' for video editors"—https://appleinsider.com/articles/20/11/12/hollywood-thinks-...


The new mac mini will be much better than the old mac mini. I have no doubt that for some it will be a game-changer: the ones who will never use anything except a mac but ran into performance limitations with the intel-based models.

But for people who do video editing on ThreadRippers and such, the new Mac Mini is a little, not-too-fat nothingburger of a machine. The fact the the "$4000 iMac Pro" was woefully overpriced and underpowered doesn't change that.

Hint: I can run a faster Mac in a VM on my 16 core threadripper than I could ever have bought from Apple.


But for people who do video editing on ThreadRippers and such, the new Mac Mini is a little, not-too-fat nothingburger of a machine.

Um, no pun intended but this isn's an apples-to-apples comparison. Most ThreadRipper CPUs cost more by themselves than does the entire Mac mini.

A 16-core ThreadRipper starts at 180 watts and costs more than the Mac mini at $699. Duh—it's supposed to be faster.

On the other hand, the Mac mini and the other M1 Macs are entry level machines that put almost all Intel and some AMD processors to shame, running at 20-24 watts TDP—so cool that the MacBook Air doesn't need a fan and the one in the MacBook Pro is silent.

From the AnandTech review [1]: "The M1 undisputedly outperforms the core performance of everything Intel has to offer, and battles it with AMD’s new Zen3, winning some, losing some. And in the mobile space in particular, there doesn’t seem to be an equivalent in either ST or MT performance – at least within the same power budgets."

That's pretty good for entry-level, consumer hardware. The professional level processors should make things quite interesting.

[1]: https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...


Everything you've said here is consistent with my entire point: M1 is an awesome processor, but it does not "reset the bar" for high performance systems used by creative professionals.


M1 is an awesome processor, but it does not "reset the bar" for high performance systems used by creative professionals.

Sure it does.

The article I previously posted talks about Hollywood studios replacing high-end hardware with M1 Mac minis.

For editing 8k video with FinalCut Pro, a 16Gb M1 Mac mini is going to be faster than machines that cost 2-4x as much.

That's a big deal. Remember, video effects and graphics editing software uses the 16-core Neural Engine, which the software professionals use—FinalCut Pro, Logic, Affinity Photo, etc.—take advantage of. Adobe's apps aren't far behind.

And that's on top of the 8 core CPU and 8 core GPU which rivals dedicated graphics cards.

A college student now has access to the kind of video, graphics and sound production and editing capability that required a $4000 or $5000 workstation a few years ago.

However you want to phrase it, these Macs are in the process of changing the game.


Here's a quote from the end of that article:

Miller believes that the Mac mini "could be huge" for the industry. He says the main issues for all editors, regardless of what software they use, are rendering times and preparing viewable videos for clients to review — and that the M1 should help with both.

"They are computer-processing-intensive actions," he said. "We have to see if this new M1 chip speeds that up. It should. Dramatically. Then Avid, Adobe and DaVinci... have to use those hardware and OS advances in their software development."

--------------

Notice all the hedging. "We'll have to see..." and "it should ..."

> A college student now has access to the kind of video, graphics and sound production and editing capability that required a $4000 or $5000 workstation a few years ago.

Sure. That's the usual march forward of computer technology for the last 30 years. But note: the fastest standard architecture system you can buy STILL costs about $5000, and it does mind-bogglingly more than that college student will be able to do on their M1.

So this isn't changing the game at all, it's just a continuation. A few years ago, almost nobody talked about being able to edit 8k video, but how that's more or less a given. The requirements expand, the machines get faster.

Don't get me wrong, I'm tremendously impressed by what Apple appears to have done with the M1 (there's me hedging too). I just get tired of it being described in a context where the comparisons don't include a wide enough range of possible options. Before the M1 there were much more powerful systems available than anything Apple offered. After the M1, there continue to be more powerful systems available than anything Apple is offering. That doesn't mean that an M1 Mac Mini is a dumb choice - it might indeed be the perfect combination of CPU/Watt/$. But it's not changing what the high end is (yet).


You're disagreeing with the OP, but presenting a completely different argument. By all accounts, the M1 mini seems to be a neat little machine, and I've placed an order myself to replace a super old machine at home. However, the OP's argument that the M1 (at present) does nothing to challenge top-end processing power is accurate.


Interestingly enough in the past 3 years, I have purchased 4 phones, 1 tablet, 1 watch and one iMac. However I have upgraded 6 PCs with new graphics cards, SSD drives, motherboards, and processors. My children have gone from phones to PCs are their main method of consuming content. It just richer on a PC than anywhere else.


> The software and hardware efforts are uninspiring because the talent has moved elsewhere. The creativity, R&D budget have gone the same way.

Then what have Intel, AMD and Nvidia ben doing the last few years? We've got increased core counts, more IPC, massive graphics cards chips with specialised cores for AI, raytracing and fast memory. It started with gaming and birthed the deep learning revolution. Those may not be entirely new paradigms, but the advances, especially recently, have been large.

There are now so many new chips launching at the same time (XBox, PS5, Zen3, Big Navi, Ampere, M1) that you can't really buy them at reasonable prices because the production capacity has become a bottle-neck, and demand far exceeds supply. This is not what decline looks like.


> They also don’t see performance along new dimensions which devices offer with glee. For example connectivity to cellular networks (still missing in even the fanciest laptop but available on my watch.)

What? Laptops with cellular connectivity have been available for years.


I have (and had) laptop with slot for sim card for years but I never bothered to buy a sim card. I have gigabit ethernet port at work, at home, fast low latency wifi everywhere ...


I have never wished my laptop had cellular connectivity, I have always used a portable hotspot for that.


Portable hotspots are nice, sure - I do have one. Still feel like it'd be more convenient to have cellular connectivity built right in though; you always have some connectivity then provided you're not too remote.

I've considered switching to Google Fi solely because they provide data-only SIMs associated with your normal plan that I could then pop into a laptop. (Rather than making you pay twice as most providers do.)

Haven't been able to justify it yet though, I'm on a pretty good grandfathered plan at the moment.


the extra price per device that carriers charge makes this cost prohibitive. I'd be very interested in a laptop with cell service if I had a shared data plan across N personal devices.


> I'd be very interested in a laptop with cell service if I had a shared data plan across N personal devices.

My understanding is that's what Google Fi does: https://support.google.com/fi/answer/6330195?hl=en

They seem to be nearly the only ones, at least that I've found. (But I'd be glad to find more!)


Here in the UK we're starting to see data only SIMS with 100GB+ per month limits for only £20 p/m. To me this seems that the era of 'always connected' laptops is almost upon us.


So you have wished for it, you just used another device as a bridge to get it.


It's true that Mac revenue exceeds that for iPads, but on the other hand Apple sells almost twice as many iPads as Macs. So from a business perspective the Mac might win out, but as a cultural phenomenon arguably the iPad has a much bigger impact.

Already at work iPads with keyboard cases often outnumber laptops in meetings. That shouldn't be surprising as people already have a full computer at their desks. My eldest is in 6th form (highschool) and is a gamer, so she has a decently powerful laptop in her room and her daily carry at school is an iPad and Pencil. When she goes to university we'll consider a keyboard case to go with it.

If you don't have a use for a powerful laptop, a skinny M1 Macbook Air is certainly very tempting and together with a powerful phone could cover some people's needs fine. I think the takeaway is that we now have a very granular, highly flexible set of devices available, covering a huge range of different combinations of use cases. It's not just about the individual tool anymore, it's about the right toolkit to meet our specific set of needs.


Funny thing is that an Apple computer or any smartphone deserves the title "Personal Computer" more than my Linux PC (which has multiple accounts which can be used simultaneously).

I think we'll see more Personal Computers and devices in the future. For example, Oculus Quest is only usable with a personal Facebook account, so I would say it's a PC by definition.


Maybe you simply misunderstand the P in PC. PC's have offered multiple user accounts historically because they were expensive machine that might be used by multiple people. Most devices at this point are liable to be single user but aren't less personal for having a function you don't need.

In fact even the iphone has a guided access feature https://support.apple.com/en-us/HT202612 presumably partially so you can hand your kid the phone in a game and they can't accidentally share all your photos or spend your money.


> Maybe you simply misunderstand the P in PC.

Ok, what does the P stand for in your opinion?

> In fact even the iphone has a guided access feature

Imho, this is a much weaker form of device sharing than e.g. Linux accounts, where every user is treated equally.


P means you occupy all CPU time on a tabletop machine as opposed to normal computers or mini computers for which you share literally single CPU with multiple other users over VT100


A personal computer is one which regular people can own and control. At the dawn of the PC era this was a novelty with the contrast being very expensive systems that many users access over dumb terminals that weren't available to the general public.


When the term PC was coined by IBM they ran PC DOS and were entirely single-user and incapable of multitasking of any kind.


Yes but it isn't defined by the lack of multitasking


I mean technically it was, but only in the sense that computers at that time were generally shared systems and the pitch was that a worker/user would have their own individual (personal) computer at their desk and wouldn't have to share resources with other users, and the IBM salespeople I talked to at the time loved to talk about this. The fact that multitasking personal systems weren't feasible at the time meant that a machine could be either multitasking/multiuser or personal. That distinction died alongside DOS though.

None of that really has much to do with what the above user said about their Linux system however.


I don't think anyone has sold a consumer OS in several decades that doesn't fully support multiple accounts?


> The software and hardware efforts are uninspiring because the talent has moved elsewhere.

Really? I find AMD's efforts in desktop CPUs and GPUs quite interesting.

Same goes for example for Mesa, radv/aco, dxvk, vkd3d-proton and etc. Quite inspiring projects and they are all about Linux desktop use case for the most part (especially gaming).

PC ≠ Windows.


The evolution seems to be like the devices are getting smaller and aim to take over as many things they user otherwise had to do as possible and once that is out of the way, the next step is to control an individual. Once individual life completely depend on the device, you can exert control over human by various means. Suddenly big tech companies will be able to tell people what to do and if they don't they'll get excluded from the "society". We need to stop this in its tracks.


I hope that some day Apple revives the idea of the PowerBook Duo (https://lowendmac.com/2007/apples-first-subnotebooks-powerbo...) in which you could dock your phone to something akin a brainless iMac: a screen, mouse and keyboard. Your main computer is always your phone, which you can slide into different form factors (tables, laptops, desktop PCs).


Works on Android, check out https://nexdock.com/touch/


PC's are still the preferred business platform and I don't see that changing any time soon. PC's may not be "cool" or "sexy", but they run the world. Their gaming power also helps the platform's support.

Mobile conventions essentially cripples the UI's when productivity matters. Mousing and large-screen-oriented UI's are simply more productive on average.


You know what 2000's era cliche has really come to fruition? "The Network is the Computer".


I can't make a WhatsApp call from my windows pc or control my tuya WiFi thermostat and it's ilk. It's the day's of near guaranteed "windows compatible" that have suddenly ended, and an era fading with it.


Microsoft is working on implementing Android userland in Windows 10, so you might soon be able to. You're still limited to WhatsApp's "one device per account" policy, but that's more of a WhatsApp issue than a Windows issue.

As for the Tuya thermostat, the terrible IoT landscape is to blame here, not MS. With IPv6 support any device in your home can be controlled by any device you own, except that IoT manufacturers are not capable of securing their devices and would much rather pull everything to their own private little cloud, letting them permanently pull the plug on their old hardware whenever they feel like it. There's very little preventing Tuya from making a Windows application, but they just choose not to.

I can make Telegram calls from any device with a microphone and/or camera just fine and there are plenty of IoT devices that can show a web interface to control the heat. It's all about the vendors you choose. The Windows/PC ecosystem has nothing to do with vendors' mediocre design decisions.


>"But I’m also willing to bet you’re reading this on a phone."

I am reading this on my PC on one of 2 4K monitors. I use phone for a phone calls, off-line GPS and couple of specialized off-line apps.


Congratulations, you are not a statistical average.


Congratulations, for covering everyone with blank statements.


I have been in post PC since 2006. Working on laptops with docking station has been a fact since then.


If measured by that, then yeah same, pretty much all companies I've worked for have made laptops mandatory - it saves a ton in insurance cost and break-in risk.

There was one exception where we worked on desktop PC's at a bank, because they were very reluctant about developers taking code outside of their premises. Less of a risk of burglary as well because of better locks and a full time security staff. They slackened that policy off a bit after risk analysis and making full disk encryption mandatory at least.


On similar cases we had to work on virtualized desktops, Citrix, or EC2 instances, after being in the customers VPN.


Post PC

First thing that came into my brain was Post Political Correctness.

I thought this was going to be JUICY... :(


I've been taking PCs for granted my whole life. My school did actually have Acorn/BBC Micro computers but I was too young to really take advantage of them. The only reason they had them was I went to the worst school in one of the most deprived areas of the UK. The school had already acquired its first PC by then and after I left I never saw anything but PCs in other schools.

Our family computer was a PC and I remember my dad installing a modem and a CD burner and probably other things (I later upgraded the memory from 32MiB to 64MiB and the hard disk to 20GB). It was so exciting. But I just thought that was how computers were. I didn't realise that this was just how the PC is. I got a smartphone, but I got it as a phone, not a computer. I got a netbook, but even that can be tinkered with within reason given its form factor. But I never stopped using my PC.

It makes me quite sad to think that this might go away. Microsoft already tried to cripple the PC with UEFI. I feel like the whole PC market is mainly propped up by gamers right now. I don't game but I'm thankful that I can still take advantage of the hardware. Imagine how boring it would all be with disposable appliances and wireless networks. I'll have to move with the times. It's my job. But I'll have to find a new hobby.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: