Even though I work in technology, I often find it hard to understand the push to involve technology in everything. Recently, our refrigerator failed, for about the third time in a year. That means, of course, perusing for a replacement. These days, they have fridges with screens embedded in them, that connect to wifi and allow you to do things with the refrigerator. They come with embedded cameras, food management software telling you when you need to throw things out, or when things are approaching expiration dates.
Maybe some folks like that stuff, or feel the need to pay more for it. More than likely, it’s a keepin’ up with the Joneses thing. You go into a house, and it has all the latest fancy gadgets and whatzits, all covered in stainless steel. Or, perhaps, the new rage “black stainless” or “dark slate” stainless. It all seems rather silly.
However, with recent revelations surrounding the Alphabet Agencies and the strong possibility that they’ve been spying on American citizens, it is no longer merely silly.
It’s utterly stupid.
Even if the Alphabet Agencies are ultimately absolved of this charge, it is clear that backdoors have been built into devices for quite some time now. And you will find that it is not merely manufacturers, software companies, and the government that are using them.
So-called Smart TVs are becoming a problem as well, as hackers can brick them, or turn microphones and cameras (should your smart TV come equipped with them) against you. The “Internet-of-Things” is proving to be a sieve.
The hacks underscore the risks of so-called “Internet of Things” devices, the vast majority of which are given network access and computing functionalities without being adequately secured. TVs and other Internet-connected appliances almost universally lack application sandboxing and other exploit mitigations that are a standard part of computer and mobile operating systems. Even worse, most devices run old versions of Linux and open source browsers that contain critical vulnerabilities. While patches are generally available on the Internet for the individual components, manufacturers rarely give customers a way to install them on the devices in a timely way.
Think about it. When is the last time most folks even bothered to update the apps on their phone? Now consider that there are refrigerators that would now need to be considered in security terms. Your average John Doe does not think to update his fridge, or worry overmuch about whether or not it is secure.
Consider, also, The Fappening, when various celebrity cloud accounts were hacked, and the nudes distributed across the Internet.
Now we have the proliferation of devices like Alexa and Echo which are designed to listen to your commands and do things with that data. Are people going to be fastidious about checking on the security of their smart speakers?
Some of these devices, of course, automatically update themselves, and remain reasonably secure from casual hacking. But then you have to consider a different threat for those devices which are secure: the company selling you the device, or providing you the service.
The GOP tells us that this is a case of regulatory overreach, and they may actually be correct about this, because the existence of the regulatory regime has done little to nothing to stop this behavior from occurring. Although, I will say right away that the optics of this bill are very worrisome.
But whether or not the bill will have an effect, positive or negative, the fact remains that your service providers have already been caught selling this data, or using it in ways you didn’t expect. You can’t trust them.
Now, imagine they have your browsing history, they know how much food is in your fridge, what you watch on TV, who you call, and who you text… Go buy some more Pepsi, says the ad on your fridge, because we know you’re out.
This is a gold mine, for companies, for government, for Alphabet Agencies within the government (who may very well be at odds with the elected government), foreign governments (the Left likes to blabber about Russia, but I’d be more concerned about the Chinese), and for black hat hackers looking to screw you over.
Is all of that risk really worth your fridge telling you that 3-week old leftover Chinese takeout should go in the garbage? I’d argue not. Do a simple risk/reward calculation on this. It’s not worth it.
So what do you do? Here are few ideas:
1. Buy “Dumb” hardware. Dumb fridges, dumb TVs (or buy Smart TVs where the “smart” portion can be disabled – at the very least, don’t connect it to wifi).
2. If you must have Netflix, Hulu, Amazon Prime, Kodi, Plex, or anything similar on your TV, consider getting a separate device like a Fire Stick, or a Roku, or a “Compute Stick” from Intel. They are cheap, and if a hacker bricks it, at least you aren’t out a whole TV. Power it off when not in use. Occasionally clear it, reset it back to factory specs and reload your apps.
3. Clear your phones of pretty much everything extra installed by the manufacturer. If you’ve some technical skill, consider wiping the OS and installing from scratch. Cynogen used to be my preferred choice in the Android ecosytem. It’s gone, now, but Lineage was forked from it in the dim mists of Android history. Consider that. If you don’t have the skill (don’t even try it if you question this), just clear everything optional you can from the phone.
5. Make sure you carefully screen new applications and software for possible hidden monitoring. Companies like to bury this in their disclaimers. Usually you can find information on the software you want to use on the Internet.
6. Don’t buy any of those smart home systems and “smart speakers” like Echo or Alexa. That’s a disaster waiting to happen.
7. If you don’t have a very compelling reason to buy any “smart” device, don’t do it.
8. Make sure you use strong passwords, both on your accounts and on your wifi router.
This won’t stop every possible way someone with malicious intent could screw with you, but it will severely limit the damage, and, in the same way a car with a few anti-theft devices will deter casual thieves, so will this eliminate casual data theft, spying, and hacking.
The Internet of Things is a spaghetti strainer when it comes to security. It’s a mess. Best not to dive too deep into it, if you can avoid it. After all, three week-old Chinese food is generally pretty good about notifying you it’s gone bad all on its own.
So the vendor kept notifying me that I should review my computer part purchases, which I did… except for the power supply. Because it’s a power supply. Really, who cares? The vendor, apparently. So I decided to give the vendor a review for the product (we’ll see if it passes the censors), but Hell, why not have a little fun, right? Be careful what you wish for…
– It’s a power supply
– It works
– It’s a power supply
– It didn’t come with a bevy of hot girls
In my lifetime, I’ve reviewed a lot of tech products. And for some of them you might discuss the performance metrics, the compatibility issues, or even how much LED bling is plastered all over it — because, as everybody knows, the PC market desperately needed to turn into a close facsimile of the ricer community. Next, these folks will put chrome tips on their exhaust ports, and some enterprising wannabe Jedi will come along to deliver a proton torpedo straight up the…
…Well, you get the idea.
So what can I say about this power supply? The Corsair RMX850X works properly. It has modular cables, which are black, and that’s good, because we all know what system building was like during the days of IDE cables and power supply rat nests. What idiot thought up sticking random useless cables on every power supply they sold, anyway? And why did they have to come in multicolor, like Picasso smoking weed and throwing up all over the canvas? “What’s that a painting of, O master of incomprehensible art?” “Why, my young apprentice, some time in the distant future, engineers will make ratty looking contraptions called ‘power supplies’ that will look something like multicolored wire vomit.” “O Master… can I have some of that [expletive deleted] weed you’ve been smoking?”
But hey, you buy this power supply, and you don’t have to deal with it. The wiring is so black, it’s speaking Samuel L. Jackson to me. “I’m tired of these [expletive deleted] snakes on this [expletive deleted] plane!” That’s right, you buy yourself an RMX850X, and your PSU is Samuel L. Jackson.
Now, let me tell you what you get when you open the box. This power supply is so Samuel L. Jackson, it comes with fancy black bag with drawstring surrounding it. So you’ve got a perfect place to store your bling, and your cash if you ever take a job doing whatever Samuel L. Jackson was doing in Pulp Fiction. Not every power supply comes in a velvet bag, you know. And then you get some cables, which are nice if you actually plan to build a computer with it.
The voltages looked good. But unless you’re buying bargain basement stuff that even the Chinese outsource because it costs too much to make, you’ll get decent voltages. So that’s kind of expected these days. I guess there’s no real cons with the RMX850X, except that when you buy a Samuel L. Jackson power supply that comes in a velvet bag, inside a box *that* big, you kind of expect more. Like, if you opened that bag and a bevy of hot girls modeled your brand new PSU, delivered on a silver platter, with angels singing Pulp Fiction in the background like “Blessed is he, who in the name of charity and good will, shepherds the weak through the valley of darkness, for he is truly his brother’s keeper and the finder of lost children. ”
But Corsair didn’t see fit to supply said hot bikini girls, for which I am mildly disappointed. For a moment, I thought I was going to get a cosmic experience, greater understanding of the universe, and some scantily clad supermodels begging to date me, because I was awesome enough to choose Corsair for my PSU business. And all I got was a working power supply in a fancy velvet bag.
Browsing around Liberty’s Torch, as I often do, I am reminded of something in my own life. Francis discusses those who don’t look very far and those who do. And he references Zen and the Art of Motorcycle Maintenance.
Oddly enough, this is a book I’ve never read, something I ought to rectify one of these days. My reading list is always long, and always growing faster than I can read. But perhaps this merits skipping a few, because I am told that my view of computing is much like the view expressed in the book.
I like to understand things, and get into the nitty-gritty. Nothing frustrates me more than being helpless, not understanding what is going on, having to rely on someone else not merely to do a thing, but to understand that thing.
You see, when you know something well enough, but contract it out to another because you do not have the time to deal with it, you know what a fair price for the work might be. You are not ripped off or taken advantage of.
For example, I know how to change the oil in my car, but I often pay someone else to do it, because I don’t have the time. Nonetheless, knowing how to do it means I am not ripped off, and if I do have the time, I can do it myself.
Although I must say, I find working with machines to be quite therapeutic most of the time. Even when they frustrate me, I am, paradoxically, enjoying myself. My wife would say that when I am cursing at the machine the most, I am also the happiest.
Lately, I decided to build a new computer. It’s about time, as the last new build was back in 2011, and for me, that is a very long time. I’ve been building my own machines since the mid-90s, when I put together a bizarre 486-based system out of leftover parts from my father’s computers.
Since then, I’d usually do a new build every 2 or 3 years, with small refreshes in between (usually a GPU or RAM upgrade). Recently, the release of AMD’s Ryzen CPU gave me the motivation to finally upgrade again, the prospect of building a relatively inexpensive, well-performing 8 core/16 thread machine finally making it worth the price.
Only this time, the build was smooth as butter. No problems. In a way, it was almost disappointing. I say almost, in case the computing gods are in a mischievous mood. There were no logic problems to solve. The build was easy, the OS installed with the first attempt (that’s a rarity), and given how fast SSDs are these days, the whole machine was fully up and running in a couple of hours.
Still, it was fun, and I enjoyed it. I often wonder why so many people seem almost afraid of learning how things work, how to work on them, or understand them. I see people afraid of changing a tire, or utterly flabbergasted by the simplest of computer issues. They don’t know how to wire an outlet in their home, or install a ceiling fan.
They’ve no clue how to do much of anything, really, and many of them are, paradoxically, proud of their ignorance. I’ve met people who laugh about being unable to balance their checkbooks. It’s utterly bizarre. Fixing things, building things, doing things… these are not the tasks of the anointed, I suppose.
They are romantics, I guess. The world is just supposed to work they way they want it to… because.
Either way, I had fun. Enjoy some pics of the build, if you like computer pr0n:
Occasionally, I feel the need to discuss technology, as my day business does depend on it. Actually, my DJing does also, to some extent. But still, I really abuse computers with my work, everything from running local virtual machines for development, to Photoshop, video editing in After Effects, and 3D Rendering. Then, of course, I still game on occasion.
Now, it used to be that I would build a new computer every 2 or 3 years. That was about the time when a build would start to have some issues, mainly because it was so heavily abused, and also the point at which performance increases would justify the cost. Every 2 or 3 years, I could double performance with a new build.
That hasn’t been the case for a long time. The current machine I’m running was built in 2011, and since then, the only component that’s seen an upgrade is the graphics card. Back when Bitcoin mining was a thing, I picked up a pair of Radeon 7950s and swapped them in, and made a nice tidy profit from that.
The reason I haven’t upgraded can basically be laid on the doorstep of Intel. AMD stopped competing in the high end CPU space sometime around 2008 or so. The company had decided that competition with Intel was too expensive, and relegated future designs to the low end market. This, in turn, made Intel very lazy. The i7-2600k in this build remained a competitive processor for a very long time. Even today’s i7-7700k is, perhaps, 40% faster. For 6 years of CPU development, that is really poor.
Meanwhile, my old build has been brutalized for nearly 6 years, and components are starting to fail. I used to have 24GB of RAM in this machine. Two DIMMs died a few months back, so I’m down to 16GB. The power supply is… twitchy, the system itself just feels less stable in almost all metrics, for which blame can probably be laid upon the motherboard. This system is giving me indications that I’ve a time limit on how much longer I can beat it to death.
Meanwhile, the only Intel options that looked interesting were completely unaffordable. The i7-7700k is pretty well priced, of course, but it’s a 4 core/8 thread CPU like my 2600k, and the IPC and clock speed improvements are not very impressive. Meanwhile, Intel released the 6900k 8 core/16 thread CPU, and the 6950k 10 core/20 thread monster. At least in development and content creation, these would be more than twice as fast as my 2600k. But these cost around $1100 and $1600, respectively, and require a much more expensive motherboard to boot.
Intel has enjoyed the fruits of being an effective monopoly in the high-end CPU space. High prices, and unimpressive performance gains were the order of the day. Monopolies really do stink. On the other hand, I have to think that even as a monopoly, this behavior was kind of stupid on Intel’s part. After all, they could have sold me a CPU or two in the last few years if they hadn’t acted this way. How many sales were lost because people didn’t see any persuasive reason to upgrade?
About time, AMD. Where were you for the last decade?
Fortunately, for whatever reason, AMD has decided to reenter the high-end CPU market with the Ryzen 7 series. I won’t go into too much detail on the benchmarks, as others have done a much better job of that, but the verdict is really fascinating. The Ryzen 1800x is nearly as fast as the high-end Intel 6900k and 6950k chips. In content creation, it appears to be slightly ahead of the 6900k, and somewhat behind the 6950k, which makes sense given that the latter is a 10 core part, and the Ryzen 7 line is merely 8 core/16 thread.
Nonetheless, this is very impressive, because even the most expensive Ryzen CPU is $499. The 1700 and 1700x are, of course, even less expensive. The 1700x may be the sweet spot, in that a modest overclock will grant performance on par with the 1800x.
In gaming, the verdict is more mixed. The Ryzen is competitive with the Intel 8 core chips in gaming, but not as competitive with the i7-7700k 4 core part. The reason for this is, of course, that games are less optimized for multi-threading. So the higher-clocked 7700k actually beats out its 8 core siblings and the Ryzen lineup by around 10% or so, across the board. There is a lot of speculation surrounding Ryzen performance in gaming, however. Brad Wardell, the CEO of Stardock, had this to say on the matter:
“Oxide games is incredibly excited with what we are seeing from the Ryzen CPU. Using our Nitrous game engine, we are working to scale our existing and future game title performance to take full advantage of Ryzen and its 8-core, 16-thread architecture, and the results thus far are impressive. These optimizations are not yet available for Ryzen benchmarking. However, expect updates soon to enhance the performance of games like Ashes of the Singularity on Ryzen CPUs, as well as our future game releases.”
The Ryzen may actually be able to gain some low hanging performance in this area. Its spectacular content creation benchmarks, and synthetic benchmarks show that AMD is not bullshitting about the CPU’s performance, as they’ve done in the past with the utterly garbage Bulldozer and Piledriver lineups. So in this case, the fact that developers have been optimizing more or less exclusively for Intel’s chips for nearly a decade — because AMD’s offerings were crappy — may have given Ryzen a handicap in those benchmarks. If so, we should expect to see a modest increase in gaming performance in the coming year.
There were two other teething problems for Ryzen. First, the Windows 10 scheduler incorrectly identifies physical CPUs (cores) with logical threads, and attempts to provide both with a similar workload. In workstation applications, this is a more or less a non-issue. In games, however, it has become a handicap for Ryzen. AMD claims to be working with Microsoft to provide updates to the Windows 10 scheduler to fix this problem. Ironically enough, Windows 7 does not have this issue (it’s also not officially supported anymore — thank Microsoft for that idiot decision). Gaming benchmarks in Windows 7 appear to be significantly better as a result, with early adopters seeing a roughly 8% improvement with the Windows 7 scheduler, over the Windows 10 scheduler. So if Microsoft deigns to fix this, which is by no means a guarantee, we should see Ryzen gaming performance jump to within a percent or two of the 7700k.
The last hiccup is the memory controller. Unusually for a CPU that is otherwise very fast and competitive with the Intel parts, the memory controller is rather weak. It is dual-channel only, and getting maximum memory bandwidth, at the moment, requires using only two DIMMs, and single rank memory. Using two DIMMs is generally not an issue at the moment, but the single rank memory issue is a major problem. It takes a lot of research to find out which memory kits are single rank, as this is not commonly listed in the specifications. The usual method is to visually inspect the memory. If the chips are on one side only, you can be about 99% certain it is a single rank part. But this is hard to do when shopping online, as the heatspreaders cover the chips, and the spec sheet doesn’t list whether it’s single rank or not. This is explained in great detail at Legit Reviews.
This will likely change in the near future, as more UEFI code comes out of AMD, and they optimize the memory controller somewhat. For now, if you are building a Ryzen box, be sure to check the motherboard maker’s QVL (Qualified Vendor List) for compatible memory. Asus, at least, has done a lot of research on this, and has even specified which memory kits are single rank, and thus best for Ryzen. The memory manufacturers themselves often don’t even do this, or bother to specify it, so kudos to Asus for taking the time to do that right.
Take careful notice of the memory speeds and latencies available to you, as Ryzen’s weaker memory controller and lack of quad-channel support make it much more sensitive to RAM speeds than Intel’s chips. Maximize memory performance to avoid a bottleneck here.
What we have here is a part from AMD that occupies a unique market position. It was a brilliant move from them. If you want the fastest gaming-only chip, the i7-7700k is still your best bet, as its higher clock speed gives it an unbeatable advantage here, and having 8 cores doesn’t do much for you in gaming (yet, anyway). Four is enough there, for now at least. If you want the absolute fastest workstation chip, the i7-6950k is still the fastest thing out there… if you have $1600 to pay for it.
But what if you’re like me, and your primary computer is a mixed-use machine? Something that sees use as a graphics and video workstation, a gaming box, and even, on occasion, a testing and development server? There the Ryzen shines. It’s cheaper than the Intel workstation chips by a huge margin, while offering broadly similar performance, and it handles games just fine, even if it’s not quite the fastest there either. It won’t double the performance of my 2600k in gaming, but it will more than double it in workstation and dev duty. And it will dominate the 7700k in workstation and dev duty.
So you sacrifice maybe 10% of gaming performance, and even then, only in situations where you are not GPU bound, and gain 50%+ in productivity and content creation compared to the 7700k. That’s a trade I’d make all day long. I suspect a lot of folks will be thinking similarly.
AMD created what’s probably the best general-use CPU on the market today.
So I’ll be building a Ryzen box this time around. But even if you don’t want to roll the dice with AMD, I imagine Intel will feel some competitive pressure again, and maybe we can get back to the market working like it ought to.
Ironically, however, where my last build used an Intel CPU and an AMD GPU, my new build will be a reverse. An AMD CPU, and Nvidia GPU (the Geforce 1080 Ti is the king right now). Maybe AMD should apply the same level of dedication they did with their Ryzen project to their next Radeon release (they claim the Vega release will be good — but we’ll have to see). Either way, though, folks ought to be thanking them for giving us an alternative to Intel that doesn’t require sacrificing your first-born son to buy.
Update: A great explanation of what’s going on with the mixed gaming performance from Ryzen. As it turns out, the decision by AMD to split the CPU into two CCXs (Core Complexes) created a latency issue between the two core complexes when there is a lot of thread-to-thread communication. It looks like this is actually a pretty easy fix, overall. Windows 10’s scheduler needs to treat them almost like two quad-core chips, rather than a single octa-core chip, which ought to distribute the workload better. I remember this was an issue for early Intel quad-core chips, which similarly split into two complexes of two cores each.
This video explains it:
This means a Windows update should fix most of the bizarre, split-personality of Ryzen. Let’s hope Microsoft actually bothers.