Women, Progress, and Automobiles in Space

Folks, I’m tired. I’m cranky. I’m operating off very little sleep and I’ve overbooked myself on work yet again. You’d think I’d learn. Or take a break. Sigh…

But a thread on Twitter caught my attention yesterday, and I couldn’t let it pass without some commentary. And that, in turn, opened up a whole new level of offensive wrongthink. Underneath it is, in this blogger’s opinion, one of the most important issues of our time.

A progressive woman laments how Elon Musk is, in her view, wasting his money launching automobiles into space. Instead, she tells us, he should spend his billions on Flint, a Democrat-run shithole (thanks, Mr. President, I’m using that term a lot more from now on) that can’t even deliver clean water to its residents.

As tired as my newborn has made me in recent days, this is worse. Progressive arguments like this are legion. Instead of using your money for anything else, you should dedicate it all to notions of wealth distribution. Forget your personal desires, your dreams, your aspirations for the very future of mankind itself… instead give everything to politicians.

Dear readers, enough is enough. I’m usually at least somewhat polite on this blog, so take it to heart when I say this. Grasp my full meaning: fuck them all. Toss them out of helicopters, or into woodchippers, or just air drop them into Somalia. I don’t even care anymore. But this is the last straw, folks.

Their short-sighted, Dunning-Kruger infused, elementary school worldview is beyond evil and into a realm of nonsensical mental masturbation. It’s a Lovecraftian horror, except it is inverted. Instead of greater beings beyond our comprehension filling us with horror because of our relative smallness, we have lesser beings whose sheer stupidity and ignorance of all sense is such that to even try to empathize with them causes headaches of massive proportions. To even comprehend such idiocy is horror-inducing.

Not that any of this is new. Back during the days of the Apollo Program, there were plenty of people lamenting that the money that sent a man to the moon would be better spent putting an end to poverty. The more things change, the more they stay the same, I guess.



Another cretinous fool on Twitter explained that Carl Sagan told us not to perform such crazy marketing stunts in space. This esteemed idiot supplied the following quotation:


So, what? Only Communists should go to space? Earth is for everybody, but space is for Communists! No Capitalists allowed. I know a lot of folks revere Carl Sagan, but this certainly ranks high on dumb quotes by this guy.

I started noticing a pattern. Most of the detractors of Elon Musk’s little stunt were women. Most of those singing his praises were men. That reinforces an old, but correct notion that women trend more Progressive than men. Look, I know a lot of smart women (Sarah Hoyt comes to mind right away), but at the same time there is something very, very wrong here.

Why are so many women fixated on first-order problems, without any conception of second-order effects? I mean, imagine if the folks at early Intel who pioneered CPU design, had instead donated all of their money to some politically-correct victim class and bowed out? What would benefit poor people more, a few thousand dollars worth of bread, or technological progress that eventually put the entire sum of human knowledge at their fingertips within seconds?

Our intrepid Progressive women cannot imagine the future benefits to everyone, the poor included, of opening up space to mankind. The entire future of our species, the whole universe there waiting for us, and this chick is fixated on Flint’s water supply? It’s maddening! It gives me a headache. It’s exhausting even trying to understand how a human brain could possibly operate that way.

But I see a lot of modern women who have this kind of thought process. Take a look at this feminist whopper:



Her complete inability to understand anything past her own immediate worldview is simply staggering to contemplate. This is extreme solipsism. Does she not understand what men say to each other on a regular basis? My daily conversations with my male friends would send your average feminist woman into conniption fits. They might give her seizures. She claims to want to be treated just like a man, but no woman actually wants this.

So hey, Thales, how is this related to Elon Musk launching a car into space? I’m getting to that.

I started thinking this morning that perhaps we aren’t dealing with an inability to see beyond first-order effects so much as an inability to virtue-signal them rhetorically. It’s rhetorically easy to point to a crazy publicity stunt and call it stupid. It’s rhetorically difficult to demonstrate the larger utility of the affair, the technological milestones SpaceX achieved in the process. It’s rhetorically easy to give a man a fish and say “look at me, I’m a good person for giving this man a fish!” It’s rhetorically difficult to demonstrate how your long-term plan to teach the man to fish will benefit mankind.

Virtue-signalling is subject to the rule of laziness. If there are two ways you can virtue-signal your moral superiority, Progressives will invariably pick the easiest one. Here’s your fish, come back tomorrow for another. Of what possible use is a microchip to a poor person? No cars in space. And why land a man on the moon? What a waste, am I right?

In other words, it’s easy to drop a first-order effect into a rhetorical conversation and ignore all other effects of that decision, as if everything was in isolation. Nothing in the universe works that way.

And for some reason, women appear more vulnerable to this tactic than men. The irony of the no billboards in space comment? It’s usually women who are more consumerist than men. Marketing? Commercials? All targeted toward women. That’s why almost every man is a dofus in TV commercials, and every woman is a wise, sage-like being of feminine supremacy. Who spends all the money? The answer to that is obvious enough. And yet, women trend anti-Capitalist? It makes no sense, until you start looking at everything from a first-order effect perspective. She wants X, and so she buys X without any thought to the greater ramifications… unless, of course, political correctness rears its ugly head, hijacks her mind, and pushes its claptrap. Then it’s save the whales, don’t buy fur, and no automobiles in space. Those, too, are considered without thought for greater ramifications.

Don’t ask me if this is innate to female biology, or just a product of a century or so of Radical Feminist thinking and conditioning, or some combination thereof. I don’t know, and my headache is too great to speculate at the moment. But whatever the cause, I’m tired of it. When I was a kid, I can remember a little optimism left in the world. Not much, for Progressivism was already on the march even then, and had been for a long time. But still, it was worth looking up and thinking “someday, I’m going to achieve something great.”

Not anymore, I guess. You can launch your car into space, and some idiotic denizen of Twitter will stand up and say “laaaaaame, you should have given your money to some Democrats instead. What are you, some kind of racist?”

Woodchipper them all. I don’t even care anymore.

Smart Homes. Dumb People.

Even though I work in technology, I often find it hard to understand the push to involve technology in everything. Recently, our refrigerator failed, for about the third time in a year. That means, of course, perusing for a replacement. These days, they have fridges with screens embedded in them, that connect to wifi and allow you to do things with the refrigerator. They come with embedded cameras, food management software telling you when you need to throw things out, or when things are approaching expiration dates.


Maybe some folks like that stuff, or feel the need to pay more for it. More than likely, it’s a keepin’ up with the Joneses thing. You go into a house, and it has all the latest fancy gadgets and whatzits, all covered in stainless steel. Or, perhaps, the new rage “black stainless” or “dark slate” stainless. It all seems rather silly.

However, with recent revelations surrounding the Alphabet Agencies and the strong possibility that they’ve been spying on American citizens, it is no longer merely silly.

It’s utterly stupid.

Even if the Alphabet Agencies are ultimately absolved of this charge, it is clear that backdoors have been built into devices for quite some time now. And you will find that it is not merely manufacturers, software companies, and the government that are using them.

Take a gander at this: Smart TV hack embeds attack code into broadcast signal—no access required

So-called Smart TVs are becoming a problem as well, as hackers can brick them, or turn microphones and cameras (should your smart TV come equipped with them) against you. The “Internet-of-Things” is proving to be a sieve.

The hacks underscore the risks of so-called “Internet of Things” devices, the vast majority of which are given network access and computing functionalities without being adequately secured. TVs and other Internet-connected appliances almost universally lack application sandboxing and other exploit mitigations that are a standard part of computer and mobile operating systems. Even worse, most devices run old versions of Linux and open source browsers that contain critical vulnerabilities. While patches are generally available on the Internet for the individual components, manufacturers rarely give customers a way to install them on the devices in a timely way.

Think about it. When is the last time most folks even bothered to update the apps on their phone? Now consider that there are refrigerators that would now need to be considered in security terms. Your average John Doe does not think to update his fridge, or worry overmuch about whether or not it is secure.

Take the Samsung Smartcam, which recently suffered a major security vulnerability. A casual buyer is likely to trust the Samsung brand.

Consider, also, The Fappening, when various celebrity cloud accounts were hacked, and the nudes distributed across the Internet.

Now we have the proliferation of devices like Alexa and Echo which are designed to listen to your commands and do things with that data. Are people going to be fastidious about checking on the security of their smart speakers?

Some of these devices, of course, automatically update themselves, and remain reasonably secure from casual hacking. But then you have to consider a different threat for those devices which are secure: the company selling you the device, or providing you the service.

Right now, there is a bill that passed Congress which supposedly allows ISPs to sell your data to the highest bidder. Here’s the catch, though, according to the EFF: these companies were already doing it.

The GOP tells us that this is a case of regulatory overreach, and they may actually be correct about this, because the existence of the regulatory regime has done little to nothing to stop this behavior from occurring. Although, I will say right away that the optics of this bill are very worrisome.

But whether or not the bill will have an effect, positive or negative, the fact remains that your service providers have already been caught selling this data, or using it in ways you didn’t expect. You can’t trust them.

Now, imagine they have your browsing history, they know how much food is in your fridge, what you watch on TV, who you call, and who you text… Go buy some more Pepsi, says the ad on your fridge, because we know you’re out.

This is a gold mine, for companies, for government, for Alphabet Agencies within the government (who may very well be at odds with the elected government), foreign governments (the Left likes to blabber about Russia, but I’d be more concerned about the Chinese), and for black hat hackers looking to screw you over.

Is all of that risk really worth your fridge telling you that 3-week old leftover Chinese takeout should go in the garbage? I’d argue not. Do a simple risk/reward calculation on this. It’s not worth it.

So what do you do? Here are few ideas:

1. Buy “Dumb” hardware. Dumb fridges, dumb TVs (or buy Smart TVs where the “smart” portion can be disabled – at the very least, don’t connect it to wifi).

2. If you must have Netflix, Hulu, Amazon Prime, Kodi, Plex, or anything similar on your TV, consider getting a separate device like a Fire Stick, or a Roku, or a “Compute Stick” from Intel. They are cheap, and if a hacker bricks it, at least you aren’t out a whole TV. Power it off when not in use. Occasionally clear it, reset it back to factory specs and reload your apps.

3. Clear your phones of pretty much everything extra installed by the manufacturer. If you’ve some technical skill, consider wiping the OS and installing from scratch. Cynogen used to be my preferred choice in the Android ecosytem. It’s gone, now, but Lineage was forked from it in the dim mists of Android history. Consider that. If you don’t have the skill (don’t even try it if you question this), just clear everything optional you can from the phone.

4. Use proxies for your Internet browsing. Tor is reasonably easy to use these days.

5. Make sure you carefully screen new applications and software for possible hidden monitoring. Companies like to bury this in their disclaimers. Usually you can find information on the software you want to use on the Internet.

6. Don’t buy any of those smart home systems and “smart speakers” like Echo or Alexa. That’s a disaster waiting to happen.

7. If you don’t have a very compelling reason to buy any “smart” device, don’t do it.

8. Make sure you use strong passwords, both on your accounts and on your wifi router.

This won’t stop every possible way someone with malicious intent could screw with you, but it will severely limit the damage, and, in the same way a car with a few anti-theft devices will deter casual thieves, so will this eliminate casual data theft, spying, and hacking.

The Internet of Things is a spaghetti strainer when it comes to security. It’s a mess. Best not to dive too deep into it, if you can avoid it. After all, three week-old Chinese food is generally pretty good about notifying you it’s gone bad all on its own.

Some Humor for the Day: A Power Supply Review

So the vendor kept notifying me that I should review my computer part purchases, which I did… except for the power supply. Because it’s a power supply. Really, who cares? The vendor, apparently. So I decided to give the vendor a review for the product (we’ll see if it passes the censors), but Hell, why not have a little fun, right? Be careful what you wish for…


– It’s a power supply
– It works

– It’s a power supply
– It didn’t come with a bevy of hot girls

Other Thoughts:
In my lifetime, I’ve reviewed a lot of tech products. And for some of them you might discuss the performance metrics, the compatibility issues, or even how much LED bling is plastered all over it — because, as everybody knows, the PC market desperately needed to turn into a close facsimile of the ricer community. Next, these folks will put chrome tips on their exhaust ports, and some enterprising wannabe Jedi will come along to deliver a proton torpedo straight up the…

…Well, you get the idea.

So what can I say about this power supply? The Corsair RMX850X works properly. It has modular cables, which are black, and that’s good, because we all know what system building was like during the days of IDE cables and power supply rat nests. What idiot thought up sticking random useless cables on every power supply they sold, anyway? And why did they have to come in multicolor, like Picasso smoking weed and throwing up all over the canvas? “What’s that a painting of, O master of incomprehensible art?” “Why, my young apprentice, some time in the distant future, engineers will make ratty looking contraptions called ‘power supplies’ that will look something like multicolored wire vomit.” “O Master… can I have some of that [expletive deleted] weed you’ve been smoking?”

But hey, you buy this power supply, and you don’t have to deal with it. The wiring is so black, it’s speaking Samuel L. Jackson to me. “I’m tired of these [expletive deleted] snakes on this [expletive deleted] plane!” That’s right, you buy yourself an RMX850X, and your PSU is Samuel L. Jackson.

Now, let me tell you what you get when you open the box. This power supply is so Samuel L. Jackson, it comes with fancy black bag with drawstring surrounding it. So you’ve got a perfect place to store your bling, and your cash if you ever take a job doing whatever Samuel L. Jackson was doing in Pulp Fiction. Not every power supply comes in a velvet bag, you know. And then you get some cables, which are nice if you actually plan to build a computer with it.

The voltages looked good. But unless you’re buying bargain basement stuff that even the Chinese outsource because it costs too much to make, you’ll get decent voltages. So that’s kind of expected these days. I guess there’s no real cons with the RMX850X, except that when you buy a Samuel L. Jackson power supply that comes in a velvet bag, inside a box *that* big, you kind of expect more. Like, if you opened that bag and a bevy of hot girls modeled your brand new PSU, delivered on a silver platter, with angels singing Pulp Fiction in the background like “Blessed is he, who in the name of charity and good will, shepherds the weak through the valley of darkness, for he is truly his brother’s keeper and the finder of lost children. ”

But Corsair didn’t see fit to supply said hot bikini girls, for which I am mildly disappointed. For a moment, I thought I was going to get a cosmic experience, greater understanding of the universe, and some scantily clad supermodels begging to date me, because I was awesome enough to choose Corsair for my PSU business. And all I got was a working power supply in a fancy velvet bag.

But hey, it’s a good power supply.

New PC Build

Browsing around Liberty’s Torch, as I often do, I am reminded of something in my own life. Francis discusses those who don’t look very far and those who do. And he references Zen and the Art of Motorcycle Maintenance.

Oddly enough, this is a book I’ve never read, something I ought to rectify one of these days. My reading list is always long, and always growing faster than I can read. But perhaps this merits skipping a few, because I am told that my view of computing is much like the view expressed in the book.

I like to understand things, and get into the nitty-gritty. Nothing frustrates me more than being helpless, not understanding what is going on, having to rely on someone else not merely to do a thing, but to understand that thing.

You see, when you know something well enough, but contract it out to another because you do not have the time to deal with it, you know what a fair price for the work might be. You are not ripped off or taken advantage of.

For example, I know how to change the oil in my car, but I often pay someone else to do it, because I don’t have the time. Nonetheless, knowing how to do it means I am not ripped off, and if I do have the time, I can do it myself.

Although I must say, I find working with machines to be quite therapeutic most of the time. Even when they frustrate me, I am, paradoxically, enjoying myself. My wife would say that when I am cursing at the machine the most, I am also the happiest.

Lately, I decided to build a new computer. It’s about time, as the last new build was back in 2011, and for me, that is a very long time. I’ve been building my own machines since the mid-90s, when I put together a bizarre 486-based system out of leftover parts from my father’s computers.

Since then, I’d usually do a new build every 2 or 3 years, with small refreshes in between (usually a GPU or RAM upgrade). Recently, the release of AMD’s Ryzen CPU gave me the motivation to finally upgrade again, the prospect of building a relatively inexpensive, well-performing 8 core/16 thread machine finally making it worth the price.

Only this time, the build was smooth as butter. No problems. In a way, it was almost disappointing. I say almost, in case the computing gods are in a mischievous mood. There were no logic problems to solve. The build was easy, the OS installed with the first attempt (that’s a rarity), and given how fast SSDs are these days, the whole machine was fully up and running in a couple of hours.

Still, it was fun, and I enjoyed it. I often wonder why so many people seem almost afraid of learning how things work, how to work on them, or understand them. I see people afraid of changing a tire, or utterly flabbergasted by the simplest of computer issues. They don’t know how to wire an outlet in their home, or install a ceiling fan.

They’ve no clue how to do much of anything, really, and many of them are, paradoxically, proud of their ignorance. I’ve met people who laugh about being unable to balance their checkbooks. It’s utterly bizarre. Fixing things, building things, doing things… these are not the tasks of the anointed, I suppose.

They are romantics, I guess. The world is just supposed to work they way they want it to… because.

Either way, I had fun. Enjoy some pics of the build, if you like computer pr0n:




Technology Post: AMD Ryzen Launch

Today, we finally have competition again in the desktop CPU market. It has been far too long since that claim could be made. For me, personally, this is very welcome news. I really abuse computers with my work, everything from running local virtual machines for development, to Photoshop, video editing in After Effects, and 3D Rendering. Then, of course, I still game on occasion.

Now, it used to be that back in the day, I would build a new computer every 2 or 3 years. That was about the time when a build would start to have some issues, mainly because it was so heavily abused, and also the point at which performance increases would justify the cost. Every 2 or 3 years, I could double performance with a new build.

That hasn’t been the case for a long time. The current machine I’m running was built in 2011, and since then, the only component that’s seen an upgrade is the GPU. Back when Bitcoin mining was a thing, I picked up a pair of Radeon 7950s, ran them in Crossfire mode, and made a nice tidy profit from that.

The reason I haven’t upgraded can basically be laid on the doorstep of Intel and AMD. AMD stopped competing in the high end CPU space sometime around 2008 or so, and by 2011’s Bulldozer release, had become something of a laughing stock in the enthusiast market. Perhaps they had simply given up competing, or maybe a series of critically bad design calls had been made. Whatever the reason, this relegated AMD CPUs to the low-end market. This, in turn, made Intel very lazy. The i7-2600k in this build remained a competitive processor for a very long time. Even today’s i7-7700k is, perhaps, 40% faster. For 6 years of CPU development, that is really poor.

Meanwhile, my old build has been brutalized for nearly 6 years, and components are starting to fail. I used to have 24GB of RAM in this machine. Two DIMMs died a few months back, so I’m down to 16GB. The power supply is… twitchy, the system itself just feels less stable in almost all metrics, for which blame can probably be laid upon the motherboard. This system is giving me indications that I’ve a time limit on how much longer I can beat it to death. It is quite unusual to be in the position as an enthusiast. Usually a system is replaced for performance reasons before components just start to fail. Yes, even with ridiculous overclocking schemes.

Meanwhile, the only Intel options that looked interesting were completely unaffordable. The i7-7700k is pretty well priced, of course, but it’s a 4 core/8 thread CPU like my 2600k, and the IPC and clock speed improvements are not very impressive. Meanwhile, Intel released the 6900k 8 core/16 thread CPU, and the 6950k 10 core/20 thread monster. At least in development and content creation, these would be more than twice as fast as my 2600k. But they cost around $1100 and $1600, respectively, and require a much more expensive motherboard to boot.

Intel has enjoyed the fruits of being an effective monopoly in the high-end CPU space. High prices and unimpressive performance gains were the order of the day. On the other hand, I have to think that even as a monopoly, this behavior was kind of shortsighted on Intel’s part. After all, they could have sold me a CPU or two in the last few years if they hadn’t acted this way. How many sales were lost because people didn’t see any persuasive reason to upgrade?


About time, AMD. Where were you for the last decade?


Fortunately, for whatever reason, AMD has decided to reenter the high-end CPU market with the Ryzen 7 series. I won’t go into too much detail on the benchmarks, as others have beaten that horse to death over the last week, but the verdict is really fascinating. The Ryzen 1800x is nearly as fast as the high-end Intel 6900k and 6950k chips. In content creation, it appears to be slightly ahead of the 6900k, and somewhat behind the 6950k, which makes sense given that the latter is a 10 core part, and the Ryzen 7 line is merely 8 core/16 thread.

Nonetheless, this is very impressive, because even the most expensive Ryzen CPU is $499. The 1700 and 1700x are, of course, even less expensive. The 1700x may be the sweet spot, in that a modest overclock will grant performance on par with the 1800x.

In gaming, the verdict is more mixed. The Ryzen is competitive with the Intel 8 core chips in gaming, but not as competitive with the i7-7700k 4 core part. The reason for this is, of course, that games are less optimized for multi-threading. Gaming tends to rely on low latency, L2 and L3 cache, and high clookspeeds more than anything. So the higher-clocked 7700k actually beats out its 8 core siblings and the Ryzen lineup by around 15% or so, across the board, in CPU-limited scenarios. There is a lot of speculation surrounding Ryzen performance in gaming, however. Brad Wardell, the CEO of Stardock, had this to say on the matter:

“Oxide games is incredibly excited with what we are seeing from the Ryzen CPU. Using our Nitrous game engine, we are working to scale our existing and future game title performance to take full advantage of Ryzen and its 8-core, 16-thread architecture, and the results thus far are impressive. These optimizations are not yet available for Ryzen benchmarking. However, expect updates soon to enhance the performance of games like Ashes of the Singularity on Ryzen CPUs, as well as our future game releases.”

Ryzen may actually be able to gain some low hanging performance fruit in this area. Its spectacular content creation benchmarks, and synthetic benchmarks show that AMD is not lying about the CPU’s raw performance, as they’ve done in the past with the utterly garbage Bulldozer and Piledriver lineups. So, in this case, the fact that developers have been optimizing more or less exclusively for Intel’s chips for nearly a decade – because AMD’s offerings were usually quite poor – may have given Ryzen a handicap in those benchmarks. If so, we should expect to see a modest increase in gaming performance in the coming year. We can also expect that as AMD pushes higher core count CPUs, developers may attempt to use that horsepower. Up until now, there has been no reason to bother trying – most mainstream CPUs in gaming machines were 4 cores or less.

There were two other teething problems for Ryzen. First, the Windows 10 scheduler incorrectly identifies physical CPUs (cores) with logical threads, and attempts to provide both with a similar workload. In workstation applications, this is a more or less a non-issue. In games, however, it has become a handicap for Ryzen. AMD claims to be working with Microsoft to provide updates to the Windows 10 scheduler to fix this problem. Ironically enough, Windows 7 does not have this issue (it’s also not officially supported anymore, though). Gaming benchmarks in Windows 7 appear to be significantly better as a result, with early adopters seeing a roughly 6% improvement with the Windows 7 scheduler, over the Windows 10 scheduler. So if Microsoft deigns to fix this, which is by no means a guarantee, we should see Ryzen gaming performance jump a little.

But make no bones about it, Ryzen still loses in this area. If you’re building an exclusively gaming machine, Ryzen doesn’t change the game for you. Intel is still your best play. Where Ryzen gets interesting is in the mixed-use scenario. If you do development, content creation, rendering, encoding, etc… and you game, Ryzen offers you 6900k-level of performance for about a third of the price.

The last hiccup is the memory controller. Unusually for a CPU that is otherwise very fast and competitive with the Intel parts, the memory controller is rather weak. It is dual-channel only, and getting maximum memory bandwidth, at the moment, requires using only two DIMMs (if you attempt to use four, you will lose memory clockspeed), and single rank memory. Using two DIMMs is generally not an issue at the moment, but the single rank memory issue is a major problem. It takes a lot of research to find out which memory kits are single rank, as this is not commonly listed in the specifications. The usual method is to visually inspect the memory. If the chips are on one side only, you can be about 99% certain it is a single rank part. But this is hard to do when shopping online, as the heatspreaders cover the chips, and the spec sheet doesn’t list whether it’s single rank or not. This is explained in great detail at Legit Reviews.

Memory problems may change in the near future, as more UEFI code comes out of AMD, and they optimize the memory controller somewhat. For now, if you are building a Ryzen box, be sure to check the motherboard maker’s QVL (Qualified Vendor List) for compatible memory. Asus, at least, has done a lot of research on this, and has even specified which memory kits are single rank, and thus best for Ryzen. The memory manufacturers themselves often don’t even do this, or bother to specify it, so kudos to Asus for taking the time to do that right.

Take careful notice of the memory speeds and latencies available to you, as Ryzen’s weaker memory controller and lack of quad-channel support make it much more sensitive to RAM speeds than Intel’s HEDT chips. Maximize memory performance to avoid a bottleneck here.

What we have here is a part from AMD that occupies a unique market position. It was a brilliant move from them. If you want the fastest gaming-only chip, the i7-7700k is still your best bet, as its higher clock speed gives it an unbeatable advantage here, and having 8 cores doesn’t do much for you in gaming (yet, anyway). Four is enough there, for now at least. If you want the absolute fastest workstation chip, the i7-6950k is still the fastest thing out there… if you have $1600 to pay for it.

But what if you’re like me, and your primary computer is a mixed-use machine? Something that sees use as a graphics and video workstation, a gaming box, and even, on occasion, a testing and development server? There Ryzen shines. It’s cheaper than the Intel workstation chips by a huge margin, while offering broadly similar performance, and it handles games just fine, even if it’s not quite the fastest there either. It won’t double the performance of my 2600k in gaming, but it will more than double it in workstation and dev duty. And it will dominate the 7700k in workstation and dev duty.

So you sacrifice maybe 10-15% of gaming performance, and even then only in situations where you are not GPU bound, and gain 50%+ in productivity and content creation compared to the 7700k. That’s a trade I’d make all day long. I suspect a lot of folks will be thinking similarly.

AMD created what’s probably the best general-use CPU on the market today.

So I’ll be building a Ryzen box this time around. But even if you don’t want to roll the dice with AMD, I imagine Intel will feel some competitive pressure again, and maybe we can get back to the market working like it ought to.

Ironically, however, where my last build used an Intel CPU and an AMD GPU, my new build will be a reverse. An AMD CPU, and Nvidia GPU (the Geforce 1080 Ti is the king right now). Maybe AMD should apply the same level of dedication they did with their Ryzen project to their next Radeon release (they claim the Vega release will be good – but we’ll have to see). Either way, though, folks ought to be thanking them for giving us an alternative to Intel that doesn’t require sacrificing your first-born son to buy.

Update: A great explanation of what’s going on with the mixed gaming performance from Ryzen. As it turns out, the decision by AMD to split the CPU into two CCXs (Core Complexes) created a latency issue between the two core complexes when there is a lot of thread-to-thread communication. It looks like this is actually a pretty easy fix, overall. Windows 10’s scheduler needs to treat them almost like two quad-core chips, rather than a single octa-core chip, which ought to distribute the workload better. I remember this was an issue for early Intel quad-core chips, which similarly split into two complexes of two cores each.

This video explains it:

This means a Windows update should fix most of the bizarre, split-personality of Ryzen. Let’s hope Microsoft actually bothers.

%d bloggers like this: