Several years ago I thought X5 was a pretty good, stable version. But then a Windows update (if I remember correctly I think it was a big update to Windows 8.1) totally "broke" X5, which forced an upgrade to X6 -the first version of CorelDRAW to require activation.
A fresh install of X5 actually worked on 8.1, but an upgrade for whatever reason, broke the install on my end. An upgrade to 8.1 also borked the install of Ps CS6 on my mom's computer. Now a fresh install of one of my digitized programs to 8.1 didn't work out, but it did give an error msg to move a specific dll file (and in to what location) and that fixed the install.
I fear the policy of no more traditional perpetual license upgrades, particularly any setup where users can skip a lame or buggy or just plain bad version, is really going to come back around and do very severe damage to Corel itself. They're likely to see many users leave and adopt rival applications in response, even if those applications represent a serious downgrade.
This isn't a Corel specific issue. Any program that has adopted this subscription based approach has this issue. Even Adobe (I'm not saying that they have had buggy versions, I'm just saying that this risk is still there that there could be, Corel just had the unfortunate reality of having a buggy version for both platforms) has this risk, especially when they instituted the no version older then x - 1 (I know the public reason why, that doesn't matter, it's the fact that it exists now, although if they have removed the offending blob of dolby code in these later versions, I don't know why they couldn't just say now, no version older then x, but oh well).
Although, it's not just a buggy version to skip, it's also to skip versions that may have deprecated
and removed functionality that a user may want/need. The program itself may not be buggy, just lost functionality that a user may need, but now forced to upgrade.
I suppose if worse came to worse you could create a virtual Windows 7 machine on a partition of a new computer and run X5 that way.
A properly spec-ed out computer for VMing can be just as good as having the OS on bare metal. I can run VMs within VMs and that VM that's within a VM runs damn near parity as if it was on bare metal (even better, but the bar was low for that in some people's mind, even though I never had a problem with it back in the day).
One doesn't want to shirk on the specs of a computer going to be used for VMing. VMing can give you damn near native performance (especially with what VMs are now capable of doing) as long as one gets the appropriate computer for the job. Don't try to just shoot for minimum requirements. VMing more often then not tends to be a worst case scenario if shirking on the specs of the computer. Not all the time, but more often then not it is.
The one downside, especially if VMing to use current and still supported software, is that most software OEMs will not support a program that they are told runs in a VM. Programs developed with not only low level languages, but languages used to target specific platforms are designed specifically to be run on bare metal. Due to that, most vendors won't support a program (even if it's a current version) that's being run in a VM. So if support is a big thing for you and can't/won't do any web searches to find answers/work arounds, stick with bare metal.