Generally I just don't trust computers for storing any important data, regardless if the computer is connected to the Internet or not. I'm pretty anal retentive about backing up data I want to protect/keep on 2 or more external hard disc drives. If you don't have the data stored safely in 2 or more places you don't have it stored. This includes keeping a copy stored off site. You can make multiple back-ups, but all those back-ups will be lost if your work place burns down to the ground.
When I was talking about keeping it off the internet, I was talking more about software not necessarily working files. While this update seems to affect more of files, I was just using it as just yet another reason of why always getting updates is not a good thing.
It's bad enough that a Not Ready For Prime Time operating system update can gum up the works. Windows is not the only OS afflicted by this. But hardware failures are always a threat too. If I have to keep art/production files on a disc inside a work computer I usually make sure it is a different physical hard disc from the one holding the operating system. If an errant update or something else hoses your OS it's not hard to start over with a clean "system restore." The only thing you lose is maybe a couple hours of time.
No, updates are a double edge sword for any OS and/or program (depending on what is getting updated). I do believe that Windows does tend to be the most afflicted. Which I can understand to a point. Forcing updates erodes that understanding though.
2 hrs for a system restore? Man, I'm truly loving not running Windows (or Mac) anymore on bare metal. 2 hrs, even on a fresh OS install, I'm firmly back into work. 20-30 minutes to get everything up and running on a fresh OS install for me. If I was better at scripts, I could probably get that trimmed down even further.
External flash drives are popular for their portability. But they absolutely stink for storage reliability. I won't spend any more than what it costs for a modest 32GB USB stick. There's no telling when one of those things will go bad. One stick will work like a champ for several years while another goes bad in less than a couple months. That even goes for the highest priced "ultra" versions.
Jump drives are designed to transport files from point A to point B, not really for long term storage. Having said that (I do get high storage capacity then what you mention), I do use jump drives for portable programs (all my production programs are portable (and sand-boxed)) and for large files. If I'm traveling with the family, most hotel TVs have USB ports, so I'll use them for video files as well. I do also tend to run live OS installs on USBs as well.
If you use Adobe software, particularly Creative Cloud, there's little choice but to have your computer connected to the Internet. The Typekit service (which is a pretty awesome bonus) only works with an "always on" Internet connection. The rest of the software you install has to be able to occasionally "phone home." Other applications are doing some of the same stuff. There's a lot of cloud storage services which demand Internet connections. I use an iPad Pro and Adobe's "mobile" apps. I can only use my Creative Cloud folder, Dropbox folder or iCloud folder to move data between my iPad and work computers. That requires an Internet connection.
I stopped with CS6, I stopped using Windows when Win 10 came out. I only have one tablet (Cintiq tablet) that has Windows installed on bare metal. All of the others run Linux and I do have a VM or two running of Windows (isolated from outside connection).
All the applications that I run are not only portable apps (AppImages, even with programs that don't come in a Linux portable version from the devs (like Inkscape), the joys of open source) and I have them further sand boxed beyond that. The libraries that they use are in the AppImage itself, so they don't even call on system libraries.
I tend to use my own NAS storage, I'm not a fan of 3rd party cloud backups. Maybe as a tertiary redundant backup, but not one of the 2 that one needs to have to be a backup. I run NAS at my place (actually 3), sister's, niece's and parents. The one at mine does not read NTFS file systems (I made sure of that), so those that may be at my place that run Windows computers won't even pick it up as a DLNA server.
The payload can be delivered via a USB memory stick or CD-R provided by a customer. Or it can infiltrate your home or work network.
Those are a none issue as I don't get jump drives or CD-Rs from customers (with it appearing to me that optical drives are going by the wayside, I doubt CD-Rs will be much of an issue for much longer for most people).
Malware can actually try to attack your desktop PC (by this I mean any x86 computer that can perform tasks locally (this does include Macs as well)), but that same malware can go after your router as well. I'm more worried about IoT vectors more then I am through traditional means as end users are totally at the mercy of the vendor to support those devices and keep them up to date.
Having said that, I actually have 2 networks at my place. One that has WAN access, anything that comes in there is scanned put on a USB drive and then put on a computer that is apart of the LAN only. Those are the production computers. Process is reversed if the file heads back on.
With the USBs and the CD-Rs and how those are handled, I blame MS on that one. Over the years, they have always done convenience over security. For USBs, CDs, even printers. I have things setup that there are some hoops to jump through to even just a printer setting from B&W to color.
If I got hit with a ransomware attack right now I would only be angry and annoyed for the time it takes to reformat the hard drive and re-install software. I wouldn't lose any important data.
That's really the way it should be. Ransomware
should not be a thing. Period.
The fact that it was able to cripple hospitals and businesses, kinda ticks me off. That tells me that their IT staff wasn't doing their job.