Maybe one day every platform will be as secure as Apple

A look at the Biden Administration’s recently updated National Cybersecurity Strategy document seems to reflect some of the approaches to cybercrime Apple already employs. 

Take privacy, for example. The proposal suggests that privacy protection will no longer be something big tech can argue against – companies will be required to prioritize privacy. That’s fine if you run a business that does not require wholesale collection and analysis of user information, which has always been Apple’s approach. The best way to keep information private, the company argues, is not to collect it at all.

While that approach isn’t total — you don’t need to kick hard at Apple’s activation servers to recognize that at least some information about you and your devices is visible to some extent — most of your personal information is not. Apple’s recent decision to extend the protections it makes available to iCloud also seems to reflect some of the commitments made in the NCS document.

Just as App Store apps are required to disclose privacy policies and admit what they do with your information, the new security strategy is to require software makers and service providers to take much more responsibility for the security of their products.

“We must rebalance the responsibility to defend cyberspace by shifting the burden for cybersecurity away from individuals, small businesses, and local governments, and onto the organizations that are most capable and best-positioned to reduce risks for all of us,” explains a White House briefing statement.

Apple’s reputation for creating a secure platform has always shown that it’s possible to build and maintain such platforms. And while security protection is never perfect, that the company has managed to do this at all means it is possible for any company to follow suit.

That (and more) is effectively what the new proposals require. As you might expect, this is prompting some pushback from some industry players as it means they will be held responsible if their software or services are found to be vulnerable.

The Information Technology Industry Council, for example, seems to think these arrangements threaten the private contracts made between developers and customers.

At the same time, as CNN reports, the proposal reflects what the US government sees as a failure by market forces to keep the nation safe. Light touch regulation should not equate complacency. There’s also the argument that negligence isn’t always the reason security protections fail.

Aaron Kiemele, CISO at Apple-focused MDM and security company Jamf, says: “All software is vulnerable in some way to future exploitation. If a new issue arises and causes widespread impact, that doesn’t mean that the software vendor was negligent. You can do everything right and still be impacted by a security incident.

“That being said, there are plenty of old vulnerabilities that remain unpatched for years as well as companies that are truly not prioritizing security and privacy,” he said. “How to take the outcome (often a poor indicator of the underlying security capabilities of the company) and drive reform without this becoming a punitive punishment for a security environment that cannot reasonably be predicted is going to be tricky.

“The most interesting piece for me continues to be that this sounds like a good-faith effort to impose appropriate liability on software companies who are not currently doing the right thing to protect their data and their customers,” said Kiemele.

“It will be nice to be held to account more fully knowing that we will be rewarded for our good practices while others in the industry will be required to do the bare minimum to secure the digital ecosystem.”

Jamf last year launched a fund to invest in Apple-related security start-ups.

Apple’s sturdy approach to securing its platforms may lend it to want to make a similar statement.

Then there’s the consideration around connected devices. Think back over the history of Apple’s smart home solution, HomeKit, and you can see that its adoption was never as rapid as expected. Apple history watchers will know that one of the reasons for this was because Apple insisted on manufacturers meeting security standards and making use of its own silicon. Others didn’t require the same stringent protection, and we’ve seen plenty of evidence of how that can be abused. Even Apple abused this trust when it set Siri to snooping.

But when it comes to national security, the vulnerabilities extend beyond home speaker systems listening in on what you say. We know Industry 4.0 is rolling out globally, even as connected healthcare systems see deployment accelerate.

All those connected devices rely on software and services and the move to make vendors in those spaces more responsible for those systems seems logical.

We’ve known since the infamous HVAC attack against Target how even a less-important connected system can be targeted. While no one should purchase any connected device that can’t be secured or updated, neither should any manufacturer sell items with a weak passcode like 0,0,0,0 installed by default.

Making vendors responsible for hardening those systems makes sense because we’ve seen too many incidences of failure.

The White House security proposals also look to future threats, such as the impact of quantum computing on traditional perimeter and endpoint security protection. You could argue that Apple has some answers here, with biometric ID and its support for password-free Passkeys, but there will be many more miles to that journey, and we’ve needed to move beyond passwords for years.

But at least the proposals should mean that everyone involved in that space will be more motivated to work toward securing their products, rather than waiting for someone else to do it.

And that is the big positive in these proposals. In essence, telling software and service providers to take more responsibility for security will probably drive most to toughen up. There will be glaring inconsistencies along the way — for example, is the regulatory drive to force every smartphone vendor to support every app store compatible with the need to secure platforms and services?

If security and privacy are so important, how is it right that Apple be forced to reduce the security and privacy of the products and services it provides?

The National Cybersecurity Strategy doesn’t have all the answers to this complex web of shifting problems, but it does offer a stronger starting point from which to move forward. Social media firms can expect a great deal of scrutiny, at last.

It calls to mind a Steve Jobs quote, that may be relevant here:

“When you first start off trying to solve a problem, the first solutions you come up with are very complex, and most people stop there. But if you keep going and live with the problem and peel more layers of the onion off, you can often times arrive at some very elegant and simple solutions. Most people don’t put in the time or energy to get there.”

While there will be much work to do, the proposals do put some urgency in place for tech to accelerate its efforts to make security simple and certainly suggests the days in which laissez-faire tech firms could sell insecurity as a service are numbered.

That’s a really good thing.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

http://www.computerworld.com/category/security/index.rss

Leave a Reply