Maybe one day every platform will be as secure as Apple

A look at the Biden administration’s recently updated National Cybersecurity Strategy document appears to reflect some of the approaches to cybercrime Apple is already taking.

Take privacy, for example. The proposal suggests that privacy protections are no longer something big tech can argue against – companies will have to put privacy first. That’s fine if you run a business that doesn’t require the large-scale collection and analysis of user information, which has always been Apple’s approach. The best way to keep information private, the company argues, is not to collect it at all.

While that approach isn’t exhaustive — you don’t have to kick Apple’s activation servers hard to recognize that at least some information about you and your devices is visible to some degree — most of your personal information isn’t. Apple’s recent decision to expand the protection it’s making available for iCloud also appears to reflect some of the commitments made in the NCS document.

Just as App Store apps are required to disclose privacy policies and admit what they do with your information, the new security strategy is to require software makers and service providers to take a lot more responsibility for the security of their products.

“We must rebalance the responsibility of defending cyberspace by shifting the burden of cybersecurity away from individuals, small businesses and local governments, and onto the organizations most able and best positioned to mitigate the risks for all of us. to reduce,” explains a white man. House briefing statement.

But nobody’s perfect

Apple’s reputation for creating a secure platform has always shown that it is possible to build and maintain such platforms. And while security is never perfect, the fact that the company has succeeded means that any company can follow suit.

That (and more) is, in fact, what the new proposals require. As you might expect, this has drawn some resistance from some players in the industry, as it means they’ll be held accountable if their software or services turn out to be vulnerable.

The Information Technology Industry Council, for example, seems to think these regulations threaten private contracts between developers and customers.

At the same time, as CNN reports, the proposal reflects what the US government sees as a failure of market forces to keep the nation safe. Light touch regulation should not equate to complacency. There is also the argument that negligence is not always the reason security measures fail.

Aaron Kiemele, CISO at Apple-focused MDM and security firm Jamf, says, “All software is somehow vulnerable to future exploitation. If a new problem emerges and has widespread repercussions, it doesn’t mean the software vendor is negligent. You can do everything right and still be affected by a security incident.

“That being said, there are plenty of old vulnerabilities that go unpatched for years, as well as companies that really don’t prioritize security and privacy,” he said. the company) and implementing reforms without penalizing a security environment that cannot reasonably be predicted will be difficult.

“The most interesting piece to me remains that this sounds like a good faith attempt to impose appropriate accountability on software companies that are not currently doing the right thing to protect their data and their customers,” Kiemele said.

“It will be nice to be held more accountable, knowing that we will be rewarded for our good practices, while others in the industry will have to do the bare minimum to secure the digital ecosystem.”

Jamf launched a fund last year to invest in Apple-related security start-ups.

Apple’s firm approach to securing its platforms may lead Apple to make a similar statement.

Increasing responsibility

Then there’s the consideration around connected devices. Think back to the history of Apple’s smart home solution, HomeKit, and you can see that its adoption has never been as fast as expected. Apple history watchers will know that one reason for this was Apple’s insistence that manufacturers comply with security standards and use their own silicon. Others didn’t need the same strict protection, and we’ve seen plenty of evidence of how that can be abused. Even Apple abused this trust when it sniffed Siri.

But when it comes to national security, the vulnerabilities go beyond home speaker systems that listen in on what you say. We know that Industry 4.0 is rolling out globally, even as connected healthcare systems see adoption accelerate.

All those connected devices depend on software and services and the move to make vendors in those spaces more responsible for those systems seems logical.

We’ve known since the infamous HVAC attack on Target how even a minor connected system can be targeted. While no one should buy a connected device that cannot be secured or updated, no manufacturer should be allowed to sell items with a weak passcode such as 0,0,0,0 installed by default.

It makes sense to make suppliers responsible for strengthening those systems because we’ve seen too many failures.

The White House security proposals also look at future threats, such as the impact of quantum computing on traditional perimeter and endpoint security. You could argue that Apple has some answers here, with biometric ID and the support for passwordless access keys, but there will be many more miles on that journey, and we’ve had to go beyond passwords for years.

But at least the proposals should mean that everyone involved in that space will be more motivated to work on securing their products, rather than waiting for someone else to do it.

We need to destroy the market for designer uncertainty

And that is the great positive thing about these proposals. Essentially, telling software and service providers to take more responsibility for security will probably push most people to toughen up. There will be glaring inconsistencies along the way – is the regulatory drive to force every smartphone vendor to support every app store compatible with the need to secure platforms and services?

If security and privacy are so important, how is it right for Apple to be forced to reduce the security and privacy of the products and services it provides?

The National Cybersecurity Strategy does not have all the answers to this complex web of shifting problems, but it does provide a stronger starting point to move forward. Social media companies can finally expect a lot of attention.

It brings to mind a quote from Steve Jobs, which may be relevant here:

“When you first try to solve a problem, the first solutions you come up with are very complex, and that’s where most people stop. But if you go ahead and live with the problem and peel back more layers of the onion, you can often arrive at very elegant and simple solutions. Most people don’t put in the time or energy to get there.”

While there’s still a lot of work to be done, the proposals mean the technology urgently needs to accelerate its efforts to make security simple.

That’s a really good thing.

Please follow me on Mastodon, or join me at AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2023 IDG Communications, Inc.

Leave a Comment