California’s privacy law means it’s time to add security to IoT
The California Consumer Privacy Act (CCPA) goes into effect on January 1, and it gives residents of California the right to learn what data companies collect about them and governs how they should protect consumer data. And that means it’s time for the internet of things to become secure.
Jack Ogawa, senior director of marketing (and IoT security guru) for Cypress Semiconductor, spoke with me about the implications of the law for everyone who is making software, hardware, chips, and systems for the internet of things, which makes everyday objects smart and connected.
Cypress, a major chip maker for IoT devices, is in the process of being acquired by Infineon for about $10 billion. Along with every other tech company, Cypress is probably going to have to abide by the California law across the U.S. and help ensure that tech products comply with basic security measures. The law is a warning shot, and it means that tech designers have to take security into account in their software and hardware designs for the future.
Here’s an edited transcript of our interview.
VentureBeat: The law that’s going into effect in California, how concerned are you about how that’s going to affect IoT? What are the consequences for the industry?
Jack Ogawa: We’re excited about the law coming into effect. Not necessarily because it’s a perfect law, if there is such a thing as a perfect law. I personally don’t have any misconceptions that the law is perfect. But it is an indication of what’s needed for IoT.
If you think about the evolution of the marketplace, we’re at a state now where the technology has gotten us to a certain point. We have connectivity. Wi-Fi has gotten to a point where it’s ubiquitous. Access to the internet is pretty pervasive around the world. That’s spun this billions-of-units vision, saying that everything is going to be connected.
That’s interesting, and from a technology perspective, we’re seeing that in our houses. We see the ubiquity of these connected devices in our homes. But what quickly happens is you get what I call a normative period where societal issues come to the fore, the biggest one being privacy. You go into this normative period now where everyone says we need privacy, and then you have to have some sort of governance over the devices to create an environment where you can deliver that capability in a cost-effective way.
What I’m saying specifically, as it relates to privacy today, is that there are no standards. There is no threshold. Therefore, these devices can be anywhere from having zero security capability to everything in between, across the spectrum. The problem with that, one, as a consumer you don’t know what you’re getting, because each device delivers things in a different way in terms of privacy. But there’s also a commercial aspect to the problem. In a fragmented state like this, it turns out it’s cost-inefficient for everyone to have their own answer.
The reason why we’re actually applauding the legislation coming forth is not because it’s a perfect law, but because it will start the normative cycle where the answer to the question of “How do you support privacy?” will start to become standard and ubiquitous. When that happens, it becomes affordable, and it becomes approachable by consumers.
VentureBeat: What are some things that the law will require?
Ogawa: There are basic things. You have to have a password to join a connected network. Your ability to use the default password — you’re confronted with that choice. You have to opt in to that choice. The default will be to force you to create a new password. That’s interesting. If you look at a lot of the data, that simple thing is what’s exploited in a lot of attacks.
As a chip maker, there’s another dimension we think is important, which is being able to install a secret and immutable identity in your hardware. That stance is built around the concept of being able to confer trust, just like being able to have a unique user password.
VentureBeat: Is that simply a user identity, or do they call that something specific?
Ogawa: It’s a user identity, yeah, an end user identity.
VentureBeat: Is multi-factor authentication a requirement as well?
Ogawa: No, but it’s suggested in the law. The requirement is to have a unique username entered in. The law goes into some other statements regarding — if you’re familiar with networking at all, it tries to cover the hardware as well. One aspect is the user authentication, and then the law also tries to describe how to authenticate hardware.
That’s the other piece that’s often forgotten. There’s so much energy focused on authenticating you as a person, but some big percentage of attacks actually spoof the hardware to get into your network. When you spoof the hardware, you’re able to adopt its permissions as it enters the network. Protecting your hardware is as important, in terms of privacy, as it is to uniquely identify you as a person.
At a high level those are the biggest things in the law. It’s the requirement to better identify a person, and then the requirement to protect data as well. That fundamentally comes down to being able to encrypt data. Then the third dimension is to be able to protect the device itself.
VentureBeat: How long were you tracking this? Was there some public history for all of this that the industry got to weigh in on?
Ogawa: Particular to SB327, I don’t know what the public commentary, how that got fit in — I don’t have that background handy. It predates some of the things that I’ve been tracking.
VentureBeat: Is it a situation where if California requires it, pretty much everyone has to abide by it?
Ogawa: That’s the thesis. The California legislature has decided to take leadership on this. The analogy I’ve read is to automobile emissions. The state felt like taking a leadership position on this was important. We are starting to get customers asking us about that.
VentureBeat: When I think about security for IoT, what it reminded me of was when I used to go to the Black Hat conferences. They always talked about these non-tech companies and non-tech industries that would design a new product, but there would be no security in it — especially at first, because it was never intended to be connected. And then you have IoT, which is connecting these things that were never connected before. They mostly went out as — well, let’s see if people want to use this type of thing, so let’s build connectivity into everyday things and see what happens.
There was another trend where there was CPU power and battery life, and the restrictions were often so tight that you didn’t have the capacity for encryption or security technology to be built into it. The pattern for introducing these things went that way. First you made it not connected. Then you made it connected. Then you had to think about security once you had enough processing power and battery life to do so. But you never did that at the beginning. It felt to me like that’s the way IoT developed, like almost any other connected product.
Ogawa: You’re absolutely right. That’s exactly the trajectory we’ve been on. Our mantra, if you will, and one of the reasons why we applaud the legislation, is that it will force IoT devices to be secure by design. In order to do this correctly — and do it economically, which is just as important — embracing security during the design phase is critical.
That’s how the industry will respond to this, I think. The evolution of this will be — first I couldn’t have connectivity. Now I can go to market with connectivity, but no one’s asked me about security. OK, now there’s regulation around security, so I have to comply. Now what’s the most cost-effective way to achieve compliance? Classically, that will unleash the engineers. If you had to start over, how would you incorporate security?
VentureBeat: I do see different companies talking more about it. It was a big topic at the Arm TechCon conference, for example. They’ve done a lot of industry-wide attempts to improve security. It feels like we’re at a stage where people don’t care as much about processing power anymore. They care about privacy and security and making it better. Is that the new stage we’re at right now? Is that satisfactory for you?
Ogawa: It’s an interesting question. For the IoT industry overall, it’s a pretty fragmented marketplace. In aggregate, you can claim millions or billions of devices shipped, but that’s scattered across smart locks, thermostats, fitness trackers, you name it. The challenge for these guys is to make sure that what they’re doing is going to appeal to their end constituency.
When you talk about processing and power, those are a couple of the horizontal capabilities that affect everybody. I think security is going to be another one of those horizontal things that affect everyone. Whether a given smart lock guy is going to run a million units, that question is sort of orthogonal to the basic question of, is he going to have to comply with security and privacy laws? Because he does. And by the way, the thermostat guy does, and the fitness tracker guy does too. But it’s one of those horizontal topics that’s defining the segment overall.
VentureBeat: If companies aren’t ready for the California law, I suppose they have to get moving.
Ogawa: This is one of those things. You’ve followed technology long enough. The lawmakers take a shot at this, but there’s so much ambiguity in how it’s enforced and how the vendors achieve compliance. Whether there will be any sort of legal or civil ramifications to the particular law is questionable, to be honest. But like I said, a lot of customers look at it and say, “This is coming.” They don’t want to be the one who becomes liable, legally or otherwise, because they don’t support the law. It’ll be interesting to see how vendors react to this, but I do expect a reaction. I don’t think people will be able to ignore it.
VentureBeat: As far as broader standards for connecting everything, does that seem to be coming together? I know there are things like Samsung’s SmartThings. I don’t know if everything is interchangeable yet, or if there’s still a long way to go there.
Ogawa: From an end customer perspective, there’s still a way to go. Connectivity tends to follow the use case. From an underlying protocol and standardization perspective, things are still the way they were. Wi-Fi has come out to be a strong, ubiquitous protocol, and we believe that will be the winner in terms of IoT. But that doesn’t mean all these other protocols will suddenly disappear. There are other use cases, like Bluetooth mesh, that will continue.
VentureBeat: As far as issues like cost, do you have pushback from people who say that security costs too much, or someone else should have to pay for security? How does that conversation go?
Ogawa: It’s an interesting problem. If you look at the hardware cost — I always use the example of a washing machine, because that’s easy to understand. If you look at the cost of a washing machine, trying to add Wi-Fi to that — it takes away from the profit margin of a washing machine. People are not going to pay another $100 for Wi-Fi, not in the broader population. There’s a real problem.
There are two dimensions to the question. One is, in general, IoT device makers have to answer the question of “Why”? A lot of that question is answered by more processing, like you alluded to earlier, and being able to be more smart. An example of that might be machine learning for preventative maintenance on the motor inside the washing machine. There’s some intelligence in the machine that detects some weird vibration, so you should pay attention to that before the whole thing dies. We can see that happening.
On the security side, I believe it’s more of a — it’s a cost of ownership problem that IoT device makers have to work around. Just like most companies today have an IT department that handles all the networking issues, IoT devices have a similar requirement, an IT administrator and dev ops, managing all those millions of devices going out. Security falls into the efficiency bucket relative to defraying your networking costs, meaning if you figure out a standard way to deploy security, it will save you money in the end versus having each one of your products having a different version of security.
It’s similar to the IT challenge. Most companies won’t allow their employees to just mix and match their PCs, because it drives the IT guys crazy and drives costs up. What most of these companies realize is, when they build in security by design, the network management efficiency will be a big gain.