As of January 2020, new legislation mandates devices employ stronger safeguards. The first of their kind, California and Oregon’s bills mark a new wave for consumer tech law. They require connected, Internet-of-Things devices to have built-in security protections.
Yet there is controversy. First, device manufacturers have never been keen on legislative requirements and mandated security controls are no exception. On the other hand, security experts fear the new laws don’t go far enough. Bill 327 and Bill 2395 read like slapping a band-aid on an increasingly complex problem.
If your company is ready to revolutionize the world with its new smart solution, listen up. Compliance with the new wave will require more than a quick-fix. Grab your engineering and design teams to learn how the new regulations will change business.
What are California and Oregon’s new security laws for connected devices?
Signed by California Governor Jerry Brown in September of 2019, Bill 327 aims to improve IoT security. Once in force, manufacturers must include security in their products. The bill requires them to equip “reasonable security feature or features“. Devices must protect against “unauthorized access, destruction, use, modification, or disclosure.” In other words, IoT devices must protect data confidentiality and integrity. While a broad definition of security, manufacturers are wise to start compliance early. After all, safeguards like encryption can be difficult to add to a finished product.
Bill 2395, signed by Oregon’s Kate Brown on May 2019, also goes into force January 2020, with a similar premise. Safeguards should defend against “unauthorized access, destruction, use, modification or disclosure” of information.
What are the requirements for compliance?
Both bills go into effect on the first of January, 2020, however, those looking for an exact plan on how to comply will find both bills lacking. Neither Bill 327 nor Bill 2395 offer much for detail on safeguard requirements. Common security standards, including encryption, monitoring access to information, and the ability to accept software updates remain unmentioned. Instead, both bills emphasize security measures that are ‘appropriate’ given the nature of the device. Manufacturers must apply safeguards fitting “the information it may collect, contain, or transmit.”
The vagueness of these requirements allows the laws to age without becoming obsolete as technology continues to advance. They also leave a lot of room for interpretation, however, which could allow manufacturers to find loopholes and cut corners.
There is, however, a marked exception. All connected devices, according to both bills, must come with passwords that are “unique to each device manufactured”. Alternatively, they must demand users set their own passwords at first setup. It makes sense: default device passwords are a known bane of cybersecurity experts that can cause massive problems. A notable example includes the 2017 Equifax data breach. As Paul Lilly with PC Gamer reports, the default password ‘admin’ was used to protect vast troves of sensitive personal information.
How many IoT products will the laws affect?
Quite a lot, as it turns out. Bill 327 defines “connected device” as “any device, or other physical object that is capable of connecting to the Internet, directly or indirectly, and that is assigned an Internet Protocol address or Bluetooth address.” That can mean anything from smart home heaters to vacuums, doorbells and even power cords. If a product can connect to the internet or Bluetooth, Bill 327 will apply.
Like it’s California predecessor, Bill 2395 applies to devices that can connect via the Internet or Bluetooth. However, it adds a unique twist with devices that are “used primarily for personal, family, or household purposes.” Sheryl Falk of Winston & Staw comments manufacturers should be aware: “personal” may be broadly defined. For the past few years, many companies have been strongly encouraging staff to use fitness wearables and tracking applications. In 2016, Lisa Schencker checked in with readers of the Chicago Tribune on the trend of employers encouraging Fitbits at work. If the employee is using an item that tracks their health at the request of an employer, will it be viewed by the law as for personal or business purposes?
Any exceptions?
Legal exemptions to the laws do exist. Critically, devices already subject to stronger data protection requirements. Bill 327 and Bill 2395 clearly state they do not override HIPAA, which has more safeguards for medical data. The legislation will also not apply to devices subject to federal law. Should the federal government enact an Internet of Things security law, it will take precedence.
Interestingly enough, Bill 327 also exempts devices that have “unaffiliated third-party software”. This makes sense, as it would be difficult, if not near-impossible to safeguard devices once users add their own modifications. There’s no way a manufacturer could possibly audit the security of all potential third-party apps a user might try.
What is interesting, however, is that the law doesn’t prohibit users from exercising full control over their device. Specifically, manufacturers cannot use the law as a justification for preventing user modifications. Does this mean Bill 327 is also giving a subtle allowance for legal device jailbreaking?
What about device retailers? How do they deal with devices manufactured before the laws, but sold afterward?
If you sell electronics in California or Oregon and are worried about the law’s impact on your business, relax. Both Bill 327 and Bill 2395 applies to manufacturers, not retailers. The law is clear that it cannot “be construed to impose any duty” upon stores or marketplaces to review or enforce compliance. In other words, if you have products left over from December 2019, unless the manufacturer issues a recall they can remain on the shelves.
Compliance concerns
Thus far, discussions and enthusiasm for these IoT laws are lukewarm at best among security specialists. It’s a good start, but many agree there are far too many loopholes. For starters, will organizations follow the legislation to include better security in overall device design, or will they focus only on authentication? Writing for DarkWeb, Robert Lemos illustrates some of the confusion and different attorney interpretations, noting that some companies may choose to wait, measuring “whether there is any risk to them under the statute”.
Going into more detail, cybersecurity expert Robert Graham argues the bills get the language wrong. When requiring devices to have unique passwords, as Robert Graham explains, the laws get the language wrong. Many systems don’t have single passwords, but instead multiple authentication systems for which passwords are included. A device might authenticate the user account for web apps or online access, and have other, insecure active application protocols such as Telnet that can still be hacker targets.
“It’s like dieting, where people insist you should eat more kale, which does little to address the problem you are pigging out on potato chips. The key to dieting is not eating more but eating less. The same is true of cybersecurity, where the point is not to add “security features” but to remove “insecure features”.
Slowly stepping up the hill
While the rules may not offer full security protections, for consumers they’re a start. As Adi Robertson with the Verge reports, device-makers who sell products in California or Oregon “would pass the benefits on to customers elsewhere.” At a minimum, these bills may get manufacturers taking device security a step more seriously. The laws give security teams and designers another business case for adding better safeguards into products. For consumers, it won’t make all devices sold in California or Oregon 100% secure, but a little protection is much better than none at all.