proliferation of new privacy and security laws imposes diverse, complicated,
and at times inconsistent compliance requirements on organizations that handle personal
data. Not only do these laws require the protection of individual privacy from
intrusion, many of them also involve public concerns like national security and
defense, protecting critical infrastructure, social interest, community safety and
law enforcement access. Adding to the complexity, satisfying some of these
requirements (e.g., data localization and sovereignty) often encroaches on
privacy, diminishes security and impedes business opportunities.
to legislation, we are also seeing governments issue procurement requirements related
to privacy and security. These requirements are necessary to ensure critical
infrastructure providers and government agencies are only using safe and secure
goods, software and services.
backdrop of these laws and procurement requirements, we now have legislation influencing
(and at times dictating) how products are built – at least setting some
baseline elements that are necessary and must be included in every product.
IoT Security law — enacted without
much fanfare in September 2018 and effective as of January 1, 2020. In a nutshell, the law
requires manufacturers of internet-connected devices sold or used in California
to build reasonable security into those products. More specifically, it
prohibits the use of generic default passwords.
law (and likely others to come), companies have a clear business incentive to
ensure their devices are properly secured and to implement protocols that help
users better protect themselves. Think about the many headlines regarding IoT hacks
causing massive DDoS attacks that took down parts of the internet; or the
unauthorized access to webcams and connected home security systems that were exploited
to invade privacy and terrorize children. In these situations, unsecure systems
hurt the provider’s business and brand and negatively impacted others too.
companies that haven’t gotten the message, California’s new law tips the privacy
ROI scales with the threat of “stop ship”, negative publicity, enforcement
actions, fines and consumer lawsuits. A legal requirement in order to sell
products into California (the world’s 5th largest economy) no doubt
impacts the way products will be built from now on.
procurement requirements in both the private and public sectors are influencing
manufacturers’ decisions about whether to sell into particular markets, and
impacting the way products are designed and built.
service providers (CSPs) wishing to sell to the US Federal government, for
example, must be FedRAMP certified. Unlike
California’s IoT Security law which applies to manufacturers, FedRamp is a
procurement requirement that US federal agencies must follow. Those agencies
are not permitted to buy or use non-FedRamp certified cloud offerings. The
FedRAMP certification process is rigorous and time consuming. For even
sophisticated security companies, it can take over a year to complete the audit
and demonstrate that all requisite controls are in place.
is strong on risk management and security, it hardly mentions “privacy” at all.
Instead, federal agencies such as the Department of the Interior have noted
that, in addition to FedRamp, CSPs must satisfy separate and distinct privacy
requirements embodied in a variety of other legislation and policies (e.g., the
Information Security Management Act of 2002, E-Government Act of
2002, and Office of Management
and Budget M-03-22 policy). If the CSP builds their product to meet FedRamp
alone, they would meet the security requirements, but not satisfy the privacy
requirements of US federal law, let alone the requirements of the 120+
countries with omnibus privacy legislation.
How then does
a company adopt a consistent security and privacy approach that will enable the
most market opportunities? Complex does not have to be complicated. When you
boil down the various laws, regulations and procurement requirements, you get
three core principles that have been the foundation of privacy for decades – transparency, fairness, and accountability.
starts with understanding the data in play and being transparent to users, data
subjects, and other stakeholders about what’s going on with it―the who, what, where, when, how, and
why of data. This effort should begin at the design phase of the
data-collecting product, service or process being built. Each data element that
will be captured must be evaluated for what it is and why it’s needed throughout
the lifecycle. Considering both the intended and unintended consequences and
risks of the data set is key to a robust privacy impact assessment (PIA). PIAs
are no longer just best practice – they’re legally required.
As part of
the PIA process, the customer’s objectives must be a top priority. It’s not
illegal to make a product nobody wants to buy; a developer’s legal obligation
is to be truthful and transparent about what the product can and can’t do. “Customer-centric
innovation” means that we focus on what the customer needs the product to do, and
what features and functionality they need to be successful (and compliant).
means that products are designed in a manner that can be customized and/or
configurable¾ setting a default configuration that meets the vast majority
of the market’s requirements while offering customization options for customers
with unique and stringent regulatory or market constraints.
trade-offs are investment choices that must be made based on the business case.
In some situations, it may not be cost effective for a vendor to allow
customization for a limited set of customers, thereby requiring a decision to
forego certain markets (or to sell custom build services).
One of the areas
where companies traditionally struggle is with data retention and deletion
practices. The availability of inexpensive storage long encouraged companies to
keep all of the data they collected in perpetuity. This was in part driven by a
“we’ll need it someday” mentality along with data scientists wanting as much
data as possible to churn and extract emerging patterns and meaning. And, it
was also just easier to not have to decide what to keep and what to expunge.
laws, however, are now pushing back against those old models. Countering
records management requirements of minimum
time periods that records must be retained, privacy is pushing for outer limits
on the maximum period data should be
changing legal climate, we’re now seeing products being built with retention,
expiry, export/portability and on-demand deletion capabilities. We’re seeing more
discipline in organizations’ data retention periods, and justification being
required for the retention period selected. While not being overly prescriptive
(yet), privacy laws require that data is kept “only as long as it is needed” to
serve the purpose for which the data was collected. If the purpose and use
isn’t well understood or clearly articulated, retention should be cut off.
laws such as the ‘right to be forgotten’ are triggering data subject demands
for all personal data to be deleted. While some have over-rotated on that
right, a more careful read of the legislation and court cases reveals that the
data subject has the right to request deletion and get a response – but they
don’t always get the data deleted.
get deletion or an explanation as to why deletion is not technically
feasible or why further retention is required. For instance, a business may
have legal retention requirements for tax purposes or for employment
verification. Individuals have a fundamental right to privacy that must be
respected, but that right is not an absolute right. It must be balanced against
the rights of others, including the right of a company to conduct legitimate
business and comply with its own legal obligations.
here means that you explain your data retention and expiry practices, disclose
the reasons why you are keeping data for the period that it is being kept, and
explain what can or can’t be deleted: what are the business and legal
justifications for your practices? The market (and regulators) will decide
whether those practices are fair and hold you accountable for living up to
promises made. Ultimately, being transparent, fair, and accountable honors
privacy rights and builds trust, making it a sound business and marketing practice.
As part of
organizational accountability, companies must also have processes in place to
document decisions being made to ensure a consistent, repeatable and defensible
approach. For example, at Cisco, our products and offers all go through the Cisco
Secure Development Lifecycle (CSDL) to ensure we are building trustworthy
solutions with privacy and security in mind from the start.
CSDL is a
product launch “gate” that defines and codifies the baseline security and
privacy requirements all products must meet before they are given the green
light to ship. It combines tools, awareness, training, and processes such as Privacy
Impact Assessments, to promote defense-in-depth and provide a holistic approach
to product security, privacy and resiliency.
guides business judgment calls and design decisions, and provides the documentation
for transparency and explanation of our practices.
requirements and product approvals go through periodic reviews, both annual and
ad hoc as new requirements (legal, industry standards, best practices, etc.)
are published. This enables us to verify that CSDL is timely, relevant and
of today’s privacy considerations gives businesses a lot to factor when creating
product development and market strategies: deeply
understanding their data, clearly defining their customer needs, tuning in to market
sentiment and balancing business goals in the regulatory environment.
depends on leveraging existing processes, and partnering closely with the privacy,
security and development teams to co-design
with a focus on customer-centric, transparent data-handling. Ultimately, being
transparent, fair, and accountable in data practices strengthens trust and
builds market opportunity.
Harvey Jang, Vice President, Chief Privacy Officer, Cisco