Why Security Isn’t Taught in Schools And What We Can Do About It

The recent highly publicized hacks of private and government entities have ignited discussion about expanding the role of cybersecurity in traditional CS curricula. This is a good thing. What isn’t, is that most colleges don’t require computer science students to take security courses. In fact, many don’t even offer them. Why? Because most developers think that security people should handle security problems, as do the vast majority of their college instructors. After all, developers want to focus on features and functionality. Not fixing problems.

Per an oft-cited study by CloudPassage conducted earlier this year, only one of the top 36 undergraduate computer science programs in the United States made passing a cybersecurity course a graduation requirement. After college, retrofitting these young developers with a security-focused utility belt isn’t really an option. Not just because it’s too costly—but because no one really has a way to even do it, or has discovered a way that works.

So how can things get to the point where twenty-somethings come off the assembly line being fed a well-balanced diet of development and security? With the know-how to build solid applications, not just functional ones?

The answer is simple: We need to flip the script and start teaching professors and mentors about the benefits of weaving security into every aspect of developer education. Understanding the best way to do that, of course, starts with understanding why it’s not already being done.

Reason #1: The Current Academic Model Doesn’t Work When It Comes to Security

Traditionally, computer science programs have done an excellent job with theory. Students are taught to be great computer scientists—not great developers.

IEEE Member Roy Wattanasin addresses one the main problems with cybersecurity education, saying that there is no template or guide for instructors to follow. They have to create their own course curricula and with the list of zero-days and online exploit kits popping up daily, it’s almost impossible to even know where to begin. Wattanasin surveyed computer science programs in the Boston area, and he that found that no two within computer science courses were alike and no school offered a course specifically on cybersecurity. This is a huge issue for employers, and with such fragmented education there is no telling how well equipped, if at all, these graduates are to tackle security issues.

Security knowledge, unlike high-level theory, requires developers to get their hands dirty. This isn’t to say traditional methods aren’t useful to a fledgling developer; they just don’t compare to actually building something with security baked in from the get-go.

Understanding security issues requires developers to:

Care about the thing they’re protecting— whether it be user data, ability to control an app, etc.
Think like an attacker and understand how they would gain access
Have a clear understanding of what the security issue looks like in the code base, how it’s manifested and how it’s exploited
And for many developers, it’s just not practical to go through the entire process—especially when security issues can be someone else’s problem.

Reason # 2: Cybersecurity Degrees—Great For Security People, But Developers?

The way security people think, react to problems, and reach solutions is foreign to developers. It’s almost like they speak different languages. So while undergraduate cybersecurity degrees might very well result in high-quality, focused professionals, they aren’t useful for developers. “You know the situation is bad when companies like Bloomberg, Facebook, Google, and Microsoft are creating their own cybersecurity programs to train employees,” says Ming Chow, a professor at Tufts University and close friend of Codiscope.

What the world really needs isn’t a new degree. It’s a developer-education reboot. Why? Without dovetailing security into developer education from the start, developers won’t become truly bilingual. They’ll always be “developers who know something about security.” Not “security-driven developers,” who could be trained to incorporate the principles of security into dev work and ensure nothing gets lost in translation.

Reason #3: Non-Traditional & DIY Approaches Aren’t Always Reliable

Bootcamps and online courses do a fantastic job of preparing new developers to write code. Graduates of these programs may very well have an edge over CS students because of the project-based, hands-on nature of their education. But despite the explosion in growth, devs coming out of these programs don’t get the same level of security training they would in traditional academic programs. The skills they learned in these crash-course environments need to be continually developed.

Even if we assume it was possible, in principle, for a self-taught developer to focus on writing secure code from the start, it doesn’t really work in practice. A Google search or trip down a Stack Overflow rabbit hole can do more harm than good. Per our CEO, Gary Jackson: “Most of what people vote up on Stack Overflow is wrong.”

This also impacts veteran devs. looking to bolster their own security knowledge when on the job training doesn’t cut it. The lack of solid resources actually disincentivizes learning. And reinforces the mistaken idea that security issues should be handled by security people.

Change Is A-Comin’

Pockets of the academic world have recognized that the CS curriculum in its current form is failing to prepare students for the workforce, and they’re trying to fix it one course at a time.

Ming Chow offered colleagues the following manifesto at the end of his New England Security Day presentation “Chipping Away at the Security Education Problem.”

There is no excuse to not integrate security into computer science courses, especially systems and application-based courses
Inform students of the security and privacy problems and opportunities; ask students to be good citizens
Encourage and challenge students to develop the curiosity and mindset of the bad guy
Do not use only traditional teaching and learning techniques for courses. Learning how to take tests isn’t helping
Provide mentorship and networking opportunities
In adhering to these principles, professors can create stronger developers (and better job candidates).

Provided the professors themselves have an understanding of security, small tweaks to existing projects won’t be difficult to implement. Baby steps, like simply encouraging CS students to think about security at the beginning of each assignment and discussing potential threats, where they come from, and how to prevent them will pay major dividends.

Educators—academic or otherwise—are in the best possible position to incite change. A shift toward more secure code begins with recognition of its importance and subsequent inclusion in curricula. There isn’t a single CS professor or bootcamp instructor on the planet incapable of asking his or her students “what could go wrong,” and having a conversation.

Where Does This Leave Us?

Hopeful!

If we want students to become developers who write secure code, the answer is simple: put them in an environment where secure code is the only kind of code. Unlike security professionals, developers are in the unique position to fix security issues before they start. Like, before the toasters and smart cribs start spying on us. Clearly, application security is evolving and developer education needs to evolve with it.

If proper training means delaying the inevitable rise of Skynet, we’re all for it. So—Professors! Developers! Bootcamp mentors! Stack Overflow contributors! Rise up and help us eliminate the plague of application insecurity by equipping young developers with the skills they need to write secure code.

Source:https://www.codiscope.com/cybersecurity-not-taught-schools/

. . . . . . . .

Leave a Reply