Showing posts with label software development. Show all posts
Showing posts with label software development. Show all posts

Tuesday, October 15, 2024

Exploring Secure Software Development w/ Dr. Joye Purser and Walter Angerer (a live blog from PNSQC)

Okay, so let's get to why I am here. my goal is to focus on areas that I might know less about and can see actionable efforts in areas I can be effective (and again, look for things I can use that don't require permission or money from my company to put into play).

Dr. Joye Purser is the Global Lead for Field Cybersecurity at Veritas Technologies. Walter Angere is Senior Vice President for Engineering at Veritas and co-author of the paper. To be clear Dr. Purser is the one delivering the talk.

Creating secure software involves a lot of moving parts. So says someone who labels herself as "at the forefront of global data protection."

High-profile security incidents are increasing and the critical nature of secure software development is needed more than ever. Because of high-profile cases that end up in the news regularly, Dr. Purser shared her journey and experiences with Veritas, a well-established data protection company. She shared their journey of ensuring software security.

Veritas has a seven-step SecDevOps process, demonstrating how they aim to control and secure software at every stage.

1. Design and Planning: Building security in from the outset, not bolting it on as an afterthought.

2. Threat Modeling: Identifying potential threats and mitigating them before they can become problems.

3. Code Analysis: Veritas uses advanced code analysis tools to identify vulnerabilities early in the process.

4. Automated Testing: Leveraging automation to continuously test for weaknesses.

5. Chaos Engineering: Veritas has a system called REDLab, which simulates failures and tests the system’s robustness under stress.

6. Continuous Monitoring: Ensuring that the software remains secure throughout its lifecycle.

7. Incident Response: Being prepared to respond quickly and effectively when issues do arise.


A little more on chaos engineering. This technique actively injects failures and disruptions into the system to see how it responds, with the idea that systems are only as strong as their weakest points under pressure. Veritas' REDLab is central to this effort, putting systems under tremendous stress with controlled chaos experiments. The result is a more resilient product that can withstand real-world failures

Veritas also focuses on ways to validate and verify that code generation is done securely, along with a variety of ways to stress test software during multiple stages of the build process. The talk also touched on the importance of keeping technical teams motivated. Including examples of role-playing scenarios, movie stars, and innovative content ads a touch of fun and can help keep development teams engaged. 

As technologies evolve, so do the techniques required to keep software safe. Security is needed at every stage of the software development lifecycle. Using techniques like chaos engineering along with creative team engagement has helped Veritas stay at the front of secure software development.

Tuesday, October 11, 2022

Does Low Code Mean Low Testing? A #PNSQC2022 Live Blog



There has been a number of increases in various software development and deployment options referred to as "low code" or "no code". What this usually means is that the software development tools in question have created systems and abstractions to either hide the code created or to effectively minimize the amount of new code that needs to be written based on built-in methods and implementations. Intriguing, but with that methodology, does that limit our ability to test or interact with systems?

Jan Jaap Cannegieter









Jan Jaap Cannegeiter argues that many of these systems do have benefits and ways of interacting with how a system is pieced together. BY dragging and dropping elements that have already been constructed/implemented, lots of reusable pieces can be put together more like Lego blocks rather than by writing individual code blocks and methods. The idea of reuse and repurposing is not new. Animated programs have been doing this for decades. Heck, Hanna Barbera was famous for reusing whole blocks of animation and repurposing them in different scenes (ever noticed that The Brady Kids and The Archies when they perform their "songs" on their respective cartoons have the exact same movements ;)?). 

Again, the idea and benefit of low-code platforms is that they have four layers: processes, screen flows, business logic, and data model. The top two are likely the easiest to plug together, while the bottom two probably have the biggest challenges to implement and make reusable and pluggable. However, I would assume that, if the business logic and data model were "understood", then there would be an easier way to plug everything together. I can help but ask... who tests the business logic and the data model? How do we know that these are correct? With code, we can research and figure out whether or not the implementation is effective or if we are missing key areas or have areas we can exploit. If these are abstracted away, I'd argue that those areas become harder to test because we cannot verify (or we would have difficulty verifying) what that business logic actually is. That's not to say we can't test the logic gates but I feel we leave a lot on the table that doesn't get properly examined.

Anyway, I picked this talk because I specifically do not have a lot of experience with this topic or these tools. Does my skepticism hold up to scrutiny? I honestly do not know but I'm curious to explore more of these options so I can see if I'm right or wrong. Let's just say the jury's out at the moment but I freely confess my biases and doubts ;)