What is the first step of application security? What is this step regardless of whether the process involved is DevOps or traditional silos? We have heard many answers before, such as architecture, code analysis, hardening, risk analysis, etc. But we have not really talked about the intersection of the user, application, data, and system. Perhaps this is part of architecture, but I see this as a need for all applications. Security must be able to protect the data and, simultaneously, the user. Security is about the traditional availability, confidentiality, and integrity as well as privacy these days.
To have good application security, we must first not only architect the system well, but that architecture should be iterative. But before we even put pen to paper on an architecture, we need to think well outside the box. We need to think about how the application will be used. In essence, we need to follow the user through the application to the data they seek that currently resides on a system in our control. We need to determine where that data will end up: will it be downloadable to an end user computing (EUC) device such as a smart device, a point-of-sale (POS) device, an Internet of Things (IoT) device, or a desktop (virtual or physical)? In addition, what data will we be taking in from those devices, how will that data be protected, and where will it live?
When you think about the recent instances of data mining and breaches, the real issue in my mind is not that they happen (which we have known about for years); rather, it is the scope of the data available to mine within databases or on systems without adequate protection. To me, this implies that the application does not have a handle on the security and privacy of user data transferred from the EUC device or the organizational data transferred to it. Once an application is out in the wild, there is a good chance it will be used in unfamiliar ways. These experiences should be tracked and analyzed as they happen.
This burden is carried not only by the application developers, but also by the application’s users. Users need to protect themselves as much as the application developers need to protect the users’ privacy. But perhaps there are just too many dialogs asking whether you want to accept something. After all, we have been trained by everyone to agree to those requests. Just look at how folks use browsers—look at the demos that are done, etc. They get accepted and the data flows freely. To protect people’s privacy, those dialogs should not appear, and the functionality should be enabled purposefully: perhaps a setting within the application. Perhaps it is time for us to review all of our settings to ensure that we are not leaking data about ourselves, such as sharing our geographic locations for EUC based games: Angry Birds, anyone?
So, we started off discussing application security and ended up with privacy. The two go hand in hand. Application designers, developers, and security folks need to be cognizant of where their data is going, where user data is going, how data is stored, and how it is protected and for how long. Perhaps they should even know the why of the storage. Data should not be stored just because we can: there should be some need for it. Users, on the other hand, also have a responsibility to protect their privacy, to ensure that the applications do not receive more information than users want to give up.
A bit of homework for users: check your smartphone and tablet settings, and see what you are sharing. What have you already given access to specific applications? Ask yourself the following question: does the application need access to a microphone, camera, location, sound, and other data?
Share this Article:
Latest posts by Edward Haletky (see all)
- Continuous Integration, Deployment, and Testing - July 22, 2016
- Serverless: Business Plan or an Approach to Technology? - July 21, 2016
- Root Cause Analysis Is Not Dead - July 13, 2016