Leaking sensitive data costs the companies in average of more than 3.25 million euros. Thanks to the Data Loss Prevention program, which it develops and integrates deep into the system, Safetica Brno provides companies around the world protection against the loss of their sensitive data.

According to CTO of Safetica Zbyněk Sopuch, data protection does not mean only the development of a perfect solution, but also communication with the user and their education. He says: “They need feedback on what they are doing and what they intended, and they need to learn it prudently. We are actually educating them in safety.”

What technologies can developers in Safetica

Safetica is developing a solution that protects authorized users from data handling errors. It solves the possible problems “between the chair and the keyboard”, and therefore it needs to be in the place where the data and the user meet – on the end device, on the mobile phone, in the cloud service, etc. The technological domain is so broad that it is easier to define what we do not do there than the other way around. Our solution is not just a mobile application, or the website with databases, as other services are. We are on all platforms. We are integrated deep in the system, we are solving drivers, we are interfacing with the user, doing data analytics. For fun, I'm saying that we are doing everything but the production of hardware, although we thought about that actually for a short time, for example in the sense of encrypted flash drives.

What exactly is your technology stack? 

Our technology stack was initially strongly influenced by Microsoft. We are their Gold partner and where it made sense, we built it using their technologies. Which meant the whole Windows platform (C++, C#) and cloud backend on Azure. Generally speaking, however, we choose native technologies, we are building the web tier in Angular, on Mac we use Objective C, C++ and Swift. On a mobile phone Swift, and Java on Android. And data analytics in Python, for example.

We are not working with Java and PHP which would bring additional platform dependencies and security risks to the customer environment.

You are involved directly in the operating
system. Can you describe it in more detail?

For example, many companies create applications for macOS. But we are going deeper and see how the system works. Where it has its weaknesses and how they can be addressed. Most developers didn't realize that when Apple released macOS Catalina, the entire security system around the kernel changed and they essentially banned using your own drivers and the like. We knew it.

We know how the system works with files and what the user is doing on the network, and we are trying to give them a helping hand when necessary. We need to know what they are doing when, for example, sending data into the Apple Ecosystem or Office 365, which are features that are more and more integrated with the operating system. Getting back to macOS, this means practically the Linux level, so we are recruiting "linuxers" for macOS as well.

How do you approach different versions of operating systems? 

We have to support not only all mainstream versions of operating systems but also third-party applications. Because of that, we are forced to defensive coding, but most of all it demands QA. We wouldn't be able to do it manually, everything is running under a robust automation, which does most of the work for us alone overnight. But sometimes we just run into a problem in the field that surprises us. For example, we have found a bug in Adobe that worked badly with files; we have found a bug in the printing API from Microsoft, or even in YSoft's drivers. Sometimes it is enough to report the bugs and they will correct them, but sometimes the companies neglect it up and we have to deal with it. The customer expects it from us.

What are your technologic differences from mainstream
antivirus companies like Avast or ESET?

We have a similar technology stack, but simply put, they watch what goes inside. We deal primarily with the authorized users, i.e. with the way out. It's about inspecting the outgoing data, detecting the file content, integrating with email clients and web browsers. We deal with the relation between the outgoing data and the company or legislation, such as GDPR. We are working deep in the network, we are decrypting SSL and watching what is flowing there to make sure that company data does not go out.

Another thing that makes a difference is the focus on user behavior. Nobody cares that on the disk thousand of operations have been performed. Antivirus doesn't care. It checks the essentials, omitting the rest. And we are expected to tell the system behavior from the user's intention and check it contextually. Alternatively, we explain to the user what is wrong. Can you imagine confronting users with what the corporate backup software does, or that you have found sensitive data in the temporary application files? The user does not understand this, and most importantly did not cause it. That is why we regard the end-user and communication with them at the top of our corporate priorities. Sometimes it's as much about psychology as it is about IT.

Are you supervising traffic at the end stations, or in cloud services as well? 

We are doing the same in the cloud, but it's far more challenging because ninety percent of what you do at the end station cannot be done there. In addition, the scaling is much more a sudden increase. If we install our solutions on end stations, they can be installed gradually and they take the power directly from the station. Imagine Office 365, where thousands of users can appear in the cloud at once by turning it on.

We monitor the impact on users a lot. When copying data from one server to another, it's fast. When we have to check it, and the check takes place on a cloud service, then it takes ten times longer. The check itself is fast, but the data are traveling a different way. Then it's a question of how to solve it. Caching, components in place, launch dialogs from the operating system and the like.

How is your approach to QA? 

With a great emphasis. On the one hand, we are directly influencing the work of each customer’s employee and we can very easily stop them. And then there's the security aspect. We must not fail in that. We are working with the client's most sensitive data, and if we don't think of consequences, we could expose their data. We are working for clients with extreme security requirements, such as banks or security services.

Therefore, our release process is thoroughly elaborated, with an emphasis on eliminating the risk of human error. This allows us to react and release quickly even under such strict conditions. We also have a mandatory code review for everything that goes out and we are striving to share information and experience as much as possible. Basically, it can be said that every one of us is involved in QA, from the product manager to the developer. They write their unit tests and do the mentioned code reviews. The QA engineers are responsible for how the function is designed from a concept perspective, but also for the entire automation architecture and team DevOps. They are not any “clickers”, but full-fledged engineers who have developed our entire automation product.

How are your teams organised?  

They are organized in business agile teams and the related areas are joined. For example, one team is focused on the data security for Windows, so it does integration with Office 365 as well and is responsible for the entire security model with the end-user. Another team handles the cloud backend, endpoint communication and data processing, and is therefore responsible for reporting and interacting with the admin. For example, the Mac team also takes care of data detection and analysis, which is a separate business area.

Our intent is that our people grow not only in the sense of their technical skills but also as humans and in terms of competencies. Our guys have a very broad overview. They are doing things across the business vertical: networks, low-level stuff, cloud, front end. But they go very deep in their favorite expertise. We need a balance between substitutability and high-end experts.

Which technological challenges are you expecting to face? 

I will start from the bottom, where specific challenges are lying for us. Concerning the end-station technology, we already have the necessary integration with all OS except Linux. This will come later. The big challenge is the cloud. There are several approaches, but it is a raging area. And then there are the more expected areas – we are building a robust cloud SaaS system (really native) and we are going to automate gradually more with machine learning and data analysis. The only trend for which we do not have any use yet is perhaps blockchain. But if anyone has a good idea, come here with it!

Petra Voženílková
HR Manager @Safetica

She devotes herself to the people who make up Safetica and excitedly introduces it to those who might become a part of it one day.

Next articles

Data Loss Prevention in Logistics

In the logistics sector, DLP plays a pivotal role in securing the multiple data streams involved in supply chain operations. Learn how you can protect your data in logistics with Safetica.

Securing Slack: The Power of Data Loss Prevention

Slack has become the go-to cloud-based collaboration tool for companies of all sizes. Read how to secure Slack with Safetica.

Data Loss Prevention in Government

Governments house a wealth of sensitive information, from classified data to citizens' records. Explore the complex world of government data breaches and learn how data loss prevention protects governmental institutions.