SUCCESS CASE #IoT #BigData
Multi-tenant data platform for IoT event historization and real-time analytics
Knolar is a business unit created by Cepsa (Compañía Española de Petróleos SA) in 2021 that responds to the new needs of Industry 4.0 in terms of monitoring facilities and making this data available to analysts and data scientists.
Knolar is already in use at Cepsa, allowing to know statistical data in real time and in a fast and visual way, thanks to artificial intelligence; and it is a solution to boost industrial data in sectors such as automotive, food, transport, chemical, utilities, smart cities…
Knolar meets the data needs of your entire team. Thanks to its democratization, all levels of your business can exploit the tool’s potential.
Knolar can connect and integrate in a simple, secure, agile, scalable and low-cost way, the data emanating from the sensors of your industrial facilities, domotic zones or any type of IIoT/IoT device, allowing you to have absolute control of what is happening in real time in your plants, industrial facilities, or any asset in the field of operation technology.
In addition, with knolar you will be able to guarantee the convergence with the world of data coming from your IT systems, such as CRMs, ERPs, Applications or Datawarehouses, providing a global vision of your company.
Knolar allows you to eliminate the complexity of IIOT-OT/IT source systems by enabling real-time, one-stop access to sophisticated, complex and unexplored data sources in order to generate real value for your business.
- The user experience had to isolate the user from what was under the hood, so that the user would not interact directly with AWS services, but would do so through a web portal with all the functionality for data ingestion, storage and consumption.
- The platform had to be multi-tenant so that each customer account would have its own set of users, with a predefined profile within the platform and their data would be isolated from the rest of the accounts.
- At the data ingestion level, different levels of latency were required, from real-time ingestion, making the data hot available for sub-second consumption, to batch file ingestion. These ingestions had to be contextualized, so that a person with no programming knowledge could use it.
- Perform data enrichment with other metadata.
- At the consumption level, there were different requirements to support:
- Descriptive analytics: Filling the need for business users to consume data via Excel directly from the database or data lake and a standard ODBC interface to connect to a BI tool.
- Advanced analytics: The platform was to be the centerpiece on which an ecosystem of predictive (predictive maintenance, demand forecasting…) or prescriptive (next-best action, forensic analysis of asset symptomatology…) analytics data products could be built.
- SaaS model, so the architecture should be standard for all customers without loss of flexibility and scalability based on the demand for data ingestion and consumption.
- As the platform evolves, canary-releases should be possible for certain environments and a gradual opening of functionalities to different clients.
To meet the needs of multi-tenancy of the platform and isolation of data environments and services for each client, we opted for the construction of a SaaS. For this Keepler relied on the AWS SaaS Factory best practices framework and the use of services such as Control Tower. Keepler deployed this service in a Siloed + Pooled solution.
The result is centralized governance of customer accounts deployed under the landing zone and automatic provisioning of new accounts at the click of a button. On the other hand, audit logs are centralized to be exploited at account and user level, allowing to establish transversal security audits.
Once the platform scaffolding was provisioned, the development of the client tenant architecture consisted of the following layers:

Keepler is a boutique company of professional technology services specialized in design, construction, deployment and software solutions operations of Big Data and Machine Learning for big clients. They use Agile and Devops methodologies and native services of the public cloud to build sophisticated business applications focused in data and integrated with different sources in batch mode and real time. They have Advanced Consulting Partner level and have a technical workforce with 90% of their professionals certified in AWS. Keepler is currently working for big clients in different markets, such as financing services, industry, energy, telecommunications and media.