As the digitalisation process of companies accelerates, the main strategies to improve margins and reduce costs involve the evolution of three main aspects in the short term: improving operational and resource efficiency, optimising time-to-market and product differentiation.

Sectors such as Energy, Healthcare, Manufacturing, Retail, Insurance or Fintech have seen a boom in the development of solutions based on big data and artificial intelligence to improve the personalisation of products and services, security and privacy of PII, increased transactional speed on a large scale, fraud detection, process automation, predictive maintenance or the ubiquitous chatbots, among others.

The Data Product approach is helping to put the focus on such solutions, which are highly oriented towards solving a specific business challenge through data, and is enabling them to deliver high business impact quickly and demonstrably, in areas such as improving customer experience, employee productivity, creating new products and developing competitiveness.

Relying on specialist technology partners or collaborators who have specialised knowledge and experience is the most reliable way to address these challenges, especially in organisations that are starting to work with data and need to establish appropriate Data Management or Data Governance practices and policies. But for this development to be sustainable, it requires an investment in training by companies and a personal effort by individuals to facilitate the acquisition of new skills and an open attitude to continuous development and training.

Companies should continue to implement best practices in the form of regulation and commitments to ensure that this developing technology is applied as transparently, ethically and fairly as possible. This means building datasets that are as representative as possible, checking for bias by defining metrics across different subgroups, conducting model sensitivity analyses, or prioritising interpretability by applying the simplest models that meet the objectives. 

Challenges

In order to move forward, companies will have to overcome some of the following challenges:

  • More data-centric perspective in AI projects, where the priority is not to accumulate data but to work on improving data quality and eliminating inconsistent data. Data quality labelling, data augmentation strategies, data versioning and feature stores will accelerate this process.
  • AI projects will have to take privacy and security aspects into account from the moment they are defined, where, in addition to securing information, practices will also be required to determine more robust and reliable models, applying techniques such as adversarial training to prevent responses to possible corrupt data or rare scenarios.
  • Automation of cognitive processes, incorporating services available on different cloud platforms (voice, image, text or decision) or making use of pre-trained “multimodal” state-of-the-art models (Dall-E or CLIP as examples) or textual (GPT 4) to solve different types of creative tasks such as performing semantic synthesis, creating new textual and visual content, or answering questions interactively.
  • Increasing big data capabilities of increasingly demanding processes, taking into account the rise of quantum computing for large-scale simulations or challenges in optimisation processes, among others.

 

Image: Freepik

Author

  • Javier Pacheco

    Data Scientist in Keepler Data Tech: "Live full, die empty" defines my state. This becomes my lifestyle taking me out of my comfort zone and driving my voracious learning attitude about different aspects of Data Science. I love learning by teaching and am always open to new challenges that push me further my comprehension."