Technology

This is the place to find out about the technology we develop, assets we bring to new projects and about our R&D activity

Scroll below
data science 1

Data Science

Data Science is a fast growing field at the crossroads of mathematics and computer science. We have built a team of data scientists to redefine how Data Science can transform operations processes. We have a continuous R&D activity to develop and validate innovative ways to manage the complexity of operations. 

  • We leverage the latest algorithms and Machine Learning libraries such as Sklearn, Pytorch, GluonTs, Shap, Spacy, etc
  • We are multi-cloud and trained to work on different cloud solutions such as GCP, Azure or AWS
  • We leverage ML Ops frameworks such as Docker, Kubeflow, Airflow, ML Flow

Data Science expertise

  • Tier-1 engineering schools and leading Data Science Master (MVA, ENSAE, Stanford, X-HEC, Ecole Centrale… )
  • Use agile principles to develop data solutions

Senior resources with a mix of data & operations experience

  • Project managers have experience in working with operations team
  • Organize and overview rituals of data teams

Domains of expertise:

  • Machine Learning, Deep learning
  • Reinforcement learning
  • Data engineering
  • Data lake, distributed systems
  • Data visualization and Business Intelligence
  • Operations Research
  • Complex modeling

IoT - Internet of things

The emergence of dedicated global, low cost IoT networks (Sigfox, LoRa, NB-IoT etc.) has significantly reduced the total cost of IoT data collection (position, temperature, shock, light, humidity etc) compared to previous generations. 

IoT sensors are now low-cost, standalone for several years, with a subscription cost of a few dollars per year through cloud-based solutions. This enables the automated tracking of millions of fixed or mobile objects: finished and semi-finished products, containers, durable containers, tools etc.

asset tracking

Our expertise : 

  • Identify the use cases of IoT in your organization
  • Build the business case with an assessment of expected benefits and costs
  • Experiment quickly thanks to our agile solutions that allow us to measure the relevance of the tested solution (proof of value)
  • Produce an initial concept of the optimal sensor and network solution and implementation strategy (market solution or development)
  • Anticipate the impact on the organization, business and systems and support the associated transformation
  • Develop and deploy IoT solutions (tracker + platform) with a particular focus on quality and time-to-market

Data platform

We have developed our expertise in data platforms. This layer of data-centric components has proven to be a key enabler to accelerate the development of analytics use cases, while ensuring synergies and applying governance principles. This layer aims to accelerate data ingestion, transformation and activation.  

A key success factor to support an ambitious data roadmap is the existence of a technological layer, a data platform, which brings together all business data – structured and unstructured – Machine Learning engines, data dictionary, development processes, connectivity (such as API) etc. while being synced with all transactional internal and external data sources, in order to be the “single source of truth” for all business use cases developed.  

Key components are:

  • Data governance and lineage
  • Data ingestion, quality control and storage
  • Self-service BI
  • Machine learning pipelines
  • Automation
  • Sandbox & production environments
  • APIs

Assets
R&D

We develop our own approach and tools to accelerate our work with clients. This can be a set of Machine Learning code libraries  for forecasting, text analysis, etc. or packaged solutions with their data architecture, visuals and user interface

Our products and assets are our long-term guideline. They are either open-sourced or available as part of the work we do for our clients

Supply Control Tower

Argon & Co Supply Control Tower is a proprietary solution which enables every supply-chain stakeholders to work in a customer-centric approach with shared priorities; to anticipate stock-outs

This solution effectively implements end-to-end pegging between Demand and Supply resources to reveal inter-dependencies between all purchasing orders, production orders, transfer orders and their consequences on final customers

Forecast_core

ForecastCore is a proprietary set of algorithmic tools, developed in Python, used to accelerate the creation of features and the use of Python libraries related to Machine Learning use-cases. This tool was created to leverage a variety of algorithmic ML process, to identify the most relevant axis and algorithms to use as a base. It allows us to quickly develop specific Machine Learning scenarios for your organization.

ForecastCore leverages 4 years of R&D and keeps on being improved everyday

Horizon

Horizon is a forecasting and data visualization tool, taking as an input a series of basic business data (sales, stock, product, actual sales and last forecasts) and providing statistical forecast simulations based on a variety of algorithms. These results are analyzed in a data visualization tool (Power BI), combining activity and forecast dashboarding features

Ecosystem

From open-source libraries, to all-inclusive datalab solutions, data ad analytics solution market is a fast evolving environment.

We are connected to most key players and certified on a set of solutions and frameworks to ensure we can deliver use-cases in your selected environment. We also operate our own data platform within Argon & Co

Explore our services