Managing Inconsistent, Duplicate, and Non-Equivalent Spare Parts in a Busy Factory

Managing Inconsistent, Duplicate, and Non-Equivalent Spare Parts in a Busy Factory

Managing Inconsistent, Duplicate, and Non-Equivalent Spare Parts in a Busy Factory

In a busy factory environment, managing spare parts inventory is complex. You deal with parts from multiple manufacturers and distributors, each with different naming conventions, leading to duplicate records, over-purchasing, and operational inefficiencies. This can result in significant amounts of capital tied up in excess stock, inaccurate procurement, and production delays.

As you embark on a new capital project, these challenges increase, making it harder to control inventory and potentially leading to over-purchasing of the same parts under different names.

KOIOS Solution: A Data-Driven Approach to Spare Parts Management

KOIOS offers a data cleansing and governance solution that standardizes spare parts data in your ERP system, reducing duplication, identifying equivalent parts, and optimizing your procurement process. By creating a single source of truth, KOIOS enables you to reduce working capital, improve operational efficiency, and make better decisions based on clean, real-time data.

Key Benefits of KOIOS for Spare Parts Management in a Factory Setting

1. Reduction in Working Capital

Minimized Overstock Through Equivalence Matching: By identifying duplicate and equivalent parts in your system, KOIOS reduces unnecessary stock. In a typical manufacturing setting, inventory levels can be reduced by 10% to 20%. For example, in a factory with $2 million worth of spare parts in stock, KOIOS could free up $200,000 to $400,000 in working capital.

Eliminate Redundant Purchases: In a busy factory, ordering duplicate parts is common due to different naming conventions across suppliers. KOIOS standardizes this data, reducing duplicate orders and freeing up an additional 5% to 10% of your procurement budget, potentially saving $50,000 to $100,000 annually in unnecessary purchases.

2. Operational Efficiencies in a Fast-Paced Environment

Accurate Material Bills of Materials (BoMs): KOIOS generates clean, standardized BoMs, ensuring that only necessary parts are ordered for projects. By reducing over-ordering and ensuring the right parts are available, KOIOS can reduce procurement lead times by 15% to 30%, which translates into faster project completions and fewer production delays.

Reduced Manual Effort: Automated data cleansing and the identification of equivalent parts can reduce the time spent on manual data management by 50%, allowing your procurement and warehouse teams to focus on higher-value activities.

Supplier Optimization: With clear equivalence data, KOIOS enables you to compare prices and lead times across suppliers more effectively, improving your negotiation position and reducing part costs by 5% to 15%.

3. Improved Data Quality and Governance

Single Source of Truth: KOIOS eliminates inconsistencies and duplicates, ensuring that both warehouse and procurement teams work from accurate, standardized information. Improved data quality can reduce ordering errors by 20%, leading to smoother operations and fewer unexpected shortages or delays.

Ongoing Data Governance: KOIOS provides continuous monitoring and updates to your data, ensuring that new parts introduced during your capital project are standardized from the outset, preventing future duplications and improving long-term inventory management.

4. Cost Savings in a Factory Context

Procurement Savings from Equivalence Substitution: KOIOS’ ability to identify equivalent parts across suppliers allows you to choose the most cost-effective option. This flexibility can reduce procurement costs by 5% to 10%, potentially saving $50,000 to $100,000 annually, depending on the scale of your operation.

Optimized Warehouse Space: By reducing redundant and overstocked inventory, KOIOS helps you optimize your warehouse space, potentially reducing storage costs by 10% to 15%. In a factory where warehouse costs total $200,000 annually, this can mean a $20,000 to $30,000 reduction.

5. Decision-Making Control and Flexibility

Empowered Teams with Real-Time Data: KOIOS provides real-time, accurate data to both warehouse and procurement teams, giving them the insights they need to make informed decisions. With clean data and equivalence matching, they can act quickly to substitute parts or negotiate with suppliers, reducing downtime by up to 20%.

Future-Proof for Capital Projects: As new parts are introduced during your capital project, KOIOS ensures that they are seamlessly integrated into your ERP system, preventing future issues of mismanagement and duplication. This reduces the administrative burden by 30%, freeing up time for more strategic activities.

Equivalence: A Game Changer in Spare Parts Management

In the real-world operations of a busy factory, recognizing equivalent parts is crucial for reducing unnecessary purchases and ensuring operational efficiency. KOIOS identifies interchangeable parts, allowing you to make faster, more informed decisions when sourcing from different suppliers. This reduces your reliance on single suppliers and improves your flexibility in managing unexpected shortages or price fluctuations.

Conclusion: Significant Financial and Operational Gains with KOIOS

KOIOS is a powerful solution for the real-world challenges of managing spare parts in a factory environment. By cleansing and standardizing your ERP data, identifying equivalences, and improving governance, KOIOS can reduce your working capital requirements, improve procurement efficiency, and save significant costs. For a factory with a $2 million spare parts inventory, KOIOS could deliver savings of $300,000 to $500,000 through reduced inventory levels, better procurement choices, and optimized warehouse space.

With KOIOS, you gain control over your spare parts data, allowing your teams to make smarter, faster decisions, and ensuring that your operations run smoothly — both during day-to-day activities and new capital projects.

Contact us

Give us a call and find out how we can help you.

+44 (0)23 9387 7599

info@koiosmasterdata.com

The Power of Data Quality in the Global Supply Chain

The Power of Data Quality in the Global Supply Chain

The Power of Data Quality in the Global Supply Chain

In today’s interconnected world, data quality is a critical factor in the success of global supply chains. As supply networks become more complex, businesses need more than just access to data—they require high-quality, structured, and standardized data to ensure interoperability and transparency across industries and borders.

The Case for a Global Product Data Library

A Global Product Data Library is the cornerstone of modern supply chains. By consolidating verified, manufacturer-sourced product data into one trusted repository, this library creates a single source of truth. This reduces discrepancies, minimizes fragmentation, and ensures that manufacturers, distributors, and end-users all access consistent and reliable data.

The benefits of a centralized data library extend beyond data sharing:

1 – Consistency: Standardizing product information across suppliers prevents costly errors from miscommunication and data misalignment.

2 – Accuracy: Manufacturer-verified data ensures businesses are making informed, precise decisions with the most up-to-date information.

3 – Global Accessibility: Data is available in multi-language, machine-readable formats, ensuring seamless integration and eliminating misinterpretation.

The Impact of Structured, Interoperable Data

Structured, interoperable, and semantic data unlocks the full potential of supply chain efficiency. When data is standardized and accessible, businesses can achieve several transformative benefits:

1 – Enhanced Decision-Making: Accurate data empowers businesses to make real-time, data-driven decisions—whether optimizing inventory, predicting demand, or automating procurement.

2 – Reduced Operational Costs: Poor data quality often leads to inefficiencies, excess inventory, and increased overhead. Interoperable data minimizes these challenges, streamlining operations and cutting costs.

3 – Improved Visibility and Resilience: Seamless data exchange enhances supply chain transparency and coordination, reducing risks and making the entire chain more adaptable to disruptions.

4 – Sustainability: High-quality data optimizes procurement and logistics, reducing waste and environmental impact. Companies can maintain accurate inventories, avoid unnecessary shipments, and contribute to a more sustainable supply chain.

Conclusion: The Time for Data Quality is Now

As global supply chains continue to evolve, the need for structured, quality, and interoperable data becomes more critical. A Global Product Data Library, like KOIOS, ensures that businesses can operate on a unified, reliable foundation. Investing in data quality doesn’t just improve internal operations—it builds a resilient, efficient, and sustainable future for the entire global supply chain.

Contact us

Give us a call and find out how we can help you.

+44 (0)23 9387 7599

info@koiosmasterdata.com

Flying Blind: How Bad Data Undermines Business and How KOIOS Solves It

Flying Blind: How Bad Data Undermines Business and How KOIOS Solves It

Flying Blind: How Bad Data Undermines Business and How KOIOS Solves It

As businesses increasingly adopt data-driven systems and processes, leaders are striving to “Moneyball” everything from operations to decision-making. But as data volumes continue to soar, the quality of that data has become a critical challenge. KOIOS offers a revolutionary approach to tackling this issue, transforming fragmented, unstructured data into high-quality, machine-readable information, creating a single source of truth for global supply chains.

With AI, machine learning (ML), and the Internet of Things (IoT) now contributing massive volumes of data daily, the need for robust data quality standards is more urgent than ever. KOIOS addresses this head-on by using international data standards like ISO 8000, and by housing data within its Product Data Library, ensuring that product information is accurate, standardized, and interoperable across global systems.

The High Cost of Low-Quality Data

The cost of bad data is staggering. According to Gartner, businesses lose an average of $15 million per year due to poor data quality. Meanwhile, IBM estimates that poor data quality drains a whopping $3.1 trillion from the U.S. economy annually. These losses stem from inefficiencies such as lower productivity, system outages, and higher maintenance costs, all of which KOIOS mitigates through its structured, interoperable data solutions.

The problem is exacerbated by the lack of trust in existing business data. A Vanson Bourne study found that 91% of IT decision-makers recognize the need to improve data quality, with 77% reporting a lack of trust in their organization’s data. KOIOS directly addresses these concerns by offering a robust platform where data is consistently verified against a trusted single source of truth—the KOIOS Product Data Library.

The KOIOS Approach: Data Quality and a Single Source of Truth

KOIOS distinguishes itself through its rigorous approach to data governance. Unlike traditional systems, KOIOS builds data quality from the ground up by leveraging AI and ML to onboard, cleanse, and validate data against its Product Data Library. This library acts as a central repository of authoritative product data, ensuring all data is compliant with international standards and ready for real-time, global use.

By focusing on data interoperability, KOIOS removes the silos and inconsistencies that often plague supply chain and enterprise data systems. Every product, process, and actor is mapped to global standards (such as GS1, ISO 22745, and ISO 8000) to ensure the data can flow seamlessly between different systems, eliminating the operational inefficiencies caused by bad data.

A Case for Data Quality Standards

Consider the airline industry, where poor data quality often results in “mistake fares,” causing airlines to either lose revenue or face public backlash. In global supply chains, the cost of bad data can be even higher—misclassified products, incorrect tariffs, and regulatory fines are just a few of the consequences. KOIOS provides businesses with the tools they need to prevent these costly errors by delivering real-time, accurate data that meets the highest quality standards.

For example, financial services depend heavily on accurate credit scoring algorithms. When bad data feeds into these systems, it can cause consumers to be misclassified as risky borrowers, leading to what’s known as “algorithmic bias.” KOIOS prevents these scenarios by ensuring that all data within the system is properly cleansed, standardized, and validated, creating trust across the supply chain.

Ensuring Data Integrity with KOIOS

KOIOS goes beyond just fixing data; it ensures that businesses never face bad data issues in the first place by establishing a robust data governance framework. Here’s how KOIOS ensures data integrity:

1 – Single Source of Truth: All data is validated against the KOIOS Product Data Library, a repository that serves as the authoritative source for product and supply chain information. This eliminates the discrepancies often found in unverified or manually entered data.

2 – Interoperable Data: KOIOS uses global standards, ensuring that product data can be seamlessly exchanged across different systems, regions, and supply chains. This interoperability enhances data consistency and eliminates the bottlenecks caused by incompatible formats.

3 – AI-Driven Data Cleansing and Governance: KOIOS leverages AI and ML technologies to continuously cleanse and govern data, ensuring ongoing compliance with international standards such as ISO 8000. This prevents data decay over time, ensuring businesses maintain long-term data quality.

KOIOS: The Solution to Low Data Utilization

Forrester Research estimates that less than 0.5% of all data is ever analyzed and used, but even a 10% improvement in data accessibility could add $65 million in additional net income for a typical Fortune 1000 company. KOIOS enables this by improving both the quality and accessibility of data across the enterprise. With KOIOS’ Product Data Library, businesses can unlock insights from their data without the need for costly manual cleansing efforts.

The Future of Data Quality with KOIOS

KOIOS’ AI-powered platform ensures that businesses don’t fall victim to the high costs of bad data. By delivering interoperable, high-quality data directly from the source—whether manufacturers, suppliers, or distributors—KOIOS enables organizations to maintain a high level of trust and accuracy in their data.

Data is more than just an asset; it’s the foundation of strategic decision-making. With KOIOS, businesses can ensure that their data is not just accessible, but also trustworthy, compliant, and ready for the digital future. KOIOS prevents the costly repercussions of bad data by delivering a comprehensive solution that focuses on real-time accuracy, compliance, and long-term data governance.

By using KOIOS, companies can avoid flying blind and instead make data-driven decisions with confidence, knowing that their data is validated, interoperable, and aligned with the highest quality standards.

Contact us

Give us a call and find out how we can help you.

+44 (0)23 9387 7599

info@koiosmasterdata.com

Why you should launch a data management program – A (data) message to the C-suite

Why you should launch a data management program – A (data) message to the C-suite

Why you should launch a data management program – A (data) message to the C-suite

How do you create quality information from data silos

If you are an executive in a rapidly growing organization whose stated mission is “to be the most successful business in the world” you will be aware that rapid growth brings a lot of excitement, and that this very growth changes the nature of the way the company operates.

Instinctive understanding of the business becomes more challenging as more operating sites and businesses are added to the group. Executives can no longer rely solely on their knowledge of individual sites, and as the business grows, they rely more on reports compiled from the different businesses to keep a grip on the business metrics.

Any significant increase in new data brings new dynamics. Data silos multiply making it more difficult to aggregate data across departments; across sites; across regions; and across national borders. Recently acquired business units will inevitably lack a common language. Some common phrases may even have a different meaning in different parts of the organization. This lack of a common understanding makes it more likely that business opportunities will be missed.

Good quality master data is the foundation for making sound business decisions, and also for identifying potential risks to the business. Why is master data important? Master data is the data that identifies and describes individuals; organizations; locations; goods; services; processes; rules; and regulations, so master data runs right through the business.

All departments in the business need structured master data. What is often misunderstood is that the customer relationship management (CRM) system; the enterprise resource management system (ERP); the payroll system; and the finance and accounting systems are not going to fix poor master data, the software performance always suffers because of poor quality data., but the software itself will not cure that problem.

The key is to recognise what master data is not; master data is not an information technology (IT) function, master data is a business function. Improving master data is not a project. Managing master data is a program and a function that should be at the heart of the business process.

Foundational master data, that is well structured, and good quality is a necessity if your business is going to efficiently and effectively process commercial data; transaction reporting; and business activity. The reality is that well-structured, good quality, master data is also the most effective way to connect multiple systems and processes, both internally and externally.

Good quality data is the pathway to good quality information. With insight from good quality information, business leaders can identify – and no longer need to accept – the basic business inefficiencies that they know exist, but cannot pin down. For a manufacturing company making 5% profit on sales, every $50,000 in operational savings is equivalent to $1,000 000 sales. It follows then, that if you have $50,000 of efficiency savings that you can identify and implement as a result of better quality information, you have solved a million-dollar problem.

If you, as an executive, are unsure of the integrity and accuracy of the very data that is the foundation for the reports you rely on in your organization, launching a master data management program is the best course of action you can take.

Contact us

Give us a call and find out how we can help you.

+44 (0)23 9387 7599

info@koiosmasterdata.com

About the author

Peter Eales is a subject matter expert on MRO (maintenance, repair, and operations) material management and industrial data quality. Peter is an experienced consultant, trainer, writer, and speaker on these subjects. Peter is recognised by BSI and ISO as an expert in the subject of industrial data. Peter is a member ISO/TC 184/SC 4/WG 13, the ISO standards development committee that develops standards for industrial data and industrial interfaces, ISO 8000, ISO 29002, and ISO 22745. Peter is the project leader for edition 2 of ISO 29002 due to be published in late 2020. Peter is also a committee member of ISO/TC 184/WG 6 that published the standard for Asset intensive industry Interoperability, ISO 18101.

Peter has previously held positions as the global technical authority for materials management at a global EPC, and as the global subject matter expert for master data at a major oil and gas owner/operator. Peter is currently chief executive of MRO Insyte, and chairman of KOIOS Master Data.

KOIOS Master Data is a world-leading cloud MDM solution enabling ISO 8000 compliant data exchange

Flying Blind: How Bad Data Undermines Business and How KOIOS Solves It

Data quality: How do you quantify yours?

Data quality: How do you quantify yours?

Being able to measure the quality of your data is a vital to the success of any data management programme. Here, Peter Eales, Chairman of KOIOS Master Data, explores how you can define what data quality means to your organization, and how you can quantify the quality of your dataset.

In the business world today, it is important to provide evidence of what we do, so, let me pose this question to you: how do you currently quantify the quality of your data?

If you have recently undertaken an outsourced data cleansing project, it is quite likely that you underestimated the internal resource that it takes to check this data when you are preparing to onboard it. Whether that data is presented to you in the form of a load file, or viewed in the data cleansing software the outsourced party used, you are faced with thousands of records to check the quality of. How did you do that? Did you start by using statistical sampling? Did you randomly check some records in each category? Either way, what were you checking for? Were you just scanning to see if it looked right?

The answer to these questions lies in understanding what, in your organization, constitutes good quality data, and then understanding what that means in ways that can be measured efficiently and effectively.

The Greek philosophers Aristotle and Plato captured and shaped many of the ideas we have adopted today for managing data quality. Plato’s Theory of Forms tells us that whilst we have never seen a perfectly straight line, we know what one would look like, whilst Aristotle’s Categories showed us the value of categorising the world around us. In the modern world of data quality management, we know what good data should look like, and we categorise our data in order to help us break down the larger datasets into manageable groups.

In order to quantify the quality of the data, you need to understand, then define the properties (attributes or characteristics) of the data you plan to measure. Data quality properties are frequently termed “dimensions”. Many organizations have set out what they regard as the key data quality dimensions, and there are plenty of scholarly and business articles on the subject. Two of the most commonly attributed sources for lists of dimensions are DAMA International, and ISO, in the international standard ISO 25012.

There are a number of published books on the subject of data quality. In her seminal work Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information™ (Morgan Kaufmann, 2008), Danette McGilvary emphasises the importance of understanding what these dimensions are and how to use them in the context of executing data quality projects. A key call out in the book emphasises this concept.

“A data quality dimension is a characteristic, aspect, or feature of data. Data quality dimensions provide a way to classify information and data quality needs. Dimensions are used to define, measure, improve, and manage the quality of data and information.
The data quality dimensions in The Ten Steps methodology are categorized roughly by the
techniques or approach used to assess each dimension. This helps to better scope and plan a project by providing input when estimating the time, money, tools, and human resources needed to do the data quality work.

Differentiating the data quality dimensions in this way helps to:
1) match dimensions to business needs and data quality issues;
2) prioritize which dimensions to assess and in which order:
3) understand what you will (and will not) learn from assessing each data quality dimension, and:
4) better define and manage the sequence of activities in your project plan within time and resource constraints”.

Laura Sebastian-Coleman in her work Measuring Data Quality for Ongoing Improvement, 2013 sums up the use of dimensions as follows:

“if a quality is a distinctive attribute or characteristic possessed by someone or something, then a data quality dimension is a general, measurable category for a distinctive characteristic (quality) possessed by data.

Data quality dimensions function in the way that length, width, and height function to express the size of a physical object. They allow us to understand quality in relation to a scale or different scales whose relation is defined. A set of data quality dimensions can be used to define expectations (the standard against which to measure) for the quality of a desired dataset, as well as to measure the condition of an existing dataset”.

Tim King and Julian Schwarzenbach in their work, Managing Data Quality – A practical guide (2020) include a short section on data characteristics, that also reminds readers that when defining a set of (dimensions) it depends on the perspective of the user; back to Plato and his Theory of Forms from where the phrase “beauty lies in the eye of the beholder” is derived. According to King and Schwarzenbach quoting DAMA UK, 2013, the six most common dimensions to consider are:

  • Accuracy
  • Completeness
  • Consistency
  • Validity
  • Timeliness
  • Uniqueness

The book also offers a timely reminder that international standard ISO 8000-8 is an important standard to reference when looking at how to measure data quality. ISO 8000-8 describes fundamental concepts of information and data quality, and how these concepts apply to quality management processes and quality management systems. The standard specifies prerequisites for measuring information and data quality and identifies three types of data quality: syntactic; semantic; and pragmatic. Measuring syntactic and semantic quality is performed through a verification process, while measuring pragmatic quality is performed through a validation process.

In summary, there is plenty of resource out there that can help you with understanding how to measure the quality of your data, and at KOIOS Master Data, we are experts in this field. Give us a call and find out how we can help you.

Contact us

In summary, there is plenty of resource out there that can help you with understanding how to measure the quality of your data, and at KOIOS Master Data, we are experts in this field. Give us a call and find out how we can help you.

+44 (0)23 9387 7599

info@koiosmasterdata.com

About the author

Peter Eales is a subject matter expert on MRO (maintenance, repair, and operations) material management and industrial data quality. Peter is an experienced consultant, trainer, writer, and speaker on these subjects. Peter is recognised by BSI and ISO as an expert in the subject of industrial data. Peter is a member ISO/TC 184/SC 4/WG 13, the ISO standards development committee that develops standards for industrial data and industrial interfaces, ISO 8000, ISO 29002, and ISO 22745. Peter is the project leader for edition 2 of ISO 29002 due to be published in late 2020. Peter is also a committee member of ISO/TC 184/WG 6 that published the standard for Asset intensive industry Interoperability, ISO 18101.

Peter has previously held positions as the global technical authority for materials management at a global EPC, and as the global subject matter expert for master data at a major oil and gas owner/operator. Peter is currently chief executive of MRO Insyte, and chairman of KOIOS Master Data.

KOIOS Master Data is a world-leading cloud MDM solution enabling ISO 8000 compliant data exchange

International trade and counterfeiting challenges: a new digital solution that will traverse the borders – Part 2

International trade and counterfeiting challenges: a new digital solution that will traverse the borders – Part 2

International trade and counterfeiting challenges: a new digital solution that will traverse the borders – Part 2

Part 2 – Introducing K:blok – the digital solution to international trade and counterfeit challenges

Introduction

In February 2019, we (KOIOS Master Data) embarked on a successful year long research and development project focusing on “Using ISO 8000 Authoritative Identifiers and machine-readable data to address international trade and counterfeiting challenges”. This project was funded by Innovate UK, part of UK Research and Innovation. ISO 8000 is the international standard for data quality.

Part one of this article explains the challenges HMRC and the UK PLC face due to counterfeiting and misclassification when importing into the UK, and outlines a digital solution to solve those challenges. Upon which we won our Innovate UK grant.

This part of the article (part two) outlines the development progress made towards building a digital solution, how machine learning and natural language processing techniques were used during the year-long project and how the project can move forward.

K:blok – technology to traverse borders

To tackle the challenges outlined in part one, we developed a new software product, K:blok.

K:blok is a cloud application that allows importers to create a digital contract between the parties involved in the cross border movement of goods from the manufacturer to the importer/buyer. These parties can include: manufacturers, shippers, freighters, insurers and lawyers, amongst others.

The contract brings together, in a single source, various pieces of data that are required to successfully and efficiently import a product into the UK and data that is not currently captured in any software system:

  • ISO 8000 compliant, machine readable, multilingual product descriptions produced by the manufacturer of the products;
  • ISO 8000 compliant Authoritative Legal Entity Identifiers (ALEI’s) for each organisation that participates in the trade;
  • Accurate commodity codes for each product, the quantity of products, serial numbers and anti-counterfeit information (only visible to the manufacturer, the buyer and HMRC) to help validate the authenticity of the product;
  • Trade specific information required for insurance and accountability, for example: the trade incoterm;
  • Licensing and trading information about the parties in the contract, for example: Economic Operators Registration and Identification (EORI) number;
  • Information regarding the route the product is taking, for example: the port of import into the UK, port of export from the original country of export, vessel/aircraft numbers and locations of the change of custody of the consignments.

The contract is digital, machine readable, can be exchanged without loss of meaning and is suitable for interoperating with distributed ledger technology, like blockchain.

This data can be accessed and used by any of the participants of the contract and analysed by HMRC. All of this data is captured before the goods are moved which, in turn, provides an intelligence layer and pre-arrival data on goods for HMRC analytics, to enable resources to be targeted at consignments deemed high risk.

This single source of data also provides buyers with an audit trail for their purchased products, which begins with the original manufacturer which assists with the authentication of the product received and can form the basis of an efficient global trusted trader scheme.

Natural language processing will help avoid misclassification

As discussed in part one, misclassification leads to the UK losing billions in tax revenue. Misclassification is both intentional and unintentional. Reducing the unintentional misclassification could save the UK millions in tax revenue.

There is a fundamental flaw in the current process of tariff code assignment. The party that currently assigns the tariff code is not usually the manufacturer of the product. Therefore, the party does not have the technical knowledge to classify the product correctly. This party also rarely has a full description of the product and resorts to using a basic description from an invoice to assign the code.

Currently, HMRC provides an online lookup and email service to enable UK businesses to assign the correct tariff code. However, there are concerns that the service is not time efficient. This concern will only get worse as more companies may have to classify their goods once the UK leaves the European Union (EU).

Therefore, as part of our project, we worked with two students from the University of Southampton, studying Computer Science with Machine Learning, to create an additional application programming interface (API) that links with the government tariff code API and uses natural language processing techniques to score a similarity between an input product description and the potential mapping to the correct tariff code.

This is accessible by manufacturers using the KOIOS software to link their ISO 8000 compliant product specifications to the correct commodity code for trading with the UK.

Techniques such as term frequency-inverse document frequency (tf-idf) and K-means were integrated into this API. Support Vector Machine (SVM), Random Forest and a Deep Neural Network (2 layers) have also been explored to improve the accuracy of the algorithm.

The API successfully improves on the searching capabilities of the government online lookup service within the product areas explored in this project – which were bearings and couplings.

KOIOS are uniquely positioned to continue the development of digital solutions for the UK PLC

Our Innovate UK project provides a foundation to achieve more efficient, cost-effective, cross border trading and to reduce counterfeit activities. We believe that data standards, including ISO 8000 can play a huge part in digitising and automating this process further.

We are ideally suited and uniquely positioned to continue the research and development of both the K:blok platform and the machine learning tariff classifier.

We also believe there is an opportunity to digitise the outdated, human readable tariff classification into a digital classification, using the international standards ISO 22745 and ISO 29002. These data standards sit at the core of all of the products in the KOIOS Software Suite. A digital version of the tariff classification will improve the accuracy, speed and reliability of computer automation.

Join us in our vision

Our successful Innovate UK project was a step in the right direction to improving international trade and reducing counterfeiting. Brexit also provides a great opportunity for the UK to become a world leader in using technology across borders and to set the standard for countries to follow.

In the coming months, we will continue to engage with the UK Government/HMRC and continue to look for opportunities to fund our research and development.

If you think that you can add value to this project and would like to explore how we could collaborate then please get in touch at info@koiosmasterdata.com

Contact us

If you think that you can add value to this project and would like to explore how we could collaborate then please get in touch.  

+44 (0)23 9387 7599

info@koiosmasterdata.com