In the latest episode of The Data Malarkey Podcast, host Sam Knowles engages in a compelling conversation with Ian Makgill, the founder of Spend Network. Makgill and his team are dedicated to enhancing transparency and efficiency in the global public sector procurement market, which is estimated to be worth approximately $13 trillion, representing about 13% of the global economy.
The mission of Spend Network
Spend Network aims to “unlock the $13 trillion global procurement market through the world’s leading tender, contract, spend, and grant data.” Makgill, with two decades of experience in building databases and six years working with artificial intelligence, is a firm believer in the transformative power of data. He emphasises that, while data often highlights negative aspects or failures, it also holds the potential to drive positive change and shape the future.
Challenges in data aggregation
Aggregating procurement data from over 700 diverse sources daily presents significant challenges. Makgill notes that data is often “unruly, different, misaligned,” and not easily comparable. Despite these obstacles, he maintains that “all data is bad; all data is dirty,” but with proper processing, most data can be rendered useful. This perspective aligns with statistician George Box’s adage, “All models are wrong; some are useful”.
The importance of data transparency
Makgill underscores the critical role of transparency in government procurement. By analysing, cleaning, augmenting, validating, and verifying approximately 220 million lines of spend data from various government departments worldwide, Spend Network strives to level the playing field. This transparency enables businesses of all sizes to access opportunities that were previously obscured, fostering a more competitive and fair marketplace.
Utilising artificial intelligence
Since 2017, Spend Network has incorporated artificial intelligence into its data processing workflows. AI aids in managing the vast and complex datasets involved in global procurement, enhancing the accuracy and efficiency of data analysis. Makgill highlights that the integration of AI is not about replacing human judgment but augmenting it, allowing for deeper insights and more informed decision-making.
Influences and inspirations
In the realm of data storytelling and visualisation, Makgill cites John Burn-Murdoch, Chief Data Reporter at the Financial Times, as a significant influence. Burn-Murdoch’s work during the COVID-19 pandemic, particularly his visualisations of excess mortality, exemplifies the power of clear and effective data communication. Conversely, Makgill expresses criticism towards political figures who manipulate data for agendas, referencing Jacob Rees-Mogg’s consultation on imperial measures as an example of data being used to mislead or obfuscate.
The future of procurement data
Looking ahead, Makgill envisions a future where procurement data is not only transparent but also standardised across jurisdictions. Such standardisation would facilitate easier comparisons and analyses, leading to more informed policy decisions and efficient allocation of resources. He advocates for continued collaboration between governments, private sector entities, and data professionals to achieve this goal.
This episode sheds light on the complexities of global government procurement and the pivotal role that data plays in promoting transparency and fairness. Makgill’s insights offer a nuanced understanding of how meticulous data management and analysis can transform a traditionally opaque sector into one that is accessible and equitable for all stakeholders.
—
This blog was created using ChatGPT.