• Case Studies

  • Our Experience

    We have worked with executive and senior management of numerous institutions to define data visions, principles, strategy and governance through to operational data management standards and processes, in particular data quality frameworks.

    As guidance for the UK banking industry, we were commissioned by a UK bank regulator to articulate the future steady-state vision for its major data regulation, the Firm Data Submission Framework (FDSF). This has resulted in internal and industry roadmaps for change to implement key pillars of the vision.

    Innovate Solutions

    Our open approach means we use data science to build novel solutions to help businesses address needs and adapt to their changing environment.


    Our data solutions work across a data lifecycle. We have particularly strong semantic experience developing ontologies, building dictionaries, defining data relationships, associating business rules, constructing domain-specific taxonomies and using description logic to enable reasoning.


    We built the Firm Data Submission Framework (FDSF) for a UK bank regulator to re-engineer data submission from UK banks. Original data requirements were confusing, unclear, non-repeatable and prone to errors, taking anywhere from 6 to 18 labour-intensive months to complete for just one bank. The first phase of the FDSF tackled this by building a knowledge base of dictionaries, data relationships, business rules, structured templates, an operating model, change management processes and roadmaps. The regulator can now manage and analyse seven firms submitting simultaneously and the FDSF has been expanded to cover thirteen international banks for market risk.

    JC Chapman, in partnership with University of York, has been awarded a grant from the UK Government’s innovation agency ‘Innovate UK’ to develop an ‘Open Rules Platform’. This will enable any community of users to easily develop and exchange business rules, particularly for data quality. The implementation will focus on natural language so business users can directly author and implement rules.

    Create and Leverage Technology

    ConTXT is our reg-tech solution which streamlines regulatory and standards based reporting processes, reduces cost, increases compliance and improves business productivity.


    A major global bank needed to rapidly incorporate new requirements released by a UK regulator. To supplement the bank’s existing solution, ConTXT automated the transformation of the regulator’s schema into relational tables along with 1000+ validation rules as per the bank’s specifications. We also imported all terms, definitions and enumerations into system tables from regulator-provided Word documents and Excel templates and produced thousands of rows of test data, including incorporation into regulatory forms, to test the bank’s data aggregation workflow.


    A UK bank regulator endured the painful and resource-intensive process of manually crafting and maintaining regulatory data requirements for banks, including data schemas, 500+ pages of Word documents, 30+ Excel workbook templates and thousands of validation rules. Through using ConTXT we imported all associated metadata from the regulator’s documents and templates. This enabled a team to collaboratively maintain the data and automatically generate all documentation, templates and schemas, eliminating expensive manual efforts and inconsistencies between documents.


    A major financial services organisation required an in-Excel tool to enable them to flexibly model and stress capital requirements. We built an innovative add-in combining data and analytics modelling capabilities. Excel was simply used as the user interface to pull desired data into worksheets where formulas could be defined by business analysts to perform desired model calculations. All data, structure of the workbook and formulas were then saved to the central repository so it could be loaded into any other user’s Excel interface.


    Following the Financial Conduct Authority’s release of the Senior Managers Regime, firms need to have a clear and accessible understanding of their organisational structure, roles and responsibilities. As a result, two large global banks expressed a need to use ConTXT to model these. We implemented W3C’s organisation data model standard to capture all associated organisation data, including related elements - for example, roles, responsibilities, committees, policies and terms of reference.


    We have provided a Universal bank with regulatory content (FDSF) through ConTXT Content and have installed our ConTXT Platform technology as their next generation solution for visualising and controlling their data flow for reporting.

    Manage Data Demands

    Our passion is cemented in a depth of knowledge and experience of managing a complex array of requirements and stakeholders under constrained resources and timelines.


    This knowledge and experience allows us to help clients build processes and capabilities to aggressively prioritise and manage open-ended demand on data capabilities from internal and external stakeholders. For the financial industry, this has been key for helping clients leverage internal implementations to meet the avalanche of regulatory demands while maximising return on investment.


    As the architect for the UK banking industry’s FDSF, JC Chapman developed required operating, change and governance models and processes. Many of these addressed internal users as well as interactions between the regulator and each of the major UK banks - such as a roadmap for each bank describing needed tasks, milestones and timelines.


    We have also worked alongside business and data experts at a large UK financial institution to examine their existing trajectory towards defining and changing internal operations and business processes. We helped them articulate a plan to transition and better align their efforts, processes and architectures with regulatory demands while keeping to budget.


    A Tier 1 bank with strong data requirements needed to robustly define the key aspects of data quality necessary to boost user confidence in the organisation’s data. We worked with client staff to capture their needs and articulate those in terms of clear data quality objectives supported by industry research.


    We developed a data dictionary for the stress testing division of a major global bank, working between all related business areas and the bank's global data office. Ensured the dictionary complied with the data policies set by the central data office. Advised the data office on an underlying data model to use for organising the dictionary and related hierarchies. Worked between all related business areas to conduct a review of the definitions proposed and provided the dictionary in a format enabling upload into the bank's glossary system.

    All Posts