KNOWLEDGE / Education / POST
July 7, 2022

Q&A Interview with Gilles Scuttenaire, Senior Software Engineer

In a presentation at BIO IT World 2022, Cognivia’s Gilles Scuttenaire, senior Software Engineer, discussed our custom, end-to-end data management solution for questionnaire data from clinical trials. 

What led Cognivia to develop Unik-me, our customized software for administration of e-questionnaires and management of questionnaire data, and how does it work? Find out in this Q&A with Gilles.

#1. What was the tipping point that led to the in-house development of an end-to-end data management solution? 

At the beginning, management of e-questionnaire data when clinical trial patients were administered our MPsQ psychological questionnaire – a critical component of the Placebell©™ methodology – was done with an off-the-shelf solution. While initiating our first projects, we didn’t have the expertise to develop and manage a solution on our own. Using the off-the-shelf solution was the easiest and most direct way to get what we needed.

However, while experiencing more and more with the off-the-shelf system in our different projects, we made several observations:

  • The off-the-shelf solution was too complex for our needs. We were using only a small subset of the functionalities.
  • To interact with the system, customize our reporting and automate our activities, extra development had to be accomplished internally. 
  • Our inability to directly  directly solve problems reported by our customers was frustrating. At that time, any customization or technical issues had to be managed and prioritized by the software provider. 

#2. Why did you develop a solution in-house instead of finding a replacement off-the-shelf solution?

To come to this decision, we made a complete gap analysis internally by considering different aspects of the problem: 

  • How technically complex is the system we need to perform our data management? 
  • Are there any technical specifications that would prevent us from succeeding in the development? 
  • What are the challenges in terms of internal staff training, skills and processes to develop to achieve regulatory compliance (FDA 21 CFR Part 11, Annex 11, GDPR, HIPAA, etc.)? 
  • What are the technical costs of implementing compliance?
  • What level of automation/customization do we want, and is it achievable with an off-the-shelf solution? 
  • Regarding past experiences with customers, what could have been better in terms of end-to-end support?

Ultimately, these questions led to the decision to develop the next generation software in-house. We were able to use common, well-known and stable technologies that fit tour development needs and maintain the software lifecycle. A lot of training was needed, but in the end, we were still convinced that going for an in-house development was feasible and worth it. 

#3. Who was responsible for the technical development of the solution? 

Myself and my colleague Guillaume Bernard, PhD, IT and Software Engineering Director, share the responsibility of the technical in-house development. Julie Carbonelle recently joined the team to help support development.


I am responsible for the development of the software that supports the development and management of our Multi-Dimensional Psychological Questionnaires (MPsQ), and Guillaume manages the development of our online platform, unik-me, that administers the MPsQ to clinical trial patients. 

Technical choices regarding the system architecture are always discussed with the broader team. Also, we work with experts to help and train us in domains where expertise is needed.  Finally, since a system like this is not only about technical aspects but also business aspects: we also receive valuable input and expertise from our colleagues from other teams: Operations, R&D, IT and Q&A.

#4. What was the development process like? 

At Cognivia, we implement a software development lifecycle based on modern software development methodologies, like Agile and DevOps. We cherry-picked the best concepts and tools from each of those methodologies to align with our software and business goals. 

We made the choice to limit the tooling landscape as much as possible to support our development lifecycle. We optimized the usage of tooling by capitalizing on the functionalities that we have in place to achieve the optimal level of automation regarding the code management, code quality analysis, deployment and testing.

We also needed to adapt these well-known methodologies to specific Company requirements and the inherent constraints of the biomedical and pharmaceutical industry where regulatory requirements regarding software validation are very demanding (and might sometimes seem conflicting with modern software methodologies). 

But my philosophy is this: Finding an efficient methodology that suits both your technical expectations and the business requirements is the interesting part of the job. 

#5. How did the implementation process go? 

The development of the solution is just one step of the continuous system lifecycle. We released the original version in August 2020 and have built and released new features on a continuous basis ever since. Our goal is to release small features more regularly instead a big release every year. 

Before releasing it to production, we put a lot of effort into the validation of the system. Our validation procedures follow GAMP 5 guidance, which provides a risk-based approach to validating GxP computerized systems. We also optimized the way we technically manage the computer system validation by automating the traceability between requirements, system design and testing, and the creation of documents required in a file for a GxP software. This is extremely useful in case of external audits to quickly show the validation state of the system.

We also worked with the operational team to adapt our existing operational processes to this new software. We worked continuously with them to ensure that the tool and features were consistent with their requirements. The development of the system was driven by the business process. Because of this, while training was necessary, the implementation and adaptation process has been very smooth. 

#6. How does your end-to-end e-questionnaire data management software work?

Our end-to-end, custom data management sotware is comprised of two major components: one internal, the other external. 

Part 1: Electronic (e-) Questionnaire Management System

First is the management of our e-Questionnaire.  At Cognivia, we have developed a custom questionnaire – the MPsQ – that we use in studies to collect participants’ psychology data. This questionnaire data is an integral part of our predictive algorithms used to understand patient response and behavior in clinical trials. With the growing number of studies we work with, we quickly identified a strong need to have software dedicated for the management of our proprietary questionnaires.

This questionnaire development process is complex and requires a development lifecycle of its own.  Questionnaires must first be created by the R&D team to ensure that they are scientifically valid. Our data management team then prepares necessary translations and generates the files to be integrated in our eQuestionnaire platform, unik-me. The data management system currently manages 476 questionnaire versions and 24 translations, and this database is rapidly growing.

Part 2: e-Questionnaire Platform, Unik-me

This brings us to the second step of our in-house solution: administering the questionnaire to patients in clinical studies. 

Our data managers first set up the study in unik-me by customizing specific parameters to meet the study needs. Once ready, the study-specific software setup is pushed to our system and will then follow the QA validation process by using documentation generated automatically by the system. When quality controls are successful, the production instance of the study is released.

Through the unik-me interface, site-based study coordinators can manage the administration of our questionnaires to the study participants. The platform is also used by our data management team to generate different types of reporting:

  • Documents that perform QA verification of the study setup
  • Annotated CRF to the sponsor as well as monitoring reports during the study
  • Exports in SDTM format to provide to the statistical team and sponsor at the end of the study

During the whole study lifecycle, our data manager generates automated monitoring reports that are regularly communicated to the sponsor. At the end of the study, the study is locked and the study data are communicated in the SDTM format to our statisticians to be used in the generation of predictive analytics, including the Placebell©™ Covariate, and the creation of reports for the sponsor.

#7. What are key features of your solution?

One of the most important features is that it is easy-to-use for sponsors and clinical site staff. We chose to prioritize the external end-user’s experience and provide them with a clear and easy-to-use web interface that minimizes the burden for both study coordinators and participants. We also focused on improving the internal user’s experience and backend features used by our data managers. 

Another important feature is automation. Globally at the system level, we wanted the different components to be well integrated with one another. We also put an important focus on the automation of our internal and external reporting. All documents required during the study lifecycle are generated automatically by our software.

#8. What differentiates this from off-the-shelf solutions?  

A key difference from an off-the-shelf solution is that it exactly meets our data management needs. If our needs change, we can plan new releases that focus solely on new functionalities– without being bogged down by unnecessary features.  

Another key aspect is the support we can now directly provide to our customers. If a problem occurs, we take full ownership and can provide rapid resolutionIt is far more satisfying to control the resolution of a problem directly than to be an intermediary.

Conclusion

The MPsQ is a critical component of Cognivia’s innovative method to predict individual patient placebo responsiveness and reduce variability in clinical trial data. As such, the technology we use to not only develop these questionnaires but also administer and report on them is essential. Internal development of custom data management and e-questionnaire administration software was the best option to ensure optimal results and an easy-to-use interface for our partners. 

Find out how to easily apply our predictive methodologies in your study by getting in touch today.

Related content

Education

Understanding the Placebo Effect: Increasing Clinical Trial Success

Placebos are used in randomized, placebo-controlled controlled trials, in which one group (or more) receives the active treatment...

Read More
Education

Mitigating the Placebo Response in Phase II & III Clinical Trials

Phase II and III trial failure due to unexpectedly large placebo response rates is still all too common....

Read More
Education

Placebo Effect vs. Placebo Response: Is There A Difference?

Placebo effect and placebo response are often used interchangeably – despite being two different phenomena. In this blog, we highlight the differences—and why it matters.   Placebos are an important…

Read More

The next frontier in clinical research & patient management

We’re proud to be leading the charge into the next era of drug development.
Cognivia helps clinical trials reduce data variability, empower decision-making, and accelerate the launch of new therapies.
Tell us about your clinical trial below and we’ll be in touch.

"*" indicates required fields

This field is for validation purposes and should be left unchanged.