Share this
9 common concerns market researchers have when considering new software tools.
by Michael Howard on 04 Jun 2024
Hi there, welcome to this latest episode of Now that’s Significant, a market research podcast. This is your host Michael Howard, the head of marketing at Infotools.
Today, I’m sharing something that I’ve been mulling over for a while now, and this is around the buying journey people take when purchasing market research software. This isn’t really limited to software, you could actually also apply this to other vendors or suppliers in the market research space as well.
For those out there who may not know about us, Infotools has been working in the market research world for over 34 years now, helping insights teams to gain a greater understanding of consumers, then feeding that understanding back into their business so it can make more informed decisions.
As you’d expect, we’ve come across probably every concern or consideration that you could think of when it comes to new software tools. We started off as a desktop-based application, and no, we’re cloud-based, in the form of our data analysis and reporting platform called Harmoni. While the delivery and means of access to our IP has changed, many of the historical concerns are still relative. And this was reinforced to us as part of some consulting work we did alongside SwayTech.
They were able to pull out a comprehensive list of concerns, which have been funneled into three categories – the first being Usability, Learning Curve, and Team Adoption, the second Impact on Workflow and Productivity, and the third Technical Capability and Requirements.
So, what’s the significant thing that I’m going to share in this podcast? Well, all change can be hard and there are hundreds of reasons to avoid it, but we should never feel that it’s insurmountable. If we know that change needs to take place, if we can see that a different way of working is on the horizon, then addressing those reasons to change, even one at a time, is well worth the effort. So, in this episode, I’ll work through some of those concerns, and hopefully, it will help you in your next software decision. The first category we’re going to look at is Usability, Learning Curve, and Team Adoption.
Usability, Learning Curve, and Team Adoption
I’m worried the tool might be too complex and won’t be able to use it effectively.
Great software is getting harder and harder to make. Changes to operating systems, legislation, user expectations – it’s such a fluid environment to operate in and even one change to one of the things I’ve just mentioned can cause all sorts of headaches. On top of that, it’s really easy for software providers to keep adding features, without being mindful of the underlying task at hand. This can add to complexity the tool feels bigger than it needs to be. As a result, there’s more to maintain, and more that can go wrong.
- Simplicity should never be overlooked. A tool that does one thing really well, and not trying to be all things to all people, is a very rare thing indeed and is well worth considering. It means a software vendor can focus on making a robust set of features for that narrower use case, as well as improving the user experience.
- Software that offers a user-friendly interface and intuitive design is important, but consider the complexity and size of what you’re asking it to accomplish. Sometimes, accepting what you’re actually able to achieve with the software compared to doing it manually, can help you be a bit more forgiving when it comes to usability. If you choose well, you may have something that looks really good, is on brand, and is very usable. But if it’s not quite up to scratch, ask the vendor what is in the pipeline for improving the user experience. Hopefully that should shed some light on what your preferred tool may be.
- One last thing to help mitigate the actual or perceived complexity of a tool is ensuring the vendor provides comprehensive training materials and readily available customer support can help too. It may be table stakes, but some providers do offer more comprehensive support than others.
What if the learning curve disrupts my current projects?
There’s often never a right or wrong time to implement a software solution, there will always be some disruption, but you have to consider the upside. You often hear companies paraphrasing Warren Buffett, where the best time to embrace technology was twenty years ago. The second best time to embrace technology is today. So how can you help with potential disruption?
- One thing that can help is to look to implement the new software during a less busy period in your work calendar.
- Phasing in adoption gradually can be a good option here too. Starting with a smaller team or pilot program to test usability before a wider rollout can minimize the risk of wider disruption. As Bryan Smith, one of our previous guests on the podcast once said, don’t try to boil the ocean.
- And on pilot programmes, you can also have them running in parallel as well for a time, to test the new platform’s ability. You may choose to completely outsource that pilot so you can test the numbers alone, or get involved to whatever level your able. Just make sure you’re not comparing apples with apples. Hopefully, the point of purchasing a new solution is to generate even more value than what you’re currently able for your organization.
How do I know if my team will adopt this new tool or resist the change?
There will always be resistance to change. Expecting it is generally a wise thing to do. Another thing you should expect is to get tired of communicating the drivers, the benefits, and the functionality of the new tool. It’s often been said that it’s only once you’re getting sick of communication the message that the message is just starting to get through. So don’t give up too early. People are busy, there’s generally a flood of comms they receive on a daily basis, only a fraction from you, so allow plenty of time and repetition for everything to click.
- Some things you can do to get more people onboard earlier is to include your team members and other stakeholders in the selection process. It’s a good idea to solicit their feedback along the way too.
- You can also identify and train internal champions who can provide ongoing support to their colleagues.
- Remember people can also be emotionally connected to their current ways of working – whether that’s using another tool or doing things more manually than what you’re suggesting. Acknowledging this emotional connection is wise. As is emphasizing the benefits of the new software and how it can enhance their overall productivity and job satisfaction.
- Take up of your new tool or process will seem slow initially, as only the early adopters tend to take you up on the new approach. But keep persisting, the early majority will get on board in time, then the late majority and the laggards will follow.
Impact on Workflow and Productivity
Will I actually save time or will this just add another layer to my workflow?
The prospect of someone adding yet another layer to their workflow is daunting to say the least. So many software products out their claim efficiencies and productivity gains, but how accurate are they? From our experience, one of the best ways to gauge how effective a tool will be, should be apparent in the very first demonstration you have of the platform. Are you making “Wow that’s great” or “Ooh, that’s cool” as you see the tool in action? That’s a good guide.
- Considering the effort to change tools, you should never try to replace like-for-like. As a general rule, you should always gain far more than you had to give up. Once you’ve established the proposed tool can do the fundamentals of the task you need to conduct, think about some of the things your current way of working frustrates you or doesn’t do well. You’ll then be able to see if the alternative option does or doesn’t outshine the encumbant.
- Conduct a thorough time analysis of current processes and compare to the software's proposed workflow. See what gains you get from the new tool, as you might need to spend some of that gains to plug some of the gaps of what the new tool may not be able to do. Having some difference is understandable.
- Request a demo or a trial period to assess the software's impact on real-world tasks. If you’re strapped for time but know something needs to change, you could even see if they could set up a small project that you can use as a comparison.
- Software that integrates with your existing tools can also help to minimize disruptions. But don’t feel you have to do this yourself either, the vendor support and success teams should be able to help streamline the process here as well.
I’m skeptical about whether or not this tool will provide more actionable insights than what we’re currently getting.
At the end of the day, market researchers are judged on the quality of insights they deliver to the business and the ROI they are able to provide. Before insights teams can even begin to properly assess this, it would be ideal to have a benchmark of the value that’s currently being generated. This is where it comes to down to what the new solution offers above and beyond the incumbent (also known as the value-add).
- As mentioned before, setting up a personalized demo environment with a project that you have already completed can be a great way to assess your options. One thing of which to be mindful is that when comparing current and potential options, you may well see a disconnect between the numbers. We have had instances where it was revealed the old way of working wasn’t as accurate as it should have been. When this is the case, it’s a sign that you’re on to a good thing.
- It’s pretty normal to read case studies and testimonials from similar organizations that have successfully used the software. If you wanted to go beyond what’s on the surface, you could also privately reach out to people from these case studies or testimonials to hear from them directly. They may be able to provide some more specific feedback than what was presented by the vendor.
- If you’ve clearly defined the desired insights and metrics for success upfront, you’ll be in a better position to weigh up your options. Ensure the fundamentals are ticked off, then you can compare other features and the nice-to-haves separately.
- Make sure you give the trial period a thorough test. Remember, while you’re comparing the platforms and using your new systems, you might just discover some insights that could help you move the needle of success. If it’s there for the taking, why not take full advantage of it – even before you’re officially a customer? That would make for a stronger business case as well.
I don’t have time to review all possible tools out there.
Time has that uncanny ability of getting away from us. Not only can time fly by, it’s also compounded by the ever increasing number of options available on the market. There just simply isn’t enough daylight (or nighttime) hours to assess them all. So, what can you do?
- Have a really clear idea of the outcome, the ideal future state of your organization. From this, you’ll have a much easier task of prioritizing essential features and requirements. This should help to narrow down your options and make the task seem less onerous. For instance, does it need to have end-to-end capability across the entire market research process? Or is it ok to be a best-in-class option in say data analysis and reporting? Some may prefer the ease of a one-size-fits-all platform, even if it does limit them to a single source of data collection, whilst others may like ability to analyze multiple data sources really well.
- You don’t have to do it all. Rather than take the all weight of the initial research phase, why not delegate this step and the screening process to a team or an individual? They may help bring perspective that you overlooked.
- Leverage online software review platforms and industry reports. Insights Association have the PAIR Network, Greenbook have theirs, ESOMAR have one, Insight Platforms have some great resources, as does Quirk’s. And don’t forget the likes of broader comparison sites like Capterra, G2, and Software Advice. Just remember, these last three don’t necessarily serve the needs and nuances of the market research sector.
Technical Capabilities and Requirements
How can I be sure the data is accurate and reliable?
While the promise of big data is well and truly here, there are still plenty of good reminders of why we need to respect and take proper care of consumer data. Major research associations from around the world have each published a Code of Conduct, which generally provides a great foundation for assessing the accuracy of the data you will receive from your proposed vendor. You may wonder how taking care of data applies to accuracy and reliability. In our opinion, it’s everything. If a software provider adheres to industry specific codes of conduct, then that’s a great sign. It also gives you a good idea that they know what it means to operate in your industry.
- One of the key things in terms of reliability of the data you’re using comes down to a matter of statistics. Market researchers use specific statistical methods as part of their analyses and investigations. If you want to know if a potential vendor really know the insights industry, ask them what statistical methods they use in their tools. Do they know their Bayesian from their bases? The difference between regression and conjoint analysis, multi-variates, and so forth? It’s even better if they can show you them in action.
- Ask yourself if you need access to respondent-level data? If you do, then you need to ensure you have a platform that maintains the respondent-level attributes, as there are platforms out there, like some of the major BI players, who aggregate data. This has its uses, but with market research, aggregating data can quickly deplete a data set of its value, especially if you needed to drill down and generate respondent level insights.
- It’s a good idea to inquire about the software's data sources, validation methods, and quality control measures. What development processes do they carry out to ensure that its functionality does what it claims it can do?
- If you have a major research project that represents a majority of your resources as an insights function, focus on that. Assess whether or not the software vendor has credentials in that space. Claiming they do is not the same as actually being able to carry it out for you. Requesting access to some sample data or testing the software with a subset of your own data can give you the confidence of knowing they can deliver on their promises.
- It’s also worth verifying the software's compliance against relevant industry standards and regulations. Does the software comply with GDPR or CCPA? Does the platform adhere to SOC or ISO standards? Documentation of these and their security credentials are generally available.
I’m nervous about whether this tool can handle our specific research needs.
Organizations are entitled to be nervous about whether or not a software platform can meet their needs. Software companies are notorious for jumping on the bandwagons of the next biggest thing. QR codes were all the rage in the mid 2010s but didn’t really get true uptake until the early 2020s thanks to QR codes being integrated natively into smartphone camera apps. Chatbots came into the world at a similar time as QR codes, yet it wasn’t until 2022 that they found their place in the world thanks to generative AI. So what do we have to say about this?
- Technology should, in theory, be invisible. What I mean by this is that you shound;t feel like you’re using a tool, it should be a natural extension of your daily job. This can only happen if you have a really clear picture of what it is you want to achieve as a market research function. A defined strategic direction and clear operational processes can help you as assess what tools will be right for your organization. A software vendor worth their weight in gold will actually call out any situations where their platform isn’t the right fit for your needs. It doesn’t happen often, but there are instances where there is a mis-match and that is counter productive for both vendor and the organization.
- Providing potential software vendors with a detailed overview of your research requirements and data complexities can help them understand your needs. It’s very unlikely that they will have a perfect fit for your organization, so ask them in full transparency where the gaps in their system are compared to your needs. Be wary if they don’t come back with anything. Good vendors may suggest workarounds to any delta that becomes apparent. But don’t expect any workarounds to be addressed in the future development of their platform.
- If you’re needing customized dashboards and reporting, these can obviously be done once a contract is signed, but sometimes, proof of concepts can be worked up to give stakeholders a sense of what’s to come. The key is to focus on the data, the numbers, the distribution channels for getting the reports out. It is also worth noting that your needs might change, so having a software platform that has some flexibility is also worth considering.
I’m worried about our data migration from our current system to this new tool
Moving from one platform to another will have its challenges, but remember that you’re transitioning for a reason. It’s always worth noting that the benefit of change is greater than the cost of doing nothing – otherwise you would keep operating as you were.
- If you’re worried about moving to a tool, think two steps ahead. If you find it hard now to move your data from one platform to another, make it one of your key criteria to be able to export data from your new tool. It may not be perfect or as full-proof as a fully functioning API, but even exports to industry standard file types can make life much easier for you five or ten years down the track.
- Is the tool you’re considering agency or data collector agnostic? What you don’t want to happen when you’ve invested all this time and energy into one vendor is to be locked in to one agency or data provider. There may be some studies that you receive from one agency, or some studies that you want to work with another. Having your options open here can help you or your team in the future.
As we close, we should remember one thing, our role as market researchers in the world is a vital one. We’re in the privileged position of helping organizations gain a greater understanding of people. There have been countless examples of how market research has helped create better products, services, advertising, brands. And thanks to the technology and tools we now have access to, there’s never been a better time to be in the insights industry. We’re excited about what we’re capable of now. And even more so for what’s in store in the coming years.
With that, if you’ve been assessing options for software that can increase the effectiveness of your market research, I hope this episode has been helpful for you.
Thanks also to those who have listened today. If you’ve liked what you heard, please subscribe to the podcast, feel free to share it with others, and also leave a review. Until next time, thanks for listening to Now that’s Significant.
Share this
- December 2024 (5)
- November 2024 (6)
- October 2024 (4)
- September 2024 (4)
- August 2024 (6)
- July 2024 (7)
- June 2024 (4)
- May 2024 (7)
- April 2024 (6)
- March 2024 (3)
- February 2024 (8)
- January 2024 (3)
- December 2023 (6)
- November 2023 (5)
- October 2023 (3)
- September 2023 (8)
- August 2023 (4)
- July 2023 (6)
- June 2023 (6)
- May 2023 (3)
- April 2023 (6)
- March 2023 (6)
- February 2023 (4)
- January 2023 (2)
- December 2022 (2)
- November 2022 (8)
- October 2022 (6)
- September 2022 (6)
- August 2022 (7)
- July 2022 (5)
- June 2022 (6)
- May 2022 (5)
- April 2022 (4)
- March 2022 (8)
- February 2022 (7)
- January 2022 (1)
- December 2021 (2)
- November 2021 (2)
- July 2021 (4)
- June 2021 (2)
- May 2021 (4)
- April 2021 (2)
- March 2021 (5)
- February 2021 (3)
- January 2021 (3)
- December 2020 (1)
- November 2020 (5)
- October 2020 (2)
- September 2020 (5)
- August 2020 (4)
- July 2020 (4)
- June 2020 (1)
- May 2020 (3)
- April 2020 (6)
- March 2020 (3)
- February 2020 (4)
- January 2020 (2)
- December 2019 (4)
- November 2019 (4)
- October 2019 (3)
- September 2019 (2)
- August 2019 (4)
- July 2019 (5)
- June 2019 (2)
- May 2019 (4)
- April 2019 (4)
- March 2019 (2)
- February 2019 (4)
- January 2019 (3)
- December 2018 (5)
- November 2018 (2)
- October 2018 (1)
- September 2018 (3)
- August 2018 (5)
- June 2018 (4)
- May 2018 (4)
- April 2018 (3)
- December 2017 (1)
- November 2017 (2)
- October 2017 (1)
- September 2017 (3)
- August 2017 (2)
- June 2017 (2)
- February 2017 (2)
- January 2017 (2)
- December 2016 (2)
- September 2016 (1)
No Comments Yet
Let us know what you think