An interview with Bradley Hossack, implementation manager at Dimebox
Bradley Hossack has a technical consulting background and years of experience helping tech companies execute the right strategies. His affinity with software platform architecture led him towards related roles in product design and customer implementation. He now uses these skills at Dimebox to create scalable implementation procedures and helps development focus on the key technical features that each client requires.
You have worked at Dimebox for a couple of months now, how have you experienced this first period?
When I started I noticed a lot of drive and energy about what we’re doing and where we think we can go. There are still a lot of challenges concerning growth, and I came in to take the technical load off of the customer-facing side. We’re finding ways to better service customers, reduce questions and move more towards advising clients instead of reacting to support tickets.
Where do you think implementation should be moving towards? What are the next steps?
We have to keep in mind how our processes would work with a thousand customers or more. Our training guides, processes and documentation have to reflect a long-term scaling strategy. We can support that with videos and interactive tools that walk a user through the dashboard screens, for example.
We currently have analytics set up to see when documentation is accessed to prevent a ticket from being opened. That way we can track where most questions arise and what we can do to stay one step ahead of new questions. That also involves educating our customers about the platform on a technical level.
To figure out our next steps, we catch up with existing customers and work with new prospective customers to gather feedback. Existing customers often just want to focus on a couple of key areas and don’t need to hear everything again. New customers almost always need a broader approach before going into more detail.
As we grow, the challenge is to minimize tickets by automating as much as possible, but also invest in a personal and human approach when a customer needs it.
Let’s talk about minimizing tickets. What processes help you achieve this?
There are a couple of different things we can do. As I mentioned, proper documentation gives customers the ability to find the answers they’re looking for by themselves. One step ahead of that, the onboarding process should educate new customers to understand the platform, and the platform itself needs to be both intuitive and functional. The most important part is delivering a product that works in the first place. That’s why testing is crucial during the development process. I play a central role in translating customer priorities into key testing priorities so that the customer is satisfied with the delivered product.
How do you interact with the testing that happens during development?
There are a number of tests we can do. There are technical unit tests, where developers see if a certain isolated connection or feature works, and end-to-end tests where QA can test the whole flow. The end-to-end flow of a whole payment includes Initiate, Authorize, Capture, Settlement Completed, Refund etc, where a unit test might cover a specific status we’re trying to get at a certain stage.
Then there are functional tests, which test the business logic and errors that are not usually technical (ex. Decline codes). Those type of tests require a tight collaboration between development and product teams, and I make sure that what is tested lines up with the expectations of the customer.
There are also integrated tests where we test our platform in a production-like scenario. For this we can use a checkout page with different types of browsers with connections with a 3D Secure MPI. Lastly, we are also enhancing our Performance Testing capabilities in order to improve the efficiency of our platform.
We’re seeing a big opportunity to help customers at the testing stage because they have different testing strategies of their own. Each customer is responsible for a final sanity check before we upgrade the production environment. In the end we are responsible for delivering something that works, so we need to prioritize accordingly.
Our implementation team supports our testing efforts by determining key test points for each customer. If they use specific processors most often, those connections have priority.
Some clients just want to use their dashboard to see all payment requests. These different needs inspire different testing strategies, which we identify and execute in a reliable and scalable way.
If we see that a customer only uses a small part of our functionalities, we’ll try to engage them more. We show them how to employ our fraud tools more effectively or use better acquirer routing options and help them get the most out of our platform.
How do you collaborate with the client/merchant to prioritize what to test?
The first step is to determine the key functional points that apply to each customer. We hand example test cases off to them to use as a starting point. If we are connecting PayPal for a new customer, we give them a base template to which they can add whatever they need, problems they need to solve or issues they have encountered in the past. Maybe they’ve experienced a frustrated customer scenario where a payment was initiated a hundred times over. The merchant knows what type of things go wrong with their payments better than we do, so that helps us prioritize.
After we deliver our product they do their own testing. So the implementation team plays a key role in translating the business requirements of the customer into actionable development strategies. In the end it’s about customer satisfaction. Some things need to work perfectly from day one, and it’s our job to deliver.