top of page
Writer's pictureMarcus Brown

Can someone write some release notes?

Updated: Apr 26, 2022

Our software releases take some time, but we are careful to run a litany of tests and validations to make sure that the outcome is great for our users.


Software releases in general are a complicated process that no developer looks forward to. In our early years, things were moving so fast that we would regularly release every other week. This was possible because we had two developers and both of us pretty much knew everything that was going on in the product. Sadly (or maybe not?), those days are gone. Now, to put it lightly, there are a lot of cooks in the kitchen! I genuinely do not understand every aspect of our code base, but this is a normal process in a product's life cycle, as it grows in popularity and in complexity.


To make matters more challenging, our dedication to ongoing third party validation makes process management such an integral part of our release cycle. Our customers buy according to our track record, as well as the validation work our partners publish. When we release, all the validation work needs to be updated; so, effectively redone! Here, we typically encounter two challenges. Firstly, partners don’t have the cycles to revalidate every time we push a new release. Secondly, even if they did, once the validation has been accepted to a journal as an article, they really do not like preprint updating. In another universe, I may have elected to only push preprints to avoid this very problem, but that’s another story all together.


So, how do we manage this? How do we know that we are actually releasing a version that meets the very high standards of the scientific community, sports analytics customers, and clinical users, who are all performing different measurements, and all have different requirements for accuracy? Well, it’s obviously challenging, and is likely why most of our developers take vacation after we push a release.


For starters, we need to have some gold standard to compare to. We have divided this up into categories: our validation data (with markers), our purely markerless data that has worked in the past (it has tracked well), and markerless data that has not worked in previous releases.


The first test is fairly simple. We rerun all the validation data and compare the new version to the marker-based data as well as the previous version of our release. Here, we are making sure that we are still very comparable, and hopefully improving in signals that were not so perfect in the initial validation data.


The second comparison is a bit more comprehensive. We compare poses from the previous release to a giant set of collections we have acquired through our partners. Now this comparison is markerless versus markerless, new version versus the old. As I mentioned earlier, we compare against trials that have worked well in the past, and some that haven’t. The idea here is that we are trying to achieve the same results for the trials that have tracked well, and hopefully track some trials that didn’t previously track well. How do we know if a trial is tracking well? Sadly, without instrumentation, we do that all by hand, by reviewing each frame for each view. It’s actually not as bad as it sounds, as long as you have a lot of people working on it. However, I caution anyone who wants to do this alone: it will age you fast.


Even though a lot of this is more or less automated, this process and usability testing takes our team about 1-2 weeks if all goes well. And although it’s a lot of work and can be stressful at times, I don’t believe that any team member would push to remove any of these steps; if anything, we are always looking to add more. Maybe it’s the eternal optimist in me, but I think that a super comprehensive suite of tests may actually help users feel comfortable updating their product to a version that we can support.


(Although we currently don’t make these three tests public, we are planning to set up something that summarizes them for each version. Stay tuned for that!)

121 views0 comments

Recent Posts

See All

Comments


Join our mailing list

Thanks for subscribing!

bottom of page