At DriveWorks World we briefly mentioned some of the testing that we’ve been doing here at DriveWorks, and I wanted to dive into some more detail into a particular testing project called WatchTower which we’ve been trialing during the development of DriveWorks Solo.
WatchTower is a suite of tools which we’ve been developing internally to specifically test DriveWorks’ model generation capabilities – both the “interactive” model generation we refer to as “OnDemand”, and more recently - our traditional bottom-up model generation which we call “Queued”.
That adds up to a total of three different (bear with me, I can count, honest!) forms of model generation in DriveWorks Solo/Pro:
- OnDemand in Preview mode (DriveWorks Solo and Pro) – this is where we perform OnDemand with temporary file names, and suppressing instead of deleting anything from the model.
- OnDemand in Finalize mode (DriveWorks Solo and Pro) – this is where we perform OnDemand with proper file names, and deleting requested components and features from the model.
- Queued (DriveWorks Pro only)
On the face of it, model generation seems like quite a simple process – apply a few dimension values, update some custom properties, and job done – right?
That couldn’t really be much further from the truth. Narrowing in on just controlling SolidWorks instances for a moment (a feature that is coming to DriveWorks Solo in the near future), when we started the WatchTower project, we identified over 400 unique tests just related to controlling instances based on being able to suppress, unsuppress, and delete them – that isn’t even taking into account replacing instances with other components.
Currently WatchTower encompasses over 2,000 fully automated tests covering core DriveWorks Solo model generation, takes about 4 hours to run, and produces around 1.5GB of data in over 16,000 files (about a half of which are SolidWorks models), per test run.
For us though, this only represents the beginning.
We’re currently in the process of building a new test rig for WatchTower which is going to allow us to run tests on 12 different environments simultaneously – 3 major versions of SolidWorks, 2 major versions of Windows, and 2 platforms – 32 bit and 64 bit. This is going to allow us to root out and work around differences in the different platforms to make sure we deliver a consistent model generation experience. This test rig is going to run around the clock on builds of DriveWorks to ensure that new features don’t break existing capabilities, and that we continue to work with new versions of SolidWorks before they are generally released.
For anyone keeping track of the numbers, that means a single run across all 12 platforms is going to produce in excess of 24,000 test results, 18GB of data, and 192,000 files.
But obviously testing the different environments is only part of the story, we’re also making a continuing commitment to creating WatchTower tests for:
- Functionality that isn’t currently covered – such as Advanced Feature Parameters
- Functionality that is coming in future releases
- Any and all customer reported issues involving model generation.
I hope you can see that WatchTower is a big deal for us. Automating SolidWorks models is what we do, it is the bread and butter of DriveWorks, and we’re committed to making sure everyone gets the most out of it.
Just before I sign off, WatchTower is just one of a number of testing initiatives currently in progress at DriveWorks, and automated testing is something we continue to invest in heavily (in terms of number of automated tests, WatchTower currently represents less than half of our current automated test suite).
Philip, Lead Developer @ DriveWorks
Comments