≡ Menu

An interesting thought experiment (9 days to go)

imageIt’s been a while since my last blog post, and for good reason; I’ve got my head down deep into the architecture of www.software-monitor.com. Changing software architecture and reworking vital parts code becomes more expensive in time and resources the later it occurs in the life of the software, so I have been reworking and refactoring constantly over the past five days. It is vitally important that every design decision I make is going to give the very best possible performance.

www.software-monitor.com is developed using ASP.NET, so hence I’m using ADO.NET. Sticking with the Microsoft technologies, I’m running SQL Server 2005 Enterprise Edition on a Windows 2003 dedicated server which is powered by an Intel dual core E2160 with 1GB of RAM and 160GB of RAID 1 storage running on a 100mbit connection in Canada. I have another mirrored secondary server in the UK running Windows Server 2008 and SQL Server 2005 Enterprise. Currently both services are running and inter-operate to maximise uptime. I’ve planned with my network hosting providers to increase hardware capacity in the short term as well as to bring additional servers online as the service grows. Of course, if nobody signs up I’ll have a bunch of expensive equipment idling away but it’s better to be safe than sorry and plan for scalability issues from the start.

So I’ve parameterised all my SQL queries and much of the processing into stored procedures. I’ve eliminated every DataSet, DataTable and TableAdapter from the code and am using the most efficient SQL DataReader object instead. This has meant that I’ve lost the ability to bind data sources directly to the Infragistics components. As beautiful as they are they benefit mostly the cubical-based developer who has the luxury of using these powerful but inefficient disconnected data objects. I don’t have the luxury of a big server running for only intranet users; let me explain by asking a question to all the developers out there:

How much system resource does your application use, on each of your customer’s machines at this very moment? Several gigabytes of objects in RAM? Several gigabytes of data in a database?

Now imagine hosting all those objects in memory on just a single machine. Every customer’s session must remain responsive.

(Those big data manipulation objects and flexible datatypes aren’t looking so good now, are they?)

This so far has proved to be the biggest technical challenge for me in the 30 days so far and something from the start of the challenge that I didn’t fully appreciate the depth of. It’s been an expensive task to give serious development time to debating the performance of each and every design decision, to rework the database, the indexes, the data types (int or bigint; char, varchar, varchar2 or text?), the stored procedures and the in-memory objects. And then to load test the effectiveness of the service. It’s cost me several days, but ultimately this is preferable to having to rework once live customer data has entered the system.

image
Happy Summer Solstice!
(Was Stonehenge built in 30 days?)

I do have screenshots to post, and www.software-monitor.com certainly looks better than any website I’ve ever written. It uses basic cascading style sheets (CSS) and all the HTML has been written and inspected by hand, from scratch. There’s a good helping of AJAX and Javascript present too. I’m about two weeks “behind” where I’d like to have been by now and although the website is a work in progress I don’t want to open the doors until the API is completely ready for consumption. Then I’m going to put together a screenshot and screencast tour and who knows, maybe I will make up for this with a candid video tour of software-monitor in person 🙂

I’m keen to get as much feedback for my idea as possible. If you have a software product and would like to integrate with a quick and easy web service to provide analytics and statistics, what would you like to see? I’ve got several bases covered; feature use, error logs, licensing, subscriptions and version checking. What else would you like to see, or is there anything specific that you’d like to see me blog about?

Yours,

Mike

{ 3 comments… add one }
  • ColinM 25th June 2008, 2:38 am

    Seems like a B2B thing to me. Not sure how, oooh, I dunno, a bingo card creator software would benefit from this.

    Maybe it can say how “form X” takes 10 minutes compared to the new “form Y” which takes 8minutes saving $Z. That could be a powerful feature.

    Just out of personal interest, any plans to support Silverlight? Not sure what information could be gathered that would be different from Google Analytics?

  • Steve Cholerton 25th June 2008, 9:24 am

    Your software sounds great. It’s early for my business to need something like this, but I can certainly see the benefits for someone who is selling a significant amount of software. The website looks great as well, simple and clean. – Steve

  • mike 25th June 2008, 1:59 pm

    @Colin; Thanks for your comment. I could certainly get the component part to perform software timing analysis. That would be cool, I’ll add it to my whiteboard 🙂

    Silverlight is supported. The information that would be different from Google Analytics is – error reporting. What makes Software Monitor so useful to developers is the two-way integration process. For example, you can check license key validity and expiry via the API (which in itself can be linked with your payment processors).

    @Steve; thanks for the compliment. I’m planning to provide some of the features for free (with a limited log size), so please consider the service once the doors have opened 🙂

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.