Category Archives: RemObjects

Using RemObjects Train for Automated Testing and Building

Train
I was recently tasked with coming up with an automated build process for one of my projects. Given the available free and commercial tools on the market I decided to give RemObjects Train a try. Train is an Open Source project from RemObjects Software (one of their many Open Source offerings). Benefits of Train include:

  • It is Open Source
  • It has a simple, extensive, extensible API
  • Scripts are friendly to version control (based on ECMAScript rather than, say, XML)
  • The tool is simple, light-weight
  • It’s free – as in beer

My requirements for the build process were limited (one of the reasons I wanted a free, simple tool rather than a capable commercial tool such as FinalBuilder or Automated Build Studio):

  1. Support for running and failing based on unit tests (using MSTest)
  2. Support for building a release configuration of several .NET assemblies
  3. Support for packaging assemblies using Inno Setup
  4. Configurable version number used for assemblies and installer

Train ended up being a very elegant and useable solution. The scripts are easy to create and maintain – I used Sublime Text 2 with its JavaScript syntax highlighting.

In the end I created two scripts: RunTests.train and BuildRelease.train. By naming my scripts with a .train extension I was able to associate them with Train.exe and can run my unit tests and build releases with a simple double-click.

My RunTests script looks like this:

//rebuild unit tests
msbuild.rebuild("../ExchangePurge.UnitTests/ExchangePurge.UnitTests.csproj", { configuration: "Release" });
msbuild.rebuild("../ExchangeSync.UnitTests/ExchangeSync.UnitTests.csproj", { configuration: "Release" });

//function for running MSTest unit tests, detecting failures
function runTest(testAssemblyPath) {
	var mstestPath = "C:/Program Files/Microsoft Visual Studio 11.0/Common7/IDE/MSTest.exe";
	shell.exec(mstestPath, "/testcontainer:" + testAssemblyPath);
}

//run unit tests
runTest("../ExchangePurge.UnitTests/bin/Release/ExchangePurge.UnitTests.dll");
runTest("../ExchangeSync.UnitTests/bin/Release/ExchangeSync.UnitTests.dll");

The script is pretty straight-forward. It builds release versions of my unit test assemblies using the msbuild API provided by Train (API documentation can be found here). It then defines a simple function – runTest – for running unit tests with MSTest.exe using the shell API. Finally, the function is called for each unit test assembly.

My BuildRelease script looks like this:

//run unit tests
run("RunTests.train");

//get version info from VersionInfo.ini
var versionInfoIni = ini.fromFile("VersionInfo.ini");
var appVersion = versionInfoIni.getValue("VersionInfo", "AppVersion", "1.0");

//update the assembly versions for each .NET project using the above version info
msbuild.updateAssemblyVersion("../ExchangePurge/Properties/AssemblyInfo.cs", appVersion);
msbuild.updateAssemblyVersion("../ExchangeSync/Properties/AssemblyInfo.cs", appVersion);
msbuild.updateAssemblyVersion("../ExchangeSyncAdmin/Properties/AssemblyInfo.cs", appVersion);

//rebuild each .NET project
msbuild.rebuild("../ExchangePurge/ExchangePurge.csproj", { configuration: "Release" });
msbuild.rebuild("../ExchangeSync/ExchangeSync.csproj", { configuration: "Release" });
msbuild.rebuild("../ExchangeSyncAdmin/ExchangeSyncAdmin.csproj", { configuration: "Release" });

//export environment variable for InnoSetup to use for app version
export("ES_AppVersion", appVersion);

//build an InnoSetup installer
inno.build("Installer.iss", { });

//copy setup.exe to folder for this version
folder.create("Output/" + appVersion);
file.copy("Output/setup.exe", "Output/" + appVersion);

This script is a bit more complex but still pretty easy to understand. It starts by using the Train API to run the RunTests script. Because of the nature of MSTest.exe and Train.exe, if the RunTests.train script fails then the BuildRelease.train script will be aborted.

The script then uses Train’s ini API to read a configurable version from VersionInfo.ini. Next, the msbuild API is used to both update the assembly version information and build release versions of each assembly.

Next, I used the Train API to export an environment variable – ES_AppVersion – with the version information read earlier from the INI file. My Inno Setup script uses ISPP, which is installed by default if you download an Inno Setup QuickStart Pack. At the top of my Installer.iss Inno Setup script I used the following code (thanks to Carlo from RemObjects):

#define MyAppVersion GetEnv("ES_AppVersion")
#if MyAppVersion == ""
#define MyAppVersion "1.0"
#endif

And then further down use the MyAppVersion ISPP macro:

[Setup]
AppVersion={#MyAppVersion}

This bit of ISPP macro code will make use of the ES_AppVersion environment variable set by the BuildRelease.train script to define the installer’s version.

Finally, the last lines of the BuildRelease.train script use the folder and file Train API’s to copy the created installer to a folder named after the version read from the INI file.

So far I’m very happy with Train as a simple, capable tool for automated testing and building. If you’d like to read more about Train you can refer to Marc Hoffman’s post on the project as well as the official documentation.

Objective-C Literals with Data Abstract

Xcode 4.4 introduced Objective-C Literals which I think are great. Objective-C Literals are a collection of language enhancements that make it possible to do many common practices and patterns in Objective-C by writing less code. This includes boxing numbers, working with arrays and working with dictionaries. For instance, here’s some sample code for writing and reading an array without using Objective-C Literals:

[array setObject:@"one" atIndexedSubscript:0];
[array setObject:@"two" atIndexedSubscript:1];
[array setObject:@"three" atIndexedSubscript:2];

NSLog(@"%@", [array objectAtIndex:0]);
NSLog(@"%@", [array objectAtIndex:1]);
NSLog(@"%@", [array objectAtIndex:2]);

Here is the same code, now using the new enhancements for working with arrays:

array[0] = @"one";
array[1] = @"two";
array[2] = @"three";

NSLog(@"%@", array[0]);
NSLog(@"%@", array[1]);
NSLog(@"%@", array[2]);

Xcode even includes a refactoring called Convert to Modern Objective-C Syntax that will help you convert most of your code to this newer format.

Data Abstract, a multi-platform library for data access by RemObjects Software, also supports this new syntax when dealing with rows of data. The syntax was added to Data Abstract, according to Marc Hoffman, right after the Objective-C Literals support was announced at WWDC. Here is an example of working with a row of data before this change:

[row setValue:@"one" forKey:@"firstColumn"];
[row setValue:@"two" forKey:@"secondColumn"];
[row setValue:@"three" forKey:@"thirdColumn"];

NSLog(@"%@", [row valueForKey:@"firstColumn"]);
NSLog(@"%@", [row valueForKey:@"secondColumn"]);
NSLog(@"%@", [row valueForKey:@"thirdColumn"]);

And here is the newer syntax:

row[@"firstColumn"] = @"one";
row[@"secondColumn"] = @"two";
row[@"thirdColumn"] = @"three";

NSLog(@"%@", row[@"firstColumn"]);
NSLog(@"%@", row[@"secondColumn"]);
NSLog(@"%@", row[@"thirdColumn"]);

Fantastic I think! Easier to write, easy to read, and a very consistent and familiar way to access data by subscript. However, there is no built in refactoring in Xcode to change your old DADataTableRow code to this new format.

Stuck doing this by hand? I think not!

You can migrate your code to the newer syntax using Xcode’s Find & Replace. The Find & Replace in Xcode supports two powerful features we’ll use here: Regular Expresion (Regex) patterns and being able to preview and reject changes one-by-one.

To get started, load up your project in Xcode. On the left, select the Search Navigator (or press ⌘3). At the top of the Navigator, select Replace if Find is selected. Click the magnifying glass with the down arrow to the left of the search box and click Show Find Options. Under Style, select Regular Expression.


To replace the instances of code where values are being written, use the following search pattern:

[(.*) setValue:(.*) forKey:(.*)];

And use the following replace pattern:

1[3] = 2;

To replace the instances of code where row values are being read, use the following search pattern:

[(.*) valueForKey:(.*)]

And the following replace pattern:

1[2]

Click the Preview button to preview changes first. You can use the sliders in the middle of the preview window to accept and reject individual changes.


Once you’ve reviewed the changes, click Replace and you’re done! I’m happy any time I can use Regular Expressions without ending up with more problems than I started with.

Using XtraRichEdit to Edit HTML in VCL Applications

One request heard frequently from users of DevExpress’s VCL controls is for an HTML editing control. So far this has been deferred by the folks at DevExpress. However, they do make a bang-up .NET control called XtraRichEdit which has great HTML editing functionality. Wouldn’t it be great if we could easily embed that control in a VCL application?

It turns out this isn’t too tough using RemObjects’ Hydra product. Hydra is a plugin framework for both Delphi and .NET that lets you mix and match both managed and native hosts and plugins, visual and non-visual. This means you can make use of .NET visual controls in Delphi applications and vice-versa.

Let’s take a look at how easy this is to do (hint – very easy).

You’ll need Visual Studio, Delphi, the DevExpress WinForms controls, and Hydra all installed in the same environment. While it’s possible to do this with Delphi and Visual Studio on separate machines, it’s beyond the scope of this post.

Create a new project in Visual Studio using the “Plugin Module” template in the “RemObjects Hydra” category. Afterward, use the “Add>New Item” dialog to create a new “Visual Plugin”. Add a RichEditControl to the new design surface and dock it in its parent.

This alone is enough to yield a plugin module that can be loaded in a Delphi application and display a working RichEditControl. However, there would be no way to access the richedit’s HTML text. For this we will need to use a custom interface. This is really easy to do.

Add the following interface declaration to the top of the code-behind:

    [Guid("4033C8A9-8A7C-4DE4-864C-B8F60EFFBDD7")]
    public interface IRichEditPlugin : IHYCrossPlatformInterface
    {
        string GetHtmlText();
        void SetHtmlText(string text);
    }

This will require using RemObjects.Hydra.CrossPlatform and System.Runtime.InteropServices. Both the Guid and descending from IHYCrossPlatformInterface are required by Hydra. Finally, implement the interface:

    public partial class RichEditPlugin : VisualPlugin, IRichEditPlugin
    ...
        public string GetHtmlText()
        {
            return richEditControl1.HtmlText;
        }

        public void SetHtmlText(string text)
        {
            richEditControl1.HtmlText = text;
        }

And that’s it for the .NET side of things.

On the Delphi side of things, create a new VCL forms project. Add a THYModuleManager, TPanel, TMemo, and two TButtons.

Click the Hydra menu followed by Import Interfaces from .NET Assemblies. Browse for the assembly created above and click Import on the dialog listing the IRichEditPlugin interface. Then, click the Hydra menu followed by Hydra Package Settings. Select “Build with Hydra Core Packages” and click OK.

Adding a minimal amount of code will get the RichEditControl showing:

procedure TForm2.FormCreate(Sender: TObject);
begin
  HYModuleManager1.LoadModule('RichEditModule.dll');
  HYModuleManager1.CreateVisualPlugin('RichEditModule.RichEditPlugin', FRichEditPlugin);
  FRichEditPlugin.ShowParented(Panel1);
end;

procedure TForm2.FormDestroy(Sender: TObject);
begin
  FRichEditPlugin := nil;
  HYModuleManager1.UnloadModules;
end;

FRichEditPlugin is a private field declared as IHYVisualPlugin.

After you copy the assembly created in .NET (RichEditModule.dll in my example) to the same path as your debug executable, running the application will now show the Delphi application with the .NET RichEditControl embedded and fully functional.

Adding a bit more code will allow getting and setting the HTML text in the .NET RichEditControl using the text entered in the TMemo:

//Set button
procedure TForm2.Button1Click(Sender: TObject);
var
  RichEditPlugin: IRichEditPlugin;
begin
  if Supports(FRichEditPlugin, IRichEditPlugin, RichEditPlugin) then
    RichEditPlugin.SetHtmlText(Memo1.Text);
end;

//Get button
procedure TForm2.Button2Click(Sender: TObject);
var
  RichEditPlugin: IRichEditPlugin;
begin
  if Supports(FRichEditPlugin, IRichEditPlugin, RichEditPlugin) then
    Memo1.Text := RichEditPlugin.GetHtmlText;
end;

This new code will require adding the “_Import.pas” unit generated when you imported the .NET interface to your uses clause.

And that’s it. Press the run button and then try the Get and Set buttons along with editing text in the RichEditControl.

XtraRichEdit in a VCL App

This is pretty powerful stuff thanks to Hydra and the XtraRichEdit, and very easy to do! You can download the C# and Delphi source code here.

Delay Applying Updates in Data Abstract iOS Apps

Even though the code generated by the Data Abstract for Xcode template for iOS apps is mostly asynchronous, you may notice delays in the UI when applying updates to your server. If you check out the asyncRequest:didFinishApplyingChangesForTables:withErrors: method in DataAccess.m, you’ll see that, once an asynchronous request finishes, your data will be saved to the local briefcase file:

- (void)asyncRequest:(DAAsyncRequest *)request didFinishApplyingChangesForTables:(NSArray *)tables withErrors:(NSArray *)errors
{
	[self saveData];
	[self setBusy:NO];
...

This happens on the main thread and, given a large enough set of data, will cause your app to stutter.

Luckily this is pretty straight forward to address. Cocoa makes it easy enough to run code in a background thread, and that’s exactly what Alexander from RemObjects suggests:

-(void)saveDataInBackground {</pre>
    [self performSelectorInBackground:@selector(saveData) withObject:nil];
}

- (void)asyncRequest:(DAAsyncRequest *)request didFinishApplyingChangesForTables:(NSArray *)tables withErrors:(NSArray *)errors
{
    [self saveDataInBackground];
    [self setBusy:NO];
...

I’ve been using a form of the above solution for a while now and can confirm that things are running smoothly, even with a larger set of data. Alexander let me know that this would be addressed in the Xcode templates for a future DA update.

Offline Mode with Data Abstract for Xcode

Offline Mode Sample App

Data Abstract for Xcode makes it simple to support disconnected n-tier iPhone applications. The briefcase support is a snap, requiring only a few extra lines to be uncommented from the templates installed into Xcode by the Data Abstract installer.

I recently looked into adding support for an “offline” mode to a Data Abstract application I’ve been developing for the iPhone. Basically, the application should function whether the iOS device has an active network connection or not. If there is no connection, any changes to data should be cached until a network connection becomes available.

This ended up being pretty easy to do. I started by adding Reachability support to the application. You can read detailed instructions about how to do that on Stack Overflow.

Once Reachability support has been added, it’s trivial to alter the beginApplyUpdates code in DataAccess.m to only apply updates when a network connection is available:

- (DAAsyncRequest *)beginApplyUpdates
{
	if (!internetActive) {
		return nil;
	}
        ...
}

Then, modify the Reachability usage to apply any unsaved updates when a connection becomes available:

		case ReachableViaWiFi:
		case ReachableViaWWAN: {
			NSLog(@"The internet is working via WIFI.");
			internetActive = YES;
			//have we received data before?
			if (self.daTestTable) {
				//setup data adapter (to get a new connection)
				[self setupDataAdapter];
				//commit any pending changes to the server
				[self beginApplyUpdates];
			}

			break;
		}

The only tricky bit I found was the need to re-create the DARemoteDataAdapter after a network connection becomes available. This is because you may have an old connection that will cause an error if you try to reuse it.

You can download a sample application here. The archive includes a SQL script for generating the sample DB table.

Using RemObjects Olympia Server with a Custom Session Class

Today I started playing with Olympia Server for out-of-process session management. We have a RemObjects SDK service that can currently use either in-memory or in-database session state. The in-memory option is faster, but limits our pooling options.

The various session managers in RemObjects SDK are pluggable and swappable, so adding support for Olympia Server to our service was pretty straight forward. However, I quickly ran into a limitation when using Olympia Server for session management: if you make use of a custom TROSession descendant by handling CustomCreateSession (discussed here), your custom properties will not be available after they make the trip to Olympia Server and back.

After looking into the source for TROSession and TROOlympiaSessionManager, I was able to come up with a fairly simple solution that allows you to continue using a custom TROSession descendant while also making use of Olympia for out-of-process session management.

The key is to override both SaveToStream and LoadFromStream in your custom TROSession descendant. There, serialize and deserialize your custom property values using the Values name-value pair property found on TROSession.

procedure LoadFromStream(aStream: TStream; OnlyValues: Boolean = False); override;
procedure SaveToStream(aStream: TStream; OnlyValues: Boolean = False); override;

...

procedure TMySession.LoadFromStream(aStream: TStream; OnlyValues: Boolean);
begin
  inherited;
  MyProperty := Values['MyProperty'];
  MyStrings.CommaText := Values['MyStrings'];
end;

procedure TMySession.SaveToStream(aStream: TStream; OnlyValues: Boolean);
begin
  Values['MyProperty'] := MyProperty;
  Values['MyStrings'] := MyStrings.CommaText;
  inherited;
end;

That Values property is one of the properties that will be persisted round trip with Olympia, and will allow your service to continue working as-is while making use of both a custom session class and Olympia Server.

Domain Business Processors with Data Abstract for .NET

One of the nice things you can do out-of-the-box with Data Abstract for Delphi is have your server-side business logic segmented into separate classes for each logical data table. This technique is described here:

Strongly Typed DataTables and Business Helper Classes

I’ve recently started using Data Abstract for .NET, and love the new features it brings, such as DA Linq and DA SQL. While this per-data table business logic isn’t to be found, it’s pretty straight forward to cook up.

Here is an example base class that encapsulates the standard Data Abstract BusinessProcessor, and here’s a sample CustomerBusinessProcessor that ensures a name is specified.

Using this class, you can write your own descendants, and then new them up in your service’s implementation unit like so:

FProcessors := new List<DomainBusinessProcessor>;
FProcessors.Add(new CustomerBusinessProcessor(self.components));

The essential bit here is passing in the components collection to the constructor, which is passed on to the constructing call for the internal BusinessProcessor. This is necessary as it is how Data Abstract finds BusinessProcessors for a given service.

You can download a sample here, which uses the SQLite PCTrade DB that ships with Data Abstract.

tide.Northwind.Customers – Part 2

In my previous post on the Customers plugin for the tide Northwind demo, we created an IMasterWindow plugin that displayed the Northwind customers, and allowed for inline editing within DevExpress’ .NET GridControl. This enabled us to create, edit, and delete customers from the Northwind database. However, as I expressed before, I’m not a huge fan of editing within a grid, unless I know the target audience is very savvy. With that in mind, we’ll now create an IDetailWindow plugin that allows us to inspect the details of a row, and make changes within that window, rather than in the grid control itself.

First, let’s look at the key methods within tide that we’ll either be calling or implementing within our plugins:

  • void IHostApplication.DisplayMasterDetails(IMasterWindow sender)
    Called from IMasterWindow plugins to display relevant detail windows.
  • void IHostApplication.DisplayDetailWindow(IMaserWindow parent, IDetailWindow plugin)
    Called from IDetailWindow plugins to display themselves, parented in an IMasterWindow plugin.
  • void IDetailWindow.DisplayDetails(IMasterWindow)
    Implemented by IDetailWindow plugins, to handle displaying details for relevant IMasterWindow plugins.

Let’s get started. We’ll start by adding a new Hydra Visual Plugin to our existing Northwind.Customers Hydra Plugin Module project. Then, we’ll copy the customers binding source from our previously created BrowsePlugin form to our new plugin form, add several DevExpress TextEdits to a LayoutControl instance, and bind them to the customers binding source.

As before, we’ll add a couple of tide namespaces to our using clause:

using tide.PluginInterfaces;
using tide.Utilities.Plugins;

The tide.PluginInterfaces namespace gives us access to the various plugin interfaces that can be implemented. The tide.Utilities.Plugins namespace provides some base classes we can descend from that give us some nice built in functionality. We’ll call this plugin DetailsPlugin and adjust its declaration as shown below:

public partial class DetailsPlugin : DetailWindow

The DetailWindow class, declared in tide.Utilities.Plugins, implements IDetailWindow with default methods. It also descends from BaseWindow, which has some niceties such as automatically persisting DevExpress GridControl changes and LayoutControl customizations.

IDetailWindow works hand-in-hand with IMasterWindow, providing a detail view for the data presented in a class that implements IMasterWindow:

public interface IDetailWindow
{
    void DisplayDetails(IMasterWindow sender);
    bool BeforeSaveDetails(IMasterWindow sender);
    bool AfterSaveDetails(IMasterWindow sender);
    string Caption();
    int Order();
    Image Glyph();
}

Some of IDetailWindow’s methods relate to how the view is presented in the application, such as Caption(), Order(), and Glyph(). Others, such as DisplayDetails(), BeforeSaveDetails(), and AfterSaveDetails(), allow our class to respond to certain events that happen to the IMasterWindow, such as displaying that IMasterWindow’s DetailsObject(), and taking certain actions before or after the IMasterWindow’s DetailsObject() is saved.

We’ll override a few of the DetailWindow methods in our DetailsPlugin, giving us what we need to view the details of a Northwind customer:

public override void DisplayDetails(IMasterWindow sender)
{
    if (sender is BrowsePlugin)
    {
        RemoveControlErrors();

        customersBindingSource.SuspendBinding();
        customersBindingSource.DataSource = sender.DetailsObject();
        customersBindingSource.ResumeBinding();

        (Host as IHostApplication).DisplayDetailWindow(sender, this);
    }
}

public override bool BeforeSaveDetails(IMasterWindow sender)
{
    return dxValidationProvider1.Validate();
}

public override string Caption()
{
    return "Customer Details";
}

public override Image Glyph()
{
    return imageCollection1.Images[0];
}

Our DisplayDetails() implementation checks to see if the calling IMasterWindow is our BrowsePlugin. If so, it simply binds the IMasterWindow’s DetailsObject() to the binding source on our form, and then calls the host’s DisplayDetailWindow() method, which shows the detail window parented in the specified master window.

We also override BeforeSaveDetails() so that we can use a dxValidationProvider to ensure the necessary values are filled in for our customer. The call to RemoveControlErrors() in DisplayDetails() is simply a helper method that removes any existing validation errors when a new object’s details are shown.

Now, we’ll alter the code we wrote before in our BrowsePlugin so that, instead of editing within the grid, our IDetailWindow implementation is displayed instead. We’ll be using a second dataset, editingNorthwindDataSet, as a temporary holding place for the row we are editing. This allows us to make changes in our IDetailWindow plugin without directly affecting the list in the IMasterWindow plugin until the changes are saved.

First, we’ll change our NewItem() method from:

public override void NewItem(object refId)
{
    customersBindingSource.AddNew();
}

to:

public override void NewItem(object refId)
{
    editingCustomersRow = editingNorthwindDataSet.Customers.NewCustomersRow();
    InitializeCustomerRow(editingCustomersRow);

    editingNorthwindDataSet.Customers.Rows.Clear();
    editingNorthwindDataSet.Customers.Rows.Add(editingCustomersRow);

    (Host as IHostApplication).DisplayMasterDetails(this);
}

Here, we create a new row, initialize it with default values, and add it to our editing dataset. Then, we call the host’s DisplayMasterDetails() method, which will iterate through plugins that implement IDetailWindow, calling DisplayDetails() on them, passing in the calling IMasterWindow.

In order for this to work (as seen in our IDetailWindow plugin’s DisplayDetails() implementation above), we need our BrowsePlugin to override DetailsObject(), returning the object that our IDetailWindw plugin is interested in. In this case, it’s our editing row:

public override object DetailsObject()
{
    return editingCustomersRow;
}

Then, we’ll change our implementations of SaveItemEnabled() and SaveItem() from:

public override bool SaveItemEnabled()
{
    return northwindDataSet.HasChanges();
}

public override bool SaveItem()
{
    customersTableAdapter.Update(northwindDataSet.Customers);
    return true;
}

to:

public override bool SaveItemEnabled()
{
    return editingNorthwindDataSet.HasChanges();
}

public override bool SaveItem()
{
    customersTableAdapter.Update(editingNorthwindDataSet);
    northwindDataSet.Customers.LoadDataRow(editingCustomersRow.ItemArray, System.Data.LoadOption.OverwriteChanges);
    return true;
}

Here, we’re simply checking our editing dataset for changes rather than our browsing dataset. With SaveItem(), we save our editing dataset rather than our browse dataset, and we load the added/updated row back into our browsing dataset. Easy enough!

Now, we’ll change our DeleteItem() implementation slightly. Before, since we were editing within the grid, we simple removed the current row in our DeleteItem() implementation:

public override void DeleteItem()
{
    customersBindingSource.RemoveCurrent();
}

These changes would be saved in our previous SaveItem() implementation. However, we’ve now changed the paradigm. DeleteItem() needs to apply the changes, so we’ll make some slight modifications, adding a message dialog to confirm our deletion and then applying the changes:

public override void DeleteItem()
{
    if (MessageBox.Show("Delete the selected customer?", "Delete", MessageBoxButtons.YesNo, MessageBoxIcon.Question) == DialogResult.Yes)
    {
        customersBindingSource.RemoveCurrent();
        customersTableAdapter.Update(northwindDataSet);
    }
}

The final piece to making this all work is disabling editing in our GridControl by toggling OptionsBehavior.Editable, and then handling the GridControl’s DoubleClick event with the following code:

private void customersView_DoubleClick(object sender, EventArgs e)
{
    NorthwindDataSet.CustomersRow customersRow = (NorthwindDataSet.CustomersRow)northwindDataSet.Customers.Rows[customersBindingSource.Position];

    editingNorthwindDataSet.Customers.Rows.Clear();
    editingNorthwindDataSet.Customers.ImportRow(customersRow);
    editingCustomersRow = (NorthwindDataSet.CustomersRow)editingNorthwindDataSet.Customers.Rows[0];

    (Host as IHostApplication).DisplayMasterDetails(this);
}

Similar to our NewItem() implementation, we’re mostly just juggling dataset rows. We take the current customer row in our browsing dataset and import it into our editing dataset. We save a reference to our editing row (as it is used by our DetailsObject() implementation), and then call the host’s DisplayMasterDetails() method (as in our NewItem() implementation). As stated above, this will display our IDetailWindow plugin, parented in our IMasterWindow plugin.

And that’s it! You can now create and edit customers within a dedicated detail window:

Widgets and Gadgets Northwind Client (4)

You can have as many IDetailWindow implementations as you like. For each one that calls IHostApplication’s DisplayMasterDetails method for a given IMasterWindow, tabs will be created for displaying each IDetailWindow plugin (that’s where the Order() method comes into play). You can also have completely separate Hydra Plugin Modules with IDetailWindow plugins that parent in IMasterWindow plugins from other modules. It’s all fairly loosely coupled, while allowing for a nice degree of integration between plugins.

In the next post, we’ll get into creating a plugin for Northwind Orders, and show how to do things such as showing customer details from an order’s details or creating a new order from a selected customer’s details.

DevExpress Gems – The VCL Filter Control

Recently I was tasked with creating a visual query builder for the reporting. The existing query builder was pretty simplistic, and did not allow for grouping conditions. After talking to a few colleagues and looking into some recommended solutions, a consultant and coworker (and close friend) of mine suggested something that is now obvious, but at the time caught me off guard: why not try utilizing the same filter control shown when you customize the filter of the DevExpress VCL Quantum Grid?

Using the filter control from the Quantum Grid component would immediately carry several benefits:

  • We wouldn’t have to purchase an additional control
  • Our query builder for reports would be the same control customers already use to customize our grids
  • Our query builder for reports would visually match the rest of our application

I was optimistic about being able to reuse the filter control, but honestly thought I’d be digging into the source for the existing Quantum Grid filter customization dialog, hoping for some source I could use. Once I sat back down to my development environment, I was delighted (to say the least) to find both TcxFilterControl and TcxDBFilterControl in my tool palette. This was not the first time (nor, I suspect, the last) that I was completely surprised to find some controls in my DevExpress arsenal of which I was previously unaware.

Did these controls allow us to provide a flexible query builder capable of creating complex SQL? You betcha.

Out of the box, TcxFilterControl is meant to be wired up to something like a TcxGrid table view, while the TcxDBFilterControl is meant to be wired up to a TDataSet descendant. While I normally shy away from database-aware controls in the VCL, the TcxDBFilterControl seemed to be more suited for our needs.

To get started using a TcxDBFilterControl, you can simply drop the control on a form and hook it up to a TDataSet descendant. Immediately, the control is capable of creating complex filters, with the field selections in the filter control populated with the field definitions from the TDataSet descendant.

If you are working with a standards compliant SQL engine, using the TcxDBFilterControl to query data requires only a handful of code. Given a simple VCL project with an ADO connection to the Northwind database, a TcxDBFilterControl hooked up to the EmployeeDefsQuery component, and a grid control hooked up to the EmployeeResultsQuery component, the following code is all you need:

procedure TDemoForm.FormCreate(Sender: TObject);
begin
  EmployeeDefsQuery.SQL.Text := 'Select * From Employees Where 0 = 1';
  EmployeeDefsQuery.Open;
end;

procedure TDemoForm.FetchButtonClick(Sender: TObject);
var
  SqlStatement, WhereClause: string;
begin
  EmployeeResultsQuery.Close;
  SqlStatement := 'Select * From Employees';
  WhereClause := EmployeeFilterControl.FilterText;
  if WhereClause <> '' then
    SqlStatement := SqlStatement + ' Where ' + EmployeeFilterControl.FilterText;
  EmployeeResultsQuery.SQL.Text := SqlStatement;
  EmployeeResultsQuery.Open;
  if EmployeesTableView.ItemCount = 0 then
    EmployeesTableView.DataController.CreateAllItems;
end;

Our database engine is not fully SQL compliant, plus there were some niceties that our previous report query builder supported that had to be preserved (such as not requiring the end-user to use the % mask character with LIKE conditions). While crafting custom SQL from the filter control is not trivial, it is entirely possible and we were able to meet all of our requirements with this very useful control.

To create your own SQL from the supplied filter control, you’ll need to parse the TcxFilterControl.Criteria.Root property. The items within the criteria list can either be criteria items or additional criteria lists (TcxFilterCriteriaItem and TcxFilterCriteriaItemList). The item lists have a BoolOperatorKind property, while the criteria items have an Operator property, along with additional properties, that allow you to fully inspect the filter specified in the filter control and create your own SQL statement. While this is outside the scope of this post, feel free to contact me for more details on how to accomplish this.

I love being surprised by existing controls, new to me, that DevExpress has provided in their suites, allowing me to get my job done better and faster without additional investments in time and money. To be honest, when I thought visual query builder, I just did not think DevExpress. To me this is a showcase control (as seen from other vendors), but DevExpress’s implementation provided everything we needed to get our new report query builder going in no time.

A License for Tide

I’ve had some really good communications with Ron Grove concerning tide. After expressing some interest in the project, I sent him the source code for tide and the Northwind client, which prompted some discussion of the licensing I’d be using for the project.

After a couple of nights reading about many different licensing options, and some great advice from Ron, I think I’ve finally settled on using the FreeBSD/Simplified BSD license. This seems like the least restrictive copyleft license that maintains GPL compatibility. I was also considering MS-PL, but this is not GPL compatible. Now, I don’t personally think that’s a huge stopping point, but I figured if I can use a GPL compatible license, why not?

Hopefully this licensing choice will suit all who wish to download, examine, learn from, use, and possible even modify or redistribute tide. I’ll be wrapping up the second part of the customers plugin soon, and hope to have a blog post up about that this weekend.

Thanks to Ron for helping with my licensing questions!