Client-Server Application Design

Client-Server Application Design

11.      Client-Server Application Design

Various bottlenecks need to be addressed when designing an application that is optimised for client-server operation. This section discusses the broad issues and indicates how the application should be designed for maximum performance.

Performance Bottlenecks

Client-server designs using front-end languages designed for a graphical user interface suffer from the user expectations of flexible and on-demand access to data. User expectations of an interface similar to a local spreadsheet together with system requirements to run up to hundreds of users simultaneously over the network require that considerable care must be taken over performance considerations.

The final user interface needs to be a trade off between the performance issues and the usability that is built into the application. It is possible to build mainframe style applications that support hundreds of users at the cost of usability or very usable database applications that support only ten users.

Some of the performance issues are detailed below:

·                  Network traffic is the greatest bottleneck requiring that a minimum amount of data is transferred between client and server with a minimum number of transactions in high volume transaction processing systems.

·                  Many validations can be performed locally within the application to prevent the loop of updates being attempted on the server, error messages passed back, changes made locally, and then reissued onto the server.

·                  Server resources can quickly be consumed with certain types of application design and front end programming tools. For example, each user requires at least one connection on the server which requires some server memory. Some applications may use several server connections for each user thus overloading the server.

·                  Some database servers can perform a considerable amount of processing on the server if the appropriate triggers and stored procedures are set up. This functionality is server specific and will not allow the application to work with all servers but will significantly reduce traffic for many types of transaction.

·                  Pass through queries can exploit high performance features of a particular server to take the load off the client and onto the server.

Parameterised Views

Parameterised Views are of great importance in reducing the amount of data retrieved from the server at any one time. Care should be taken that appropriate indexes are set up so that the server can optimise the remote view and a small amount of data is always specified.

Only the required fields (and primary keys) need be specified in a remote view. Be careful with memo and text fields and use the view parameters to reduce the network traffic for these large fields.

Some servers, including SQL Server, place read locks onto records when they are read as part of a transaction that may be followed by updates. The number of records could be kept to a minimum here or the transaction terminated to release the locks and reduce server overheads in a high volume situation.

Preparing the view on the server will improve performance for a view that is repeatedly queried many times for a single user.

Local Validations

Performing some validations programmatically at the workstation instead of relying on the server validations will reduce the number of transactions passing over the network due to incorrect user entry.

Field and Record level validation can be implemented using the local data dictionary or application logic on the workstation. These validations should not require any checks with data from the server so no network traffic is generated.

Field level validation is specified in the View Designer whilst Record level validation rules may be specified for a remote view using the DBSETPROP() command to set the RuleExpression property for the View.

Validations that require checks on server data are better performed on the server. This applies particularly to referential integrity checks and also to complex record level validations that can be performed with a Trigger.

In some cases a local copy of the server data should be made to reduce network traffic. For example, if a product list is relatively static, a local copy of the table could be made at the beginning of processing so that all validation is performed locally.

An Offline View can be used to create a hybrid system where near static data is updateable by the user who will only see local changes to the tables until the next time the view is created.

Validations that are checked locally will also need to be implemented on the server if other applications are accessing server data. The cost of this approach is that changes need to be made to the server and client data dictionaries each time a change is made to the business rules affecting the application. The workstation application will also need to be upgraded each time a change is made.


The frequency and size of transactions between the client and server can be controlled by the design of the application. Try to retrieve a small amount of data to the user and update in one transaction wherever possible to reduce network traffic.

Encourage the user to select a small number of records by providing a usable selection window that allows pinpointing of a small number of records before retrieving the data from the server.

Many data entry forms may search for information on a parent table and only show child related information if required. For example, an accounts system might allow the user to search for an invoice and then display the line items. Two views could be used in this case so that the line items are retrieved only if required for display to the user.

Some applications have several users constantly entering data. This data may not be required immediately on the server and the number of transactions may be reduced by updating the server after ten records have been entered instead of each time one record is entered.

Similarly changes to multiple records that form part of a transaction could be stored on the workstation and issued to the server in one transaction using table buffering rather that issuing several transactions followed by a commit or rollback.

Stored Procedures

Stored Procedures are passed directly to the server and can use whatever performance enhancement functions are supported by the server. SQL Server applications might use a stored procedure to return required information for example.

One example that creates huge gains in performance involves sequences of transactions that are common in accounting applications where a transaction is made in one account and several other corresponding transactions must be made in other ledgers for double entry book keeping.

Many applications would issue a begin transaction on the server, issue the first transactions followed by a transaction for the double entry in the other tables, and then commit. All of these transactions could be performed in a single stored procedure by passing the amount and the ledger names to the procedure. Only one transaction is passed to the server and the server performs all the subsequent transactions locally. In a high volume scenario this improves performance considerably.

Other examples might involve the creation of a temporary table on the server and running various procedures before returning the final results set to the workstation.

SQL Server can also call external stored procedures to compiled DLL programs on the server to integrate with mail, Internet, or other infrastructure systems in the organisation. This requires only one connection to the resource for all database users and may reduce overall network load.

SQL Pass Through

The server may have extensions to standard SQL that allow better server performance through the use of proprietary language features. If performance is critical, these may improve performance whilst tying the application to the particular database platform.

There are some ODBC extensions that may help in creating pass through queries that operate on several servers. These are particularly useful for date arithmetic and left outer joins.

One clever trick to improve performance might be to create long strings of commands that can be sent to the database server as a single command. A series of one hundred INSERT statements separated with semi-colons and sent to the server as a SQL pass through statement might be faster than appending records with a remote view.


Some applications will use more than one connection to retrieve data from the server. A grid might retrieve data using one connection, a form using another, and a combo control on a form yet another.

Care should be taken to reduce the connections per user (by using shared connections) as these consume valuable resources on the server. Remember that SQL pass through statements can use the same connection handle as remote views.

Microsoft Transaction Server

The 3-tier model of programming promoted by Microsoft can work very well in many situations. In this case an application server runs business components as small objects which can easily be instantiated on separate client workstations. All communication with the database server passes through this middle layer of software components.

Microsoft Transaction Server handles all the software configuration issues and the software behaves as if it is running on the local workstation. This allows Visual Basic or Microsoft Office applications access to complex database functions written in FoxPro without requiring FoxPro to be installed locally. It is also the recommended method for implementing web pages with ASP.

However, it is likely that a native FoxPro application will perform better by accessing the ODBC driver directly from the workstation than it would by using a three-tier software architecture to access a remote automation server.


27/05/2017 01:28:02