Full Set of IBM Business Process Manager Classes Available

We have been releasing various courses on IBM Business Process Manager 8.5 lately and have several different options for those clients using this product.  The course that might apply depends a bit on the role someone is in so below are the options we have:

Business Analysts:



“Whole Team”:

  • WA2219 BPM Bootcamp with IBM Business Process Manager Advanced 8.5 – We also have a 5 day “bootcamp” class that is intended to show multiple roles the capabilities of the IBM Business Process Manager platform and how to most effectively leverage the many features.  This class combines the full 3 days of the process modeling course and then includes 2 days on the “programming” that would be most commonly used to support process applications.

For those wanting to learn more about how to leverage the features if the IBM BPM platform, it might also help to view the recorded webinar below.  Even though the webinar was originally given for 8.0 it still applies to 8.5 as well.

Webinar – Effective BPM with IBM Business Process Manager 8.0

So for those clients looking to use the many features of IBM Business Process Manager 8.5, let Web Age Solutions help you learn these capabilities!

No Comments

WebSphere Portal 8.0 Programming Class Released – Now with Mobile!

We have recently updated our WebSphere Portal 8.0 Programming class (WA2089) with some new topics.  The most notable of these is coverage of some of the new mobile features that IBM has included in WebSphere Portal 8.0.  This includes the WebSphere Portal Mobile theme and “device classes” which is a framework provided by WebSphere Portal to allow you to easily determine what type of device a user is viewing the portal on and perhaps adjust the view content based on that information.

In recent versions of the class we have also added or expanded coverage of some topics popular with clients which include the following:

  • Spring MVC portlet framework
  • Customizing WebSphere Portal themes and skins (the mechanism for this changed drastically in WebSphere Portal 7.0 and we show the new way)
  • Using the Dojo client-side JavaScript library loaded by WebSphere Portal
  • WebSphere Portal “iWidget” framework for client-side components loaded by the portal
  • Basic WebSphere Portal administration tasks like deploying portlets and creating portal pages
  • WebSphere Portal Personalization Framework

Although the basic portlet programming hasn’t changed in several WebSphere Portal versions there are certainly lots of important things that have changed recently so this updated WebSphere Portal 8.0 Programming class will help get you up to date!

No Comments

WebSphere Liberty Profile Webinar – Wednesday, May 28, 2PM Eastern

Although WebSphere Application Server is one of the most robust Java Enterprise Application Servers for deployment of mission critical applications, it is not always that easy to use in development. Often we see clients who deploy to WebSphere in production using Tomcat or some other server to test in development because it is "easier". The complaint is that the full WebSphere Application Server takes too long to start or redeploy applications and is not intuitive to configure for developers. To address these issues, IBM has created the WebSphere "Liberty Profile" server. This is a lightweight server, certified for Java EE 6, that starts much faster and is easier to configure.


In this webinar we will look at the features of the WebSphere Liberty Profile server, how it compares to the "full" WebSphere Application Server, and how you can use it to simplify the development and testing of Java EE applications. We will even show that with version 8.5.5 of the WebSphere Liberty Profile there are some intriguing new features that would even let you run the server as part of a cluster and use it for some production deployment scenarios. We will also highlight the use of FREE Eclipse development tools that are available since the cost of development tools for WebSphere Application Server has also been historically an issue.


Register for the webinar here

No Comments

Time for Spring …. 4!

On the first day of Spring 2014 (even though some of you may feel winter will never end) I think it is a good time to talk about what is going on with the Spring Java Framework.

Over the past year there have been some big things going on with Spring, probably the chief among them the move in April 2013 to place Spring under the control of the new company GoPivotal.  This spin-off from VMware, along with an investment from GE, is meant to support a new breed of applications where cloud and big data are a given not some afterthought on a platform not built for it.  With the ever-expanding broad ecosystem of Spring-related projects, in addition to the immense popularity of the core Spring Framework, Spring seems a natural fit as an application framework to support this.  Probably the most immediate change for those already using the Spring Framework day to day though was that there was a new web site to get Spring documentation, downloads, resources, etc:


Fast forward to the end of 2013 and we had the release of the Spring 4 Framework in December.  The release of Spring 4 fits nicely with the desire to support more modern applications as support for many new technologies has been added.  Among those are included support for Java EE 6 & 7, Java SE 8 (which was just released) and more recent versions of many optional third party libraries.  Spring 4 is also a new foundation for the expanding list of Spring-related projects, the following just a few key ones to mention now:

  • Spring Boot – Jumpstart on rapid development of Spring applications
  • Spring Data – An umbrella project with a number of useful utilities for popular data access technologies like MongoDB, Hadoop, and JPA
  • Spring XD – Unified system for big data ingestion, analytics, batch processing and export
  • Spring Security – Application security framework
  • Spring Mobile & Spring for Android – Support for developing mobile applications
  • Spring Integration – Implementation of well-known enterprise integration patterns
  • Spring Batch – Comprehensive batch application framework
  • and several more

As you start to look at the Spring 4 Framework and what it can do for you, we at Web Age Solutions would like to assist you in that discovery.  Below are some links to a webinar we will be giving next week on the changes in the Spring 4 release and a link to the new training category we have posted with Spring 4 training classes.

WEBINAR – What’s new in Spring 4, Thursday March 27th 1:30-2:30 PM Eastern

TRAINING CLASSES – Spring 4 Framework Training Classes

Here’s to hoping that your wait for using Spring 4 will not be as long as the wait for Spring 2014 has seemed!



SharePoint Development– Object Model

For SharePoint development we rely on a bunch of classes. These classes are collectively known as Object Model. By using these classes we can perform various activities e.g. create site collections / sub-sites, libraries, lists, upload documents, add list items, delete items, create taxonomy / metadata / content types etc.


Object model is of two main types:

  1. Server Object Model
  2. Client Object Model

Server Object Model

Server Object Model means we get to create program in C# / VB.NET and the code is deployed to SharePoint server as a wsp package.  In short the code runs on the same server where SharePoint is deployed. We can also use PowerShell for utilizing server object model.

Here’s the server object model hierarchy:

* SPFarm (enumerate services, solutions, CurrentUserIsAdministrator)
    * SPService (represents a service. e.g. Excel Service, InfoPath Form service etc.)
        * SPWebApplication (represents the IIS web application)
            * SPSite (represents site collection)
                * SPWeb (represents site / sub-site)
                   * SPList (represents list / library)
                        * SPListItem (represents list item or document)

Here are some more useful classes:

SPField[type] (represents field / column in a list or library. e.g. Link, Choice, …)
SPContentType (represents content type)
SPUser (represents SharePoint user)
SPSecurity (security,  elevated permissions, exception handling)
SPQuery (used for running CAML queries for querying lists and libraries)
SPContext (easier way to retrieve current site)
SPGroup (represents SharePoint security group)
UserProfileManager (useful for manipulating user profiles and mysite)
Taxonomy (Term Store. Hierarchical metadata)
    * TaxononmySession
    * TermStore
    * Group
    * TermSet
    * Term

Client Object Model

Client Object Model allows us to write code in  C# / VB.NET / JQuery etc. The code can reside on any machine and we can still access SharePoint located on a remote server.  Here are some useful client object model classes:

ClientContext (specify the SharePoint site we want to connect to. Pass credentials to the site)
Web (represents an existing site / sub-site)
WebCreationInformation (used for creating a new site /sub-site)
NavigationNodeCreationInformation (used for configuration navigation / top link bar)
List (represents an existing list or library)
ListCreationInformation (used for creating a new list or library)
ListItemCreationInformation (used for creating a new list item)
ListItem (represents an existing list item)
FileCreationInformation (used for creating a new document in a library)

, ,

1 Comment

SharePoint–Some useful OOB Application Pages

SharePoint comes with a boatload of application pages that reside in 14/15 (depending on SharePoint version) hive folder structure.


e.g. In case of SharePoint 2010 you can find them in

%systemdrive%\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS

In this hive folder you will find a bunch of .aspx pages. These pages are used by “Site Settings” etc. You can also create your custom application pages as well by using Visual Studio. Business case for creating custom application page would be you want to create a form that allows end-users / site admins to submit request for creating departmental sub-site or bunch of libraries / lists with custom content types.

Here are some useful OOB application pages along with sample scenario where you can use them.

Signing out the user

Say you want to create a custom link that allows the user to sign out. You can do so by using this url:


Signing in as a different user


Display access denied page

Say you have some code in Visual Studio and the operation you are trying to perform isn’t valid for the current user due to lack of permissions. We can redirect the user to access denied page.


Removing faulty web parts from a web page

When you add web parts to a page sometimes due to unhandled exceptions or configuration issues a web part can cause the whole page to become inaccessible. To remove the faulty web part we can use following application page:


Here /SitePages/Home.aspx the relative url of the page that contains the faulty web part.

No Comments

Using the k-Nearest Neighbors Algorithm in R

k-Nearest Neighbors is a supervised machine learning algorithm for object classification that is widely used in data science and business analytics.

In this post, I will show how to use R’s knn() function which implements the k-Nearest Neighbors (kNN) algorithm in a simple scenario which you can extend to cover your more complex and practical scenarios. R is free and kNN has not been patented by some evil patent trolls (“patent assertion entities”), so there is no legal or other restrictions for us to go ahead with the demonstration.
Read the rest of this entry »


SharePoint–Displaying Site Collection Size Using PowerShell

Let’s say we have to display SharePoint site collection size using PowerShell. We can easily use Client Object Model and PowerShell to accomplish that. Let’s do it step by step:


PS > $spsite = Get-SPSite “http://sharepoint:100”

PS > $spsite.Usage

It will display data like this:


Hmmm, interesting. It shows the size and also some other user information like visits and bandwidth. But, we just care for the size. So let’s modify the code:

PS > $spsite.Usage.Size

Now the output looks like this:


Ok, we have the size but it’s in bytes. Let’s convert that to MB.

PS > $spsite.Usage.Size / 1000000
or alternatively we can use this code:
PS > $spsite | select @{Name=”Size”; Expression={$_.Usage.Size / 1000000 }}

The output looks like this:


Still doesn’t look good. Now we can use the expression formatting to make it human friendly:

PS > $spsite | select @{Name="Size"; Expression={"{0:N2} MB" -f ($_.Usage.Storage/1000000)}}

Bingo, now it shows up as:

3.51 MB

0:N2 is the standard .NET based format. 0 would means the argument position and N2 means number format with 2 decimal places. –f is used for pass the argument to the format.

No Comments

Data Quality Services (DQS)–SQL Server 2012

Business Intelligence is all about taking better decisions and decisions can only be as accurate as the data we have at the time of decision making. Data quality is mostly about ensuring there are no typos in data. E.g. if province name is “British Columbia” then it should be consistent everywhere and not use “BC”. At times data can be inconsistent e.g.

  1. Receiving data in an txt / csv / excel / xml where the operator entered data by hand.
  2. Data entry application has lots of open-ended text boxes without field validation in place.
  3. Data is received from 3rd party company and they have different conventions.

DQS makes life a lot easier for both data stewards and developers. Data steward is a person who’s responsible for manually fixing the typos. Developer is a person who can use bunch of tools to automate data cleansing. DQS provides tools caters to both the data stewards and developers.

DQS comes with various components:

  1. Server side knowledgebase.
  2. Client side tool for creating adding data to the knowledgebase
  3. SSIS Data Cleansing Transformation that can be used by SSIS developers for automating data cleansing

For working with DQS there are various steps involved:

Step 1: Create Knowledgebase

Knowledgebase is a repository for domains. Domain is basically a column or a field where you define the possible values for that domain. To create the knowledgebase we can use the DQS Client tool which looks like this:

2013-11-21 5-29-58 PM

Step 2: Add Domains to the Knowledgebase

Add at least one domain to the knowledgebase. Here I have added two domains.

2013-11-21 5-32-55 PM - Copy

Knowledgebase resides in SQL Server databases that are creating by DQS. Here’s a list of databases that’s usually created by DQS:


Step 3: Populate Domains

After creating the domains we have to add data to the domains either manually or we can import it from Excel spread sheet or SQL server. In case if you want to import data from Oracle or some other source then you want to convert it to Excel / SQL first by using SSIS or any ETL tool. Here I have populated the domain by hand:

2013-11-21 5-32-55 PM

If you want to import data from Excel / SQL then we can use the following option:

2013-11-21 5-33-05 PM

Step 4: Perform Data Cleansing

Once we have the knowledgebase in place we can perform data cleansing. We can either do it manually or we can automate it by using SSIS. If you want to clean it manually then we can use the same DQS client tool that we used for creating the knowledgebase. In DQS client tool we can create a project and specify Excel / SQL database as the source. When we run the project it will let us output both the original column value and the fixed column value. Then from here we can choose to save the fixed data to Excel / SQL database.

2013-11-21 5-33-32 PM

In case if you want to automate data cleansing as part of your ETL operation then we can use DQS Cleansing Transformation which is new in SSIS 2012. Here’s how it looks like:

2013-11-21 5-32-44 PM

DQS cleansing transformation makes use of the knowledgebase that we defined earlier on. It’s never 100% automation since we do have to update the knowledgebase whenever new typos are discovered. It’s on going activity and over period of time the knowledgebase eventually becomes mature enough to fix most of the typos.

DQS Pattern Matching

On top of fixing the typos DQS can also be used for finding patterns e.g. we could have a mailing address that’s combination of various fields like street number, avenue, postal code etc. I will leave the details for some other time.


SQL Server 2012 received the first iteration of DQS and it’s really cool. It makes fixing typos a lot easier both for data stewards and developers. It will be exciting to see further improvements in the next version.

, ,

No Comments

Transparent Data Encryption in SQL Server 2008, 2008R2, 2012 (Part 2)

If you haven’t read Part 1 of this series then you can read it here. Data encryption allows us to encrypt data at field level. It’s done by the developers either in stored procedures or in in-line T-SQL code.

Transparent Data Encryption (TDE) allows DBAs to encrypt the whole database at SQL instance level. Once TDE is enabled at database level data is transparently encrypted and there’s no noticeable performance overhead. Developers don’t have to write any code on their end to encrypt / decrypt data. TDE encrypts both the database data files and the backups. TDE sort of binds the database to the SQL instance, not to the physical machine, where TDE is enabled.  Which means if someone steals the mdf / ndf / ldf files or the .bak files then it’s pointless since no one can attach / restore these files on any other instance of SQL server.

Before we discuss the technical details for enabling TDE it’s very important to know that both data encryption and TDE secure the data at the physical layer level i.e. hard drive. Data is not encrypted when it’s transmitting over the network. For encrypting the communication / network layer we have to encrypt the connections to the database engine which is primarily done by using configuration manager and certificates. I will leave network / communication level encryption for some other day. For now we are going to encrypt data at the physical layer level so that if the physical files (mdf / ndf / ldf / bak) files are stolen or “misplaced” then they shouldn’t be usable on any other instance / machine.

The overall process for with TDE requires creation of master key, certificates and enabling various TDE related options. Here I will guide you through the whole process. If you are following it then please ensure you perform them on a test / dev machine. DISCLAIMER: If these steps fry your machine, make you bald headed or gets your dog abducted by aliens then I won’t be held responsible.

Step 1:

To work with TDE your first step is to create a master key at the SQL instance level i.e. in the master database. Master key is used for creating certificates at the SQL level.

USE master




Step 2:

Create certificate that will be used for enabling TDE. Note: initially certificate is created in the master database and then assigned to a custom database



Step 3:

Now before we proceed with the actual encryption we should take the backup of the certificate and the encryption key right away otherwise if we lose the TDE enabled database then it will be nothing more than virtual paper weight, meaning we can’t decrypt the encrypted database without the right certificate. Notice we are taking the backup of the key and encrypting the backup with a password so that if the key gets stolen then it should not be recoverable without the password.

BACKUP CERTIFICATE TestCert TO FILE = ‘D:\backup\cert.bak’

WITH PRIVATE KEY ( FILE = ‘D:\backup\key.bak’ , ENCRYPTION BY PASSWORD = ‘KeyPa$$w0rd’ )

Step 4:

Now we are going to create the encryption key in our custom database (SalesDB) by using the certificate that was generated at the instance level. Since certificates rely on asymmetric cryptography so it means that one key remains in the instance and second key is in our database. This is what that sort of binds the database to the instance.







Step 5:

Time to flip the switch and turn on TDE at the database level.




Houston we have a lift off. TDE has been enabled. All the applications will be able to access data from the database seamlessly without requiring manual encryption / decryption. To test TDE try these:

1. Try taking the database offline, then copy the mdf / ldf files to some other instance and attach the files. It will tell you that it’s not possible to attach it to some other instance since certificate is not found on that instance.

2. Try taking the backup of the TDE enabled database then restore it on some other instance. Again you will be out of luck and won’t be able to restore since the second instance doesn’t have the certificate that was used for encrypting the data.


Restoring certificate / legitimately copying TDE enabled database to another instance

Say that there was a disaster and the whole SQL instance, where TDE enabled database was residing, is lost. You will end up installing SQL server and then you will want to restore your TDE enabled database. Or, say that you want to copy the TDE enabled database, legitimately, to some other SQL server instance. How do we accomplish that? Well for that we have to ensure that the certificate that was used to encrypt the database exists at the instance level. Here are the steps that will be required for successful restoration / copying of database on some other instance.


Step 1:

We have to restore the certificate that was backed up previously.

USE master


CREATE CERTIFICATE TestCert FROM FILE = ‘D:\backup\cert.bak’ WITH PRIVATE KEY ( FILE = ‘D:\backup\key.bak’, DECRYPTION BY PASSWORD = ‘KeyPa$$w0rd’ )


And that should do. Now you should be able attach your TDE enabled mdf / ldf or restore your .bak file


TDE is not for me. Get me out of here

If you don’t like TDE or if you want to disable TDE at the database level then run the following script:





TDE Pros & Cons


  1. Encrypts the whole database and sort of binds it to the instance.
  2. There’s almost no performance overhead.
  3. Encryption / decryption is seamlessly done by SQL server. No manual coding is required.


  1. It doesn’t prevent “man in the middle” attacks. Data is not encrypted when it’s getting transmitted over the network.
  2. Requires Enterprise edition
  3. Requires discipline on behalf of DBA. Certificate backup is mandatory otherwise without it there’s no way to recover the TDE enabled database.
  4. Older hardware might notice some performance overhead since encryption / decryption is taking place in the background.

, , ,