Generate PDF with ASP.NET and iTextSharp


PDF is a popular file format for documents. Due to their ubiquity and layout capabilities, it's not uncommon for a websites to use PDF technology. There are a variety of .NET libraries available to programmatically create PDF documents. Perhaps the most popular is iTextSharp, which is also a free to use API.

Using iTextSharp we can print the any kind of html, CSS & images on the PDF Document. I am using the iTextSharp to Print the the DataList in PDF format. DataList is Loaded Dynamically By Value Which Contain Text and images to print on the PDF Document . Below is the code to convert DataList Into a PDF Document.

Below is the code:

Web Form : PDF.aspx

Below is the datalist which is used for rendering the content directly to HTMLTextWriter object. Sometimes you may need to display datalist differently on web page and pdf. In that case you can have a duplicate datalist control with visibility set to none. In this duplicate control, you can have html format as you want in the PDF.

<asp:DataList ID="m_ctlPdfGuide_DataList" runat="server">


<asp:Label CssClass="largeName" runat="server" ID="m_ctlHotelName" Text='<%#Eval("BrochureItemName")%>'> </asp:Label>

<asp:Image ID="m_ctlBrochureItemImage" Height="50px" Width="100px" ImageUrl='<%#GetImageUrlPDF(Eval("BrochureItemImageName"))%>' Visible="true" runat="server" />

<asp:Label ID="Label1" CssClass="address" runat="server" Text='<%#Eval("BrochureItemAddress")%>'></asp:Label>

<asp:Label ID="Label2" CssClass="description" runat="server" Text='<%#Eval("BrochureItemDescription")%>'></asp:Label>



<asp:Button ID="m_ctlPrintGuide_Button" CssClass="search_btn" runat="server" Text="Save Guide" OnClick="m_ctlPrintGuide_Button_Click" ToolTip="Save Guide To Your Desktop" />


Code-Behind: PDF.CS

On Button Click We create a pdf document and save any where on our Desktop.

protected void m_ctlPrintGuide_Button_Click(object sender, EventArgs e)


string attachment = "attachment; filename=TravelGuide.pdf";


Response.AddHeader("content-disposition", attachment);

Response.ContentType = "application/pdf";

// Rendering DataList into HTMLTextWriter

StringWriter swHtml = new StringWriter();

HtmlTextWriter hTextWriter = new HtmlTextWriter(swHtml);




using (Document doc = new Document(PageSize.A4))


using (PdfWriter w = PdfWriter.GetInstance(doc, Response.OutputStream))



// Getting City Name From Session

string cityName = string.Empty;

if (!string.IsNullOrEmpty(SessionUtil.GetCurrentCityNameFromSession()))

cityName = " to " + SessionUtil.GetCurrentCityNameFromSession();

//add guide header

Chunk c = new Chunk("Your Guide" + cityName + "\n", FontFactory.GetFont("Verdana", 15));

Paragraph p = new Paragraph();

p.Alignment = Element.HEADER;



//add columns

MultiColumnText columns = new MultiColumnText();

columns.AddRegularColumns(36f, doc.PageSize.Width - 36f, 25f, 2);

// Set margin to document

doc.SetMargins(0f, 8f, 8f, 10f);

// Apply CSS to DataList for style

StyleSheet style = new StyleSheet();

style.LoadTagStyle(HtmlTags.IMG, HtmlTags.WIDTH, "220px");

style.LoadTagStyle(HtmlTags.IMG, HtmlTags.HEIGHT, "80px");

style.LoadStyle("address", "style", "font-size: 8px; text-align: justify; font-family: Arial, Helvetica, sans-serif;");

style.LoadStyle("largeName", "style", "font-size: 10px; text-align: justify; font-family: Arial, Helvetica, sans-serif;");

style.LoadStyle("description", "style", "font-size: 8px; text-align: justify; font-family: Arial, Helvetica, sans-serif;");

// Reading value Of DataList to write on the PDF Document

using (StringReader sr = new StringReader(swHtml.ToString()))


List<IElement> list = HTMLWorker.ParseToList(sr, style);

foreach (IElement elm in list)














//throw new ApplicationException("Oops! Error occurred while downloading the Travel Guide! "+ ex.ToString());




swHtml = null;

hTextWriter = null;



When Document Is Created it looks like This:

Below is the screen shot of the document


Happy Coding!!

Internationalization of application

The concept:
The process of making application ready for multi-lingual support is called Internationalization of the application. It includes following three steps:
  1. Globalization is the process of designing and developing applications that function for multiple cultures. Thus globalization is a concept which underlines the fact that the application needs to be neutral culture and nothing related to culture is hard coded. The primary task in this phase is to identify the various locale-sensitive resources and to isolate these resources from the executable code. Following are key points to make your application globalized:
    1. Do not hard code cultures like date, currency etc.
    2. Do not hard code texts, put it in resource files.
  2. Localizability: An application that has been globalized must be tested to ensure that its executable code is independent of the culture and language-specific data. This is called localizability testing. The focus of localizability testing is to check how the application adapts itself to different locales
  3. Localization is the process of customizing your application for a given culture and locale. Localization consists primarily of translating the user interface.
Let’s talk now of Implementation:
ASP.NET enables you to create a page that can obtain content and other data based either
  1. On the preferred language setting for the browser or
  2. Based on the user's explicit choice of language.
Content and other data is referred to as resources and such data can be stored in resource files or other sources. In the ASP.NET Web page, you configure controls to get their property values from resources. At run time, the resource expressions are replaced by resources from the appropriate resource file.
Resource files:
Resource files is an XML that contains the string that has to be translated in different languages or paths to images. The resource file contains key/value pairs. Each pair is an individual resource.
Separate resource file has be created either
  1. For each language (hindi & english)
  2. For each language and culture (UK English, US English)
Resource files in ASP.NET have an .resx extension. At run time, the .resx file is compiled into an assembly, which is sometimes referred to as a satellite assembly. Because the .resx files are compiled dynamically, like ASP.NET Web pages, you do not have to create the resource assemblies.
When you create resource files, you start by creating a base .resx file. For each language that you want to support, create a new file that has the same file name. But in the name, include the language or the language and culture (culture name). For example:
  1. WebResources.resx - The base resource file. This is the default (fallback) resource file.
  2. WebResources.en.resx - A resource file for English. This is also called neutral culture file.
  3. WebResources.en-GB.resx - A resource file for english (UK) specifically.
Please note Culture is used to format localize info for non-UI things such as Date, Numbers, Currency, while UICulture is for specifying the locallized UI info (which UI resource set to use), it controls resource loading, translated text and localized control properties like color or font.
Generating resource files:
Resource files in are created on the basis of scope required.
  1. Global resource file - To create global resource file, we have to put resource file in the App_GlobalResources folder at the root of the application. One can read these file from any page or code of the application.
  2. Local resource file - To create local resource file, we may either put files in the reserved folder App_LocalResources or any other folder in the application. You associate a set of resources files with a specific Web page by using the name of the resource file. For example, if there is a file name called index.aspx, :
  1. Index.resx - The base resource file. This is the default (fallback) resource file.
  2. Index.en.resx - A resource file for English. This is also called neutral culture file.
  3. Index.en-GB.resx - A resource file for english (UK) specifically.
You can use any combination of global and local resource files in the Web application. Generally, you add resources to a global resource file when you want to share the resources between pages. However, global resource files can become large, if you store all localized resources in them. Global resource files can also be more difficult to manage, if more than one developer is working on different pages but in a single resource file, in that case local resource is preferable.
Using resource files in Pages:
To use the resource files in the web pages, there are two ways:
  1. Implicit localization - it works with local resources and lets you automatically set control properties to matching resources.
  2. Explicit localization - it lets you use a resource expression to set a control property to a specific resource in a local or global resource file.

Unique Visitor Counter In Asp.Net


It is sometimes useful to display the number of visits a page on a Web site has experienced using a counter that's displayed as part of the page's content. While adding a counter to a page written in ASP or ASP.NET is relatively straight-forward, adding a counter to a plain HTML page (no scripting of any kind) is a little different.

This article describes a counter I created (see the following figure) using ASP.NET. . The code also maintains its own little "database" so that you don't have to FTP to your Web server to create a new counter. I use and designed this counter for my site listings and various other pages; so it is use a XMLfile to store the noumber of counts.

The Counter Lok like this image                      Visitor Count


Creating The Counter:

Firstly you have to create a project in VS Studio And also you have to add a XML file and name it HitCounter to store the Hit counts, in the Global.asax file Write Down the following code:

using System;
using System.Data;
using System.Configuration;
using System.Collections;
using System.Web;
using System.Web.Security;
using System.Web.SessionState;

namespace HitCounter
    public class Global : System.Web.HttpApplication

        protected void Application_Start(object sender, EventArgs e)
           //  Code that runs on application start

        protected void Application_End(object sender, EventArgs e)
            //  Code that runs on application shutdown


        void Session_Start(object sender, EventArgs e)
            //Get & increment visitor count

            DataSet tmpDs = new DataSet();
            catch (System.IO.FileNotFoundException)
                //no need to throw exception for this

            //set in Session
            Session["hits"] = tmpDs.Tables[0].Rows[0]["hits"].ToString();

        /// <summary>
        /// Increments visit count on every user session
        /// </summary>
        private void UserVisitCount()
                DataSet tmpDs = new DataSet();
                //read hit count
                int hits = Int32.Parse(tmpDs.Tables[0].Rows[0]["hits"].ToString());

                hits += 1;

                //write hit count
                tmpDs.Tables[0].Rows[0]["hits"] = hits.ToString();
            catch (System.IO.FileNotFoundException)
                //no need to throw for this

After Writing these code you have to take now a user control to to render the Hit Counter.And give the name HitCounterControl.ascx. in the code behind file write the following code: To generate the table for the counter to show on the web form.


<%@ Control Language="C#" AutoEventWireup="true" CodeBehind="HitCounterControl.ascx.cs"
    Inherits="RebelsWeb.SiteControls.HitCounterControl" %>
<div style="padding-top: 15px; text-align: center">
    <span style="font-size:small">Visitor Count </span>
    <asp:Literal ID="ltlCounter" runat="server"></asp:Literal>


using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Text;

namespace RebelsWeb.SiteControls
    public partial class HitCounterControl : System.Web.UI.UserControl
        int totalDigits = 6;
        protected void Page_Load(object sender, EventArgs e)
            if (!IsPostBack)
                ltlCounter.Text = FormatHTMLCounter(Session["hits"].ToString());

       /// <summary>
       /// Calculates and formats the vistor count number
       /// </summary>
       /// <param name="visitCount"></param>
       /// <returns></returns>
        private string FormatHTMLCounter(string visitCount)
            int noOfDigits = visitCount.Length;
            int zeroesToPrefix = totalDigits - noOfDigits;
            StringBuilder strCounterHtml = new StringBuilder();
            strCounterHtml.Append(@"<table align=""center"" style=""width: 120px; background-color: Black; color: White; text-align: center; font-weight: bold;""><tr>");

            for (int i = 0; i < zeroesToPrefix; i++)
                strCounterHtml.Append(@"<td style=""border:1px solid white; padding:2px"">");

            char[] visitCountChar = visitCount.ToCharArray();
            for (int i = 0; i < visitCountChar.Length; i++)
                strCounterHtml.Append(@"<td style=""border:1px solid white; padding:2px"">");

            return strCounterHtml.ToString();
Now, After writing these code you can put this user control on web form to display the counter.


This article described how to create a fast page hit counter that's easy to use with plain HTML. The code uses ASP.NET and the C#.NET programming language to, automatically creating new counters, and preventing users from artificially inflating the value of the counter.

Agra & Mathura tour


Perhaps for the first time, I and my wife toured so many places at one Go. We visited following places:

  • Taj Mahal
  • Agra Fort
  • Dwarkadhish - Mathura
  • Yamuna Mandir - Mathura
  • Janmbhoomi - Mathura
  • Brindavan

It was a great tour and we enjoyed a lot. There were many experiences which I think would be useful for people reading this and perhaps for myself as well in future. If you just have to time to read some useful instructions, read only the bolded ones below.

Firstly, we went to Agra. The first place we visited was Taj Mahal in the morning. Being weekend, we had to wait in the Queue for about 2 hours to get into Taj Mahal. Anyway, wait was fruitful since we got to see one of the wonder of the world - Taj Mahal. Eventhough I had my camera, I went for some professional photos by Licensed photographer inside @40 Rs. Each. Then we visited Agra Fort, it was one of the largest Red fort I have ever seen. Then we visited Sadar Bazar, which is a good place to shop for.

Do not forget to take famous Agra Petha and Dal mooth from Sadar Bazar. Make sure to take it only from original Panchi Petha House, since you would find petha stores named same at all the places there.

Then we went to Mathura. There I kept my bag at the railway station cloak room. It’s a good and safe option but do not forget to have your bags locked otherwise they won’t accept it. In Mathura, you will have to reach before 11 am, to visit Dwarkadhish or Janambhoomi. Both are closed by 11am and then opened in the evening at 4:00 PM. To reach Dwarkadhish, take cycle rickshaw since Auto rickshaws are not allowed to go close there. We had food in Gujarati restaurant named “Babu bhai”, it was just near Dwarkadhish. Food was really good and home like for Rs. 50 each. You should try that as well. Janambhoomi has really great Krishna Mandir to visit. You would need to put your belongings including camera and mobile outside the mandir in Cloak room. Don’t worry, its really safe there.

Just  near Janambhoomi mandir, there is a Brijwasi Peda Store, take peda only from there. Again there are lots of Brijwasi peda store there but only one original is there, its big one having glass doors and looks like showroom of pedas Smile , going by standards of other peda store there.

Then we went to Brindavan – Bankebihari mandir. However, unfortunately we could not get into mandir due to heavy rain and water logging there and had to return back from mandir door only.

Hope you would find some useful information from my experiences here, while travelling Agra mathura.

Thanks for reading this!

Creating a Windows Service Application

Creating a Service

To create and configure your service
  1. On the File menu, click New Project.
    The New Project dialog box opens.
  2. Select the Windows Service project from the list of Visual Basic, Visual C#, or Visual C++ project templates, and name it MyNewService. Click OK.
  3. Click the designer to select Service1. Then, in the Properties window, set the ServiceName and the (Name) property for Service1 to MyNewService.
  4. Set the AutoLog property to true.
  5. On the View menu, click Code to open the Code Editor. Edit the Main method to create an instance of MyNewService. When you renamed the service in step 3, the class name was not modified in the Main method. In Visual C# applications, the Main method is located in the Program.cs file.

    // Change the following line to match.
    ServicesToRun = new System.ServiceProcess.ServiceBase[]
       { new MyNewService() };

Adding Features to the Service

In the next section, you add a custom event log to the Windows service. Event logs are not associated in any way with Windows services. Here the EventLog component is used as an example of the type of component you could add to a Windows service. To add custom event log functionality to your service
  1. In Solution Explorer, right-click Service1.vb or Service1.cs and select View Designer.
  2. From the Components tab of the Toolbox, drag an EventLog component to the designer.
  3. In Solution Explorer, right-click Service1.vb or Service1.cs and select View Code.
  4. Edit the constructor to define a custom event log.

    public MyNewService()
    if (!System.Diagnostics.EventLog.SourceExists("MySource")) 
    eventLog1.Source = "MySource";
    eventLog1.Log = "MyNewLog";

To define what occurs when the service starts
  • In the Code Editor, locate the OnStart method that was automatically overridden when you created the project, and write code to determine what occurs when the service starts running:

    protected override void OnStart(string[] args)
    eventLog1.WriteEntry("In OnStart");

To define what occurs when the service is stopped
  • In the Code Editor, select the OnStop procedure from the Method Name drop-down list, which was automatically overridden when you created the project. Write code to determine what occurs when the service is stopped:

    protected override void OnStop()
    eventLog1.WriteEntry("In onStop.");

Some custom actions have to occur when a Windows service is installed, which can be done by the Installer class. Visual Studio can create these installers specifically for a Windows service and add them to your project.

To create the installers for your service

  1. In Solution Explorer, right-click Service1.vb or Service1.cs and select View Designer.

  2. Click the background of the designer to select the service itself, instead of any of its contents.

  3. With the designer in focus, right-click, and then click Add Installer.

    By default, a component class that contains two installers is added to your project. The component is named ProjectInstaller, and the installers it contains are the installer for your service and the installer for the service's associated process.

  4. In Design view for ProjectInstaller, click ServiceInstaller1 or serviceInstaller1.

  5. In the Properties window, set the ServiceName property to MyNewService.

  6. Set the StartType property to Automatic.

  7. In the designer, click ServiceProcessInstaller1 (for a Visual Basic project), or serviceProcessInstaller1 (for a Visual C# project). Set the Account property to LocalSystem. This will cause the service to be installed and to run on a local service account.

To build your service project

  1. In Solution Explorer, right-click your project and then click Properties. The project's Property Designer appears.

  2. On the Application page, from the Startup object list, click MyNewService.

  3. Press CTRL+SHIFT+B to build the project.

Now that the project is built, it can be deployed. A setup project will install the compiled project files and run the installers that are required to run the Windows service. To create a complete setup project you will have to add the project output, MyNewService.exe, to the setup project and then add a custom action to have MyNewService.exe installed. For more information about setup projects, see Setup Projects. For more information about custom actions, see Walkthrough: Creating a Custom Action.

To create a setup project for your service

  1. In Solution Explorer, right-click to select your solution, point to Add, and then click New Project.

  2. In the Project Types pane, select the Setup and Deployment Projects folder.

  3. In the Templates pane, select Setup Project. Name the project MyServiceSetup. Click OK.

    A setup project is added to the solution.

Next you will add the output from the Windows service project, MyNewService.exe, to the setup.

To add MyNewService.exe to the setup project

  1. In Solution Explorer, right-click MyServiceSetup, point to Add, and then click Project Output.

    The Add Project Output Group dialog box appears.

  2. MyNewService is selected in the Project box.

  3. From the list, select Primary Output, and click OK.

    A project item for the primary output of MyNewService is added to the setup project.

Now add a custom action to install the MyNewService.exe file.

To add a custom action to the setup project
  1. In Solution Explorer, right-click the setup project, point to View, and then click Custom Actions.

    The Custom Actions editor appears.

  2. In the Custom Actions editor, right-click the Custom Actions node and click Add Custom Action.

    The Select Item in Project dialog box appears.

  3. Double-click the Application Folder in the list to open it, select Primary Output from MyNewService (Active), and click OK.

    The primary output is added to all four nodes of the custom actions — Install, Commit, Rollback, and Uninstall.

  4. In Solution Explorer, right-click the MyServiceSetup project and click Build.

To install the Windows Service

  1. To install MyNewService.exe, right-click the setup project in Solution Explorer and select Install.

  2. Follow the steps in the Setup Wizard. Build and save your solution.

Postback vs Callback


A Postback occurs when the data (the whole page) on the page is posted from the client to the the data is posted-back to the server, and thus the page is refreshed (redrawn)...think of it as 'sending the server the whole page ( full of data'.

On the other hand, a callback is also a special kind of postback, but it is just a quick round-trip to the server to get a small set of data (normally), and thus the page is not refreshed, unlike with the postback...think of it as 'calling the server, and receiving some data back'

With Asp.Net, the ViewState is not refreshed when a callback is invoked, unlike with a postback.

The reason that the whole page is posted with ASP.Net is because ASP.Net encloses the whole page in a <form> with a post method, and so when a submit button is clicked in the page, the form is sent ot the server with all of the fields that are in the form...basically the whole page itself

If you are using FireBug (for Firefox), you can actually see callbacks being invoked to the server in the Console. That way, you will see what specific data is being sent to the server (Request) and also the data the server sent you back (Response)

The below image illustrates the Page Life Cycles of both a postback and a callback in a ASP.NET based Website:

ASP.NET Page Life Cycles


Thus, a postback occurs when a request is sent from the client to the server for the same page as the one the user is currently viewing. When a postback occurs, the entire page is refreshed and you can see the typical progression on the progress bar at the bottom of the browser.

A callback, generally used with AJAX, occurs when a request is sent from the client to the server for which the page is not refreshed, only a part of it is updated without any flickering occurring on the browser

Mobile: WAP vs. Native Application Development (Apps)


There are development platform options to consider when building a mobile application. Here is a comparison of WAP vs. Native platform solutions (Apps). This includes areas to consider not only upon initial deployment but post launch.

WAP vs. Native Application Decision (Apps)
Apps Advantages:
  • Library update
  • Direct technical support
  • User has more control
  • App store and device portal solution
  • Can apply existing User Interface (UI) standard for mobile users
  • Better UI design result, can take full advantage of each platform
  • Data persistent – data could be available even with no network connection; more option on data cached

WAP Advantages:

  • Open Source solution
  • One programming language or solution could be applied for different browsers
  • Centralized – more control – easy to update
  • UI design takes less time and code to implement
  • Fast development
  • No code size and memory limitations
  • Better integrated tracking
Apps Disadvantage:
  • Frequent library updates – Fluid technology
  • Not all are Open Source solution
  • Different programming languages
  • Different UI design pattern
  • Slow development time
  • Code size and memory limitation
  • Update depends on the app store or user to update

WAP Disadvantage:

  • Library update could be slow from time to time
  • Limited direct technical support
  • Library is still limited to utilize the native features
  • Browser rending issue for different version and browsers
  • Data persistent – limited option and fewer data space can be cached

A mahatma announces fast unto death just for You and me.


Anna Hazare has given an ultimatum to the Prime Minister Dr Manmohan Singh to enact stringent anti-corruption law – the peoples “Jan Lokpal Bill”! Jail to the corrupt must happen! We have been betrayed by those that are leading us!

When & Where?

From 5th April at Jantar Mantar, New Delhi.

Who is Anna Hazare?

A soldier, lone survivor in his unit in 1965 Indo-Pak war, Anna dedicated his life to the well-being of society. A bachelor, an ascetic, he has no possessions, no bank balance and lives in a temple. He is a living Mahatma Gandhi!  anna_hazare
In Maharashtra, Anna has single handedly transformed barren and dry regions into green and food surplus areas. He has fasted unto death on several earlier occasions. He forced the Maharashtra government to dismiss the corrupt - 6 ministers and 400 officers. Due to his fast, the govt enacted the Maharashtra RTI Act. In 2006, when government of India tried to amend the Central RTI Act, he again went on an indefinite fast and forced the Indian government not to amend RTI Act.
Leaders, organizations and the common man from across India will be with him. This is a do or die moment – let us make it happen!
Across India, join Swami Ramdev, Sri Sri Ravishankar, Swami Agnivesh, Arch Bishop Vincent Concessao, Mahmood A Madani, Kiran Bedi, J M Lyngdoh, Shanti Bhushan, Prashant Bhushan, Arvind Kejriwal, Mufti Shamoom Qasmi, Mallika Sarabhai, Arun Bhatia, Sunita Godara, All India Bank Employees Federation, PAN IIT Alumni Association, Common Cause and many other prominent organizations and leaders, as India comes out on the streets! 100+ CITIES WILL RALLY BEHIND ANNA!
This is a defining moment for India that can give our children a better future! Let us unite and stand by him! Thousands will be there to support him. Will you be there?

Anna’s appeal to the people:

When I sit on fast from 5 April, I urge my fellow countrymen to
  • Join me in fasting for one, two, three or whatever number of days you can comfortably fast.
  • Along with fast, please pray to God (whoever you believe in) for better and corruption free India. Collective prayers from all the people of India would definitely have a huge impact.
  • Write an impassioned plea to our Prime Minister that we look upon him to pass “Jan Lokpal Bill”, else we will be compelled not to vote for his party in next elections.
  • Remain calm and peaceful and develop the courage to go to jail, if required, in this next freedom movement.
If possible, come and stay with Anna at Jantar Mantar for a few days from April 5th.
Would you like to stay updated about this movement? Let us know your contact details at

Salient features of Jan Lokpal Bill

Drafted by Justice Santosh Hegde, Prashant Bhushan and Arvind Kejriwal, this Bill has been refined on the basis of feedback received from public on website and after series of public consultations. It has also been vetted by and is supported by Shanti Bhushan, J M Lyngdoh, Kiran Bedi, Anna Hazare etc. It was sent to the PM and all CMs on 1st December. However, there is no response.
  1. An institution called LOKPAL at the centre and LOKAYUKTA in each state will be set up
  2. Like Supreme Court and Election Commission, they will be completely independent of the governments. No minister or bureaucrat will be able to influence their investigations.
  3. Cases against corrupt people will not linger on for years anymore: Investigations in any case will have to be completed in one year. Trial should be completed in next one year so that the corrupt politician, officer or judge is sent to jail within two years.
  4. The loss that a corrupt person caused to the government will be recovered at the time of conviction.
  5. How will it help a common citizen: If any work of any citizen is not done in prescribed time in any government office, Lokpal will impose financial penalty on guilty officers, which will be given as compensation to the complainant.
  6. So, you could approach Lokpal if your ration card or passport or voter card is not being made or if police is not registering your case or any other work is not being done in prescribed time. Lokpal will have to get it done in a month’s time. You could also report any case of corruption to Lokpal like ration being siphoned off, poor quality roads been constructed or panchayat funds being siphoned off. Lokpal will have to complete its investigations in a year, trial will be over in next one year and the guilty will go to jail within two years.
  7. But won’t the government appoint corrupt and weak people as Lokpal members? That won’t be possible because its members will be selected by judges, citizens and constitutional authorities and not by politicians, through a completely transparent and participatory process.
  8. What if some officer in Lokpal becomes corrupt? The entire functioning of Lokpal/ Lokayukta will be completely transparent. Any complaint against any officer of Lokpal shall be investigated and the officer dismissed within two months.
  9. What will happen to existing anti-corruption agencies? CVC, departmental vigilance and anti-corruption branch of CBI will be merged into Lokpal. Lokpal will have complete powers and machinery to independently investigate and prosecute any officer, judge or politician.

JAN LOKPAL BILL will act as deterrent and instill fear against corruption.

A mahatma is doing fast unto death just for You and me, can we not even support him?

(This movement is neither affiliated nor aligned to any political party)
India Against Corruption: A-119, Kaushambi, Ghaziabad, UP | 09718500606 | |
Please circulate this mail as widely as possible

Basics of Database Design



When Bob started his Global Supplies business, things were nice and simple. He had some stock, a copy of Microsoft Office, and a basic web site that allowed anyone with access to the Internet to purchase something from his growing range of novelty gifts.

Each day, Bob would go through the list of orders, package them up, fill out a Sales Receipt, and send the items off to his eagerly-awaiting customers.

Bob was particularly pleased with his receipts: they were based on one of the new Microsoft Word templates, and looked quite the part. He kept the original files on his laptop, regularly backed them up to an external drive, and kept the originals in a large old filing cabinet - just in case.


As time went by, and business picked up, Bob looked at his growing stack of Sales Receipts, and wondered if he ought to be making more use of the information they contained. Of course, he had a feeling for who his best customers were, and sometimes flicked through the papers held in the filing cabinet to answer some of the other questions that had popped into his head from time to time.

This could be frustrating at times, because the receipts were stored in a way that wasn't always conducive to answering Bob's question quickly. Sometimes he thought about reorganising things - perhaps placing each customer's receipts in separate folders, or sorting them in a different way, perhaps by name.

The problem was that each potential arrangement seemed to have advantages and disadvantages: sorting by order value would help with some questions, but not others, for example. In the end, Bob came to the conclusion that what he really needed was an Excel spreadsheet.

With a spreadsheet, Bob could record the information held on his receipts in a much more organised way:


This was much better. Bob could now see a lot more information all at once, and he quickly found ways to filter the data, add calculations, and copy bits to new sheets so he could look at his data in different ways.

Even Spreadsheets Have Limitations

As much as Bob loved his spreadsheet, it still had its problems. He found himself copy-and-pasting a lot. Product descriptions and customer addresses were particularly tiresome. He sometimes couldn't be bothered with that when entering new data, which resulted in slightly different spellings and inconsistent abbreviations.

His spreadsheet also quite often required a fair amount of work to answer his questions. Sometimes, he found himself giving up completely. He realised he wasn't a spreadsheet expert, by any means, but it was still frustrating.

Discussing the situation one evening with his friend Alice, she suggested that perhaps what Bob needed was a database - something like Access, perhaps. She explained that databases were much better at answering the sorts of questions Bob had, and they typically required much less copy-and-pasting.

This sounded good to Bob, but he found that his edition of the Office suite (Home & Business) did not include Access. He briefly considered buying Access separately, or downloading a trial edition, but Alice pointed him instead to a product called SQL Server.

The Express Edition is free, she said, and it comes with something called a Query Optimizer, which Bob thought was a great idea. After all, no-one wants to write sub-optimal queries do they?

When Alice mentioned that this Query Optimizer thing had been worked on by a guy called Conor Cunningham, after he left the Access team, Bob was completely sold on the idea. If he was going to replace his spreadsheet with a database, he might as well use one where the developers had learned from their past mistakes!

A Spreadsheet in SQL Server

Bob downloaded SQL Server 2008 Express Edition and quickly created a Sales table:


Bob wasn't quite sure he had made the right choices in the Data Type column, but he figured he would ask Alice about that later. He started copying data across from his spreadsheet:


That didn't seem nearly as neat as his spreadsheet - there were all these NULL things, where his spreadsheet just left a blank space. Worse, when Bob used the query designer to show information for customer 123, it showed the wrong results:


There should be five items not one! Bob thought the problem might be related to those odd-looking NULLs in his table. He decided to get rid of them by copying information from other rows:


Making these changes meant that Bob's query now returned the correct five rows, but he was far from happy. Using a database seemed to mean he did rather more copy-and-paste than he ever needed to do with his trusty spreadsheet.

The Problem with Duplication

When Bob next saw Alice, he took his laptop along to show what he had done, and expressed his disappointment at the need for so much duplication. What Bob would really like, he explained, was to record each piece of information from his receipts once.

Alice agreed that the duplication was a big problem. Not only did Bob spend a lot of time entering the same information over and over again, the duplicates wasted a lot of space, and introduced the possibility of conflicting information: if Bob made a mistake, and two rows ended up disagreeing about a customer's address, which one should be believed?

Another, more subtle problem, was that there was no neat way for Bob to record information about products he had not sold yet, or customers that had not yet been issued with a receipt.

A Little Design Theory

Alice had briefly studied database design at university, and while that was some time ago now, she agreed to help Bob with his problems as best she could. She explained that although database design was still regarded by many as more art than science, there were some core principles that might be usefully applied.

Her first idea was to use more than one table, with each table representing a single type of real-world thing. This meant recognising that receipts, customers, and product items were fundamentally different things, and each should be stored in its own table.

Bob agreed that this seemed like an idea worth pursuing. Intuitively, it made sense to him that his database should mirror the real world as much as possible, though he was intrigued to see how it would work in practice. Bob had never much enjoyed theory - he wasn't much of an academic - so Alice agreed to try to explain what she was doing in practical terms.


Alice explained that the idea of breaking up one large table into several to reduce duplication is called normalization. Bob raised an eyebrow, and Alice took the hint - she would try to cut out the technical jargon from here on in. She picked up a printed copy of Bob's table, and added some shading to show what she had in mind:


She used green to highlight receipt-related columns, orange for customer information, and purple for data which applied to products. Alice said that the red columns showed a more subtle type of duplication: their values could be calculated from other columns in the table, so they didn't represent anything new. Bob could see this was true, and was all in favour of removing duplication wherever they could.

With that agreed, Alice set about creating the three new tables, one each for Receipts, Customers, and Products. As she typed, Bob asked her about the mysterious data types he had encountered earlier. Alice explained that each column had to have a data type, and it was important to choose an appropriate one from the wide range offered by SQL Server.

Generally speaking, Alice said she would choose the most compact type that could contain all values the column would ever need to hold. Bob had a quick look at the list of data types, and quickly realized that his initial choices had been less than ideal! He was particularly embarrassed about choosing the text data type,because there was a big warning on the Microsoft web site discouraging its use.

The Three Tables

The first version of Alice's design looked like this:


Bob noticed that Alice had used more appropriate data types, and all the ticks from the Allow Nulls boxes had gone. He was quite pleased about that, since he had no idea what a NULL was, and he had certainly never seen one on a real receipt.

Bob agreed that the three table design removed a lot of the duplication. He could now make a single entry per receipt in the Receipts table, one entry per customer in the Customers table, and one entry per product in the Products table. This was definitely progress!

Alice said that no table should ever contain duplicate rows, and there should always be something called a primary key to enforce that. Seeing Bob's puzzled expression, Alice pointed to the little golden key symbols in her diagram. The primary key, she explained, is a column (or a minimal combination of columns) with values that are different for each row in the table.

SQL Server would enforce the key's uniqueness, so anyone trying to enter a new row with a duplicate key would receive an error message. Bob thought this was a very useful idea, and made a note to never create a table without also adding a primary key.

One thing was puzzling him though: in Alice's diagram, why did the Quantity column in the Products table have slightly different shading?

Dependencies and Keys

Alice explained that an important part of table design is to ensure that each column in a table states a single fact about the key, and that was not the case with the Quantity column. Bob's eyes glazed over slightly when she said that, so Alice gave an example:

In the Customers table, she said, notice how each column contains a single fact about a single customer (identified by the primary key: CustomerNo). Now look at the Products table: the Description and UnitPrice columns each state a single fact about a particular product (again identified by the primary key: ProductNo).

The Quantity column is a bit different, remarked Alice. The quantity of product does not depend solely on ProductNo - the quantity will often be different for the same product on different receipts. So, the quantity is determined by a combination of ProductNo and ReceiptNo - not ProductNo alone.

Bob asked if Alice could just change the primary key on the Products table to include both the ProductNo and ReceiptNo columns. She said no, because then the other columns (Description and UnitPrice) would depend on just part of the new key. It was essential, she said, for each column in a table to depend on a key, the whole key, and nothing but the key.

This all sounded rather technical to Bob, but he grasped the basic idea. The question was, what should he do about the troublesome Quantity column? It seemed not to belong in the Products table at all - it appeared to belong in a table with ReceiptNo and ProductNo as a primary key, but they didn't have one of those! Not yet, said Alice, and smiled.

Table Relationships

Before we find a permanent home for the Quantity column, said Alice, we need to look at how the tables we already have relate to each other. After all, we will need to find a way to link them together if we ever want to ask questions about more than one type of thing (customer, receipt and product).

Bob said he had been wondering about that: it was all very well dividing his table up into three, but how could he match the rows up? How could he see which receipts belonged to which customer, for example?

Alice replied by asking him to look at the Customers and Receipts tables. These tables were an example of a one-to-many relationship, she said. One customer might have many receipts, but each receipt only ever related to one customer. Bob confirmed that this was indeed true, but wondered how that helped.

One-to-Many Relationships

Modelling a one-to-many relationship between two tables was easy, said Alice. All they had to do was to add something to each row of the Receipts table, linking back to the parent customer. Bob thought about this, and asked if this pointer would just be the customer's primary key. Alice confirmed that he was correct, and updated her design accordingly:


Alice explained that the new CustomerNo column in the Receipts table was known as a foreign key. Bob noticed that the new column did not break Alice's rule (columns should state a single fact about the whole key). Each row in the Receipts table was indeed associated with a single customer.

Alice drew Bob's attention to the new connecting line in her diagram, which she said represented a foreign key relationship. SQL Server would enforce this relationship, similar to the way it enforced primary key uniqueness. The practical consequence was that Bob could not add a receipt to his database without first creating a customer record for it to link to.

Bob thought this was a very cool feature - SQL Server would make sure he could not add information that didn't link up properly.

Many-to-Many Relationships

Alice now turned to the task of linking the Receipts and Products tables. The link between these tables was going to be a bit different because each product could appear on many different receipts, and each receipt could contain many different products. This, she said, was known as a many-to-many relationship.

Bob started thinking about how they might add foreign keys to each table to represent this kind of link, and it made his head hurt almost immediately. Each table would need to hold a variable number of foreign keys in each row - was that even possible?

Alice said that to model this relationship correctly, she would need to add another table to the design. She called this a bridge table, and it would sit between the Products and Receipts tables

She explained that bridge tables were necessary whenever two tables had a many-to-many logical relationship. Because the new table modelled the link between the Receipts and Products tables, its primary key would be a combination of the primary keys of the related tables (she called this a compound key).

Bob wasn't sure about this at all, but Alice seemed to know what she was talking about, so he waited to see how this new idea would look before passing judgement.

The Final Design

Alice called the new table ReceiptProducts, and added it to the overall design:


The Customers and Receipts tables had not changed, but Bob noticed that the problematic Quantity column had found a home in ReceiptProducts. He remembered that the Quantity column was a single fact about a receipt/product combination, so it made sense to see it in a table with that combination as its primary key.

Bob also saw that the new table was linked by two one-to-many foreign key relationships: one to the Receipts table, and one to the Products table. These new foreign key relationships ensured that Bob could not accidentally add a Quantity without first entering a related product and receipt.

Entering Data

Returning home for the evening, Bob found that the new design made it easy to add new information in a logical order, and without the duplication he was used to.

He started by adding a complete list of products to the Products table. He was happy that he would only have to enter the descriptions and prices once, and working from his price list made the task quick and easy. Whenever Bob lost his place and tried to enter a product for a second time, SQL Server replied with an error message, which Bob found very reassuring.

Next, Bob reached for his notebook and started adding customer information to the Customers table. Again, this was quick and easy, and the information only needed to be typed in once.


When he came to add data to the Receipts table, Bob was relieved to be able to refer to the customer by number, rather than retyping the name, address, and town information each time as he used to with the spreadsheet. The ReceiptProducts table was just as easy; he just needed to link products to receipts and enter the quantity.


Writing Queries

Though he started off using the visual query designer in Management Studio, Bob quickly picked up the basics of the T-SQL query language. The relationship diagram Alice had given him made it easy to see how his tables should be joined together, and Bob quickly found himself preferring the SELECT, JOIN, and WHERE syntax over the designer.

Bob was delighted to find that his database could answer a very wide range of questions about his business. He thanked Alice for helping him understand some of the fundamentals of good database design - and particularly how to avoid duplication.


  • Every table needs a key to uniquely identify each row
  • Use appropriate column data types
  • Each column in a table should state a single fact about the whole of the key
  • Break tables that violate that rule into logical parts, based on real-world objects where possible
  • Enforce one-to-many relationships using foreign keys and constraints
  • Model many-to-many relationships using bridge tables

Reference  -

Data Profiling before loading source data into Data warehouse – A real Time saver!


Understanding the source data is critical before its loading into Data warehouse since it has significant impact on how you design the structures and build the ETL.

Data profiling is a great way to analyze the source data to understand in what condition is it in terms of pattern, nulls, dirtiness etc. SSIS 2008 has a new task called 'Data Profiling Task'. We can leverage it for data warehouse ETL.

There are three main parts to the data profiling task.

  1. Running the data profile task in the SSIS
  2. Providing the output generated by the above tasks to 'Data Profile Viewer'
  3. Observing and analyzing the report and then design ETL on the basis of it.

Running the data profile task in the SSIS:

  1. The Data Profiling Task needs connection for source database, so we need to create one connection in the connection manager window.
  2. Create a connection to the server where database is there.
  3. Save the connection information and return to the SSIS package designer.
  4. Drag a Data Profiling Task from the Toolbox onto the Control Flow and double - click the new task to open up the Data Profiling Task Editor


  5. The Data Profiling Task includes a wizard that will create your profiling scenario quickly, therefore click the Quick Profile Button on the General tab.


  6. In the Single Table Quick Table Form, choose the connection, and in the Table Or View drop - down, select the desired table or view from the list. Enable all the checkboxes in the Compute list and change the Functional Dependency Profile to use 2 columns as determinant columns.


  7. Select OK to save the changes, which will populate the Requests list in the Data Profiling Task Editor as shown in Figure below.


  8. Return to the General tab of the editor and in the Destination property box, choose New File Connection. This is where you will define the XML file where the profile output is stored.
  9. In the File Connection Manager Editor, change the Usage type drop - down to "Create file " and enter name of the xml file in the File text box. Select OK to save your changes to the connection and OK to save your changes in the Data Profiling Task Editor.
  10. Run this simple package in BIDS, which will initiate several queries against the source table or view. This may take several minutes depending upon the number of records. I had about 30000 records and it took about half an hour to run the package.
  11. Result of this file would be stored in the xml file created above.

Providing the output generated by the above tasks to 'Data Profile Viewer'

After running the data profile task, we need to evaluate the results.

  1. Open the Data profile viewer from Start Button>All Programs>Sql Server 2008>Integration Services>Data Profile Viewer
  2. Click Open and select the xml file generated above.



  3. In the Profiles navigation tree drill, first click on the table icon on the top left to put the tree viewer into “ Column View. ” Second, drill down into the details by expanding the Data Sources, server (local), Databases, and Columns objects.
  4. The first profiling output to observe is the Candidate Key Profile, so click this item under the Columns list, which will open the results in the viewer on the right. You will see that the Data Profiling Task has identified six columns that are unique across the entire table (with 100%

    uniqueness) as shown below.


  5. Next, in the profile tree, expand the NationalIDNumber column and highlight the Column Length Distribution, and then in the distribution profile on the right, double - click the length distribution of 1, as shown below. Here we can see the minimum and maximum length of NationalIDNumber, its distribution across this field and the actual data.



  6. Under NationalIDNumberin the profile browser, select the Column Null Ratio Profile and then double - click the row in the profile viewer on the right to view the detail rows. As seen there are no null value here but if it will have it will show the distribution accordingly.



  7. Next, we may select the Column Value Distribution Profile on the left under the Title and observe the output in the results viewer. How many unique values are there in the entire table? Its 67 here.


  8. This can also help to create regex for validations as given below, using Column Pattern profiles. The Column Pattern Profile uses the regular expression syntax to display what pattern or range of patterns that the data in the column contain.


  9. The final data profiling type to review is the Column Statistics Profile. This is only applicable to data types related to numbers (integer, float, decimal, numeric) and dates (dates only allow minimum and maximum calculations). Max, min, mean and standard deviation for the field Is calculated using this.


Thus, as seen above, we can use the output of Data profiling Task to identify the range, quality, pattern in the source which helps to formulate the ETL design in a fast and reliable way. Isn’t it a real time saver? Please let me know your feedbacks by commenting on this article.

Happy Datawarehousing :) !