Frameworks in Architecture Definition

There are lot of frameworks which disappeared with time, also there are few frameworks which have changed the way applications are getting developed. These successful frameworks have contributed to give a new look to Java itself. There are many reasons behind continued success of different frameworks available in market. These days Java application development involves considerable use of proven frameworks. Below are some benefits we get by using these frameworks.

* Readymade components to speed up the development
* Developer community support to help
* As all are OpenSource, Everything comes free of cost
* Successfully Proven technology assuring success of project

Now, we will explore the architecture trends arising due to wide usage of these ready frameworks in application development, assuming readers to be aware of different popular Java based open source frameworks e.g. Struts, Spring, Hibernate etc.

Layered Architectures:

Architecture definition involves identifying tiers and further different layers to separate each unit responsibility. Definitely the driver is business requirement, which makes us to separate presentation from business and business from data access. Let us take following simple example where we have three tiers – web tier, business tier and database tier. Each tier has clear responsibility.

Presentation tier also called as web tier handles presentation part along with navigation from one screen to other. Additional features like managing user session, caching information etc. will be additional expected features in presentation tier.

Business tier encapsulates complete business logic. Business validations, processing, calculations etc. gets handled in this layer. Transaction management, security etc. are the enterprise features expected in this layer.

Database tier involves implementation of database interaction and processing of retrieved or to be persisted data. Database session management, transaction propagation etc. are the features expected here.

When we select ready frameworks to implement each of these tiers, different layers in a tier are identified by the selected framework already. We can customize these layer but up to certain extend only.

Predefined Layers with different Frameworks:

The word framework broadly means ready components and skeleton on which we can construct a larger program. Some amount of flexibility is available to choose better alternatives suitable to the application. To explore it further, let us take each tier and try to identify layer alternatives according when we use different framework.

Presentation Layer:

As we move from one framework to other, the type of objects we use go on changing. Here Action, Controller, and Page Components encapsulate navigation logic. This means that when we use these frameworks we have to have a layer that contains everything related to navigation. If you remember the days of servlet, everything related to navigation along with request data extraction and population in data object used to be inside servlet code. Now we get the request data populated in mapped pojos, and what we need to do is identify next navigation point in these action/controller/pagecomponent classes. Framework takes care of rest of the things. Thus we have limited options in this tier with usage of one of these frameworks.

Business Layer:

This layer contains everything related to business. An application might require exposing the business functions as a service, and we decide to use Spring Webservice for this purpose. This implementation will require following simple layers.

Here the first layer just deals with webservice request –response-fault handling including marshaling and un-marshaling, while the second class contains actual business function implementation. These two layer can be clearly indentified in with usage of Spring Webservices.

Database Layer:

In this layer, we take example of hibernate framework which accesses oracle database. We require a Dao (data access object) layer that offers create, read, update and delete (CRUD) operations on data objects (which map to database entities directly).

From the above discussion, we can conclude that each (OpenSource) framework has it’s own requirement of layers, when we select the framework we also identify these layers directly in our architecture. We can leverage little flexibility provided by the framework to customize it according to our needs.

Framework Usage View:

With above preliminary discussion, We can clearly see a need of representation that shows how different frameworks in different tiers interact with each other. Not only interaction but interaction in following context –

* Thread management
* Transaction management
* Data transformation (e.g. if you are using struts then action form is required but it can not be directly passed to hibernate as it requires conversion into object that represents database table.)
* Exception management support
* Individual framework benefits and constraints

This list can be longer depending on application and frameworks considered.

Generating Word documents from PHP

PHPDOCX is a PHP library that allows its client code to generate Microsoft Word documents in the .docx format from PHP scripts. PHP is increasingly being used for disparate goals and has to deal with data that comes from strange sources and has to be produced in stranger formats. An off-the-shelf solution for the creation of Word documents from an arbitrary source — being it a database, Excel or a csv file — is indeed a good tool to keep at hand.

Starting with the 1.5 version, which has been released on July 12th, PHPDOCX is now compatible with PHP 5.3. The adoption of PHP 5.3 from operating systems is growing and it will at last replace the previous versions of PHP also in the servers of hosting providers.


PHPDOCX provides some standard features that you would commonly use when generating a document dynamically: managing text, list, tables, images and graphic elements are all basic operations of document editing.

There are more useful features included in the library, which come handy when dealing with long documents. For instance, insertion of headers, footers, page numbering, and table of contents are all supported.

A final note on the feature list is the possibility of outputting PDF and HTML from a given Word document. The library is intended for generation of reports and being able to switch the output format at will is a great point.


PHPDOCX has no requirements for a functional version of MS Word, except for generating legacy versions of the documents (.doc format for Word 2004 or before).

The library does requires the zip and xsl PHP extension to work, but they are probably already installed on your server of choice, or available at will. Apart from that, a generic installation of PHP and Apache will suffice.


Like many libraries for web development, PHPDOCX comes with more than one license.

The first possibility is to use the library with an LGPL license, which covers the free version. It has somewhat limited features in comparison to the Pro one, but it includes no watermarks in the produced documents nor it has time limits.

The Pro version has greater capabilities, like the insertion of graphs and MathML constructs for scientifical documents. It also provides technical support, which may be the most compelling point for its adoption.

In conclusion, PHPDOCX is a valid tool to manage production of documents in one of the most diffused formats of the world. It also manages PDF and HTML, which guarantee interoperability with any end user’s machine

EclEmma – Java Code Coverage for Eclipse

EclEmma is a free Java code coverage tool for Eclipse, available under the Eclipse Public License. Internally it is based on the great EMMA Java code coverage tool, trying to adopt EMMA’s philosophy for the Eclipse workbench:

Fast develop/test cycle: Launches from within the workbench like JUnit test runs can directly be analyzed for code coverage.
Rich coverage analysis: Coverage results are immediately summarized and highlighted in the Java source code editors.
Non-invasive: EclEmma does not require modifying your projects or performing any other setup.

The Eclipse integration has its focus on supporting the individual developer in an highly interactive way.



EclEmma adds a so called launch mode to the Eclipse workbench. It is called Coverage mode and works exactly like the existing Run and Debug modes. The Coverage launch mode can be activated from the Run menu or the workbench’s toolbar:

Simply launch your applications or unit tests in the Coverage mode to collect coverage information. Currently the following launch types are supported:

* Local Java application
* Eclipse/RCP application
* Equinox OSGi framework
* JUnit test
* TestNG test
* JUnit plug-in test
* JUnit RAP test
* SWTBot test


After your application or unit test has terminated code coverage information is automatically available in the Eclipse workbench:

* Coverage overview: The Coverage view lists coverage summaries for your Java projects, allowing drill-down to method level.
* Source highlighting: The result of a coverage session is also directly visible in the Java source editors. A customizable color code highlights fully, partly and not covered lines. This works for your own source code as well as for source attached to instrumented external libraries.

Additional features support analysis for your test coverage:

* Different counters: Select whether instructions, lines, basic blocks, methods or loaded types should be summarized.
* Multiple coverage sessions: Switching between coverage data from multiple sessions is possible.
* Merge Sessions: If multiple different test runs should be considered for analysis coverage sessions can easily be merged.


While EclEmma is primarily designed for test runs and analysis within the Eclipse workbench, it provides some import/export features.

* Coverage data import: A wizard allows to import *.ec coverage data files from external launches.
* Coverage report export: Coverage data can be exported as a *.ec file or in XML or HTML format.

Never Use $_GET Again

Please don’t need to use $_GET or $_POST anymore. In fact, probably we shouldn’t use $_GET and $_POST anymore. Since PHP 5.2, there is a new and better way to safely retrieve user-submitted data.

The clever developers have constructed a library that analyzes data and escapes it appropriately. But the problem of validating and sanitizing input is still a substantial issue. Many seasoned PHP developers still spend precious development cycles building custom code to filter input.

PHP (from 5.2 onward) has a built-in filtering system that makes the tasks of validating and sanitizing data trivially easy. Rather than accessing the $_GET and $_POST superglobals directly, you can make use of PHP functions like filter_input() and filter_input_array(). Let’s take a quick look at an example:

$my_string = filter_input(INPUT_GET, ‘my_string’, FILTER_SANITIZE_STRING);

The code above is roughly the equivalent of retrieving $_GET[‘my_string’] and then running it through some sort of filter that strips HTML and other undesirable characters. This represents data sanitization, one of the two things that the filtering system can do. These are the two tasks of the filtering system:

* Validation: Making sure the supplied data complies with specific expectations. In this mode, the filtering system will indicate (as a boolean) whether or not the data matches some criterion.
* Sanitizing: Removing unwanted data from the input and performing any necessary type coercion. In this mode the filtering system returns the sanitized data.

By default, the filter system provides a menagerie of filters ranging from validation and sanitization of basic types (booleans, integers, floats, etc.) to more advanced filters which allow regular expressions or even custom callbacks.

The utility of this library should be obvious. Gone are the days of rolling our own input checking tools. We can use a standard (and better performing) built-in system.

If we take things one step further than merely presenting this as an option. We can say that we should no longer directly access superglobals containing user input. There is simply no reason why we should. And the plethora of security issues related to failure to filter input provides more than sufficient justification for my claim. Always use the filtering system. Make it mandatory.

“But,” one might object, “what if I don’t want my data filtered?” The filtering system provides a null filter (FILTER_UNSAFE_RAW). In cases where the data needn’t be filtered (and these cases are rare), one ought to use something like this:

$unfiltered_data = filter_input(FILTER_GET, ‘unfiltered_data’, FILTER_UNSAFE_RAW);

Following this pattern provides a boon: We can very quickly discover all of the unfiltered variables in the code by running a simple find operation looking for the FILTER_UNSAFE_RAW constant. This is much easier than hunting through calls to $_GET to find those that are not correctly validated or sanitized. Risky treatment of input can be managed more efficiently by following this pattern.

Filters won’t solve every security-related problem, but they are a tremendous step in the right direction when it comes to writing safe (and performant) code. It’s also simpler. Sure, the function call is longer, but it relieves developers of the need to write their own filtering systems. These are darn good reasons to never use $_GET (or $_POST and the others) again.

10 Ways to Increase Hard Disk Life and Performance

Performance of hard disk has always been an underrated aspects of the overall system performance. The hard disk were considered only as a place to store and people paid heed to how they affected the operation of the PC as a whole. Each time you read and write on the hard disk some performance is lost, because the disk subsystem is the slowest component in your computer system. There is not much to do about the fact, but you might be able to take a number of actions to make certain that the computer’s hard disk is always running in tiptop shape. We provide 10 ways to increase hard disk life and performance.

1. Remove duplicate files from hard disk

The first step towards enhancing the performance of duplicate files from hard disk. There are several free duplicate file finders that easily find all sort of duplicate files on the disk. Users can then remove all the duplicate copies and just keep a single one. You might use duplicate file finders such as Duplicate Cleaner. It might even find files that have some common content, even if the file names are different.

It is an important step in any hard disk cleaning exercise. Removing all the duplicate files from the hard disk can considerably reduce the space occupied on the hard drive.

2. Defragment Hard Disk

It’s one most widely known that speeds up the hard disk and improves performance.

Step 1: Now open My Computer. Right-click the disk that you want to defragment and click Properties. On the Tools tab, click the Defragment Now button.
Step 2: The Disk Defragmenter window appears. Click the Analyze button.
Step 3: An analysis of the drive is performed, and a message appears telling you whether or not you should defragment the drive.
Step 4: If the drive needs to be defragmented, click the Defragment button. The defragmentation process begins and may take some time, depending on how badly the drive is fragmented.

3. Checking up for disk errors

It’s easy to check the disk errors as another useful tool that is provided by Windows XP. The tool is available in the Tools tab of the hard disk properties sheet. It offers simple check box options to check for file system errors and recover bad sectors. The error checking tool needs complete access to the disk for its work. Sometimes the application is closed and the user needs to reboot before it starts to gain complete access to the disk. In case you use the computer a lot it’s a great idea to run this tool once for a month to ensure your disk is working perfectly.

To use the error checking utility

Step 1: Go to Start > My Computer
Step 2: Right click on the hard disk or partition that you want to check for errors
Step 3: Click on properties and then “Tools”
Step 4: Under “Error checking” click on “Check Now”
Step 5: Select the Scan for and attempt recovery of bad sectors
Step 6: Click on Start

This will scan the disk for errors and mark bad sectors

4. Compression/Encryption

In the NTFS you can compress folders and encrypt folders and files to stop unauthorized access of those files and folders. Given that the compression feature is impressive, compressed files takes longer to open and resave. If you want the maximum possible speed from the system you need to avoid compressing the drives.

Encryption will also reduce the performance in terms of the opening files. The encryption process needs to make sure that you have an authorized view of the file. In general the rule is to follow the encrypt files or folders that are necessary. Don’t get in the habit of encrypting everything.

5. To NTFS overhead disable the 8.3 filenames

NTFS is a feature packed file system that Windows XP users can work with. For compatibility with MS-DOS and old Windows 3.x systems, NTFS supports 8.3 filenames. It implies that the files are named with eight characters, followed by a dot and three-charecter extensions. It’s nothing wrong with it. Overload is unnecessary when you are not supporting older programs and systems. Some other programs depend on 8.3 filenames, so you need to turn off the 8.3 filename feature some programs might not work properly. It is unlikely at this point in time.

To reduce NTFS overhead we have the following steps

Step 1: Click Start>Run. Type regedit and click OK

Step 2: In the Registry Editor, navigate to
HKEY_LOCAL_MACHINE\SYSTEM \CurrentControlSet\Control\Filesystem.

Step 3: Locate the NtfsDisable8dot3NameCreation entry and change the value to 1. This will disable the creation of 8.3 filenames
Step 4: Close Registry Editor

6. Master File Table

The NTFS Master File Table (MFT) keeps track of files on disks. This file logs all the files that are stored on a given disk. It includes the entry for the MFT itself. It works like an index of everything on the hard disk in much the same way that the address book. It makes the index of all these files so are easy to locate for defragmentation and for use by application.

You can add a registry entry to ensure the table is large enough and has the space it requires. It will take up more space on the hard disk that reduces overall NTFS overhead to help general performance.

Steps 1: Click Start>Run. Type regedit and click OK

Step 2: In the Registry Editor, navigate to

HKEY_LOCAL_MACHINE\SYSTEM \CurrentControlSet\Control\FileSystem

Step 3: Create a REG_DWORD entry and name it NtfsMftZone Reservation

Step 4: Set the value of the entry to 2.
Step 5: Close the Registry Editor.
Step 6: Set the value of the entry to 2

7. Stop Hibernation

In Windows systems like XP the Hibernation feature is quite handy. It can be shared with others to optimize Windows XP disk performance to switch off hard disk Hibernation mode in Windows XP. To use this mode follow the steps below

Step 1: Click Start
Step 2: Control Panel
Step 3: Power Options Properties
Step 4: Click on the Hibernate tab and clear the Enable Hibernation check box

8. Clean up unnecessary files and optimize the Recycle Bin

The number of temporary files, temporary Internet files, recent documents list in Windows start menu, download files and log files which Windows XP generates. In case you want the hard disk to perform in the best possible way, just delete all the junk files from it. It is an overstuffed hard drive that makes Windows XP work harder. Make sure that you have the system in place to keep the old files and junk cleaned up and removed.

Make sure you optimize the Recycling Bin. The size of recycle bin is the percentage of hard drive.

Step 1: Right click on the Recycle Bin and choose Properties
Step 2: In the Recycle Bin properties, move the recycle Bin size slider from 10 percent to 3 or even 1 percent. It is still a decent amount of storage since you now have a larger disk to work with.
Step 3: Click OK

9. Convert to NTFS

NTFS is better than FAT/FAT32 and allows users to use some management features of Windows XP that FAT/FAT32 does not support. Try to convert any FAT/FAT32 drives to NTFS. It’s only exception is the rule used in a dual-boot system that also boots earlier version of Windows that doesn’t support NTFS, such as Windows 98 or Windows Me.

To convert FAT or FAT32 drive to NTFS follow the steps below

Step 1: Click Start>Run. Type command and click OK.

Step 2: At the command prompt, you will use the Convert command to convert the FAT drive to NTFS. Keep in mind that the conversion process is completely safe and all of your data will remain as it is. The command and syntax is as follows:

convert D: /FS:NTFS
Press Enter.

Step 3: Conversion may take several minutes, depending on the size of the drive. When the process is complete, simply exit the command interface. If you converted the boot partition, you will be prompted to reboot the computer.

10. Remove the temporary files

Windows creates a lot of temporary files during the normal operation. They try to clean up those files while some of them stay and keep accumulating. For instance file fragments, browser cache, memory dumps, log files, cookies, Recycle Bin.

Windows provides a built-in tool to clear such files that is not as efficient as some of the free disk cleaners available in the market. You can use FCleaner, CCleaner, and Comodo System Cleaner.

MS Office 2010 vs MS Office 2007

Microsoft is leaving no stone unturned to promote its new Office suite. Recently Microsoft announced a “technology guarantee” program that read “Sell Office 2007 today and your customer can download Office 2010 free”. The customers purchasing and activating Office Home and Student 2007, Office Standard 2007, Office Small Business 2007 and Microsoft Publisher 2007 between March 5 and September 30, 2010 will be eligible for free upgrade to a comparable version of Office 2010. However, before moving to an upgraded Microsoft Office version you would always look for advantages of installing it over the previous version. Microsoft ensures that those looking to go for an upgraded version of Office 2007 have enough reasons to support . We draw a comparison to elucidate how MS Office 2007 better’s Office 2010.

Updated Ribbon

Office 2007 made headlines with innovative ribbon menu system Office 2010 takes it to the next level with a more intuitive ribbon. It provides a new home menu system. Instead of opening up a dropdown the entire window changes color and provides the save, open, close preview and other options. The enhanced Ribbon across Office 2010 applications allows users to access commands quickly and customize tabs to personalize the experience in your working style.

The home menu system offers detailed information on modifications, authors, file size, and permission. The new print and print preview menu that definitely changes the layout most PC users are used to. The new menu by simply cleaner than that of Office 2007.

Better multimedia Editing

The clear that Microsoft upped the multimedia editing options form the last iteration. Image editing has been improved in office 2010. There are new tools for screen capture in Word 2010 and PowerPoint 2010. You can even remove the backgrounds with the new Office. Now Microsoft PowerPoint has in-video editing. You might even remove backgrounds with the new Office. All this implies you can do so much more without even using Photoshop.

This features takes Office 14 to a new level. Suppose you wanna trim some parts of a video clip before presentation or you might be looking to apply professional styles to a video like reflection coupled with 3D rotation. PowerPoint 2010 now includes some powerful video editing features.

Broadcast Slideshows within PowerPoint

This is one of the most exciting new features of PowerPoint 2010. Now you can deliver live PowerPoint presentations over the web and anyone sitting in any part of the world can get connected to the web browser.
Distribute the slides as video

In PowerPoint 2010 you can convert your presentation into a video file that can be uploaded on YouTube or distributed on a portable media player such as the iPod. Moreover, the video conversion runs in the background that allows you to use PowerPoint while creating the video.

More visually enhancing

Office 2010 comes with an array of design options to help you generate better ideas. It includes new and improved picture formatting tools such as color saturation and artistic effects that allow you transform your document visuals into a work of art. Office 2010 offers new SmartArt® graphic layouts and a wide range of new pre-built Office themes to create fantabulous designs in Word.

Real time collaboration and communication

Not just the web version, the desktop version of Office 14 also sports a real time buddy list of sorts that shows what individuals are currently editing within a document. Now you can see who is online and who is working on what. This is a great revamp when compared to Office 2010.

Stronger Security Settings

Office 2010 improves upon Office 2007 with respect to author settings, restricted editing and protect mode. This stops you from accidentally editing a file you download until it is enabled.

Embed Web Videos in the Presentation

Office 2010 provides a significant advantage over Office 2007 by allowing users to embed video clips from the Internet into PowerPoint presentation. Now users can just copy and embed code from YouTube or any other video sharing site. The video can be pasted anywhere on a slide.

Quick Steps in Outlook

Microsoft Outlook includes a new Quick Steps feature in the Outlook. With Quick Steps you can create a sequence of commands and apply them to the Outlook item with just a click. For an example, you can use quick step for Send and Delete that will delete the email from your inbox after replying.
Now users can compress the long e-mail threads into a few conversations that can be categorized, filed, ignored, or cleaned up.

Built-in PDF Writer

The Office 2010 programs include a built-in PDF writer that helps you to save documents into PDF format with a click. In 2007 you had to use add-on separately but now you will have native PDF support.

Simpler Document Printing

Microsoft Word 2010 has completely revamped the print dialog. Take an instance, you can tweak printer settings like margins and preview the changes side-by-side.

Creating better data insights and visuals

Excel 2010 introduces a Sparkline feature that tracks and highlights important trends with new data analysis and visualization. The Sparklines feature allows clear and compact visual representation of data with small charts within worksheet cells. Filter and segment into PivotTable data in multiple layers using Slicers to spend more time analyzing and less time formatting.

Export MS Access Database To MySQL Database

Here are few tips (with undocumented features) to help you export huge MS Access database (production database with real data) to MySQL.

Use mdbtools utilities as described below:

mdb-tables displays the list of tables in MS Access

mdb-scheme export the MS Access schema to MySQL database. The documentation doesn’t mention that MySQL is supported. You should use the following command:

mdb-schema [-S] database mysql

You may need to replace column names if they collide with MySQL reserved keywords. few column names like GROUP, PRIMARY & CROSS.

The schema created contains drop tables. You can use grep -v to exclude them. Additionally you should add drop database followed by create database commands at the very top of the file when you are importing to a new database. This is essential because you may have to go through multiple passes. Having drop table when a table is not there creates problem as also not having them creates problem when the table has already been loaded. So I remove the drop tables and instead drop and re-create the database for each run by the process described above.

mdb-export exports MS Access tables to CSV format. What is not emphasized is that you can directly create INSERT statements instead of a CSV file. Converting it directly to SQL INSERT statements is very helpful, especially for converting data types. Use it like shown in the sample:

mdb-export -I -R’;\n’ MSAccessDatabase.mdb Table > Table.sql

Note: Replace MSAccessDatabase.mdb with the full path of your MS Access Database, Table is Table name. You can save the output in any file, I just chose Tablename.sql for convenience.

The -R’;\n’ ensures that each SQL INSERT is followed by a semicolon and then a new line.

You may need to modify this to substitute column names, if you have previously changed them while importing the schema to prevent conflict with reserved MySQL keywords.