Continuous code re-factoring – part-2 An Introduction to Selenium web automation testing

selenium

Introduction:

Browser automation testing play a major role on integrated testing. To measure quality of our web application we need certain quality metrics . We should able to mock the behavior of end user, the way they navigate and the way they consume information.

Why we need browser automation?

Simply the evaluation of Front End Technologies. There are some pretty good javascript based frameworks like blackbone.js, knockout.js and Google’s very own Angular.js offer excellent support for MVC and MVVM frameworks. These frameworks redefine User Experience on web applications, so it is important to have a Browser based automation testing to measure the product quality.

Selenium:

Selenium was originally developed by Jason Huggins in 2004. It is a open source software and has the following components in to it.

Selenium IDE:

An Integrated Development Environment, developed as a Firefox extension. It can record,edit and debug test. We can also produce executable code. We can export client code in Java,Ruby and other popular languages as well.

Selenium Client API:

A scripting SDK (available on most of the premier languages), that provide more control for the testers. We can see an example in java on that in later part of this article.

Selenium Web Driver :

A successor of Selenium RC, Web driver accepts urls from Client API and send it to the browser. It is the main interface between your test cases and browser. Selenium has web drivers for almost all the major web browsers such as Firefox, Chrome and so on.

Hello World Selenium :

The following program will open up Google home page and search for Java complete reference. We are using Firefox Driver to test it out.

Maven Dependency:


<dependencies>

<dependency>
 <groupId>org.seleniumhq.selenium</groupId>
 <artifactId>selenium-firefox-driver</artifactId>
 <version>2.32.0</version>
 </dependency>

<dependency>
 <groupId>org.seleniumhq.selenium</groupId>
 <artifactId>selenium-server</artifactId>
 <version>2.32.0</version>
 </dependency>

<dependency>
 <groupId>org.apache.httpcomponents</groupId>
 <artifactId>httpcore</artifactId>
 <version>4.2.3</version>
 </dependency>

<dependency>

<groupId>org.seleniumhq.selenium</groupId>
 <artifactId>selenium-java</artifactId>
 <version>2.35.0</version>
 </dependency>

<dependency>
 <groupId>commons-lang</groupId>
 <artifactId>commons-lang</artifactId>
 <version>2.6</version>
 </dependency>

<dependency>
 <groupId>xml-apis</groupId>
 <artifactId>xml-apis</artifactId>
 <version>1.4.01</version>
 </dependency>

</dependencies>

 

Sample Java code:


package com.example.tests;

import java.util.regex.Pattern;
import java.util.concurrent.TimeUnit;
import org.junit.*;
import static org.junit.Assert.*;
import static org.hamcrest.CoreMatchers.*;
import org.openqa.selenium.*;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.support.ui.Select;

public class Google {
 private WebDriver driver;
 private String baseUrl;
 private boolean acceptNextAlert = true;
 private StringBuffer verificationErrors = new StringBuffer();

@Before
 public void setUp() throws Exception {
 driver = new FirefoxDriver();
 baseUrl = "https://www.google.co.in/";
 driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
 }

@Test
 public void testGoogle() throws Exception {
 driver.get(baseUrl);
 driver.findElement(By.id("lst-ib")).clear();
 driver.findElement(By.id("lst-ib")).sendKeys("java complete reference");
 driver.findElement(By.xpath("//ol[@id='rso']/li/div/h3/a/em[2]")).click();
 }

@After
 public void tearDown() throws Exception {
 driver.quit();
 String verificationErrorString = verificationErrors.toString();
 if (!"".equals(verificationErrorString)) {
 fail(verificationErrorString);
 }
 }

private boolean isElementPresent(By by) {
 try {
 driver.findElement(by);
 return true;
 } catch (NoSuchElementException e) {
 return false;
 }
 }

private boolean isAlertPresent() {
 try {
 driver.switchTo().alert();
 return true;
 } catch (NoAlertPresentException e) {
 return false;
 }
 }

private String closeAlertAndGetItsText() {
 try {
 Alert alert = driver.switchTo().alert();
 String alertText = alert.getText();
 if (acceptNextAlert) {
 alert.accept();
 } else {
 alert.dismiss();
 }
 return alertText;
 } finally {
 acceptNextAlert = true;
 }
 }
}

 

 

 

 

 

 

Tech Co-founder – Watch before you leap.

tech co-founderHave you heard many times, I’ve this killing idea; I just need a technical co-founder, Will you jump in to the ship for a hard voyage? Then keep read on.

Keep that in your mind, the idea is just a part of successful product. May be you can say it will contribute 5% of a product. When someone says “this is unique idea, don’t tell anyone”, he should be the most stupid you ever faced. Just carry on with your business.

There is nothing called unique in Idea. There were lots of search engines before and after Google. There were lots of social networks before and after Facebook. (Remember LinkedIn Started an year before Facebook). The idea is not your product, but the ability to sustain and keep innovates will yield your product. In short it is all about implementation. Remember the tech co-founder is that implementation guy.

 

Before you jump in to be part of any voyage, here are some of the checklists for you.

Ask these following questions.

 

What the “Idea” guy bring in to the table?

The non-tech co-founder should have the following skills.

  1. Strong domain understanding and the ability to innovate on their domain.
  2. Should have strong connection and network with in his domain
  3. Strong Sales and Marketing experience. (This is the must)
  4. Strong vision about his product and what he wants to take the product in next three years.
  5. Ability to understand and respect technical difficulties
  6. Strong user experience sense and ability to create wireframe

 

   What is that “idea?” 

  1. Do you feel the idea is sustainable? As a user or a potential user, will you use it?
  2. Does the idea solve any real world problems?
  3. How realistic the idea is?
  4. Will your life become lot better with that product?

 

What is the benefit for me?

 

  1. You are my tech co-founder, CTO. You will hold 2-3 % stack in the company. Just get lost. There is no point of to be in part of it with those marginal profits and spending your valuable time. Instead you could solve challenging problem in kaggle.com or help people in stackoverflow.com
  2. Are you getting a good pay for your consulting. Does that money worth of your time. If not keep move on.

 

If your checklist gives negative marks all the way through, There are enough technical challenges to be solved in the world to better the human life. Just don’t waste your time.

 

 

Continuous code re-factoring – part-2 Unit Testing on the container, the Arquillian way

arquillian_logo_450px

The Problem:

JUnit is probably one of the Top 5 Java open source tool developed by Java community. Until the IOC and Dependency Injection comes in to picture it is all well. But the moment containers taken care of injecting objects for you, the product development became lot easier. Now the programmers just need to describe how the Object should get created, your container will take care of injecting your Object. But on the downside, it makes the unit testing lot harder. To unit test your methods, one need to mock all the Objects that are needed to test a particular method. One of the most common Object is your EntityManager. It is the whole lot of  time-consuming effort to mock all the Objects to do your Unit Testing that will eventually consume your development time. Developers tend to move away from writing unit test cases, which will affect the code coverage. The lesser the code coverage, higher the code breaks. So your continuous code re-factor is in huge risk.

The saviour – Arquillian:

Arquillian is an open source test framework from JBoss.  It integrate with JUnit and make your existing test cases to run on Arquillian container. The way Arquillian works is instead of mock the dependent Objects, Describe what are the Objects are dependent to unit test your class. Arquillian bundle the dependent classes and create a WAR and deploy in to the JBoss server and runs the unit test on your actual JBoss server.

The Advantage :

1. Developers no need to mock the Objects.

2. We are actually testing container injected Objects, just like your code will run after you deploy the WAR file.

3. Since only required classes been getting deployed, you are not deploying the entire WAR file. So the testing is faster.

4. It takes the pain out from the developers and increase the development time.

The Hello World:

Lets see how we can run a simple Hello World test case that will inject the Entity Manager. I’m taking the Entity Manager as an example because it is a common use case and I found very less resource on the web as well.

Step 1:

Configure your JBoss Home on your environment variable.

JBOSS_HOME = “<jboss home directory>

Step 2:

Maven dependency configuration


<properties> <version.shrinkwrap.resolvers>2.0.0-beta-5 </version.shrinkwrap.resolvers> </properties>


<dependency>

<groupId>org.jboss.as</groupId>
 <artifactId>jboss-as-arquillian-container-managed</artifactId>
 <version>7.1.1.Final</version>
 <scope>test</scope>
 <exclusions>
 <exclusion>
 <artifactId>org.apache.felix.resolver</artifactId>
 <groupId>org.apache.felix</groupId>
 </exclusion>
 </exclusions>
 </dependency>

<dependency>
 <groupId>org.apache.felix</groupId>
 <artifactId>org.apache.felix.resolver</artifactId>
 <version>1.0.0</version>
 </dependency>
 <dependency>
 org.jboss.arquillian.protocol
 <artifactId>arquillian-protocol-servlet</artifactId>
 <version>1.0.4.Final</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.arquillian.junit</groupId>
 <artifactId>arquillian-junit-container</artifactId>
 <version>1.0.4.Final</version>
 <scope>test</scope>
 </dependency>

<dependency>
 <groupId>org.jboss.spec</groupId>
 <artifactId>jboss-javaee-web-6.0</artifactId>
 <version>3.0.2.Final</version>
 <type>pom</type>
 <scope>provided</scope>
 <exclusions>
 <exclusion>
 <groupId>xalan</groupId>
 <artifactId>xalan</artifactId>
 </exclusion>
 </exclusions>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-api</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-spi</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-api-maven</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-spi-maven</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-impl-maven</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>
 <dependency>
 <groupId>org.jboss.shrinkwrap.resolver</groupId>
 <artifactId>shrinkwrap-resolver-impl-maven-archive</artifactId>
 <version>${version.shrinkwrap.resolvers}</version>
 <scope>test</scope>
 </dependency>

<dependency>
 <groupId>xml-apis</groupId>
 <artifactId>xml-apis</artifactId>
 <version>1.4.01</version>
 </dependency>

<dependency>
 <groupId>mysql</groupId>
 <artifactId>mysql-connector-java</artifactId>
 <version>5.1.21</version>
 </dependency>

Step 3:

On your src/test/resources, create a XML file arquillian.xml

</pre>
<?xml version="1.0" encoding="UTF-8"?>
xmlns="http://jboss.org/schema/arquillian"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://jboss.org/schema/arquillian
 http://jboss.org/schema/arquillian/arquillian_1_0.xsd">

<!-- <span class="hiddenSpellError" pre="">Uncomment</span> to have test archives exported to the file system for inspection -->
<!-- <engine> -->
<!-- <property name="deploymentExportPath">target/</property> -->
<!-- </engine> -->

<!-- Force the use of the <span class="hiddenSpellError" pre="the ">Servlet</span> 3.0 protocol with all containers, as it is the most mature -->
 <defaultProtocol type="Servlet 3.0" />

 <container qualifier="jboss">
 <protocol type="jmx-as7">
 <property name="executionType">REMOTE</property>
 </protocol>
 </container>
</arquillian>
<pre>

Step 4:
On your src/test/resources/META-INF create a file test-persistance.xml like,


</pre>
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0"
 xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="
 http://java.sun.com/xml/ns/persistence
 http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">

<persistence-unit name="stats-demo" transaction-type="JTA">
 <provider>org.hibernate.ejb.HibernatePersistence</provider>
 [Your JTA Data source configured in standalone.xml | domain.xml]
 <properties>
 <property name="hibernate.dialect" value="org.hibernate.dialect.MySQLDialect" />
 <property name="hibernate.show_sql" value="true" />
 <property name="hibernate.format_sql" value="true"/>
 <property name="use_sql_comments" value="true"/>
 <property name="hibernate.connection.provider_class"
 value="org.hibernate.connection.DatasourceConnectionProvider" />
 <property name="transaction.factory_class"
 value="org.hibernate.transaction.JTATransactionFactory" />
 <property name="hibernate.cache.provider_class" value="org.hibernate.cache.HashtableCacheProvider" />
 </properties>
 </persistence-unit>

</persistence>
<pre>

Step 5:
We have completed all the XML configuration to start with Arquillian. Now the JUnit class to test is,

</pre>
package com.ananth.learning.unittest.arquillian;

import java.io.File;
import java.util.ArrayList;
import java.util.List;

import javax.inject.Inject;
import javax.persistence.EntityManager;

import org.jboss.arquillian.container.test.api.Deployment;
import org.jboss.arquillian.junit.Arquillian;
import org.jboss.shrinkwrap.api.Archive;
import org.jboss.shrinkwrap.api.ShrinkWrap;
import org.jboss.shrinkwrap.api.asset.EmptyAsset;
import org.jboss.shrinkwrap.api.spec.WebArchive;
import org.jboss.shrinkwrap.resolver.api.maven.Maven;
import org.junit.Test;
import org.junit.runner.RunWith;
/**
* @RunWith make sure the test case running as a Arquillian test case
*/

@RunWith(Arquillian.class)
public class HelloWorld {

 @Deployment
 public static Archive<!--?--> createDeployment() {
 // Add this if you want to add any lib that been mentioned as a dependency in your maven configuration (pom.xml)
 File[] libs = Maven.resolver().loadPomFromFile("pom.xml")
 .resolve(getDependencyLibs()).withTransitivity().asFile();

 //System.out.println(libs);

 /** Add dependent classes and libs in to Web Archive.
 * please check the API lib, you can addClass() to add single class
 * addClasses() and addPackage(), addPackages() also works work you.
 * Note: you don't need to have beans.xml. It will be created by JBoss by itself.
 */
 WebArchive jar = ShrinkWrap
 .create(WebArchive.class, "test.war")
 .addClass(HelloWorld.class)
 .addAsLibraries(libs)
 .addAsResource("META-INF/test-persistence.xml",
 "META-INF/persistence.xml")
 .addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml");

//System.out.println(jar.toString(true));

return jar;
 }
 // let the container inject for you
 @Inject
 private EntityManager em;

 /**
 * Your actual test case
 */
 @Test
 public void testHello() {
 System.out.println(em.toString());

 }

/**
 * list of dependency module from pom.xml that you may need. (e-g) your common module or persistance module
 */

private static List getDependencyLibs() {
 List<String> list = new ArrayList<>();
 list.add("mysql:mysql-connector-java");
 return list;
 }

}
<pre>

Now all you need to do run the test case just like how you run your normal JUnit test. You can see the WAR getting generated and unit test run on your JBoss Server.!!!

Continuous code re-factoring – why we need to do it – part-1

tdd-red-green-refactor-diagramContinuous Code re-factoring is an important part of software development. The growing number of adoption of Agile development, code often written to move fast. The problem with this approach is the code may lose the usability since it often looks to solve the problem then deliver re-usability.  Agility of business problem often changes abruptly. The problem you solved a year back may not be relevant now or need more better solution. This will increase the “Technical Debt” in the code and eventually eat up your entire budget.

The benefits:

I’ve listed some of the key benefit of doing continues code re-factor.

  •  You will know which part of the code has more technical debt. There by fixing it continuously improve the code quality and stable the product.
  •  You can be very sure that there is no redundant code in your product. You make sure you are following DRY principles.

Remember “The code you’ve not written is the code you don’t need to debug”

  • Code re-factor along with the team improves the domain understanding for the team. It will help the product on innovation and continues improvement of the quality.
  •  Continues code re-factor also help you understand your code better. This will enable to estimate cost and time effort of future business requirements. This will enable no surprise in product delivery.

Reason people not to do Continues code re-factoring:

1. Often people excused not to do code re-factor is that the code already running fine. Re-factor may introduce bugs and the cost and effort to test it out is very high. But if you can see adding more technical debt eventually increase the maintenance cost of the product. I believe every product spend almost 80% investment in the maintenance rather than the development.

How I can do it?? – The challenge and the solution: 

The question will eventually raise, okay I want to follow continues re-factor code principal at the same time I want to reduce the risk of the product getting break. I don’t want to burn my testers every time I’m doing the code re-factor.

The best way is, as you guess “Automation Testing”. Code Coverage is an important principle in software development. It make sure each and every piece of code been covered with your unit test cases so that you want face any surprises during testing phase.

Well, my experience is all about software development using java technologies. One of the recent trends is the Dependency Injection. CDI (Context Dependency Injection) was included as part of JEE6 and it is growing strongly. So the container takes responsibility of injecting the object when it needed rather than we creating it. Well it is reducing the burden of development, but increases the pain of unit testing. The standard JUnit test framework all run on a standalone and have no connection with java containers. So we need to mock all the objects getting injected by the container. Mocking leads to error. I mean you can’t really test the real object. There comes Arquillian testing framework to help you around. Arquillian is built around JUnit framework and ability to run inside the container. So instead of build and deploy the entire infrastructure, you can deploy the class or object you want to test in to the server and test it out. There by you can automate most of your code base.

But will that enough. No. Often web based application consists of well-defined web flow. we need to navigate through certain flow to achieve certain functionality. It is more of an integration testing. How to automate it? How we can automate a test case like end user using the product. how we can capture the journey. Selenium is answer for you. Again it is built on top of JUnit framework, which can invoke a browser on its own and execute the flow. It has wonderful plugins for all the leading browsers. Selenium browser plugins used to record the flow / URL you are accessing on your browser and export in to auto generated JUnit test cases. Beauty isn’t it???

In part 2 I have covered how to get start with Arquillian. In part 3 I have covered how we can automate browser testing for an end to end application using Selenium.

Hadoop MapReduce – Best Practices

Hadoop_logo

Some of the best practices while developing MapReduce programs. You can see why we need Hadoop here

1. Use Combiner:
Combiner is a mini-reducers. Most of the time the code for Reducer will be same for Combiner. Combiner also extends your Reducer class to implement the reduce functionality. The major advantage of combiner is it will reduce data on each Map itself, that will reduce the Network IO and increase the execution time.

2. Use Data Compression:
Hadoop supports common compression techniques like DEFLATE,gzip,bzip2,LZO,snappy. Since hadoop hard split the files in to different block it is always best to use split compression algorithms like bzip2. Hadoop also supports Avro file format, which standardize the serialization of objects. Sequence file format is the best fit to input compressed files. Compression also helps to handle small files problem in Hadoop.

3. Distribution Cache : use it for only small files.
Distributed cache is a way of side data distribution. You often need to have a look up data to perform computation, say for instance using employee id, write employee name in the output or using IP Address in the Apache Log, get the city / country name. If the distributed cache is large than HDFS will store it over blocks and it will be time-consuming to read those data just to look up.

4. Choose HDFS as a larger block:
HDFS specification defined a block size to hold data should be between 64 MB to 512 MB. Hadoop designed to work on larger amount of data to reduce the disk seek time and increase the computation speed. So always define the HDFS block size larger enough to allow Hadoop to compute effectively.

5. Set Reducers to zero if you not using it:
Some time we don’t really use reducers. For example filtering and reduce noise in data. When not use Reducer always make sure to set it zero since the sorting and shuffling is an expensive operation.

6. Chain the Jobs:
The funda about MapReduce is modularity. modularize your problem and think of solving it MapReduce Way. Chain your jobs so that in a complex problems, if some failure happen in a mid way, you can still carry from the last job. It will also simplify your problem and make it easy to solve.

7. Always write unit test and run in a small data set:
It is the best practice in any programming. Hadoop comes up with good support to unit test your Mapper as well as Reducer.

8. Choose Number of Mapper and Reducer wisely:
The general rule of thump to choose number of mapper and reducer is
Total mapper or reducer = Number of Nodes * maximum number of tasks per node
and
maximum number of task per node = number of processor per node – 1 (Since data node and task tracker will take one processor)

Say like we have 50 nodes and each node has 4 processor then
total number of mapper or reducer = 50 * (4-1) = 150.

A WALK THROUGH OF GRAILS-PART 2- Maven Integration

Maven
Maven

This is the part 2 of Walk through Grails. In this post we can see how Maven can be integrated with Grails. You can see the part-1 here.

Why I like Maven
I like maven because of its dependency management and build on top of typical build life cycle. So it is easy to manage the dependency lib files as well as manage the entire build life cycle. The profile based configuration is another important feature where you can manage your dependency and build process as different for each environments.

How to use Maven advantage in Grails?
As we saw in the first post of Grails introduction, the conventional over configuration makes Grails application development fast and reliable. The power of Maven could be a significant advantage on Grails based application. There are two stages of a Grails project we can integrate Maven.

  • Existing Grails Project
  • Grails project from scratch

Existing Grails Project
Grails comes with pretty good command line tool that makes easier your job. Just typing the below command on your Grails Project Home will add maven support.

grails create-pom <group-id> 

create-pom command will expect your group id of the project.

Grails Project from the scratch
The traditional archtype command will do the job for you.

mvn archetype:generate -DarchetypeGroupId=org.grails \
    -DarchetypeArtifactId=grails-maven-archetype \
    -DarchetypeVersion=1.0 \
    -DgroupId=grails-app -DartifactId=grails

Note:
once you generated pom.xml, you need to change

<plugin>
  <artifactId>maven-compiler-plugin</artifactId>
  <configuration>
    <source>1.5</source>
    <target>1.5</target>
  </configuration>
</plugin>

in to

<plugin>
  <artifactId>maven-compiler-plugin</artifactId>
  <configuration>
    <source>1.6</source>
    <target>1.6</target>
  </configuration>
</plugin>

The go to the project home and issue

mvn initialize

Job done. You have integrated maven with Grails application. You can still use all the Grails commands as maven goals. All you need to do is to add the Grails commands in to Maven goals. Some of the commands I have integrated like,


<plugin>
        <groupId>org.grails</groupId>
        <artifactId>grails-maven-plugin</artifactId>
        <version>${grails.version}</version>
        <extensions>true</extensions>
        <executions>
          <execution>
            <goals>
              <goal>init</goal>
              <goal>maven-clean</goal>
              <goal>validate</goal>
              <goal>config-directories</goal>
              <goal>maven-compile</goal>
              <goal>maven-test</goal>
              <goal>maven-war</goal>
              <goal>maven-functional-test</goal>
               <goal>exec</goal>
               <goal>create-controller</goal>
                <goal>run-app</goal>
                <goal>console</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

So if you want to run the application with in-build server all you need to do,

mvn grails run-app 

A walk through of Grails-Part 1-Introduction

Grails logo
Grails

This is the part 1 of Grails (Note: click here to see why it is not called Groovy on Grails?) articles which I will cover the flexibility of Grails. You can see the part 2 here.

One of my friend through this to me to try it out Grails. I explored Ruby on Rails and Codeigniter and I always found more bond of convention over configuration or code over configuration right from the age of Java Beans.The simplicity will reduce the development time and increase the productivity.

With all these years with the web development and specially with java, I found these are the important feature any java web frameworks should support.

I will explore one by one with an example on my fourth coming blog posts. Another important aspect what I found interesting is that the command line tool which Grails provide make our job more easier. The only disappointment I had is it is not supporting Servlet 3.0 Spec, so you can’t really run it on Tomcat 7 and higher.

Setup Grails.

Grails running on Groovy. So we need Groovy to be setup on your machine. (Note: This setup is for Ubuntu)
Groovy can be installed from the Ubuntu repository.

>sudo apt-get install groovy

To test the installation

>groovysh

This should open a python like interactive prompt to test your scripts quickly

>groovyconsole

This will open a GUI based swing tool to test your scripts quickly.

Installing Grails:

The simplest way to install Grails is to install from Ubuntu repositories.

>sudo apt-cache search grails

and get the latest version of Grails. In my demo I have used

> sudo apt-get install grails-1.3.7

But you can’t get the latest version from the repository. You need to do either manual installation or integrate with maven which I will cover on my next blog.
To start a new project simply issue this below command.

 >grails create-app grails-app

The folder structure of the hello world application will be like,

Grails folder Structure

conf:
The conf folder has the configuration files for hibernate and spring if you don’t want to use GORM and override it.
controllers
It holds all the controllers groovy files
domains
All your domain objects which interact with data source goes here
view
The view part of Grail Server Pages (gsp) goes here
services
If you want to expose the app as services all your web services code goes here
i18n
The internationalization support goes here
taglib
Grails support you to develop its own Taglibs as well.
plugins
Grails allow you to develop plugins. There are some you can found it here
src
If you want to develop any components on java or Grails you can create and reuse with Grails.
test
All your test cases for Grails goes here.
webapp
Typical java web application container holds all the js, image, css and container configuration files.

The Hello world:
Lets create a simple controller and run our hello world. Grails have interesting set of command tools which help to speed up the process. you can check their help tool by executing,

 grails help 

Now to create a controller issue,

grails create-controller com.test.hello

This will create HelloController.groovy file inside com.test package with index() file.

package com.test
class HelloController {
    def index() {
         def myname = "hello groovy"         
         render "This is cool ${myname}"
    }
   
}

now run the build-in server to test it out.

> grails run-app

you can see

server running on http://localhost:8080/grails-app

Now on your browser just try

http://localhost:8080/grails-app/hello/index

There you are, you cooked your first fish on Grails.