Build a REST api under five minutes

The paradigm of computing is changing dynamically. Now, it is the era of mobile computing. What’s that? Pretty simple. All you need is a smartphone be it an Android or an iPhone or a Lumia; and the entire world is yours. The app stores and markets are flooding with millions of apps uploaded/updated everyday. Clearly, mobile apps are the developer’s paradise today.

That brings us to the next part of our discussion. The mobile can do everything that our computer does. Take an example. An app called Foursquare. It is existing as a web service as well as an app. Both can pretty much do the same. The website runs on a browser in a PC/Mac with configurations such as i5 processor and 4GB RAM, while the same app version runs on a mobile device with configuration such as Qualcomm processor and 512 MB RAM. Here’s how they look on different versions.

Foursquare PC vs Mobile version.

The trick here is that the client apps do not do the maximum processing. They simply interact with the server which performs most of the processing and returns the response to the mobile apps. That brings us to the obvious question – How do they interact? Answer: via the API.

Here, we are only interested in RESTful APIs. REST api is a very simple form of API(Application Programming Interface) which interacts with the server via some explicit URL patterns that are used either to fetch data from server(GET) or to send data to server(POST). More study on REST can be done here. Now, this REST request looks like this and you might be quite familiar to that:


The purpose of using REST is that when the mobile app sends a request to your server to GET the contents of a particular user, it returns the data in a portable format be it XML or JSON. It typically looks like this:

   "_id": "xyz",
   "code": "N/A",
   "email": "",
   "joinDate": "2015-09-20T05:59:30.675Z",
   "lat": "22.64999833333333",
   "lon": "88.45",
   "name": "XYZ",
   "token": "86f7a143bc4f1f4c"

Now its the job of the mobile app code to parse the JSON response and format it accordingly. Thus, the entire architecture looks like:

Architecture of Workflow

Now how can you build such an API under 5 minutes? Easy. You need three things:

  1. NodeJS
  2. Restify framework
  3. MongoDB
A server using Restify and Nodejs looks very simple like this:
var mongojs = require('mongojs');
var restify = require('restify');
var request = require('request');
var ip_addr = '';
var port = process.env.PORT || 1337;
var fs = require('fs');

/*****************************Modules req**************************/
var config = require('./modules/config.js');

var server = restify.createServer({
 name : "API started..."

server.listen(port , function(){
 console.log('started %s',;

Create a folder called modules in root folder and create a file called config.js in it wher we will store all connection strings like those for MongoDB connection.

module.exports = {
 mongoString : "mongodb://"

You can get a free 500MB of mongodb from MongoLab. Replace the mongoString with your connection string. The DB is now setup. All we need to do is POST and GET. We don’t need to write explicit Models, Views and Controller as restify takes care of all those in NodeJS.
Creating routes for user:

/*************************MongoDB Server****************************/
var connection_string = config.mongoString;
var db = mongojs(connection_string, ['dbname'], {authMechanism: 'ScramSHA1'});
var user = db.collection("user");

//USER Routes
var USER_PATH = '/user'
server.get({path : USER_PATH , version : '0.0.1'} , findAllUsers);{path : USER_PATH , version: '0.0.1'} ,createUser);



function createUser(req , res , next){
 var _user = {};
 _user._id =; =; =;
 _user.joinDate = new Date();

 res.setHeader('Access-Control-Allow-Origin','*'); , function(err , success){
 console.log('Response success '+success);
 console.log('Response error '+err);
 res.send(201 , _user);
 return next();
 return next(err);

function findAllUsers(req, res , next){ //finds all users listed
 res.setHeader('Access-Control-Allow-Origin','*'); //header set for CORS request
 user.find().limit(20).sort({postedOn : -1} , function(err , success){ //limit of 20 users
 console.log('Response error '+err);
 res.end(JSON.stringify(success,null,3)); //JSON response
 return next(err);


The coding part is done. Now run the server as node app.js. Your server will be running at:
Download Postman from Google Chrome Store.
Open Postman and enter the http://localhost:1337 as url and enter parameters as:


Choose method type as POST and hit Send.
Screen Shot 2015-11-12 at 1.12.32 PM

And there you go. This is how you create a Data entry. Now simply redirect your browser to
http://localhost:1337/user and you can have the JSON response of user.

This is just a tutorial and hope it helps you guys. Building an API is not rocket science anymore. Its easy and you can do it under 5 minutes.


Build a weather forecast system leveraging the power of Windows Azure

In this section, we are going to build a weather forecast system using the power of Windows Azure, Microsoft’s cloud platform. Before we even start with building our application we need to understand the architecture of the application.

1. Understanding the basics of Azure:

Windows Azure is a cloud platform which means we can scale up and scale out our applications built on Azure. Say, we have a website that handles user requests. When we had 100 users per minute, one single IIS server was sufficient to handle those needs. But as the requests increase to 1,00,000 per minute our quota would be limited. So to remove all such restrictions, all good Cloud platforms offer scalability which means you can add to the processing capabilities of your system as and when required. Microsoft Azure offers a greater deal of scalability than any other vendor. It additionally offers you scaling out your apps or Virtual Machines/Servers. Suppose we have 10 servers/VMs to host our web applications each equipped with 2Ghz processor and 2GBs of RAM. Now, we can use this configuration or we can have 20 VMs with 1.4Ghz processing speed and 1GB of RAM; without hindering the capabilities. Instead we increase our fault tolerance and robustness as we increase number of VMs. Also, it reduces the pricing drastically. Here’s an analysis below:

Configuration 1 with 4 Medium Size VMs:
Azure Pricing Model 1
Azure Pricing Model 1
Configuration 2 with 6 small sized VMs:
Azure Pricing Model 2
Azure Pricing Model 2

So the second model helps a lot as far as pricing is concerned.

2. Architecture of the Application

Two core concepts are used in here. We have to understand the topology of the application/service.

  1. User logs into his account and requests for weather data.
  2. User queries are handled by a web service/server called as Web Role in terms of Azure. We shall rename it as ‘Web Role DataServe’.
  3. The Web Role queries or asks for data from a storage service called as Table in Azure.
  4. The weather data as found in the storage table is returned and shown to the user.

Front end Looks like this:

Frontend UI
Frontend UI

Now these are the steps from the front end view. That is, what the user is viewing. What about the fact that how the system can process and retrieve data in real time?
So, The back end view goes like this:

  1. We receive data from a standard open source weather API called openweather API. We run a Python script that periodically stores data for user’s cities. The data obtained is in JSON format.  This periodic engine is called a Worker Role. We rename it as ‘Worker Role DataFetch’.
  2. After data is received it is stored to the Table storage via a standard Python API provided under
  3. So whenever the user asks for data we need not check for it and create unnecessary waiting time.

The architecture is printed below:

architecture and dataflow
architecture and dataflow

3. Code for Worker Role DataFetch:

For the Worker Role we use a Python script that helps us easily parse our JSON data.

from import *
import datetime
import re
import urllib2
import math
import time

table_service = TableService(account_name='*********', account_key='**********')

class Temperature(object):
    def toCelcius(self, deg_f):
        return (deg_f-32)*5/9

url = ""

# search for pattern using regular expressions (.+?)

# match pattern with htmltext

#print "Overall Weather status: ",weather_cond[0][1]"%d%m%Y%H%M")
temperature = weather_temp[0][0]
#print "Minimum Temperature: ",weather_temp[0][1]
#print "Maximum Temperature: ",weather_temp[0][2]
print "Wind Direction: ",weather_windspeed[0][0]

def checkduplicate():
    tasks = table_service.query_entities('WeatherFetch', "PartitionKey eq 'data'")

    for task in tasks:

        return True
        return False

if(not checkduplicate()):
    fetched_data = {'PartitionKey': 'data', 'RowKey': RowKey, 'temperature' : temperature,'min_temperature' : weather_temp[0][1], 'humidity' : weather_humid[0],'windspeed':weather_windspeed[0][0],'winddirection':weather_winddirection[0][1] ,'condition':weather_cond[0][1]}
    table_service.insert_entity('WeatherFetch', fetched_data)
    print 'Foo is empty!'

This basically stores weather forecast data of Kolkata every day. So, this script can be run on a small sized Linux VM via a simple shell script:

#script checks whether time its a full hour like 5:00, 8:00
#and runs a script hourly

echo $now
while true
now=$(date +"%M")
if [ $(( $now % 100 )) -eq "0" ]; then
if [ $count -eq "0" ]; then
python /home/abhishek/AzureWorkerRole/

This script calls the Python Script hourly and stores weather information in the table.

Now, Table details that means name and primary key can be found at Azure Portal under storage:

authorization for table storage
authorization for table storage

4. Fetching Data from table using Web Role DataServe:

This entire documentation can be found at:

So, this wraps up another tutorial in Windows Azure.

Entire Code is available on:

Happy Coding.

Python Lists and Tuples

In all programming languages that we are quite familiar with, i.e, C, Java, C#, PHP,etc. have support for Arrays while the intelligent OOPs have ArrayList, an interesting feature. But is that all?

If the answer is NO. Let’s switch to Python, a language built with a certain positive vibe not to make a programmer’s life a purgatory. With Python, we have ease of code readability, 100 lines of code written in Java can be equivalent to 10 lines of meaningful Python code. And it’s opensource and reliable. The best part is that you can leave things on the framework and just work with what you need.

Coming to the topic, apart from arrays, Python provides several other quintessential data storage elements:


Lists are pretty much like ArrayLists. You can insert strings,variables,etc. in and out as you wish. Lets see some examples:

  • Open IDLE for Python 2.7. You can download it here.
  • First we create a list named Movies containing 3 movie names
    Movies=["Terminator 3","Titanic"]
    >>> print(Movies)
    ['Terminator 3', 'Titanic']
  • Now we’ll add another element to this List. This <listname>.append(object x) will insert object x at the end of the list.
    >>> print(Movies)
    ['Terminator 3', 'Titanic', 'Avatar']
  • What if we want to insert in some other position?
    >>> print(Movies)
    ['Terminator 3', 'Psycho', 'Titanic', 'Avatar']
  • We can even Sort the list.
    >>> print(Movies)
    ['Avatar', 'Psycho', 'Terminator 3', 'Titanic']
  • We can remove an element too.
    >>> print(Movies)
    ['Avatar', 'Terminator 3', 'Titanic']


Tuples can be categorized as Read-only Lists. A List is declared by [] whereas a tuple is denoted by (). Tuples are often very useful when we want a read-only archive.

  • Let’s create a Tuple first. Note that a single item tuple won’t be like (“Titanic”) rather like (“Titanic”,):
    >>> print(Movies)
    ('Titanic', 'Avatar')

Fade animation using WPF

It is a sample based on the rich animation that is available under animation libraries of WPF. Here we’ll mainly focus on the fade animations under WPF. First of all you need to use ‘System.Windows.Media.Animation‘ in order to enforce the animation.

  • First of all we need to include ‘System.Windows.Media.Aniimation’
  • Then we’ll use the fade animation when the mouse leaves the area or reappears when mouse enters the area.
  • It can either be a canvas, or an ellipse or any geometrical shape even a grid.

For building or downloading the sample visit my post at MSDN.

Before animation starts :



After fade effect is activated:


Code for entering mouse :

private void canvas1_MouseEnter(object sender, MouseEventArgs e)
Canvas c = (Canvas)sender;
DoubleAnimation animation = new DoubleAnimation(2, TimeSpan.FromSeconds(5));
c.BeginAnimation(Canvas.OpacityProperty, animation);
textBlock1.Visibility = Visibility.Hidden;
textBlock2.Visibility = Visibility.Visible;

Code for leaving mouse :

private void canvas1_MouseLeave(object sender, MouseEventArgs e)
Canvas c = (Canvas)sender;
DoubleAnimation animation = new DoubleAnimation(0, TimeSpan.FromSeconds(5));
textBlock2.Visibility = Visibility.Hidden;
textBlock1.Visibility = Visibility.Visible;

Here we create objects Canvas and animations and then apply properties.

using System;
using System.Collections.Generic;
using System.Linq;
using System.ServiceModel;
using System.Text;
using System.Threading.Tasks;

namespace WcfServiceLibrary1
    [ServiceBehavior(InstanceContextMode = InstanceContextMode.Single)]
    public class DataService:IDataService
        List<data> datas = new List<data>();
        #region IDataService members

        public void submit_data(Data data)
            data.roll = Guid.NewGuid().ToString();

        public List<data> GetData()
            return datas;
        }        public void remove_data(string roll)
            datas.Remove(datas.Find(e => e.roll.Equals(roll)));
        }        #endregion

High Energy Physics Data Handling using Cloud Computing

We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have to design a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters situated at different places. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and scalable to many hundreds and possibly thousands of user jobs.


Infrastructure as a Service (IaaS) cloud computing is emerging as a new and efficient way to provide computing to the research community. The growing interest in clouds can be attributed, in part, to the ease of encapsulating complex research applications in Virtual Machines (VMs) with little or no performance degradation [1]. Studies have shown, that high energy physics application code runs equally well in a VM. Virtualization technologies not only offers several advantages such as abstraction from the underlying hardware and simplified application deployment, but in some situations where traditional computing clusters have hardware and software configurations which are incompatible with the scientific application’s requirements, virtualization is the only option available. A key question is how to manage large data sets in a cloud or distributed cloud environment. We have developed a system for running high throughput batch processing applications using any number of IaaS clouds. This system uses software such as Nebula [2] and Nimbus [3], in addition to custom components such as a cloud scheduling element and a VM image repository. The results presented in this work use the IaaS clouds based on Amazon EC2. The total amount of memory and CPU of each computational cluster in the clouds are divided evenly into what we call VM slots, where each of these slots can be assigned to run a VM. When a VM has finished running, that slot’s resources are then released and available to run another VM. The input data and analysis software are located on one of the clouds and the VM images are stored in a repository on the other cloud. The sites are connected by a research network while the commodity network is used to connect the clouds to Amazon EC2. Users are provided with a set of VMs that are configured with the application software. The user submits their jobs to a scheduler where the job script contains a link to the required VM. A cloud scheduling component is implemented (called Cloud Scheduler) searches the job queue, identifies the VM required for each queued jobs, and sends out a request to one of the clouds to boot the user specific VM. Once the VM is booted, the scheduler submits the user job to the running VM. The job runs and returns any output to a user specified location. If there are no further jobs requiring that specific VM, then Cloud Scheduler shuts it down. The system has been demonstrated to work well for applications with modest I/O requirements such as the production of simulated data [4]. The input files for this type of application are small and the rate of production of the output data is modest (though the files can be large). In this work, we focus on data intensive high energy physics applications where the job reads large sets of input data at higher rates. In particular, we use the analysis application of the BaBar experiment [5] that recorded electron-positron collisions at the SLAC National Accelerator Laboratory from 2000-2008. We show that the data can be quickly and efficiently streamed from a single data storage location to each of the clouds. We will describe the issues that have arisen and the potential for scaling the system to many hundreds or thousands of simultaneous user jobs.



Data Management:

Analysis jobs in high energy physics typically require two inputs: event data and configuration data. The configuration data also includes a BaBar conditions database, which contains time-dependent information about the conditions under which the events where taken. The event data can be the real data recorded by the detector or simulated data. Each event contains information about the particles seen in detector such as their trajectories and energies. The real and simulated data are nearly identical in format; the simulated data contains additional information describing how it was generated. The user analysis code analyzes one event at a time. In the BaBar experiment the total size of the real and simulated data is approximately 2 PB but users typically read a small fraction of this sample. In this work we use a subset of the data containing approximately 8 TB of simulated and real data. The event data for this analysis was stored in a distributed file system at one cloud. The file system is hosted on a cluster of six nodes, consisting of a Management/Metadata server (MGS/MDS), and five Object Storage servers (OSS). It uses a single gigabit interface/VLAN to communicate both internally and externally. This is an important consideration for the test results presented, because these same nodes also host the IaaS frontend (MGS/MDT server) and Virtual Machine Monitors (OSS servers) for the cloud.

The jobs use Xrootd to read the data. Xrootd [6] is a file server providing byte level access and is used by many high energy physics experiments. Xrootd provides read only access to the distributed data (read/write access is also possible). Though the implementation of Xrootd is fairly trivial, some optimization was necessary to achieve good performance across the network: a read-ahead value of 1 MB and a read-ahead cache size of 10 MB was set on each Xrootd client.

The VM images are stored at the other cloud and propagated to the worker nodes by http. For analysis runs that includes the Amazon EC2 cloud, we store another copy of the VM images on Amazon EC2.

In addition to transferring the input data on demand using Xrootd, the BaBar software is also staged to the VMs on demand using a specialized network file system to reduce the amount of data initially transferred to the clouds when the VM starts by reducing the size of the VM images transferred from the image repository to each cloud site. This not only makes the VM start faster, but also helps mitigate the network saturation after job submission by postponing some of the data transfer to happen later after the job has started.


A typical user job in high energy physics reads one event at a time where the event contains the information of a single particle collision. Electrons and positrons circulate in opposite directions in a storage ring and are made to collide millions of times per second in the center of the BaBar detector. The BaBar detector is a cylindrical detector with a size of approximately 5 meters in each dimension. The detector measures the trajectories of charged particles and the energy of both neutral and charged particles. A fraction of those events are considering interesting from a scientific standpoint and the information in the detector is written to a storage medium. The size of the events in BaBar are a few kilobytes depending on the number of particles produced in the collision. One of the features of the system is its ability to recover from faults arising either from local system problems at each of the clouds or network issues. We list some of the problems we identified in the processing of the jobs. For example, we find that Cloud resources can be brought down for maintenance and back up again. In our test, the NRC cloud resources were added to the pool of resources after the set of jobs was submitted. The Cloud Scheduler automatically detected the new resources available and successfully scheduled jobs to these newly available resources without affecting already running jobs.

Conclusions: From the users’ perspective, the system is robust and is able to handle intermittent network issues gracefully. We have shown that the use of distributed compute clouds can be an effective way of analyzing large research data sets. This is made possible by the power of the cloud computing and distributed file systems. 



Mark sheet maintenance using WCF

Before I start with this article I’d like you to know certain things about WCF or the Windows Communication Foundation. We all have heard about servers and applications running at server end(the back end) like, a Java Servlet. The basic idea in the infrastructure is that certain parts of the applications are deployed as services across the hosts, which means, the application is not running in a native machine but as a service across several machines connected and sharing stuff between themselves through a network.

It is a system for creating connections between applications using services and endpoints. WCF is, more than anything, an infrastructure technology for messages. Just as roads support cars, and as electricity travels over wires and cables, and as pipes convey water, WCF exists to transfer messages between any two endpoints. And it does so securely as well. That is, you can create messages that are encrypted to keep your information safe from being tampered with. A standard example will be Data integration service for any Windows Forms Application that you develop or a WPF one, even a Silverlight based RIA (Rich Internet Application).

So, before we start, let’s get the basics clear :

  • We have a DataContract where we add Data members.
  • We have a ServiceContract where we mention about operations to be performed on the data, i.e, we declare methods.
  • Finally, we have ServiceBehaviour where we assign how the service needs to be executed or how our WCF application should behave.

So, getting this correct we shall move on to our project.

  1. First create a WCF Service Library in Visual Studio :

Image2. Add a Data.cs class to the project and enter the following code :

Screenshot - 2_24_2013 , 8_30_24 PM

3.  Now add another class to the project named IDataService.cs, change it to an interface(instead of a class), and enter the code:

Screenshot - 2_24_2013 , 8_35_17 PM

4.  Now add another class named as DataService.cs and add the following code :

Screenshot - 2_24_2013 , 8_38_30 PM

5. Now build the project.

6.   Now that our model is ready we need to make certain changes to the App.Config file so that our application works as we need in the host.

First, we edit WCF configuration of app.config file by right clicking, check the service tab, and we have a browse window go to e:\programming\c#\marksheet\marksheet\bin\Debug (or project location) and choose the appropriate service dll.

Screenshot - 2_24_2013 , 8_46_02 PM


Now, let’s select some end points by choosing empty endpoint name and in service endpoint window select contract browse to appropriate service like above.

Screenshot - 2_24_2013 , 8_48_22 PM


Now, close, save all if prompted and deploy (Ctrl+F5).

Screenshot - 2_24_2013 , 8_49_46 PM


Now there you have your methods, lets deploy the Submit_Data() method to add an entry to our Database and then invoke :

Screenshot - 2_24_2013 , 8_51_28 PM


Continue, for other methods, and that’s it. Simple and easy.


Virtualization with XEN : The backend of the Cloud

What is XEN ?

Xen is the most popular Open Source Virtualization software that allows multiple OS to
run on the same computer hardware concurrently, thereby improving the effective usage
and efficiency of the underlying hardware. It benefits the enterprises with the power of
consolidation, increased utilization and rapid provisioning.

The back end of our cloud setup runs Xen hyper-V to support virtualization of instances or nodes. The Eucalyptus-nc package is installed in this Node controller(s) running our back end.

Steps for BACK END setup:

ü Prepare a raw Ubuntu 12.04 system preferably server edition.

ü Install Xen hypervisor following these steps:

o    sudo sed -i 's/GRUB_DEFAULT=.*\+/GRUB_DEFAULT="Xen 4.1-amd64"/' /etc/default/grub
o    sudo update-grub
o    sudo sed -i 's/TOOLSTACK=.*\+/TOOLSTACK="xm"/' /etc/default/xen
o    sudo reboot

3. Check for running hyper-V:
o    sudo xm listFollowing output is obtained:
o    Name                      ID  Mem  VCPUs   State       Time(s)
o    Domain-0                   0  945     1    r-----      11.3

A simple Mail client using C#

Every now and then we need to e-mail our friends, family and more. But the available suites are not only complicated but also time consuming. They are mainly targeted for business purposes like Microsoft Office Outlook, or Windows Live Mail. Those of you who use such mail clients definitely know that these programs have got a jerky performance as well as keeps you waiting for a minute or so just to synchronize(send/receive) mails from your mail accounts. What if I just need to send a mail in a couple of seconds to my friend or to a mailbox for a subscription closing in a minute. Well, here comes the time complexity. We need faster client-side apps for mailing. I have understood the need of such a software and I’m glad to present it before you.

You can download it from here

A screenshot of Mammail.


Building the Sample

Weneed to understand the parts of the program. First develop an UI that is suitable for a mail client, i.e, it must contain :

i. A sender field and a receiverfield.

ii. A credential panel to sign in using your gmail account credentials (username & password).

iii. A SUBJECTbox to type the subject.

iv. An attachmentbox to attach files.

v. A messagebox to fill in the mail.


The UI would look something like this :


Here’s the code for all developers :

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.Net.Mail;
using System.Net.Mime;

namespace mail_client
public partial class Form1 : Form
String path;
//string str1, str2;
MailMessage mail = new MailMessage();
public Form1()

private void button5_Click(object sender, EventArgs e)
if (textBox4.Text == "" || textBox5.Text == "")
MessageBox.Show("Please enter proper credentials\n");
MessageBox.Show("Successfully logged in");

private void button3_Click(object sender, EventArgs e)
SmtpClient SmtpServer = new SmtpClient();
SmtpServer.Credentials = new System.Net.NetworkCredential(textBox4.Text, textBox5.Text);
SmtpServer.Port = 587;
SmtpServer.Host = "";
SmtpServer.EnableSsl = true;
mail = new MailMessage();
String[] send_from = textBox1.Text.Split(',');
mail.From = new MailAddress(textBox4.Text, textBox4.Text, System.Text.Encoding.UTF8);
Byte i;
for (i = 0; i &lt; send_from.Length; i++)
mail.Subject = textBox3.Text;
mail.Body = richTextBox1.Text;
if (listBox1.Items.Count != 0)
for (i = 0; i &lt; listBox1.Items.Count; i++)
mail.Attachments.Add(new Attachment(listBox1.Items[i].ToString()));
string page;
page = "&lt;html&gt;&lt;body&gt;&lt;table border=2&gt;&lt;tr width=100%&gt;&lt;td&gt;&lt;/body&gt;&lt;/html&gt;";
AlternateView aview1 = AlternateView.CreateAlternateViewFromString(page + richTextBox1.Text, null, MediaTypeNames.Text.RichText);
mail.IsBodyHtml = true;
//mail.DeliveryNotificationOptions = DeliveryNotificationOptions.OnFailure;
mail.DeliveryNotificationOptions = DeliveryNotificationOptions.OnSuccess;
if (mail.DeliveryNotificationOptions == DeliveryNotificationOptions.OnSuccess)
MessageBox.Show("Mail has been sent to: {0}",textBox1.Text);
mail.ReplyTo = new MailAddress(textBox1.Text);
catch (Exception x)

private void button1_Click(object sender, EventArgs e)
OpenFileDialog dialogue1=new OpenFileDialog();

Form1.DefaultFont.Style.CompareTo(System.Drawing.FontStyle.Strikeout);// = Color.BlueViolet;
if (dialogue1.ShowDialog() == DialogResult.OK)

private void button4_Click(object sender, EventArgs e)

private void textBox3_MouseEnter(object sender, EventArgs e)

private void button1_MouseEnter(object sender, EventArgs e)
button1.BackColor = Color.Aqua;
private void button1_MouseLeave(object sender, EventArgs e)
button1.BackColor = Control.DefaultBackColor;
private void button2_MouseEnter(object sender, EventArgs e)
button2.BackColor = Color.Aqua;
private void button2_MouseLeave(object sender, EventArgs e)
button2.BackColor = Control.DefaultBackColor;
private void button3_MouseEnter(object sender, EventArgs e)
button3.BackColor = Color.Aqua;
private void button3_MouseLeave(object sender, EventArgs e)
button3.BackColor = Control.DefaultBackColor;
private void button4_MouseEnter(object sender, EventArgs e)
button4.BackColor = Color.Aqua;
private void button4_MouseLeave(object sender, EventArgs e)
button4.BackColor = Control.DefaultBackColor;
private void button5_MouseEnter(object sender, EventArgs e)
button5.BackColor = Color.Aqua;
private void button5_MouseLeave(object sender, EventArgs e)
button5.BackColor = Control.DefaultBackColor;

private void button1_MouseClick(object sender, EventArgs e)
button1.BackColor = Color.Gold;

private void textBox4_TextChanged(object sender, EventArgs e)
textBox2.Text = textBox4.Text;

Happy coding and development! 🙂

Surviving 15 days with(or without) SMS

Time and time again, TRAI has imposed bans and limits on the norms of telecommunications including calls,sms,mms and even data. Now, backed by the Prime Minister of our country they have imposed a 15 day regulation according to which an individual can send only 5 sms per day, 20kb data/sms. It has been imposed to limit bulk text messaging which has led to an upsurge followed by exodus of North-East Indians from Southern India especially, Karnataka and Maharashtra. You can read the full TOI(Times Of India) article here.

Now the big question: How shall you survive?

Without a text message, or our cell phones ringing or vibrating or beeping every few minutes with an incoming text message, it’s really tough time out here in this scenario. So, I shall recommend some smart moves:

1.Normal Symbian S40 users:

Switch to Nimbuzz, e-buddy, or g-talk or any messaging service in built on your phone.You can download :

Nimbuzz .

e-buddy(for Nokia customers) or for others.


2.Symbian S60 Smartphone users:

Switch to Skype or G-talk or WhatsApp Messenger. You can download:



3. Android users:

Switch to Viber or Skype or WhatsApp. You can download:




4. Iphone Users:

Switch to Skype or WhatsApp. You can download:



5. Windows Phone Users:

Switch to Skype or WhatsApp. You can download:



5. Blackberry Boys:

Switch to WhatsApp. You can download:


Google Talk.

Windows Live Messenger.

Enjoy these 15 days with the freedom of the internet!

Encryption software: The Art of encrypting.

Welcome again! Returning visitors and coders, today I’m going to show you an interesting topic that is, how to prepare an encryption software using C# 4.0 and Windows Forms.

Prerequisites :

1. Microsoft Visual Studio. (Mine is Professional Edition.)

2. A little knowledge in C# and .NET; else go through my topic ‘Class on Classes and Objects’.

First of all, let us understand the need for an encryption software. Just think of you sending a message of extreme confidentiality, someone else fetches it and goes through it. A breach in the security! We can’t allow that to happen. So, virtually every data send over a network is encrypted now-a-days.

Take an example, what i have written so far is normal human readable, meaningful sentences, conveying some meaning. Let’s look at this now:

Vdmbnld!`f`ho !Sdutsohof!whrhunsr!`oe!bnedsr-!une`x!H&l!fnhof!un!rinv!xnt!`o!houdsdruhof!unqhb!ui`u!hr-!inv!un!qsdq`sd!`o!dobsxquhno!rnguv`sd!trhof!B”!5/1!`oe!Vhoenvr!Gnslr/ Qsdsdpthrhudr!; 0/!Lhbsnrngu!Whrt`m!Rutehn/!)

Lhod!hr!Qsngdrrhno`m!Dehuhno/( 3/!@!mhuumd!jonvmdefd!ho!B”!`oe!/ODU:!dmrd!fn!uisntfi!lx!unqhb!&Bm`rr!no!Bm`rrdr!`oe!Nckdbur&/ Ghsru!ng!`

mm!mdu!tr!toedsru`oe!uid!odde!gns!`o!dobsxquhno!rnguv`sd/!Ktru!uihoj!ng!xnt!rdoehof!`!ldrr`fd!ng!dyusdld!bnoghedouh`mhux-!rnldnod!dmrd!gdubidr!hu!`oe!fndr!uisntfi!hu/!@!csd`bi!ho!uid!rdbtshux !Vd!b`o&u!`mmnv!ui`u!un!i`qqdo/!Rn-!whsut`mmx!dwdsx!e`u`!rdoe!nwds!`!oduvnsj!hr!dobsxqude!onv,`,e`xr/!

This is absolute fuzzy mess! That’s ‘encryption’ according to my software. Now feed this data back in the software you get usual form the plain English language.

screenshot of the software

This is how it works.

For all who want to download my software   click here . Click on setup–>Install–>Run.

For interested coders:

The code:

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;

namespace encoder
    public partial class Form1 : Form
        string str1,str2;
        char[] input;//new char[100];
        public Form1()

        private void textBox1_TextChanged(object sender, EventArgs e)
            str1 = textBox1.Text;

        private void button1_Click(object sender, EventArgs e)

            input = str1.ToCharArray(); /*converts string to character array*/
            if (input.Length &lt; 100)
                progressBar1.Value = (progressBar1.Maximum / 100) * input.Length;
                progressBar1.Value = progressBar1.Maximum;
            for (int i = 0; i &lt; input.Length; i++)
                //progressBar1.Value = (progressBar1.Maximum / input.Length-1) * i;
                if (input[i] % 2 == 0)
                    input[i] += Convert.ToChar(1);
                    input[i] -= Convert.ToChar(1);
            str2 = new string(input);
            textBox2.Text = Convert.ToString(str2);


        private void textBox2_TextChanged(object sender, EventArgs e)
            textBox3.Visible = true;
            textBox3.Text = "Do you want to decipher the code?";
            button3.Visible = true;
            button4.Visible = true;

        private void button3_Click(object sender, EventArgs e)
            button2.Visible = true;
            textBox1.Text = textBox2.Text;
            textBox2.Text = "";
            button1.Visible = false;

        private void button2_Click(object sender, EventArgs e)
            input = str1.ToCharArray(); /*converts string to character array*/
            if (input.Length &lt; 100)
                progressBar1.Value = (progressBar1.Maximum / 100) * input.Length;
            else {
                progressBar1.Value = progressBar1.Maximum;
            for (int i = 0; i &lt; input.Length; i++)
                //progressBar1.Value = (progressBar1.Maximum / input.Length-1) * i;
                if (input[i] % 2 == 0)
                    input[i] += Convert.ToChar(1);
                    input[i] -= Convert.ToChar(1);
            str2 = new string(input);
            textBox2.Text = Convert.ToString(str2);
            textBox3.Visible = false;
            button3.Visible = false;
            button4.Visible = false;
            textBox1.Text = "";

       /*private void progressBar1_Click(object sender, EventArgs e)



That's it..Happy coding!