Posted by: Vikas Sahni | January 30, 2017

2017 calendar using photographs taken by me

During 2016, I developed a passion for photography.  A friend suggested sharing one or two of the photographs, I decided to be ambitious and do a calendar.  So here is it, feel free to share and/or download and print it for your use as long as you don’t modify it.

Click here to download the PDF file: calendar2017


For those who are interested in following up on Scott’s talk – this is the link to the sample code and an eBook based on the talk:

The talk may become an academic book sooner rather than later, generalising the 13 patterns to a meta-model.  Hopefully Scott will announce a co-author to complete the work and maintain it as a living summary of how to design and develop IT applications and systems.

The facilities staff asked who was this star…the last time the hall was this full was when the Irish football player Paul McGrath visited!

Scott spoke for nearly three hours without a break.  He had promised that he would attempt to melt our brains by the end of the evening, and boy…was he successful!

Here are some of the photos that I took that evening:



WP_20131202_20_40_43_Pro WP_20131202_19_46_36_Pro

WP_20131202_19_10_10_Pro WP_20131202_19_09_31_Pro

WP_20131202_18_52_45_Pro WP_20131202_18_42_08_Pro

WP_20131202_18_38_05_Pro WP_20131202_18_37_57_Pro

WP_20131202_18_37_43_Pro WP_20131202_18_37_23_Pro

WP_20131202_18_36_57_Pro WP_20131202_18_36_39_Pro



Posted by: Vikas Sahni | July 19, 2013

My first AWS Boot Camp

I got an opportunity to learn about the Amazon offerings in the Cloud Computing arena from an Amazon architect recently. As many of you know, I love to share what I have learned. Therefore, having delivered a number of Windows Azure boot camps in Dublin, Cork and Galway during the past year, I decided to do one on AWS last month.

Over 40 people turned up on Saturday 22nd June at the National College of Ireland in Dublin to learn about AWS and do some hands-on work.
I divided the day into three sessions, starting off with an overview lecture, the second one was a whistle stop demo, and then the hands-on lab.
For the overview session, I used the Overview document available at

The breadth of the AWS offerings is so vast that most of this 90 minute session was spent just describing the graphic on Page 2 of this document, reproduced below for your convenience:

AWS Overview

After a well-deserved coffee break, I logged into the AWS Management Console at and gave a quick demo of EC2, S3, RDS and CloudWatch.  Oh yes…before starting the demo, I requested the attendees to just watch, and not try to do the steps at the same time – that was for the afternoon hands-on session.  Some of the participants were more management and less hands-on, so this format of the boot camp allowed such people to get the knowledge they were looking for and leave during the lunch break.

After lunch, I was pleasantly surprised by the attendance – only a couple of people excused themselves, the rest were all there!  In the morning, I had asked people as to what they were looking to get out of the day.  Based on their responses, I had identified three distinct areas of interest – spinning up Virtual Machines in EC2, creating static web sites in Amazon S3 and using AWS Elastic Beanstalk for development work.  Therefore, the participants split into three groups and started off using AWS.

Within a couple of minutes, the developers were the first ones to call me over – they were evenly split between PHP and .Net!  The solution was simple, I just asked the group to split into two!!!

The rest of the afternoon went smoothly.  By 4 pm, a number of different instances using different AMIs (Amazon Machine Images) were up and running, two static web sites had been deployed, and one PHP and another .Net application was live.

At this point, I asked people to share their learning experience with the other teams.  Before I could put some structure to this, the participants decided that the best way was for them to take turns going over to other tables.

By the end of the day, everybody had done some hands-on work on AWS and had learnt from each other how easy it was to use Amazon in other areas.

A Saturday well-spent getting over forty people started doing hands-on with AWS!  There may be more events in the near future, so follow me (@sahnivi) on twitter if you are interested.

Posted by: Vikas Sahni | June 16, 2013

A comparison of Windows Azure SQL Database and Amazon RDS

Rohit Sharma, a Masters student at NCI Dublin ( worked with me over the past few months to compare Windows Azure SQL Database and Amazon RDS. The reason for choosing these two cloud offerings was simply commercial – these are the two mainstream commercial services that offer an SLA.

We measured the basic CRUD operations on Windows Azure SQL Database and Amazon RDS, going up to 10 million rows. This was done by using the To-Do list sample application included in the Windows Azure Training Kit. For Amazon RDS, a micro instance was used to do the measurements, so you can get better performance by using a bigger instance if need be.

Windows Azure SQL Database clearly demonstrates overall low performance in comparison to Amazon RDS. Windows Azure SQL Database, being a pure DBaaS (Database as a Service) has low performance issues due to many disadvantages that are summarised below:
• Data privacy and Data security – Data must be protected from other users because of multi-tenant architecture.
• SQL Queries Over Encrypted Data- To leverage DBaaS for complete encryption, strategy should be to process as much of the query on service provider data centre without having to decrypt data.
• Log size
• Cannot handle massive transactional workloads
• Performance Degradation

DBaaS has not yet matured, but it can replace Infrastructure as a Service in scenarios where performance is not critical and cost is more important. In such cases, a pure DBaaS like Windows Azure SQL Database is the better approach because of less admin overheads.

The Amazon RDS service, being hybrid IaaS, has several advantages in comparison to Windows Azure SQL Database, which is a pure DBaaS. These advantages, resulting in higher performance, are listed below:
• Flexibility and Scalability
• High throughput and simplified Query Processing.
• Large transactional log size
• Complete control over physical Infrastructure.

If a user wishes to build and host business applications with large database requirements, then Amazon RDS is the better option. It was twice as fast even with just a micro instance. As the number of rows increased, Amazon RDS scaled more or less linearly, while Windows Azure SQL Database slowed down.

On the other hand, a user who wants to get up and running quickly and is not too concerned about optimising performance would be better off using Windows Azure SQL Database as it has very little admin overheads.

NOTE: This work was carried out in January to March 2013, and compares only basic operations on a single table.

Posted by: Vikas Sahni | June 9, 2013

Cloud Computing 101

I have been asked to talk about what Cloud Computing really means a number of times. The people who attended a session yesterday requested that I post the deck…so here it is…

Posted by: Vikas Sahni | April 16, 2013

How to use an Academic Pass to open a Windows Azure Account

How to use an Academic Pass to open a Windows Azure Account

A Step by Step Guide

The INITIAL STEP can be done ONLY by faculty members.  You have to apply for an Educator Grant of Windows Azure Academic Passes for faculty and students.  Go to and enter the details. Click Submit.  The Azure University team will contact you for the next steps.  On approval, you will receive a set of codes for redemption that you can share with colleagues and students.

STEP ONE: When you get a code from your lecturer, visit and click Have a code already? Redeem it here.


STEP TWO: On the next screen, select Country from the dropdown list and enter your promo code. Click Submit.


STEP THREE: Sign In using your Windows Live ID.  If you don’t have one, you can create it by clicking on the Sign Up link on the next screen.


STEP FOUR: Enter your details on the following screen:


STEP FIVE:  Now sign the Trial Agreement with Microsoft.  Type in First Name and Last Name.  Enable the checkbox By checking the box, you are accepting Microsoft Windows Azure Trial Agreement. Click Accept.


The message You have successfully requested your Windows Azure Pass appears!


Posted by: Vikas Sahni | October 4, 2011

Business Intelligence and the Cloud

The term Business Intelligence (BI) was defined as ‘the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal.’ in a 1958 article by an IBM researcher, Hans Peter Luhn.  It has evolved from the Decision Support Systems (DSS) which began in the 1960s.  DSS originated in the computer-aided models created to assist with decision making and planning. From DSS, data warehouses, Executive Information Systems, OLAP and business intelligence came into focus beginning in the late 80s.

BI applications usually use the company repository of data, supplemented by public data, and transform it into information that gives insight into business trends that would otherwise not be visible.  BI applications enable the company to take well informed strategic decisions.  A good suite of BI applications, backed by high quality raw data, can give a company a significant edge over its competitors.

The Wikipedia entry on the topic states that Business Intelligence can be applied to five distinct purposes (MARCKM), in order to drive business value:

  1. Measurement – program that creates a hierarchy of Performance metrics (see also Metrics Reference Model) and Benchmarking that informs business leaders about progress towards business goals (AKA Business process management).
  2. Analytics – program that builds quantitative processes for a business to arrive at optimal decisions and to perform Business Knowledge Discovery. Frequently involves: data mining, process mining, statistical analysis, Predictive analytics, Predictive modeling, Business process modeling
  3. Reporting/Enterprise Reporting – program that builds infrastructure for Strategic Reporting to serve the Strategic management of a business, NOT Operational Reporting. Frequently involves: Data visualization, Executive information system, OLAP
  4. Collaboration/Collaboration platform – program that gets different areas (both inside and outside the business) to work together through Data sharing and Electronic Data Interchange.
  5. Knowledge Management – program to make the company data driven through strategies and practices to identify, create, represent, distribute, and enable adoption of insights and experiences that are true business knowledge. Knowledge Management leads to Learning Management and Regulatory compliance/Compliance.

The above five areas can be broadly grouped into two categories – the first three are Quantitative, and the other two are Qualitative.  Good knowledge management and collaboration practices are essential to ensure that the data used for BI is of high quality (we all know ‘Garbage In Garbage Out’).  The data then needs to be analysed to produce quantitative information, usually presented pictorially so that it is easy to understand.

The quantity of data that is available within an organisation is increasing exponentially.  External data that needs to be taken into account is exploding even faster.  Storing and processing this data to produce meaningful insights is becoming a challenge for the practitioners of BI because of increasing costs and the current economic climate.

Cloud Computing presents a simple answer to these challenges.  A company, instead of purchasing its own hardware and building a data centre whose average utilisation is in the single digits or low teens at the best, can simply use the Cloud to rent processing power that is needed only for a day or two to run those monthly reports!  Confidential data can be kept on-premises, and even proprietary algorithms can be executed on the company’s own infrastructure…build a hybrid solution that just offloads the non-confidential bits to a public cloud.
This can lead to significant savings, and empower a company to use algorithms and process data that is simply not affordable with traditional on-premises solutions.

There are extreme examples from Scientific Computing and Data Visualization where the Cloud has already delivered cost reductions from millions to thousands of dollars.  BI in the Cloud certainly has the potential to reduce costs by an order of magnitude and simultaneously make it possible to run complex analytical algorithms.

Posted by: Vikas Sahni | June 6, 2011

RIC – A Practical Virtual Research Environment

A Virtual Research Environment (VRE) is a collection of tools and services to complement the activities of researchers and address their “pain points” in day to day activities.   A VRE
should enhance the research process by facilitating collaboration among researchers and provide them a more effective means to work together.  A VRE should be available as a service with minimal barriers to entry.  Most of the technology comprising a VRE is evolutionary; it is the convergence of these technologies that is unique.

There are an increasing number of Virtual Research Environment projects and programs underway around the world as outlined in the recent report emanating from a JISC funded project[i].  As the report found, there has been “a great deal of activity over the past few years in terms of prototype and demonstration systems moving into the mainstream of research practice.”  However, no VRE is ready for practical deployment on a large scale till date.

Research Information Centre Framework

Microsoft Research, being fully aware of the potential of VREs, has been developing a VRE framework called the RIC (Research Information Centre) based on Microsoft technologies in collaboration with the British Library since 2007.  The RIC is a virtual research environment framework, offering an integrated suite of tools for collaboration, and for finding, creating, managing, sharing and disseminating all types of information associated with a research project.  It provides a core set of functionalities supporting a research lifecycle organised around the four phases of idea discovery, funding, researching and dissemination.  These functions can be used to build domain specific VREs into which additional modules can be added.

The RIC Framework is a collection of server-based tools that run on top of Microsoft® Office SharePoint® Server.   While a SharePoint license is necessary to run it, the actual source
code for V1.0 has been released under an open source license and is available on Codeplex.  The environment can be accessed through a web browser.

The RIC aims to reduce the time researchers spend on administrative tasks, to support collaborative research, provide easy access to relevant information, to facilitate networking and to help preserve not only project outcomes, but the whole process of research.  A particular focus is to reduce inefficiencies in knowledge management over the research lifecycle by, for example, providing domain specific access to information, ensuring researchers are aware of relevant resources.  Templates for projects can be created and specific project sites set up based on those templates.  The RIC offers a range of features such as access control, workflows, sharing and annotation of resources, RSS feed integration, federated search over domain specific literature sources, full-text search over local resources, blogging, wikis, networking, creation of project groups and archiving of project sites.

The Research Information Centre Pilot at Trinity College Dublin

In 2009, Microsoft Research and Trinity College Dublin agreed to collaborate on a pilot project to develop and implement a working version of RIC suitable for humanities researchers to act as a ‘proof of concept’.   The purpose of the pilot project was to build a VRE suitable for humanities researchers at Trinity College Dublin, and in parallel to make recommendations on the future development of a Humanities VRE in Trinity College Dublin.  This was a collaborative venture between Trinity College Dublin, Softedge (Irish SME) and Microsoft Research involving the extension of proven technologies to bring together existing collaboration, communication, search and social networking technologies into a single coherent framework.   The VRE was created by enhancing the core Microsoft Research Information Centre (RIC) 1.0 framework for use by humanities researchers[ii].

The focus of the Trinity College Dublin pilot project was on the idea discovery and research phases of the research lifecycle within the humanities domain.

Exciting opportunities exist to expand the VRE to cover all areas of the research lifecycle, for example, research planning and administration, scholarly communication, data management, and data preservation, and to integrate the VRE with other college systems already supporting these areas, for example the Research Support System and the Institutional Repository.

The RIC Development Community

A RIC development community has been established which has as one of its aims the gaining of a better understanding of how VREs can support future research.  A forum for sharing information has been established on LinkedIn and user community workshops regularly take place at the British Library or at Microsoft Research, Cambridge.  The community participants include Oxford, Southampton, Kings’ College, Trinity College Dublin, Leiden (Denmark), La Trobe (Australia) etc.

Current Status

The RIC 2.0 Beta Toolkit is undergoing final testing, and will be available for download by June-end 2011.  RIC 2.0 is built on top of SharePoint® 2010, which is a proven collaboration platform from Microsoft.  RIC 2.0 is free; however a licensed SharePoint 2010 installation is required to deploy it.   The source code will also be available under an open source license for free download from CodePlex.

The core tools of Version 2.0 of the RIC Framework have been developed by Softedge in collaboration with the British Library and Microsoft Research.  Invest NI has provided financial support for this effort.  A number of other institutions are actively developing additional tools for the RIC V2.0.

La Trobe University, Australia is working with Softedge to deploy RIC 2.0 in time for the next academic year.

Considering the significant effort that has been put into it by the community, the RIC is now ready as a practical VRE that can be deployed easily by any research institution.

[i] Carusi,
A., Reimer, T. “Virtual Research Environment Collaborative Landscape Study: A
JISC funded project.” January 2010.

[ii] Research Information Centre Framework –

Posted by: Vikas Sahni | February 21, 2011

The Future is Smartphones

The Experiment

Fifty days into the New Year, and what started as an experiment on 1st January 2011 is now my way of life.  The laptop no longer goes on the road with me every day! 

If a heavy user like me can survive – no, actually do more – outside the office with a Smartphone instead of a laptop, I am willing to lay a bet that the laptop is on the way out!

The Experience

Before you go off this post, shaking your head and thinking that I am mad, look at the table below which summarises my experience:

Activity / Task Laptop Smartphone Comment
Reading emails Yes Yes Practically no difference
Replying to emails Yes Short – YesLong – No If I need to send a detailed reply, a short note saying that I am away and will get back by a certain time is usually good enough.
Composing emails Yes Short – YesLong – No If I have to compose a long mail, I anyway wait till I am back at my desk and can work without distraction
Casual Browsing Yes Yes More and more sites detect that you are using a mobile browser and serve up a version suitable for the phone
High bandwidth Browsing Yes No Researching on the net,  Downloading software etc. is anyway something that I do at my desk, not when I am at a seminar or conference, or commuting or waiting for a meeting to begin
Note taking Yes Yes The Smartphone wins this one hands down.  As a developer architect, a white board gets used heavily in many of my meetings and I just take photos!
Presentations Yes No Simple workaround, I just mail them in advance to the meeting/workshop/seminar organiser and carry a backup on USB
Demos Yes No The only person who should be legally allowed to give live demos is Scott Guthrie.  Ordinary people like me should use videos and screen grabs
Taking Photos No Yes  
Routes / Directions No Yes  
Carrying it around Pain Easy Carrying a laptop around is a painful task.  It is heavy, has to be lugged around everywhere, and the risk of loss or theft is omnipresent.  The Smartphone is just a bit bigger than my last phone and much smaller than my first mobile!  The risks are still there, but the chances are lower as the device fits into my shirt pocket


Try it for a week and see for yourself…

The Next Generation Smartphone

Here is what I think should be added:

  1. A USB port, so that people can plug in a USB hub to connect devices.  Keyboard, mouse, hard disk, DVD, printer… almost all devices are now available with a USB interface
  2. A HDMI (video) port, so that people can plug in a monitor or TV

With just these two additions, most people will no longer require a laptop or even a desktop!  Just plug your Smartphone into the USB hub when you are in the office or at home!

Instead of adding the ports, another way could be to just extend the OS to use Wi-Fi or Bluetooth to handle devices…

Of course either of these alternatives will require OS support.  However, none of the things listed above are rocket science, they have all been done.  All that is needed is someone with the right resources to execute this, and laptops could become history within five years!

Apple / Google / Microsoft…are you reading this?  The first one off the block has an opportunity to introduce a revolutionary change…

Power Users

For software developers, designers, engineers and gamers, the above Next Generation Smartphone will not work.  The only reason is that these users need more processing power.

Is there a solution for us?  Yes, and again using proven technologies!  This will however require more work than what will do the trick for ordinary users.  An operating system that can automatically handle connecting and disconnecting an additional CPU will do the job quite well.  There are plenty of distributed systems available in the industrial / military domain that do this all the time.

A second alternative that may even support the current operating systems is also thinkable…AMD / Intel…are you reading this???  Multi-core CPUs on one chip…how about the same on two chips?  Look into your archives, such chipsets were produced earlier!

Note – What do I mean by ‘Heavy User’

Just to explain ‘heavy user’ in plain English – my laptop is more powerful than the desktop that most power users have.  For the more technically inclined among my readers – I am a software developer / architect who works with the latest technologies.  My laptop has a i7 processor, 12GB RAM, 3GB Graphics card, 640GB HDD etc.  I do not use a desktop, this is my only machine.

Posted by: Vikas Sahni | February 10, 2011

Porting exisiting applications to Windows Azure

In order to take full advantage of the Windows Azure platform, you have to create applications from the start.  This is because you can then re-architect your application to use Windows Azure blobs and tables and partition it between different roles.  However, in the present economic climate, it is likely that you may not have the budget to redevelop an application from scratch.

VM Role to the rescue

Do not be disappointed after reading the above – the new VM role can help you to port a large percentage of applications that are currently running on Windows Server. 

To use a VM role to port an app to Azure, just follow the following simple steps:

  1. Make sure that your app runs on Windows Server 2008 R2
  2. Create a VHD (virtual hard disk) from the machine running Windows Server 2008 R2
  3. Upload this image to a VM Role in Windows Azure
  4. Connect the database – you can leave it as is on premises or move it across to SQL Azure

Multiple Instances

Remember that you need at least two instances of a role for the Microsoft SLA to kick in, and that the Azure Fabric automatically does load balancing between the instances for you.  This is not an issue for most apps running on Windows Server, as they are likely to be running in load-balanced multiple server environments.

However, it may be that your application is not able to handle multiple instances running at the same time.  If this is the case, some rework will be necessary.

Other Migration Alternatives

Applications that are stateless, or maintain state in SQL Server, can be ported to Azure with minor rework.  You should be able to copy the code across to an Azure project and deploy it with minor changes.  The data can be moved to SQL Azure.  This will not give you the full benefit of migrating to AzureAzure, mainly the native storage structures (Blobs, Tables and Queues).  However, this may not be an issue for many applications.  Also, remember that if the original app was not designed to work with multiple instances, testing and rework is required.

Next Azure post will be about the scenarios where it makes business sense to move to Azure.  But before that, expect something about SmartPhones.

Older Posts »