Configure Point-to-Site connectivity for Windows Azure VMs

As a DBA generally, we have networking as a gray area. I have been trying my hands on little bit of networking while learning Azure. I wanted to share it so that, it can help you whenever required. Before we start getting into point-to-site connectivity configuration, It’s important to understand the usage . It’s generally used to access the Azure resources from you on-premise machine.

Point to site connectivity is meant to be used for small operations because of the limitation of the gateway bandwidth upto 80Mbps.It can be used for small operations like troubleshooting and monitoring etc. It’s something like our office VPN , which we use to connect to office infrastructure from anywhere. When you want to connect you Azure infrastructure to your local local datacenter, you need to use site-site-connectivity e.g. if the machines need to be connected to Azure resources all the time where the download and upload will be really high. For that solution, you need to a VPN device to support that bandwidth or optionally can use Windows 2012 RRAS feature .

Lets discussion how we can access Azure resources from on-premise machine to Azure virtual machine resources using point-to-site connectivity:

1. Create a network:


2. Enter the IP address for the DNS server and click on point-to-site connectivity:


3. Click on Add gateway subnet – it will provide the IP address to the VPN client used by the on-premise machine:


4. Once the Network is created, click on the network name and the dashboard will look like this:


5. Create the gateway for the VPN connectivity: Click on create gateway:


Once that is done , the color will change 🙂 :


6. Now, it’s time to create the certificates for the VPN connectivity. If you click on the certificate tab as see the above picture , it will ask you for:


Let’s create certificates to upload here:

Create self-signed root certificate – to be uploaded to the site:

makecert -sky exchange -r -n “CN=dbcouncil” -pe -a sha1 -len 2048 -ss My  c:\work\dbcouncil1.cer



7. Create Client certificate: It will be kept on the client which needs VPN connectivity:

makecert.exe -n “CN=dbcouncil” -pe -sky exchange -m 96 -ss My -in “dbcouncil” -is my -a sha1


8. Once the certificate is uploaded , download the VPN client:


9.  Once you install the client, you can find it under connections on your system like this:


10. Once you click on it, you will get a connect option – just connect to it:


11. Once the connection is established, it looks like:


Azure management studio – great tool for Azure operations

Just got to know about this tool – Azure management studio which can be used to manage Azure resources through GUI.

Here are the steps you need to follow:

  1. Download the tool
  2. Download the Publishsettings file for you Azure subscription
  3. Load it into the tool in the beginning itself
  4. Once the tool is loaded , it looks like:



It’s really easy to use for general cloud operations.

Azure Vidyapeeth Day – 8th Aug, 2015 – Session PPT

Hello Team

As discussed in one of my previous posts , there is an initiative Azure Vidyapeeth where we help our community with Azure learning. We call the community at various Microsoft locations and present the latest content on Azure. You can download Azure Vidyapeeth windows Phone app for the latest updates or check this page.



Please check my session’s PPT for the event here. Additionally , please check the post I had written on Azure BCDR.


Azure Vidyapeeth Day – 8th August , 2015

We are hosting a full day event for our community for Azure education , on 8th August! On this event, we are going to discuss in detail about the most common question for Azure i.e. security. In our general day to day discussion with Infra and DB professionals , we get the biggest question of how to ensure the data safety when we host data out of the premise.

Moreover, We are going to have a Service Engineer from Microsoft IT team, helping us understand how MSIT is leveraging Azure and what kind of challenges they faced while migrating to Azure.  During this event, we are going to touch upon many things which Microsoft Azure can do for us, both for Infra and DB professionals.

Just click on the hyperlinks to see the agenda of the event is –


Please take time to join this event – the registration link is as follows –

Registration link –

Entry will be based on the registration only.


For a SQL server DBA – How to start Azure Learning – Part 5 (SQL Azure Database Monitoring)

In continuation to this series, I will write about how we can monitor SQL Azure DB. Before I get into the monitoring of SQL Azure DB, I just wanted to discuss on the major work which DBA does on day to day basis:

1. Database Management e.g. creating/adding/removing files
2. Backup/Restore
3. Database Security
5. Performance Tuning and Optimization
6. Performance Monitoring
7. Database maintenance e.g. Rebuild Index/Update statistics etc.

Let’s see how many activities will reduce if the database is moved to SQL Azure DB.

1. Database Management – we don’t need to add/remove any files. Once the database is moved to Azure we don’t have this option. Database size will be dependent on the performance tier selected.
2. Backup/Restore – It will be managed by Microsoft itself so, this task will reduce. You just need to use the GUI for the restore and get your work done.
3HADR – You just need to configure HADR options as discussed in my post and rest will be managed by Microsoft.

For rest of the activities , the method will change slightly – though logically things will remain the same. If you understand, It’s still SQL server underneath.

4. Database maintenance – The activities like Rebuilding Indexes/update statistics will still need to be managed. These activities will still need to be scheduled regularly. Just remember, there are some restrictions with the T-log consumption. I will come up with a separate post on the limitations which we hit in Azure DB due to some resource constraints.

5. Database Security – It will still need to be managed, though there is a change with the way it used to be managed in on-premise environment.

6. Performance Tuning and Optimization – We will no longer be able to manage SQL configuration using sp_configure or hardware or database files etc. Generally these areas used to be looked at before getting into query tuning. DTUs will become the prime focus when it will come to throughput of IO/CPU etc. Query tuning will remain the same, we still need to look for Index/table scans or implicit conversions or joins etc. in the execution plans.

7. Performance Monitoring – Finally we are on today’s topic of discussion, If we talk about what do we monitor today for on-premise SQL server:

1. Memory
2. CPU
3. IO
4. Network
5. SQL Access Methods
6. SQL Memory Clerks
7. SQL Blocking/Deadlocks
8. SQL Transactions
9. DB Size/Growth
10. SQL Configuration changes
11. SQL Connections
12. SQL Wait Statistics
13. Connection failures/Attentions

We generally use various tools like Perfmon, Management Data Warehouse , DMVs and activity monitor etc. to see these various resources.  But for Azure DBs, we will have a monitoring Dashboard and DMVs to serve the purpose. The monitoring dashboard is really detailed to help with most of the resources and for delving deeper we will still have the luxury of the DMVs as required. The DMVs related to the features we don’t have or are not required won’t be there.

Just login into and you will your database as follows:


Just click on the database name and you will get an option to select the dashboard, here you can see the database size:


If you want to use PowerShell to see the DB size , check this post.

Just above the database size, there is monitoring Dashboard but for detailed option, just click on Monitor option:


It’s testing database so, I couldn’t have much data there. If you want to add more counters, simple select “Add Metrics” and you will see:


Most of the counters are self-explanatory so, I will simply come to the point – something , which will be change , to look at is:

DTU percentage
Log IO Percentage
Data IO percentage

These four counters are really important to understand. I personally think, DTU is little tricky to articulate as a DB professional working on on-premise SQL Server. Instead of thinking about CPU/memory/storage , we just need to focus on the transactions / concurrent connections / Concurrent logins etc. As discussed in the previous posts, Microsoft guarantees to provide the below mentioned metrics in the performance tiers and you just need to move your from how much CPU/disk you need. Instead focus on how much transaction /connections will the application have etc. etc.


Once the application is deployed on a specific tier, we need to monitor how much percentage is being used. If it’s 80 – 90% utilized and we will either need to tune the queries using the same methods as we used to use for on-premise SQL server. Else, we will need to scale up to the next level. This can be scheduled for automatic scale up and scale down as well. For more information on resource limits check –


For information on DTU – please check this video:


References –

Benchmark Overview –

Resource Limits  –

I will hold PaaS series for now and will move a little bit into IaaS as well. I will write my next post on storage for IaaS.