Configure Point-to-Site connectivity for Windows Azure VMs

As a DBA generally, we have networking as a gray area. I have been trying my hands on little bit of networking while learning Azure. I wanted to share it so that, it can help you whenever required. Before we start getting into point-to-site connectivity configuration, It’s important to understand the usage . It’s generally used to access the Azure resources from you on-premise machine.

Point to site connectivity is meant to be used for small operations because of the limitation of the gateway bandwidth upto 80Mbps.It can be used for small operations like troubleshooting and monitoring etc. It’s something like our office VPN , which we use to connect to office infrastructure from anywhere. When you want to connect you Azure infrastructure to your local local datacenter, you need to use site-site-connectivity e.g. if the machines need to be connected to Azure resources all the time where the download and upload will be really high. For that solution, you need to a VPN device to support that bandwidth or optionally can use Windows 2012 RRAS feature .

Lets discussion how we can access Azure resources from on-premise machine to Azure virtual machine resources using point-to-site connectivity:

1. Create a network:

image

2. Enter the IP address for the DNS server and click on point-to-site connectivity:

image

3. Click on Add gateway subnet – it will provide the IP address to the VPN client used by the on-premise machine:

image

4. Once the Network is created, click on the network name and the dashboard will look like this:

image

5. Create the gateway for the VPN connectivity: Click on create gateway:

image

Once that is done , the color will change 🙂 :

image

6. Now, it’s time to create the certificates for the VPN connectivity. If you click on the certificate tab as see the above picture , it will ask you for:

image

Let’s create certificates to upload here:

Create self-signed root certificate – to be uploaded to the site:

makecert -sky exchange -r -n “CN=dbcouncil” -pe -a sha1 -len 2048 -ss My  c:\work\dbcouncil1.cer

image

image

7. Create Client certificate: It will be kept on the client which needs VPN connectivity:

makecert.exe -n “CN=dbcouncil” -pe -sky exchange -m 96 -ss My -in “dbcouncil” -is my -a sha1

image

8. Once the certificate is uploaded , download the VPN client:

image

9.  Once you install the client, you can find it under connections on your system like this:

image

10. Once you click on it, you will get a connect option – just connect to it:

image

11. Once the connection is established, it looks like:
image

HTH!

Advertisements

Azure management studio – great tool for Azure operations

Just got to know about this tool – Azure management studio which can be used to manage Azure resources through GUI.

Here are the steps you need to follow:

  1. Download the tool
  2. Download the Publishsettings file for you Azure subscription
  3. Load it into the tool in the beginning itself
  4. Once the tool is loaded , it looks like:

image

 

It’s really easy to use for general cloud operations.
HTH!

Azure Vidyapeeth Day – 8th Aug, 2015 – Session PPT

Hello Team

As discussed in one of my previous posts , there is an initiative Azure Vidyapeeth where we help our community with Azure learning. We call the community at various Microsoft locations and present the latest content on Azure. You can download Azure Vidyapeeth windows Phone app for the latest updates or check this page.

image

 

Please check my session’s PPT for the event here. Additionally , please check the post I had written on Azure BCDR.

HTH!

Azure Vidyapeeth Day – 8th August , 2015

We are hosting a full day event for our community for Azure education , on 8th August! On this event, we are going to discuss in detail about the most common question for Azure i.e. security. In our general day to day discussion with Infra and DB professionals , we get the biggest question of how to ensure the data safety when we host data out of the premise.

Moreover, We are going to have a Service Engineer from Microsoft IT team, helping us understand how MSIT is leveraging Azure and what kind of challenges they faced while migrating to Azure.  During this event, we are going to touch upon many things which Microsoft Azure can do for us, both for Infra and DB professionals.

Just click on the hyperlinks to see the agenda of the event is –

image

Please take time to join this event – the registration link is as follows –

Registration link –  http://www.microsoft.com/click/services/Redirect2.ashx?CR_EAC=300309398

Entry will be based on the registration only.

HTH!

For a SQL server DBA – How to start Azure Learning – Part 5 (SQL Azure Database Monitoring)

In continuation to this series, I will write about how we can monitor SQL Azure DB. Before I get into the monitoring of SQL Azure DB, I just wanted to discuss on the major work which DBA does on day to day basis:

1. Database Management e.g. creating/adding/removing files
2. Backup/Restore
3. Database Security
4. HADR
5. Performance Tuning and Optimization
6. Performance Monitoring
7. Database maintenance e.g. Rebuild Index/Update statistics etc.

Let’s see how many activities will reduce if the database is moved to SQL Azure DB.

1. Database Management – we don’t need to add/remove any files. Once the database is moved to Azure we don’t have this option. Database size will be dependent on the performance tier selected.
2. Backup/Restore – It will be managed by Microsoft itself so, this task will reduce. You just need to use the GUI for the restore and get your work done.
3HADR – You just need to configure HADR options as discussed in my post and rest will be managed by Microsoft.

For rest of the activities , the method will change slightly – though logically things will remain the same. If you understand, It’s still SQL server underneath.

4. Database maintenance – The activities like Rebuilding Indexes/update statistics will still need to be managed. These activities will still need to be scheduled regularly. Just remember, there are some restrictions with the T-log consumption. I will come up with a separate post on the limitations which we hit in Azure DB due to some resource constraints.

5. Database Security – It will still need to be managed, though there is a change with the way it used to be managed in on-premise environment.

6. Performance Tuning and Optimization – We will no longer be able to manage SQL configuration using sp_configure or hardware or database files etc. Generally these areas used to be looked at before getting into query tuning. DTUs will become the prime focus when it will come to throughput of IO/CPU etc. Query tuning will remain the same, we still need to look for Index/table scans or implicit conversions or joins etc. in the execution plans.

7. Performance Monitoring – Finally we are on today’s topic of discussion, If we talk about what do we monitor today for on-premise SQL server:

1. Memory
2. CPU
3. IO
4. Network
5. SQL Access Methods
6. SQL Memory Clerks
7. SQL Blocking/Deadlocks
8. SQL Transactions
9. DB Size/Growth
10. SQL Configuration changes
11. SQL Connections
12. SQL Wait Statistics
13. Connection failures/Attentions

We generally use various tools like Perfmon, Management Data Warehouse , DMVs and activity monitor etc. to see these various resources.  But for Azure DBs, we will have a monitoring Dashboard and DMVs to serve the purpose. The monitoring dashboard is really detailed to help with most of the resources and for delving deeper we will still have the luxury of the DMVs as required. The DMVs related to the features we don’t have or are not required won’t be there.

Just login into https://manage.windowsazure.com and you will your database as follows:

image

Just click on the database name and you will get an option to select the dashboard, here you can see the database size:

image

If you want to use PowerShell to see the DB size , check this post.

Just above the database size, there is monitoring Dashboard but for detailed option, just click on Monitor option:

image

It’s testing database so, I couldn’t have much data there. If you want to add more counters, simple select “Add Metrics” and you will see:

image

Most of the counters are self-explanatory so, I will simply come to the point – something , which will be change , to look at is:

DTU percentage
Log IO Percentage
Storage
Data IO percentage

These four counters are really important to understand. I personally think, DTU is little tricky to articulate as a DB professional working on on-premise SQL Server. Instead of thinking about CPU/memory/storage , we just need to focus on the transactions / concurrent connections / Concurrent logins etc. As discussed in the previous posts, Microsoft guarantees to provide the below mentioned metrics in the performance tiers and you just need to move your from how much CPU/disk you need. Instead focus on how much transaction /connections will the application have etc. etc.

image

Once the application is deployed on a specific tier, we need to monitor how much percentage is being used. If it’s 80 – 90% utilized and we will either need to tune the queries using the same methods as we used to use for on-premise SQL server. Else, we will need to scale up to the next level. This can be scheduled for automatic scale up and scale down as well. For more information on resource limits check – https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/

 

For information on DTU – please check this video:

https://channel9.msdn.com/Series/Windows-Azure-Storage-SQL-Database-Tutorials/Scott-Klein-Video-02/player

 

References – https://msdn.microsoft.com/en-us/library/azure/dn369873.aspx

Benchmark Overview – https://msdn.microsoft.com/en-us/library/azure/dn741327.aspx

Resource Limits  – https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/

I will hold PaaS series for now and will move a little bit into IaaS as well. I will write my next post on storage for IaaS.

HTH!

Journey to SQL Server Community Delhi NCR

It’s always a pleasure to interact with the community. I remember, from the first day of my job I was interested in public speaking and wanted to talk to the audience. Ever since, I have been trying to find opportunities to present on the subjects (mostly technical)  in various forums. I started my career as Oracle DBA and I used to present to the application team with the best practices to write the code. It’s something which always has come as a passion for me and without any motivation from my mentors.

When I started my career with Microsoft 7 years back, I got lots of opportunities to present and lead initiatives internally. There were lots of learnings in terms of, how to present – to be an effective speaker. It motivated me to take some courses on presentation skills from my language coaches Nathan, Patricia and Jaicy. Though learning never stops but they really helped me to polish on this subject and I am really thankful to them.

Slowly, I got a chance to record a video on Best practice Analyzer for SQL 2008 R2 and I was so happy to see it getting posted on Microsoft Website.  After that, I got an opportunity to manage GTSC website http://blogs.msdn.com/sqlserverfaq with Balmukund Lakhani , for the management and publishing. It was a great opportunity to understand about blogging , SEO and power of social media to increase the reach of the blog. We created a FB and LinkedIn group SQLserverfaq also a twitter handle @sqlserverfaqs. We used to share all new posts through these handles to the social media.

Initially , there used to be only text blogs on this website and after I recorded video for BPA , I realized video blogging can be something new for this. Then, we started a new initiative of video blogging where we posted videos of new SQL features along with SQL troubleshooting . It was very well received by the SQL audience.

I even started my own blog site – https://dbcouncil.net where I keep posting my experience on SQL server, with the community. It feels great, when people read it and learn something new from this. At times, it even helps me to refer back on few topics when needed.

Then I transitioned to PFE role and  i got an opportunity, to be a speaker in TechEd 2014 which was hosted in Bangalore. With the help of my friends/mentors Arvind Shyamsundar, Amit Banerjee and Sourabh Agarwal , I got into presenting a session on Columnstore Index. It was a big hit in TechEd in terms of participation and response from the audience.

After I established myself in the role, I focused my energy to setup a community for Delhi NCR  with Raju Kumar and Gurwinderjit Singh. 20th December, 2014 was the day when we started this community. It has always been a pleasure and it always give lots of satisfaction when we interact with various DBAs/consultants. There have been lots of efforts, we are putting in to make it even more fruitful and worth the time for our audience. We always pick common technical challenges from the field and then present on those subjects so that, it can help people to move to next levels. Not only -this, we also started to talk about future of market – SQL Azure / SQL on Azure VMs and Big Data which can help to be more innovative on the job.

I am hoping that this passion will continue and I will keep doing more contributions to the SQL community!!