Tag Archives: Junaith Haja

A New Challenge at Crane Worldwide

Summers are meant to be relaxing, go around and explore the sunny east coast, but this summer wasn’t one for me because I took a life changing decision – Yes, I accepted a new position with Crane Worldwide Logistics as a BI Architect and moved to Houston, Texas.

Why did I accept a new position?

Well, I was working with my prior Employer and Client for about 4 years and I felt I have reached a saturation point in terms of job challenge and growth. I have seen my skill set grow from a technical to Business & Analytical person and the credit goes to my previous Boss (Jason C) who pushed me to learn beyond my boundaries. When this offer came, my first instincts were this job would be a great opportunity to implement all my learning. This move was one of the toughest decision I had to take so far in my life – I did think through a lot to leave an Employer like client and a Mentor like boss and a friendly Employer, I am grateful to them both.

What’s Crane Worldwide?

Crane Worldwide is a Full-service air, ocean, customs brokerage and logistics company, it has been into business for last 8 years and have achieved significant growth in terms of revenue and resource. The company is now in its tipping point and business intelligence is one of the key factor they are investing at.

How’s new job?

It’s been 3 months and it was super-fast and challenging, it’s a new industry to me and there was a lot of learning and have still covered only a drop in the ocean of logistics. I was very much intrigued by the startup like work culture and the responsibility they trust you with. I was able to carry out my first upgrade deployment and to come up with an execution strategy and vision. A part of my job duty is to evangelize the BI initiative through training’s to have all the users on-board with us, I do see a lot of momentum for BI and am looking forward to keep up and move forward!

How’s Houston?

Houston is hot, humid and always reminds me of my hometown Chennai, it’s a large and busy city where you get to spend a considerable time on road, though you like it or not:). We are slowly getting accommodated to the new city and it’s extremely different when compared to Baltimore and its cold climate, will surely miss/escape the snowy days and now I can never miss/escape the rainy days:( I miss all the good hearts I met at Baltimore and I hope all my friends and colleagues reading this blog are fine,Take care:)

-Junaith Haja

 

Becoming a Speaker

One of my goals for 2016 is to speak at a local SQL Server User Group and a SQL Saturday event. Am very glad both of my goals did come to fruition before end of Q2.

I wanted to pick a unique topic for my presentation and Power BI Desktop Fundamentals was an apt choice for it. It was a new tool and very less people know about it and hasn’t been presented in my local user group. I focussed on developing a 101 course on Power BI covering the basics from top features, creating dashboards and story telling with it.

I got an opportunity to present it on April 19,2016 at Charles I. Ecker Business Training Center of Howard Community College for Baltimore SQL Server User Group.

This was how I looked while presenting:)

Junaith at BSSUG

Had to rush to the meeting after my work hour and while setting up the projector it didn’t work, I got very tensed and found a work around..phew!! FInally when I started presenting everything got settled!

Out of humility, I should say it was  huge success and no one slept in the room:) Lot of good questions from the audience and I was surprised to see many companies have started to use it.

I was deeply moved by the comments.

Mark

 

Vakul

 

 

 

 

 

 

 

Thank you Jeremy Kadlec and the Sponsors for giving me the opportunity.

Speaking at SQLSaturday Baltimore;

On April 30, 2016 I got a chance to speak at SQL Saturday Baltimore BI Edition it was a dream come true moment to share the stage with MVP’s and other expert.

We had a nice Speakers Dinner at a Turkish restaurant and hundreds of people attended the conference next day, the whole event was organised by huge efforts of Slava Murgyin and Ravi Kumar, kudos to their hardwork.

My Badge!!

My Badge!!

 

 

 

 

 

 

With Ravi

With Ravi

 

 

 

 

 

 

 

 

 

 

 

 

 

I was really motivated by these opportunities and want to keep this going and during Q3 I want to take it to next level and present a Power BI Advanced topic.

Hope I do it!!

Creating a SSRS report to show SSIS package run time statistics

In most companies developers are restricted from accessing the MSDB database and they rarely know the performance of their packages in a production environment unless they have access to third party software tools or a friendly DBA. This happened to me once when I wanted to know how long my packages ran in a production environment and I had no access to the MSDB database to look at the sysjobs and sysschedules tables. The work around is to enable SQL Server logging in SSIS packages and to create a SSRS report from the sysssislog table.

The logic behind this solution is to enable SQL Server Logging in SSIS packages while we create/develop the package and send it for deployment.

Follow the rest of the article from here

Back From Vacation Syndrome and SSMS Shortcuts

Was just back to United States from a long vacation and started to work. The first two days were filled with replying to all pending emails and settling up issues reported while I was away. Slowly was out of jet lag and my routine work life started. My client did gave me a new requirement to build a report. Came to my cubicle, opened SQL Server Management Studio (SSMS) and started building the query for it. With me being away from work for few weeks, I thought I might have forgotten the tables, databases and relationships which exists between different database objects.

To my surprise, I didn’t forget any and I painted the whole picture of the report from different tables in my mind when my client gave the requirement. While building the query I noticed something strange, I have been stumbling with keyboard shortcuts I use with SSMS and just took a breath and yelled at myself (How could I?). Shortcuts are cool feature with SSMS and I love using them. I know I should get into my notes and brush up my shortcuts.

This time I wanted to know why did I forget them and started googling about it. The one word answer is Brain Plasticity. The ability of our brain to be flexible and adapt with new changes is called Brain Plasticity and our brain survives by forgetting too. The article I read quoted human brain doesn’t need to forget intentionally and any changes in the outside environment has an effect on our brain and it will make us forget to store new events. Think of it in this way, If I were to think about SQL and databases at my vacation it might have gone terrible right?, luckily I wasn’t and my brain was on vacation mode and now am back to work it will slowly store my work and daily life related data and events and might forget vacation mode and it keeps working!!!Amazing is our brain!!!

I just called it as Back from Vacation Syndrome:)

Here are the shortcuts I brushed up from my notes and would like every SQL Developer to be aware of it and forget only when they are on vacation:P

Ctrl + U  - To Change database connection

Ctrl + F6 – Toggle between query windows

Ctrl + F5 – To parse the code

Ctrl + R – To Toggle Result Pane

Ctrl K + Ctrl C – To Comment the code

Ctrl K + Ctrl U – To Un Comment the code

F8 – To view Object Explorer

Shift + Alt + Enter – To view in Full Screen

Ctrl + Shift + U – Converts the selected code to Upper case

Ctrl  + Shift + D – Converts the selected code to Lower case

Ctrl + Shift + Home – Selects the code from current cursor location to the beginning of query

Ctrl + Shift + End – Select the code from current cursor location till the end

Share with me if you find some more shortcuts are essential.

#JunaithHaja

Download and Install Adventure Works 2014

Microsoft has launched Adventure Works 2014 database for its SQL Server 2014 version. Until 2012 Microsoft provided sample databases in format of mdf and ldf file downloads, a developer will download and attach the mdf and ldf file to install Adventure works 2012. It has changed the style completely with 2014, do watch the video to know about it.

Script:

USE [master]

RESTORE DATABASE AdventureWorks2014

FROM disk= ‘C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AdventureWorks2014.bak’

WITH MOVE ‘AdventureWorks2014_data’

TO ‘C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AdventureWorks2014.mdf’,

MOVE ‘AdventureWorks2014_Log’

TO ‘C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AdventureWorks2014.ldf’

,REPLACE

References and Dowloads: https://msftdbprodsamples.codeplex.com/releases/view/125550

 

Configure Delayed Transaction Durability in SQL Server 2014

After reading my previous article on Delayed Transaction Durability,one of my Twitter follower posted me  this question. How do we configure our server to work with Delayed Transaction Durability? Does it comes by default?

First thing to note, Delayed Transaction Durability can be applied only at Database level and not at Server level. Here is how you configure Delayed Transaction Durability in a SQL Server 2014.

Step 1: Open your SQL Server 2014 in SSMS and Right click the database you want to set this feature and open the Properties window.

Step 2: Go to Options and Set the Delayed Durability property as Allowed under Miscellaneous section.

Delayed Transaction Durability in SQL Server 2014

Now you are all set:)

Coming to his second question, By default Delayed Transaction Durability comes Disabled with SQL Server 2014 and we can set the property as “Allowed”, “Forced” or “Disabled”.

Hope it Helps!!

#JunaithHaja

 

Why you should upgrade to SQL Server 2014:

I shared my blog with my brother Zulfi Haja, he is a SQL Server Consultant too and works for an investment bank in London,UK. Having reviewed the top 5 features of SQL Server 2014, He shot me an email saying, Bro the effort you put on summarizing the top 5 features was worth reading and it’s a neat approach, would definitely recommend it to my colleagues, but I am still not convinced for an upgrade to SQL Server 2014. Can you help?

I thought only a Tech Evangelist from Microsoft will be able to answer him. I took the question from him with a little hesitation and replied back saying, “I don’t know how long will it take but surely will get back to you once I find a convincing answer”. He said, All the best!!

Was looking out for users experience with Enterprise 2014 version across my network circle and over the blogs, it was of no use. In the beginning of May I joined a local SQL Server User group and they invited me for their monthly meeting. To my surprise there was a presentation about In Memory OLTP and SQL Server 2014 by a Product Specialist from Microsoft. I said to myself, I found the guy and this Product Specialist and Tech Evangelist should be the one to answer my question. His presentation was very convincing enough and I found the answer from his talk. After the presentation the Q&A session was open and I asked him the same question and he gave me the expected answer from presentation. It is SQL Server 2014 can now support Tier1 application.

Let me elaborate this, if you have hanged out with a developer crowd from different domains they have this pre destined notion with them where they will give you an analogy, Oracle is like a business class of a flight and major critical applications are built using it and SQL Server is like an economy class where middle tiered companies uses it. In fact, of all my projects 60-70% of them will be the front end applications supported by Oracle and we had the data replicated to us in SQL Server environments for our reporting and analysis purposes. With SQL Server 2014 that pre destined notion is no more and as per my conversation with the Tech Evangelist, SQL Server 2014 will be good enough to support Tier1 applications and compete with Oracle supported systems because of its new DB engine design. Also he hinted out saying, if NASDAQ uses SQL Server why not your applications?

So Oracle folks better be aware of it :)

#JunaithHaja

Delayed Transaction Durability In SQL Server 2014

Microsoft has introduced a new feature called Delayed Transaction Durability with its SQL Server 2014 version which aims to reduce the delays happening at the transaction level and to facilitate a full time availability of the database to the front end client application. By default, the transactions in SQL Server are durable which means the committed transactions will remain in the system even after a system failure. With its 2014 version Microsoft has introduced two types of durability in its transactions as follows,

1. Full Transaction Durability and 2. Delayed Transaction Durability.

Full Transaction Durability:

Consider a data entry user using a front end application built by ASP.Net backed by a SQL Server database. When the data entry operator enters 100 records the data gets written into the log file of the database and when he hits submit button from the Client application, the 100 records are moved from the log file into the data file. During this process the user will not get the control of client application until write process from log file to data file gets completed and he wouldn’t be able to process/enter any data at front end level (This corresponds to the processing screen which appears when we submit data over a website – annoying right?).Once the write process is completed, the log file becomes available and the user gets availability of the Front End and continues with his work. This process is called Full Transaction Durability. On the pros side there are no data loss and on the cons side there will be a significant latency at the client side application.

Delayed Transaction Durability:

In this method a buffer is used at the log file and the data is sent periodically to the data file whenever the buffer gets filled up. Imagine if the buffer has the capacity to hold up to 25 records flushes the data to a data file once it gets filled up. Flipping the above said data entry process for this method, the data gets written into the data file when the user enters the 26th record and when he enters the 100th record and hits submit, 75 records were already moved to the data file and only 25 records needs to be moved which will be taken care by the buffer flush and the client application is readily available to the user when the write happens which mean a no wait time. This process is called Delayed Transaction Durability. The latency at client side is significantly reduced and user is given an all-time availability of the client application. The DBA has to configure the database to handle full durable/delayed durable transactions.

Couple of month’s back one of my colleague had an issue of the log file getting filled up at a rapid pace and the nightly jobs were getting failed. I hope configuring the data base with Delayed Transaction Durability will enable periodic flushing of data from log file to data file which will prevent the log file from getting filled up. I am going to recommend this to my colleague. Hope you all will:-)

#JunaithHaja

Inline Specification of Indexes

We know from our experience with previous versions of SQL Server’s that there are two ways to create an index in a table.

One is to Right click the table at Object Explorer level and create it.

Second is to explicitly write a T-SQL statement starting with Create Index on Table name after the create table statement.

There is a single pitfall in the above two methods, the index cannot be created simultaneously within table definition like a Primary key or Foreign key constraint. SQL Server 2014 overcomes this draw back by facilitating to create an index within the Create table definition which is referred to as Inline Specification for Index creation.

Hence the index could be defined as below within the Create table statement,

CREATE TABLE DBO.FRUITSHOP (

ITEMNAME VARCHAR (25) NULL,

QUANTITY INT NOT NULL,

UNITPRICE FLOAT NOT NULL,

INDEX IX_RATE NONCLUSTERED (ITEMNAME, QUANTITY)

The above code will create a Non Clustered Index named IX_RATE and can be checked from SYS.INDEXES table by using the below query.

SELECT * FROM SYS.INDEXES WHERE NAME=‘IX_RATE’

Create Index by Inline Specification in SQL Server 2014Create Index by Inline Specification in SQL Server 214

Create Index by Inline Specification in SQL Server 2014Create Index by Inline Specification in SQL Server 2014

Create Index by Inline Specification in SQL Server 2014However the  above create table code will fail in SQL Server 2012 and earlier versions with the following error.

Msg 1018, Level 15, State 1, Line 7

Incorrect syntax near ‘INDEX’. If this is intended as a part of a table hint, A WITH keyword and parenthesis are now required. See SQL Server Books Online for proper syntax.

Hence 2014 makes our coding easier. The Inline specification can be extended for Clustered and Non Clustered Indexes with varying number of columns as needed and can be found in MSDN site.

Hope it helps!