This blog is meant to provide information, thoughts and links that may be found useful to a computer programmer. There is no set strategy or limits of what topics will be discussed.

May 23, 2008

Review: C# & VB.Net Conversion

I'm not sure about others, but I find quite often I need to convert from one language to the other. Now, I feel I'm fairly well versed in both C# and VB.Net; however, VB.Net is more natural for me because I originally started programming with VB 3.0.

I've taken the time to learn and apply some of my knowledge towards C#; and I'll admit there are some very nifty parts to C# that you won't find in VB.Net. But, the same goes with VB.Net having some features that C# lacks. I'm not writing this to debate which language is better; in my opinion both have their strong and weak points, and depending on the situation you may find one language will handle the requirements better than the other. It all seems to come down to personal taste; I've yet to meet a challenge that one language could perform that the other one could not. Now, I've had challenges where performing something was easier in one language over the other, but the other language could still perform the same feat with a little more ingenuity.

This all leads up to the fact that both languages have strong examples for certain features or code samples; and there are times when it is difficult to convert the language over. The other day I picked up a book from Amazon.com called "C# & VB.Net Conversion". To my surprise this book covers .Net 1.1 and earlier; yet, it still very much applies to converting code that is used in .Net 3.5 and earlier!

This book is well laid out, once you glance over the conventions used and table of contents. You can quickly find information for one language and then immediate see the equivalent feature (or method to accomplish this feature) for the other language.

I'd give this book a rating of 3.5 out of 5, only because I couldn't find a more recent version of the book. There have been some changes in both languages that make less than 10% of this book obsolete; so, this isn't the 'solve all conversion problems' book. But, my research indicates that as of this writing there isn't really a book that will do any more than the one I picked up.

I wanted to pass along this finding to anyone who reads this book and has a need to learn to convert between the two languages; it's well worth the time to read to make this process easier. Especially since you can get used versions of this book that are in great condition from Amazon.com for around $6 or less (including shipping).

Until next time, Happy Coding!

May 22, 2008

Virtualization (in a nutshell)

The other day a friend of mine and I were discussing how Virtualization works; to help her out I broke it done to a simple analogy. In fact, I think it will help out most people that are not at all familiar with virtualization. Here is a snippet of the e-mail; some content has been modified to remove personal discussions and to help make the conversation make more sense to someone reading from a 3rd party perspective.

...You’d use virtualization if you wanted to run X number of instances of SQL Server and each instance (or many of the instances) needed to be on its own operating system. Reasons for doing this would vary, some would be because they believe that a dedicated OS will handle a single instance of SQL better than running 2 instances; while others might choose this route because they have 15 server boxes with SQL and to lower the amount of reconfiguring it’s easier to put each box into its own virtual hard drive. Basically, you could use Virtual software solutions to accomplish the sharing (and dedicating) memory with SQL Server and a separate OS. Some people will purchase a single box for their SQL Server 'virtual worlds’ (my terminology there) with maximum RAM and CPU installed, and then set up each virtual hard drive to use enough RAM for that particular instance. It's also worth noting that a 'virtual world' is limited to a maximum amount of RAM as the OS for the 'virtual world' will allow. Even if the server box is using Windows Server 2008 (64-bit) with maximum allowed memory, you still would be limited to 4 GB if you installed Windows XP Home Edition as the OS for the 'virtual world'. You could always tell it to use more RAM, but the OS will only recognize the maximum limit it is allowed (or configured for). This is an important part to understand because some new administrators will try to cut corners (or costs) by using a lesser OS thinking that the Server OS will pick-up the slack; this is NOT the case as the underlying Server OS is simply a host that is allowing the 'virtual world' to run..it has no dictation over the actual running of the 'virtual world', other than the resources that will be made available to the 'virtual world'.

An example would be if you had a 64-bit system with 4 dual-core processors running on Windows Server 2003 with maximum RAM (I think is 32 GB, I could be wrong on the max amount). Now, let’s say your business wants a SQL Server instance that holds valuable company information (such as Human Resources records, accounting records, employee personal information, etc). In this case you’d probably want to make sure this instance of SQL Server is not on the same machine that your users (rather it be employees of the company, or customers of the company) are using. So, instead of purchasing a separate box for it, you could use virtualization to accomplish this feat (but as with anything else there are security concerns and procedures that should be followed when doing this). You would simply create a new virtual hard drive (‘virtual world’) that contained this SQL Server instance. And when creating this ‘virtual world’ you can specify how much RAM can be used; and this would be determined by the amount of data flowing through the SQL instance and the number of users. What’s really ingenious about virtualization is that you can go into an existing virtual hard drive (I believe even while it is running) and tell it to use less RAM or other resources, without ever shutting down the server. So, let’s say on the same box your primary business SQL instance is running and using up 28 GB of the 32 GB, and the other 4 GB is reserved for the Server OS use. Well, you just go into the first virtual hard drive and tell it to use 24 GB, this now frees up 4 GB; which you would then tell the virtual hard drive being created for the company sensitive info (SQL instance) to use that 4 GB. Then just monitor both ‘virtual worlds’ for a while (maybe a couple days or weeks) and adjust the RAM as you see fit, as well as any other resources.

As you can see, virtualization can be handy. But, there are also drawbacks; in which Windows Server 2008 addresses. I think Windows Server 2008 did a very good job at addressing the issues, but this is my opinion. The basic drawback is memory allocation/sharing; I’ll cover that in a second. If you are interested in this type of scenario (using Virtual Hard Drives) and see a business requirement that you can fill with it to improve your systems then you’d want to read into Virtualization, and look at applications like VMWare or Microsoft’s Virtual Server. I’ve heard VMWare is very good, especially in cases where you are consolidating 100s of servers into a few servers. VMWare, I believe, was the first true virtual environment..I could be wrong on this; I don’t know the whole history of virtualization. Personal experience has been with using Microsoft’s virtual server, it’s fairly simple to setup and use. I chose to go with MS over VMWare because I personally feel that MS would design their Virtual Server to utilize every nook and cranny of the Windows OS; I’d honestly look into using VMWare if I were running an OS that isn’t MS based...

...Hyper-V is a term that you will want to familiarize yourself with if you are getting into virtualization. Hyper-V is a concept created by Microsoft that addresses memory allocation within the OS to improve the performance of virtual hard drives/OS. Hyper-V in particular was designed to be used with Virtualization; it cleans up a lot of the memory issues that occurred with Windows Server 2003 and earlier versions. People would see problems when running 15 instances of virtual hard drives. Mainly that their expensive RAM wasn’t worth any more than common RAM picked up at a local computer shop! Microsoft’s solution was Hyper-V! If you ever get bored and want to geek out, read up on Hyper-V. The concepts are actually quite interesting, and if you’re not familiar with Virtualization you will still see how Hyper-V improves virtualization. Amazing stuff!

The basic idea, in a nutshell, is that prior to Windows Server 2008 the virtual hard drives would compete with each other and the OS for memory. Think of memory like a catalog file in a library. If you want to find a book, you need to open a drawer find out the location of the book, then go to the book. Now, if you want to find another book, you’d have to close the open catalog drawer; then open another catalog drawer and find the location of that book. Now, let’s say there is only 5 catalog drawers and they hold all of the locations of all books in the world; and let’s even say that everyone knows that these catalog drawers has this information. So, you might be looking for a book on virtualization; and the person next in line wants a book about Vacuums. They obviously can’t look in the catalog at the same time as you because there isn’t enough room to move the catalog cards around so both people can view it. So, the next person MUST WAIT for you to finish; then when you are finished and leave to get your book they can look for their book. Now, what if you figure out that you wanted to look up something about SQL. Now you must WAIT for that person to finish looking in the catalog, PLUS anyone else that is in line before you returned.

This is how memory would work also, it is more of a linear type of thing. So, the OS would be accessing part of the RAM, then Virtual Hard Drive (VHD) # 1 is accessing a part after the OS is done, then VHD # 2 would access another part. So when switching between VHDs you’d run into WAITS for the OS or VHD to complete its accessing of the memory. First attempts to resolve this problem was to divide memory into sections; but regardless the OS alone uses up so much room for memory that the VHDs would have to compete just to get enough memory to run; let alone to actually perform any operations (such as just logging in to the VHD).

To resolve this MS introduces Hyper-V; which basically will hold references or copies of memory in a temporary memory table. So when VHD #1 is done, the accessed memory isn’t actually released, it’s stored in a temporary table just in case VHD # 1 needs to come back to access it again. This is the same with the OS and VHD #2; this doesn’t necessarily resolve every potential memory sharing issue…but it sure does greatly reduce it! Going back to the catalog analogy, if the section of catalog cards you were looking at was copied and then placed into a separate temporary box and you realized that you wanted specific topic under Virtualization; you could go to the temporary box of catalog cards that held info about virtualization books. This obviously would save you some time because you don’t have to wait in that long line to access the original cards; it would have its own limits also (storing some cards in boxes); such as it would take additional time/effort to copy the information, you must find a temporary box (or create one), try to keep others from not using the box, and this only works if you are searching for a book the is relatively similar to your original book. If you want to search for a book about car engines, then obviously you still have to get back in line to access the original cards. See how there are some drawbacks?

That’s about the best I can do at explaining this subject without getting into great technical details. It’s quite an amazing concept (virtualization) and can help businesses that use lots of databases and servers. Hardware technology, and Operating System technology, are quickly adapting to handle virtualization more elegantly. It will probably be a few more releases before we can truly start seeing virtualization performing near its peak of potential; but, now would be a good time to familiarize yourself with the topic while it’s still being developed/refined. As with any other technology, the longer you wait to learn about it, the more complex it becomes which means the more confusing it will be to learn. Imagine if you learned about SQL Server when it was simple, just a database with commands (before all of the DTS, SSIS, etc); then now you’d only have to learn about the new options offered in SQL Server. Imagine if you were to walk into using SQL Server today for the first time ever, and told to run a multi-national company’s data structure..that’d be a daunting task because there is just so much to learn with SQL Server alone!

I hope you get a lot of information out of this!

Until next time, Happy Coding!!

March 6, 2008

Should you use old or new technology?

Overview:

With the release of Visual Studio 2008, Windows Server 2008, and SQL Server 2008 (in 3rd Quarter of 2008, as of time of this blog writing) I get asked if I will be utilizing the new technology or staying with the old technology. This question has some thought to be had before deciding; and for most others the thoughts can be very complex. I'm going to cover the thoughts that an everyday programmer and administrator commonly come across for a small-medium sized business. I am not covering Enterprise structures because most have some sort of a business IT roadmap/plan they will follow and require much more authorization and testing than those of smaller businesses.

Understanding the complexities involved:

First to make the best decision the IT admin/programmer will need to understand the overall objective of the business, the requirements that are expected to be met, and foremost the actual practicality of the business needs.

An example is a small sized business with less than 20 employees and probably only having 1 IT (do-it-all) person this can be a much more simple decision. This type of scenario will typically involve very few custom made applications, typically applications that can be upgraded using built-in upgrading wizards and, in most cases, will have limited compatibility problems with Operating Systems (i.e. most, if not all, systems are 1 OS version like Windows XP Professional). In this case, let's assume the IT personnel has been authorized to make the decision to upgrade on their own and financially given the 'green light'. So, what do you do? First, what OS's are being supported? In this case, Windows XP (no Vista…yet) is the prime OS. Next, how about servers being supported? They probably have 1 or 2 servers, maybe 1 server using Windows 2000 and 1 using Windows 2003; and a third server that uses Windows 2003 to backup important files on a nightly basis. Next, how many applications are custom made (i.e. using a programming language in-house, such as C# or Visual Basic)? Maybe ½ a dozen custom apps exist. Next, what programming language was used? Applications were probably developed using Visual Studio .Net (any release) with Framework 2.0.

Another example would be a medium sized business with 200+ employees, a small IT Department of maybe 10 people. Going through the same line of questions would maybe result in finding out that some employees are on Windows XP and some are on Windows Vista. You may also find out that you have more than a couple dozen custom applications and they were produced on programming languages ranging from Visual Basic 6.0 to Visual Studio .Net 2005. They might have a small server cluster with redundant clusters.

Deciding what to do:

In the first scenario, with the small business, the option is quite simple; you can choose to if you want to but don't have to because of the low amount of custom applications and no immediate worries of the OS changing. In my personal opinion I would go ahead and make the conversion to the new released software versions when possible; this will minimize the headache of upgrading at a later time when you no longer have a choice. There are two caution areas here. The first is the custom applications were developed with the intention to run on Windows XP; this can require more time to upgrade to run on Vista, but with the low amount of supported applications this can most likely be upgraded using a wizard and then revised in person over a small amount of time. The other caution is upgrading from Windows Server 2000; this one would be a strong candidate for being upgraded as soon as possible for two reasons, 1) The longer you wait the larger the leap in OS versions which means the more chance of problems, and 2) Windows Server 2000 may become unsupported during this year which will result in retrieving support via websites only and getting vital Security Updates…no more patches for bugs, and especially no more Service Packs!

In the second scenario, with the medium sized business, the option is much more difficult. First you must take into consideration is what to do with the legacy applications (referring to Visual Basic 6.0). This requires quite a large amount of conversion and will be tedious and require custom upgrading of the coding. The next consideration is the servers; how to do the migration path and to test the migration before implementation. Then comes into factor is the support aspect for the older Windows 2000 servers; with this slated to be no longer actively supported and fixes only limited to security (unless you have an agreement with Microsoft for continued patches) then this starts to become a race against the clock or a gamble each day for the security of your system (not that even using the more current versions don't have their own security gambles as is). In this scenario I would go ahead and make the switch to the latest programming language because the upgrade path is minimal for most applications made with Visual Studio .Net; and immediately find a solution to the Visual Basic 6.0 applications. I would probably at minimum upgrade the Windows Server 2000 system because the thought of minimal support from Microsoft is scary; already hard enough to get patches for the products that are fully supported, I can't imagine the problems with a product that is no longer supported in the mainstream.

What about SQL Server 2008?

This was not addressed in the above scenarios because it is pretty much the same decision process for both cases. The first question is do you want to have the latest technology, and can you risk (for the more immediate future) a new released software version. Most people have already determined to wait until Service Pack 1 is released before even testing the new technology. For the larger organizations that might already have 1 or 2 SQL Servers in production environments will most likely want to avoid upgrading those due to complexity of upgrade testing and path; and I'm the believer of if it isn't broken then don't fix it. I'd personally rather use SQL Server 2008 for newer developments, then start the migration process for the older databases as needed or opportunity presents itself. You may find one day that you have to do a complete restore (I hope you never do) and then at that point you might just decide to restore it on SQL Server 2008 instead and deal with compatibility issues at that point (this is assuming you already tested the upgrade requirements and functionality previously).

So, what do I do?

Well, it all comes down to two simple facts:

  1. Are you willing to put in the time to test current compatibility and perform upgrades manually where needed?
  2. Do you want to get the upgrade done now while you have a smaller upgrade path, as opposed to waiting another 5 years?

If you answered NO to either of these questions then stay right where you are and keep your fingers crossed that the next version of these software titles remains simple in upgrade paths. If you answered YES to either, then determine your needs and start testing before actually upgrading (there are changes and some you might not like and others you may love...try to be prepared to read/learn about the new features in all 3 products). If you answered YES to both questions, then you will want to start testing immediately and pour in your resources available to you; this will be a bumpy ride in the beginning, but will get much smoother as the dust settles and you get familiar with the new technology.

Conclusion:

Deciding whether to upgrade or not is not an easy decision to make by any means, I have only addressed two common scenarios and no one persons scenario will fit perfectly into the above scenarios. Also, these are my personal opinions and thoughts on the matter; I'm in no way a guru in this field of subject and would caution you heavily to do all the research you can and carefully plan your steps. Don't be afraid to hire a consultant if you are confused on what the best path to take is. Microsoft works very closely with many consultants and vendors in preparing them for this upgrade and handling many, many different scenarios.

In my overall opinion I feel that if you can spend the time and efforts then upgrading to the current/new technology is much better than sitting on the old technology that you feel is comfortable and works for you; before you know it you'll find that technology has just blown right by you and now you don't have an upgrade as an option and will require either rebuilding your structure or hiring someone to come in and fix everything and deal with all of the headaches (I know this, because I have been hired on with a company to fix their 10+ year old technology problems…and it really is a headache because of the drastic changes in technology architecture and methods)!

February 27, 2008

Blog’n Again…

Well, it's been a little over 6 months since my last entry for this blog. Life's been a challenge and required my absence from this blog (and my other blogs).

Now things have settled down. My health is stable, my career has become stable. Now, it's time to get this blog back to being stable! I hope to be able to continue along the path I was headed with this blog.

You may, or may not, know that Microsoft is in process of releasing its 2008 versions of SQL Server, Windows Server, and Visual Studio. I will remain focused on the 2005 version for the most part; however, I will try to touch upon some 2008 new features and enhancements. This blog may evolve to covering the 2008 version as they become more main stream…and as I evolve to using those versions primarily. As with any new products being released, I will go through rigorous testing and validating before I convert over.

My current objective is to write an entry for this blog about once a week; however, it could be more or less. I hope to not go any longer than two weeks between entries from here forth.

Until next time…Happy Coding!