Sunday, December 03, 2006

Take Control of High Level Userid's

It's been a while since I published anything so I figured I would drop a quick tip/suggestion for account administration that works well. One of the most overlooked and dangerous habits of system administrators, development staff etc is good plan for safe usage of high access user accounts. Obviously it's bad practice to allow users to perform all of their day to day responsibilities (email, web browsing, etc.) while logged in as an administrator. These accounts should be reserved only for performing the duties in which they were created for. If allowing otherwise, one of your power users users will eventually succumb to a virus or browse some illegitimate website and reak much more havoc on your infrastructure as they potentially would logged in as a normal user.. Not a good plan.

So, how is your staff supposed to do their jobs without having an administrator account? One simple solution that works well is to create two separate accounts for these users. One account should have very minimal access levels allowing just the basics. The users should use these accounts as their login everyday. Make the second id's similar to the first but prefix them with a standard naming convention like "admin" to make them easier to manage. The second id's should have all appropriate permissions for all of the employees tasks.

Now, forcing your users to login and logout all day long will make them go bananas, and truthfully you will not get anyone to abide by this for very long without forcing and denying things link email on the administrator accounts. To make every one's life easier, you could create a batch file for each the users that executes the RUNAS command, and fires up the command prompt running as the admin user.

Example of the batch file contents:
runas /user:domain\administratorid cmd

Drop a shortcut to the batch on their desktops via AD to make it really easy for them. There's also alot of good options in the runas command that you can take advantage of like using alternate profiles if need be.

Now that we have our user logged in with a stripped down account, and they are running the cool custom command prompt as the administrator you should be all set. The user can actually drag and drop any program into this command prompt (Computer manager, SQL manager, etc) and voila - it fires up as the admin account. The user can keep this prompt running all day and use it over and over whenever they need to access an admin level application.

In closing - Having a good, organized strategy for account access is paramount in creating a safe, secure and happy infrastructure.

Tuesday, February 28, 2006

Standards in enterprise level intranets

This is more of a best practices blog than a technical one. After seeing several large enterprise level intranets grind themselves to near uselessness, I figured it was time to shed some light on why standards can be so important.

Defined standards is an often overlooked part of a companies internal computing strategy, yet in my opinion a very important one. Introducing standards into web systems will in the long run save user frustration, save time, save money, and ensure that an organization's investment in their information is accessible.

Keeping a few simple things in mind when laying out your design will inevitably create a better end user experience.

Successful enterprise level intranets should contain usable, organized information. Feel free to babble on about the history of your company on your extranet, remember to keep your intranet environment concise and to the point. The key is the intranet is a tool, and when users brains are highjacked by lack of organization an extraneous information the effectiveness is lost. The end users should be able to retrieve what the are looking for quickly, and then to move on.

Early intranet adopters usually have chaotic web structures. Many larger companies have a disorganized or non existent web structure because their strategy was (and is)to piece together all their departments home made websites. Every department has a self proclaimed web aficionado, and that person was typically usually tapped to "put together" and maintain that dept's intranet site. This leads to a host of issues including lack of central management, unbalanced traffic loads (both physical loads and "political") and my personal pet peeve - departmental branding which I will get into a bit later. All of these things lend themselves to an inefficient end user experience. It may sound harsh, but taking the design liberties away from your rouge developers will foster a user centric and standard web experience. Management of corporate intranets should be centrally managed in regards to design and function, actual content should be delegated.

Drop the fancy logos. One thing that I have seen over the years in most or all of the patched together intranet systems is custom departmental logos popping up. Some facets of an organization will in fact need self branding, but keep in mind most don't and when they don't they add to the confusion factor. Adopt a rendition of your corporate logo, and create a clear background for sublevels to modify with a picture explaining what it is that they do. Your company has already spent millions of dollars developing an image for itself, it may hurt, but it is better than your fancy new logo that you made in Photoshop. Sub-branding also throws off new users. I speak from experience when I say an intranet with a different header image and logo in each site makes a new employee wonder how many different companies are involved. Sure departmental pride is a good thing, but who do you actually work for - creating sub logos projects you are on a different team altogether and not working for a common goal.

In closing, it is easy to see why we need standards. Designing the superstructure of your intranet smart will make your investment give a much higher return. So, develop your design standards in regards to Look/Feel, Navigation, and keep them user focused! Long story short - all development including back end systems, graphics, and applications should be agreed upon at a corporate level by development staff and management. Delegate content management tasks the guys in each dept whose experience consists of making a website for their local church. Good luck, you're gonna need it.

If you read this far,  you should follow me on Twitter!

Wednesday, February 01, 2006

Classic ASP on 2003 Server with disabled Connection Pooling.

I thought that this experience was blog-worthy due to the highly undocumented nature of this problem. I hope that it can shed some light on why your new MS2003 web migration isn't exactly holding it's own under a moderate traffic load.

Day starts like this.. You take initiative and migrate all of your rusty old NT webfarm to a happy new Win2003 environment. You're running very expensive and highly trafficked custom ASP code, with a remote SQL backend. Expecting huge performance gains from the cutting edge systems, you brief everyone on the IT staff about the new direction the neglected websystems are headed.

You migrate all of the systems over in one fell swoop, do a bit of testing and after deciding that everything checks out - you swap the DNS entries and now the applications are live on the new hardware.

4am the phone rings, it's Tokyo and they want to know why their business critical web applications are not serving pages.

Saving you from my ranting soliloquy, this is an issue that I recently ran into. All ran smooth as a whistle until there was a mild, I stress mild, load (<300 concurrent connections) on the servers. But why did cutting edge servers running the latest Microsoft webserving technology get out performed by old NT machines? The answer is in the way that 2003 server talks to SQL backends with connection pooling disabled.

If you read this far,  you should follow me on Twitter!

Basically the web server was running out of available ports to communicate with the back end SQL server. This became evident by running a basic netstat on the test webserver during one of the load tests and monitoring the connection traffic. By default, w2k3 server reserves 4000 ports for communicating with SQL. The netstat showed me that all 4000 ports were quickly simultaneously opened to the SQL server. When the server runs over this allotment, it will start denying the connections. These opened ports are reserved for a default 4 minutes, so even if the connection is idle - it is still in a "TIME_WAIT" status, essentially unusable by another request. Every request after this limit gets denied until ports are freed. This is a very "obscure" feature of 2003 and classic ASP. There are essentially two ways to fix the problem, in this case, partly to do with the nature of the applications, I chose to increase the port allotment on the webserver to allow more simultaneous connections. Here is the quick and dirty fix:

In the afflicted webservers registry, add the key

Value Name: MaxUserPort
Data Type: REG_DWORD
Value: 65534

to

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters

This essentially gives the client (in this case a webserver) 60000 more ports to play with. After applying the changes, the load tests show that this successfully resolved our issue.

You should under just about all circumstances be using connection pooling, however there are some circumstances where this is not feasable. In my own opinion, I am not confident that connection pooling works very well after some of the load tests I have done on 2003 server communicating with a SQL backend (in my case I was using a SQL cluster). This may or not have been fixed by the time this is read, if it was an issue at all.

After spending hours testing and researching this issue, we happened on this fix. Since then MS has published this KB article explaining most of what is going on here.

http://support.microsoft.com/kb/328476

Please note, that this may or may not be the best solution for your setup. This type of issue can also be evident if there are other underlying problems such as poorly closed code or quickly opening and closing connections in the code which can lead to high stress on your database servers.

I hope this is useful info, I know it would have been to me had I run across it.

If you read this far,  you should follow me on Twitter!

Friday, January 20, 2006

The Second E-Revolution

I look at starting an e-business like a treasure hunter, with an idea as the map and the successful venture the treasure. You have to follow your treasure map before someone else gets a chance to photocopy it and claim the gold for themselves. You have to protect your ideas at all costs because in many cases it can be your only true asset until you have large online following.

Your product or service has to be new, original, witty, and it helps if it is something that the media would have a field day with. That's one thing that many people don't understand about the internet and starting a business on it. The internet was built to share information, so in turn, anything that is new or original on the net has the tendency to spread like wildfire whether it makes financial sense or not. Internet businesses have been won and lost over a single article, but the overall longevity of the venture depends on the effort put forth to keep the site on the top in terms of technology, ease of use and services. The media is a powerful force when it comes to ecommerce but without a concerted effort to stay the first you will find that spotlight will burn out very quickly.

Many skeptics say that the dot com era has financially come and gone but in my opinion it hasn't even started yet. The rise and fall of the dot coms of the late 90's was due to over-hyped overvalued stocks, not because the internet was not a sturdy platform for business. The dot coms came and went because we had the ideas for using the internet to make money but the general population didn't yet have the dependency or trust in the net. This left the startups having to sell the public on why using the internet was better than driving to their local Best Buy. The net still has inherit values which make it perfect for business in the future:

1. Relative low overhead
2. It's always upgradeable
3. Growing dependency of public on the net

Devices (Phones, PDA's, cars, etc) are becoming more and more 'smart' or connected. This in turn is moving us more and more into a wired (and wireless) world. As the world becomes more used to the idea of information at the click of a mouse or the touch of a button, they are also becoming more dependent on these services. Imagine trading your TV in for a radio. Crazy? Of course it is, but with wireless connections on the rise, trading a web enabled smartphone in for an analog cellular would be just as ridiculous.

We are just now physically catching up with all the promises we got about the internet and how it was going to change our lives. The general public is using powerful internet tools to book their flights, sell old junk, and to communicate more and more every day. Trust in the internet is finally growing and with that trust people are spending more money for products and services found solely online.

The idea is the same, have a great idea for a website and become a millionaire, but this time around the stakes are much higher and the reward much greater for those who can succeed in the second E-revolution.

Thursday, December 15, 2005

Greetings Earthlings

Hello World, and welcome finally to my BLOG. I have been meaning to put all of my rants into a blog for years, and finally this comes to fruition. No more procrastination, no more excuses. Please fel free to post your own comments and questions. I vow to update the site at least twice a month, perhaps more - but attempt to spare you from useless dribble. So sit back, relax - and dont forget to bookmark ryanbutcher.com, a sometimes personal, sometimes professional blog...