The Revolving IT Door Starts to Spin

Recently our IT department lost two of its top leaders.  One was the head of IT who left for a position at a university in Texas.  The other was essentially the second in command who left for a different position within the organization, but not in IT.  In the past few months, we also had a lot of other people change position, more so than in any other period that I remember over the last eight years.  Most of the moves have been to get more money, not because they really wanted the new position or were unhappy in their old position.  I’ve had the opportunity (so to speak) to sit back and watch objectively while I wait in doctor’s offices while my wife investigates alternatives that might help her live with her cancer (there may not be a cure, but there may be a way for extended managing of this disease).  While I have an MBA and thus some background in looking at business issues and not just technical issues (which is why I alternate between a business blog and a technical blog), not many of my co-workers recognize that I just might have some insights to business issues.  But just in case they are reading, here is what I think is going on.

Over the last three years, salaries at our organization have been essentially frozen.  I know some of you might be thinking at least salaries were not cut, and I get that.  However, salaries at local public institutes like school districts have never been on par with business in general. Now the gap still exists, but is not quite as wide.   At one time people came here to work because of the greater benefits and the almost guarantee of a job.  Well, some dents have been put in those arguments lately as well although they are still better than at most businesses.  At one time, there was very little job stress in these positions.  Again, not true anymore.  Long days and extended workweeks are not uncommon.  In fact, using the terminology of the ‘Carrot a Day’ people (Adrian Gostick and Chester Elton) there are not a lot of carrots going around.  Some might say there is a famine.  Some of this is because there is little to no discretionary spending especially in the public sector where local newspapers are ready to pounce on the hint of any money spent to reward individual employees.  Some of this is because there is a fear of being taken to Human Resources because of treating some employees unfairly compared to others.  Some of it may even be because management just doesn’t recognize who their best performers are because they really don’t understand the work they do and effort expended to do it.  It doesn’t matter what the reason is, just that good performance often goes unrecognized and unrewarded.

As a result, good employees are faced with two options if they want to grow their career.  One option is to find someone within the organization to help them move to another position.  The other option is to look for a job in some other organization.  Both options are viable even in today’s economy, but both options result in a loss of knowledge and capability in their old department that could take months or years to replace.

A quick example of this is that during the eight years I’ve been with our organization, I seen 4 different heads of our IT department.  That is like one every two years.  However, between each one there has been a gap of between 6 months to a year to fill the position.  How beneficial is that for our organization’s strategic effectiveness?  Did some of them leave because similar positions in other organizations paid more for the same level of commitment?  Did some leave because there were not enough funds to ‘make a difference’ in our organization?  Or were there other reasons such as less stress or not being on call 24/7 for the same pay?

In any organization today that artificially limits salaries of anyone in IT below the average level in the industry for any given position and regardless of performance, especially when there have been few or no raises over the last several years and no year-end bonuses, cannot be surprised when they see their best employees start to leave?  I know the arguments.  They tell you the times are tough and there are few jobs out there.  But honestly, that is just not as true for IT as it may be for other areas in business.  IT always rallies first as businesses try to leverage their data to gain an advantage as the economy improves.  ‘Ride the wave’ so to speak.  For example, anyone with some experience looking for a job in BI (Business Intelligence) can probably get a new job fairly quickly.  So perhaps an indicator that times are improving is to look at how many of your top performers are leaving.

Top performers also look for new challenges, and if your organization has cut back on new exciting projects and have relegated the best people in the organization to maintenance tasks while giving the few fun new projects to consultants, is it really any surprise that they will leave as soon as the economy appears to pick up?  Do you have consultants come in to write applications and then turn the maintenance over to your internal staff?  Money is still tight.  But investment in the future as the economy begins to dig out of the doldrums it has been in for the last three years can separate the winners from the losers.  It is sort of like playing Monopoly.  One strategy is to buy as many of the ‘better’ properties as possible early in the game even if you don’t have a lot of money because holding those properties will give you an edge later.  Yes, that means buying certain blocks of properties, utilities, and or railroads depending on your personal strategy.  Employees want to be on a winning team, not just a team that ‘gets by’.

I’ve also heard both within our organization and from other organizations that there just isn’t money for training, or the alternative is that training employees just makes them more attractive to other organizations.  While money spent on training is a tricky subject, it is important to employees, especially in IT, to feel that they are not being left behind in their skills.  I’ve lost count of the number of times I’ve upgraded my skill set over the years and almost every time I’ve paid to learn those skills with my own time and money.  If I hadn’t, I would still be looking for a job writing FORTRAN applications on large mainframe computers and there are not many of those.  The point is that employees will just as easily leave your organization if you don’t provide training than if you do.  Maybe even more likely to leave.  Think of training as a benefit.  Yes, you might have some employees leave after the training, but then probably your best performers would have left anyway if you did not provide the training.  At least by training employees, you might reap the benefits from those who do stay because they feel at least some loyalty to an organization who at least tries to help them to be happy in their jobs.  And being happy to a technical person is using the newest tools, languages and hardware to get their job done more efficiently, faster and with fewer problems so they can have a life outside of work too.  Yeah, even IT people sometimes want to have a life outside of work.  That is really best for their health and wellness, or as some may say, sanity.

It is too easy for management to hide behind the opinion that technical people like to work 24/7.  Most really don’t.  At the same time, they can and will put in extra time for interesting challenges or in exchange for some promised reward such as bonus, time off, more training, etc.  But don’t promise without delivery on those promises because that is a sure way to guarantee they will leave or take an ‘I don’t care’ attitude the next time you need something special.

One last point, if your technical people are constantly on call 24/7 so that they begin to feel that they are really working two jobs and getting paid for one just because management uses the phrase, ‘other tasks as assigned’ don’t expect loyalty.  Occasional overtime is one thing, even extended overtime with a promised reward can be tolerated.  However when overtime becomes the norm and daily expectation, you can also expect to see an exodus when the economy starts to improve.

So what are your plans?

Advertisements

Programming in Microsoft Excel – Macros – Part 1

Many people use Excel, but few of those people know how to add programs within Excel either to be part of the spreadsheet or just as a tool to help build the spreadsheet.  Over a series of blog posts, I will examine how to work with macros and create your own UDFs (User Defined Functions).  My goal is not to teach you everything about how to use these two features because there are books out there devoted to that, but just to show you how to get started.

The first thing you have to do is to activate the Developer ribbon.  While Microsoft provides a developer ribbon with options to develop macros and UDFs, it does not by default display this ribbon.  I guess that is another example of my opening statement showing that even Microsoft does not believe that the ‘average’ Excel users will need to use these features.  But then you are not average either, right?

To turn on the Developer ribbon, open the File menu and click on Options toward the bottom of the menu.  This opens the Excel Options dialog that consists of the menu on the left and the options for the selected menu item on the right.  Find in this menu the Customize Ribbon menu item and click it.  There are many controls on the right side of the dialog to help you customize the ribbon.  For now, the only control you need to focus on is the large list box on the right side that shows the main ribbon tabs.  Unless you have turned it on previously, the only ribbon not selected (with a check in the checkbox to the left of the name) is Developer.  Click this check box and then click OK.  You should now see the Developer tab at the top of Excel and if you click this tab, you should see the Developer ribbon shown below.

Now that you are ready to access the developer tools such as the ability to record macros or create your own UDFs, let’s suppose you have the spreadsheet shown on the right from a product inventory.  It shows the product code, product name, and current count.  However, the data appears as a single column of information.  You want to create rows from the data.  You could manually copy and paste data into multiple columns but that would be too time consuming.  Because the structure of the data is consistent (3 rows of data sequentially displaying the code, name and count), you realize that a macro could perform those repetitive steps for you much easier.

Position your cursor in the first cell of the data and open the Developer ribbon.  Because we are going to be moving data around, first you want to make sure that you select the option Use Relative References in the Code group of this ribbon.  When building a macro that must move between cells, there are two things that you must remember:

1)      Never use the mouse to move between cells.  When you click on cells with the mouse to move to a new cell, it adds the absolute cell reference to the macro.  This is rarely what you want.  In fact, it only really makes sense if you want to access a specific cell every time you run the macro no matter where in your spreadsheet you started the macro.  Instead, you want to use the keyboard arrows to move between cells.

2)      Even if you move between cells with the keyboard, you will not get a relative reference (to allow you to apply the macro on different rows or columns in the spreadsheet) unless you first click the Use Relative References option in the Code group of the Developer ribbon.  You will know when you have selected this option because Excel highlights it.

Failure to consider these two points can lead to a very frustrating experience for the first-time macro builder and may even convince them that macros are not worth learning.  Unfortunately, I do not know of a way to make relative references the default setting for all new worksheets even though that would seem to be a logical need and which could appear in the Excel Options we looked at earlier.

Check again that you have the first cell and only the first cell selected and click the Record Macro option in the Code group.  A dialog box appears as shown below.  You must provide a name for the macro.  Excel suggests a simple name that begins with the word Macro followed by a sequential number.  You do not have to use this name.  You can provide your own name, but it cannot use spaces.  If you want a name that has multiple words in it, use an underscore rather than blanks between the words.  You can then provide a shortcut key to quickly execute the macro.  While not required, a shortcut key makes it easier to execute the macro without going through the Developer ribbon, opening the Macros dialog, selecting the macro you want to execute and then clicking Run.  You also have some options about where you can store your macro.  For now, just store the macro in: This Workbook.  Finally, the Description field is also optional but can be useful if you create many macros or if you use the Personal Macro Workbook to collect all the macros that you typically use.  I’ll discuss macro storage options in a future blog entry.

From this point forward until you stop recording, Excel translates all of your actions into the macro, which is essentially a small Visual Basic program.  If you are following along in your own version of this data, follow these steps:

1)      Press the down arrow key once.

2)      Press CTRL-C to copy the product name to the clipboard.

3)      Press the up arrow key once, then the right arrow key once.

4)      Press CTRL-V to paste the clipboard contents in the second column of the row having the product number.

5)      Press the left arrow key once and the down arrow key twice.

6)      Press Ctrl-C to copy the product quantity to the clipboard.

7)      Press the up arrow key twice, then the right arrow key twice.

8)      Press CTRL-V to past the quantity in the third column.

9)      Press the left arrow key twice and the down arrow key once.

10)   Hold the Shift key while pressing the down arrow key once more to select the two rows we now need to remove.

11)   Right click the mouse button to display and dropdown menu and select Delete.

12)   From the popup Delete dialog, select Shift Cells up and click OK.

13)   Now two cells are still selected, the product code and product name of the next item.  You only want to the product code cell selected.  To correct this, press the up arrow once followed by pressing the down arrow key once.

14)   The product ID should now be the only selected cell so we can stop recording by clicking either the Stop Recording option in the Code group of the Developer ribbon or by clicking the Stop Recording button just below the sheet tabs in the bottom left corner of the worksheet.

You can now repeat these steps for each product item by making sure the product ID cell is selected and then pressing the shortcut key combination you created in the Record Macro dialog until you have the entire table converted as shown to the right.

Well, that’s a quick start to creating macros to replace a series of repetitive keystrokes with a single executable program.  In future blog entries I’ll talk about different ways you can access your macros, different ways you can save macros depending on how you use them, how to edit your macros to customize them further.

Real Programmers DO Use Wizards

Many years ago when I was working on my first FoxPro book, FoxPro 2 Programming Guide, and I wanted to include several chapters on how to use the wizards provided by FoxPro to create forms and reports to even menus.  However, my editor Lance Leventhal insisted that real programmers DO NOT use wizards and that he would not allow me to include those chapters in the book.  I continued to argue for the inclusion of these chapters and eventually won but I think he was disappointed to have to include them.

Lance’s background came from assembly language programming and he wrote quite a few books on assembly language programming for many of the early microcomputer processors.  His experience as a writer was much greater than mine since I had only written one previous book at the time on the Apple II.  However, I did learn a lot about how to write technical books from Lance and will always be grateful for his mentoring.

In fairness though to his opinion on wizards, I suppose the use of wizards was not a big thing in his programming world, at least not as big as it was in FoxPro. However I totally bought into the concept of using the FoxPro wizards to help build my applications.

So now why do my co-workers hear me complain about the use of wizards?  Well it is not so much their use, but their overuse that concerns me.  A few years ago we brought in a consultant to help with some programming projects because our regular team was fully engaged in the planning and roll-out of our public facing school SharePoint-based web sites.

While we were busy with our project, the consultant, decided rather than to just write the application that was needed, to write a wizard that would write the application.  Basically every form and every report was created through his wizard interface.  Any changes to a form or report was accomplished by first changing the wizard and then regenerating the application.  That might sound great at first, but there were several problems which surfaced only after the consultant moved on and we were left supporting application on our own.

The first problem was that the team member assigned to take over the application not only had to learn the processes needed by the application that were very complex and deserving of another story, but he also needed to learn how to work either within the wizard or around the wizard to make changes to the code.  If he tried to work around the wizard, the next time the wizard was used to regenerate the forms or reports, all of the custom changes were lost.  On the other hand, the wizard generated such complex code that it was hard to make changes manually in the first place.  Therefore, either method of changing the code took more time than I felt it should take for what in many cases should have been a simple program update.

Of course most programmers don’t think about time, they think about the challenge of the task.  But because this application was in a constant state of flux, the changes quickly overwhelmed the programming.  We finally reached a point where we no longer believed the current application could be effectively maintained and needed a major overhaul.  This would result in a decision to either rewrite the entire application from scratch or look for a third party application that would do most if not all of the functions needed.  In the end, management decided to look for a third party application.

The moral of this story is that this failed project attempted to make wizards do too much.  It tried to make the wizard the application itself rather than use the wizard to save some initial design time.  Wizards can be great to get as much as 80 percent of the work done quickly, but then the programmer must take over for the 20 percent balance to customize the code to meet the specific application needs.

Quite honestly though, the wizards in Visual Studio and SQL Server Reporting Services already meet this need most of the time.  Most programmers don’t realize this but much of the programming needed to build a window, a dialog, or even a report is now done for you as you visually design the user interface.  After the basic visual design is complete, the programmer can open individual controls and add code to events to make the program actually do what they want it to do.  Writing another wizard to perform these same tasks is redundant.  Writing a wizard to help write the code for the control events seems to me to be trying to automate too much effort for too little return.  If a programmer can write the code for a wizard that writes the code for control events, why not just write the control event code first and save yourself a lot of time?

The bottom line is that I’m all for using wizards, but only up to a certain point.  Wizards can help build code for easily repeatable activities such as building the basic form or report infrastructure.  These are activities that every application needs and can easily be generalized.  Custom programming activities should remain custom programming activities.  Don’t try to create tools to create custom code because all you have done is add another layer of complexity that can break or need to be maintained.

I’m sure that over time, wizards will continue to take over more of the programming activities that must or should be done manually today.  Does that mean that the need for programmers will disappear.  I doubt it.  For every advance in interface design I’ve seen over the years beginning with dBase II’s tiny dot prompt to today’s very rich user interfaces, the need for programmers to customize the code generated by wizards has if anything grown.

With that said, I’m about to launch a series of Saturday blog posts on using macros in Excel.  Macros are wizards.  You can have Excel record a series of keystrokes and automatically generate the corresponding code to perform those actions.  (That’s the wizard part.)  However, sometimes you cannot do everything you want by just recording your keystrokes.  You need to get into the generated code and tweak it just a bit to suit your needs.  This series of articles will cover this distinction using the built-in Excel wizard to build much of the code as possible, but then allowing us to go into the resulting code to make adjustments rather than attempt to modify the wizard to do everything we need

Sorting Columns in Excel

Almost everyone who uses Excel knows how to sort rows, but how many know how to sort columns?  I didn’t until I happened across an option dialog for sort that I hadn’t looked at before, or if I had looked at it, I dismissed it as unimportant.  For the last several years we have been helping other users collect data from other systems to add to SharePoint lists.  We would often export the data from the source system to a CSV file format because we knew that we could easily import the CSV file into Excel and then we could create a new list in SharePoint from that Excel spreadsheet.  We could even copy and paste data from the Excel spreadsheet into an existing SharePoint list as long as the column order was the same and the data types matched. The problem was that often the data source had columns in a totally different order from the SharePoint list columns.  So we would copy and past columns.  Yes, I know you can drag and drop columns as well, but sometimes it can be really tricky trying to grab the select column to drag it to a new position.  I had wished for a way to just identify the column order I wanted and with a click of a button magically rearrange the columns in my spreadsheet.

Well, wishes and magic sometimes occur.  Let’s take a simple example of the spreadsheet data in the figure below.  This example only has 6 columns, but the technique will work for any number of columns.  The data begins with Last Name and then displays First Name, Office, Title, Phone, and Email.  Perhaps what we really want is to display the columns in the order Office, First Name, Last Name, Title, Email, and Phone.  While this would only require a couple of moves by dragging and dropping columns into new positions, imagine if you had a spreadsheet with a hundred or more columns and you needed to move columns throughout the sheet.  It would be a little more difficult to accomplish manually.

Suppose you could add a row across the top of your spreadsheet and identify the column order there.  Well, that is exactly how it works.  In the figure below, I added a new row across the top of the spreadsheet.  I then numbered the columns ‘1’ through ‘6’.  There is nothing specially about these numbers.  In fact, I could count the columns by 10 or even 100 so that I would have space between columns I already numbered to insert columns that may be further off to the right, off the screen.  I could even ‘number’ the columns using letters of the alphabet or words.  In fact, anything that can be ranked can work including combinations of letters and numbers.

After ranking the columns, go to the Data ribbon and click on Sort within the Sort & Filter group.  This button displays the Sort dialog shown below.  Normally I would just use the Sort By dropdown which by default shows me the names of the columns or column letters on which I can sort.  However, the button I kept skipping over was the Options button found in the top center of the Sort dialog. 

 

Clicking this button displays the Sort Options window which consists of only two options.  The first option lets you decide whether you want to sort with a Case Sensitive order.  Most people do not so this checkbox is not selected by default.  However, if you ever need to sort text data in which case matters, here is where you can adjust your sort.  The second option lets you select the sort orientation.  Normally, the orientation is set to top to bottom for sorting rows.  But to sort by columns, you must change the sort orientation to left to right.

 

With the sort orientation changed, click OK to return to the Sort dialog and open the Sort By drop down.  Rather than display column names as you may be used to seeing, you will see row numbers.  Since our temporary row in which we defined our desired sort order was in row 1, we would select that option from the dropdown menu.

 

Continuing in the Sort dialog, you would probably still sort on values, but you may want to change the Sort Order to ready Smallest to Largest if it is not already set since we want the first column to be on the far left. 

 

When we press OK, Excel almost instantly redefines the column order based on our numbers as shown in the next figure.                                                                                                                             

 

Now the only thing left to do is to select Row 1 and right click on the row number along the left side of the screen.  Select Delete from the menu.  For sorting a large number of columns before importing them into SharePoint or any other system, this method is by far faster than manually dragging and dropping columns or even cutting and pasting.

Hope this saves your some time.

Are Your SharePoint Libraries File Dumps?

When we first started to use SharePoint five years ago, our focus was to replace our external Internet facing sites with a more consistent look and feel experience for end-users.  We did not have a lot of collaboration sites, but we did support a few, particular among the groups within IT.  We definitely did not spend a lot of time providing training on how you should use SharePoint libraries.  Looking back, that was probably because we did not have a real good idea what worked and what should be avoided.  Therefore, since most of the collaboration site users were IT veterans, they designed their own libraries focused around a paradigm they knew well, a hierarchy of folders within folders within folders…

It took a few months, but we soon started to hear about problems such as:

  • I know I saved my file, but I just don’t remember where.  Can you help me find it?
  • We just discovered that there are two copies of the monthly status report stored in different folders in our site.  Each one has updates that are unique.  Can you help us merge the documents back into a single document.
  • I saved our documents in our project folder, but now when I attempt to access the file with its URL, the browser refused to find it.  What is wrong?

Many of these problems were the direct result of the users having created a complex folder hierarchy that not everyone was familiar with.  We also found several cases where someone saved a document in a different folder specifically because they could not save the file back to the original library.  Perhaps they could not find the original directory again or maybe someone else had the file locked.  Okay, the situation here was that they turned off automatic checkout for the library because checking files out to edit them and then checking them back in was ‘just too much trouble’.

SharePoint 2007 with Microsoft Office 2007 did provide a ‘default’ lock to the file.  However, that default lock released after about 20 minutes or so depending on the client’s OS.  Therefore, conflicts could easily occur.  One of the advantages of moving to SharePoint 2010 and Microsoft Office 2010 is that this dynamic duo now supports multiple editors in the same document at the same time.  Yes, you can now have two or more people edit the same document at the same time.  They just cannot edit the same paragraph at the same time.  My book: Office and SharePoint 2010 User’s Guide from Apress covers this topic in detail.  (BTW to the person who gave the book a bad review because some copies of the book went out from the publisher with the wrong cover.  Thanks for letting us know and I’m sure Apress would have sent you a corrected copied had you contacted them.)

Another solution to their problem of not finding a document would be to use Search to find the document.  However our users had such a bad experience with the web search engine prior to our switch to SharePoint that they did not even try it.  They would try to manually step through the library folders to find their documents. I’ve seen libraries with 8 or more levels of folders. Even uses who tried to use Search were not really sure how they could fine-tune their search string and so were often left with hundreds of references in their search results.

Ultimately after stumbling through several different attempts at solutions, we converged on a solution that many others had proposed and we thought we would give it a try.  That solution was to flatten the structure of the library eliminating most if not all of the folders.  Then we would replace that structure with a set of metadata columns to sort and filter the documents visible using the sort and filter options found in the column headers of the library list view.

We experimented with a library that had limited number of document types but which every person in every department had to create new instances of the documents on a fairly regular basis.  We created one metadata column to identify the department.  Another column to identify the type of document, and a third column to identify the year the document was created.  Then we got rid of the folders and defined a set of views for each document type with groupings on department and year.  Now instead of hunting through hundreds of documents, users could quickly find any document of a specific type for a specific person and year within seconds.

Next we took some other libraries and did the same type of thing.  One library held our monthly newsletters.  In that library, we saved both the newsletter source documents which happen to be from Microsoft Publisher and the published version of the newsletter which is a PDF.  We wanted to display the PDFs on a web page so that employees could just go out and click on a PDF link to view the newsletter.  We also decided not to send out copies of the PDF to our newsletter subscribers.  Rather we would just send them a link to the current issue saving storage needs on the Exchange server that no longer needed to store emails with large newsletter attachments.

But then we ran into a problem.  In creating the page to display the newsletters, we decided to use a Content Query web part.  With this web part, we could just point to a library and display all of the contents from that library on the page as links.  The cool part of creating the page this way is that when we add new documents to the library, they automatically appear on the web page displayed to the users without our having to edit the page.   Obviously we did not want to display all of the files from the library.  We only wanted to include references to the PDF files, not the PUB source files.  The obvious answer was to filter the web part on a metadata column that contained either the word: Newsletter or Source.  Okay, the actual word does not matter.  What mattered is we subsequently found out that simple user defined columns could not be used to filter the data in the Content Query web part.

If we created the metadata column for the library by simply creating a new column for the library and populating the column with one of the two document types (Newsletter or Source), we could not reference that field in the filter portion of the content query web part.  After a lot of experimentation, we found that a field called Category located in the Site Collection columns could be added to our library and when we used it, the Category field appeared in the filter selection of the web part and allowed us to correctly filter the data.

For some time, we thought that was the solution.  By using this Site Collection column, we could then filter the documents in the Content Query web part.  We could even rename the column and it would still work.  One day while working on a different library, I just happened to have a field in the library defined at the Site Collection level.  I was surprised to see that field appear in the list of possible fields I could filter on in the Content Query web part.

So I tried a couple of other libraries, adding fields first to the Site Collection columns and then adding the field to the library as a new metadata column.  Each time it worked.  So for some reason that I have not been able to discover, it appeared to us that the Content Query web part filter only works on columns defined at the site collection level.

Whatever the reason, it has formed the basic of our current technique for defining metadata columns in libraries.  Now when our group works with other teams both in and outside of IT, we help them to define the metadata columns that will minimize or eliminate folders while helping them organize their documents.  If the column has the potential of being used as a filter in a Content Query web part, we first define the metadata column as a Site Collection column.  This has worked well for us since then.

Does this mean that we never use folders?  No.  However the only reason we currently resort to folders within a library is to group files that need special permissions.  While we strongly recommend that our users not create special permissions on individual documents in a library because this has a known negative affect on performance, we do recommend that they apply special permissions to folders and then place documents that need those permissions in that folder.

The bottom line for us is that folders are ‘bad’ except to provide special permissions within a library when multiple libraries are not an option.  A better way to search for your documents is to use filters and sorts operating against metadata columns that categorize those files.  Furthermore, it minimizes the chance of creating duplicate copies of the same file.

See you next time.

Making Search a Best Bet – Part 2

Last time we talked about creating Best Bets by using the Best Bet Suggestions report.  While this is a fast way to create many Best Bets, you will at some point find yourself tweaking your best bets manually.  To do this, first navigate to the site collection where you want or have your best bet definition.  Open the Site Actions menu and select Site Settings.  In the Site Collection Administration group click Search Keywords to open the Manage Keywords page.  Perform a search for the keyword first to make sure it is not being used.  If the search does not find it, click the Add Keyword link to display the Edit Keyword dialog shown below.

You must enter a keyword phrase which can consist of one or more words.  You can also enter synonyms that users might use instead of the keyword to get to the same page.  Next click on the Add Best Bets link.  This shows the following dialog.

Make sure to select the radio button to add a new best bet.  Then supply the full URL to the page where you want to take the user when they select the Best Bet.  Next you must supply a Title for the Best Bet.  This can be as simple as the keyword phrase, the title of the page you are redirecting the user to, or a short description of the link.  I emphasize short because the output from this field on the search results page appears to be limited to 50 characters even though you can enter more.  You can also supply a description which appears immediately beneath the Best Bet title on the results page.  Here is where you can be creative and define in your own words where this link will take the user.  You can also simply copy the first few sentences from the redirected page if you are not interested in creating your own description.  Note that you can leave this field blank because a description is not required.

Click OK to close the Best Bet definition.  The other fields in the keyword definition area begin with a freeform multi-line text area (See Problem 5 below) and then a field for a contact person and some dates.  The Contact field is really only necessary if you implement a review policy on keywords to automatically notify the person listed here on the review date to see if the keyword is still valid, needs some changes, or should be deleted.  I also use this field to add the name of the person who recommends a keyword whether there is a review date or not just to document who suggested the keyword should conflicts with other users wanting to use the same keyword arise.

The dates in the publishing section allow you to define a Start and End date.  These fields are useful for creating special event page best bets or to emphasize specific pages in the search engine based on the time of year.  The third date is the review date mentioned in the previous paragraph with the contact person so keywords can be periodically reviewed.  This feature is also useful for Best Bets to URLs which periodically get replaced with new pages having a different name.

Once you are done defining the Best Bet, click the OK button to save the definition.  Sounds easy?  It is, but problems can still arise.  The rest of this blog describes 5 common problems with suggestions on how to handle them.

Problem 1: Two or more Best Bets cannot refer to the same URL (page)

This problem occurs when a page, especially home pages on sites or subsites, have more than one topic or area that users may want to search for.  The solution to this problem is to just add the new keyword for the new Best Bet you tried to create as a synonym to the keyword in the existing Best Bet that references the same URL.  Sort of changes the definition of synonym doesn’t it?  But it works.  So what we are saying is to use the synonym field for any word that might describe the page.

Problem 2: I need to reference more than 1 URL for a Best Bet

Any Best Bet can reference more than one URL.  To do this, just go to Site Settings and select Search Keywords under the Site Collection Administration group.  Select Edit from the keyword where you want to add a second URL.  In the Edit Keyword page click the link: Add Best Bet from the Best Bet section of the page.  In the resulting dialog supply the second URL along with a title.  After clicking OK to close this dialog, you will see two URLs in the list of Best Bets.  Notice the Order column to the right.  You can determine the order in which these Best Bet URLs appear.

Problem 3: How do I fix a mistake I made in the Best Bet dialog?

After you create your first or second Best Bet and close the dialog where you define the URL and Title you realize you made a mistake, but there is no apparent option to go back and edit the URL or Title.  Rather there is only a link to remove it.  Well you could remove the Best Bet and start all over, but if the correction is minor you really don’t want to start over.  You can click OK to save the definition.  Then from the Manage Keywords page select the primary keyword again.  This time the second Best Bet definition displays both the Remove and Edit link. That’s right.  The Edit button cannot appear until after you have saved something.  Just click the Edit link to display the Best Bet dialog again, make your correction and save it again by clicking OK.

Problem 4: How do I remove a keyword I no longer want?

Navigate to the keyword and from the dropdown menu, select Delete.  If you only want to replace the keyword with one of the synonyms, edit the keyword instead to change the keyword and synonym list appropriately.  Note that moving an existing synonym phrase to the keyword field means you must also delete the phrase from the synonym field.

Problem 5: I have a Best Bet that appears in black letters instead of the hyperlink colors and it does not show a hyperlink.  How do I remove it?

This situation occurs when you supply a description on the keyword page rather than in the Best Bet dialog.  This keyword definition gets displayed at the top of the search page along with the keyword.  They are not links because they have no associated URL.  Rather all the Best Bets defined for the keyword appear beneath this information.  More of a concern is that even if you delete the keyword description, the keyword in black text remains at the top of the Search Results page with the other Best Bets listed beneath it as shown in the following figure:

So how can you remove this keyword displayed above the other Best Bets?  Honestly, I have not found a good solution.  What I have been doing is documenting the best bet definition for that keyword that I want to keep, deleting that keyword, and rebuilding the keyword without the keyword definition, only defining the Best Bets.

Beware: When defining a Best Bet, SharePoint does not validate the URL entered.  Therefore it is possible to enter and save an invalid URL that may not be noticed until a user clicks on it and gets the message: The webpage cannot be found.  This is also true if the webpage is removed before removing the Best Bet.

Together with last week’s blog entry, I hope you are now ready to go to your SharePoint web sites and start defining Best Bets to help make your user’s search experience more productive.  Remember a good search experience increases user satisfaction with SharePoint and thus makes them happier about the job you are doing.

If you like what you’ve read in this site, be sure to tell your friends.  See you next time.

Who Controls Permissions On Your Site?

Today I ran into an interesting problem.  But to understand it, I need to give you a little background.  As many of you know, I work for a large school district in Florida, the 10th largest school district in the country with over 180,000 students and over 20,000 employees.  From that you can probably guess that our web site infrastructure is fairly large.  We have over 4300 sites and subsites spread across the district and school external facing web sites and internal employee sites and collaboration sites.  However, what I want to focus on here is just the externally facing web sites.

Our small team of 5 people obviously does not do the content for all of these sites.  We instead implemented a decentralized approach to content creation with each department and each school responsible for their own content.  In fact, the head of each department and the principal at each school is ultimately responsible for all the content on their site even though they may not be the ones to actually enter that content.

Therefore, from the very beginning, we implemented a workflow process for publishing all content that appears on public facing web sites.  Many people in a department or school can have content creation rights, but only a select few may have content approval rights.  To control who has the ability to create content for public facing sites, the site owner (department head or principal) has full authority over who can and who cannot be content creators and approvers.  Of course, the approver roll is the most important, but even content creators have the potential of overwhelming the approvers with additions and changes if everyone at a school could make changes.  However, our group does not make the decision over who can or who cannot have rights.  We merely act on the requests of the owners.

You might ask, why don’t the owners of the sites manage the permissions to their own sites?  That is an excellent question.  However, from the beginning most department heads and principals did not want to be bothered with supporting that task.  This is something we are looking at changing over the balance of this year, but in the meantime, we require a simple email from the owner to identify the site URL, the name/employee ID of the person (yes, we have multiple people with the same name) who should get rights and what those rights should be.

If you are following me so far, you may say, ‘That makes sense to protect the integrity of the information displayed to the public.’  So what happened that was so interesting?

Today we had an Area Technology Coordinator argue that they should be able to tell us who to give rights to.  Ok, the term Area Technology Coordinator needs definition.  Our 170+ schools are divided into 5 areas to facility management.  Each middle and high school has one or more technology coordinators dedicated to that school.  At the elementary school level, one technology coordinator is shared by two schools.  Technology coordinators at a school are generally responsible for all technology at the school, mostly the computers, software and network.  At some schools, the principal has also made them responsible for the content of the school’s web site.  However, at other schools, the principal has assigned this task to other members of their staff, even teachers in some cases, since in general maintaining content on a SharePoint web site is typically no more difficult than working with most Microsoft Office products.  (Ok, no jokes about how hard that can be.)  Sometimes, the principal changes who they want to be responsible for the web site.  For that reason, it is imperative for us to receive an email from the principal for any changes to permissions.

So now we get to the Area Technology Coordinate who is basically a floating Technology Coordinator that provides additional support to any school in one of our five areas.  Anyone one of these area technology coordinators wants to be the one to define who should be given rights.  Furthermore, he sent an email to us to give a person at one of the schools in his area rights without saying whether those rights should be just content creator, approver, or site administrator (which is a little less than owner).  When I responded back that the request must come from the principal, he sends a request to the principal.  So far so good.  But then he starts to give me grief about the fact that we have still not implemented the rights change even though we have yet to year from the principal.  He further states that we are not cooperating with him and that he wants to know who my supervisor is.

In a way I welcome him to push his agenda forward, because I don’t think the principal at that school or principals in any of the 170+ other schools would want the precedent that would establish.  On the other hand, it is just one more reason why our group wants to get out of the permissions business and let the site owners take full responsibility for either managing the permissions for their site themselves or designating one or two individuals in their department or school to have that responsibility.  Our group has plenty of other work to keep us busy. We really don’t need to be the ones to manage the permissions to the sites and insure that only owners are making permission requests.

So here is what I want to know from those faithful readers of my blog.  What do you do at your organization?  What works for you?  Ok, I know what works for small organizations may not apply to large organizations.  We encounter that all of the time.  But surely some of you either work for, consult to, or know of other large organizations and may have some suggestions.  If you do, post a comment to this blog.  I’d really like to hear what you may have to say.

Making Search a Best Bet

One of the things that can make or break a web site is how well the search works.  Five years ago, before we began our SharePoint implementation, our HTML web site had a horrible search engine.  You could search on anything and basically get the same result,… nothing!  So when we introduced SharePoint, even the out-of-the-box search was a great improvement over what we had before.  But it still wasn’t always giving us the results we wanted.  Of course part of the problem is that our website is huge compared to many companies.  Being the 10th largest school district in the country means that there are lots of departments and lot of documents stored in hundreds of libraries across the Internet and intranet sites.  In addition to those sites, we also have sites for over 180 schools.  Therefore, search often returns hundreds of results, all of which are valid, but not all equally relevant making it a challenge to find the one you want.

So what did we do?   We started looking at creating Best Bets in SharePoint 2007.  Basically a Best Bet is one or more keywords that when entered into the search box will return a pre-defined URL along with a Title and an optional description.  Best Bets appear at the top of the search result page and have a gold (yellow) star to their immediate left to call attention to them.

While they worked great, the problem in SharePoint 2007 was that they required first generating a search report which shows queries that were submitted over the report period ranked in order from the terms with the most occurrences to those with the least.  You then could take the terms you wanted to define as Best Bets and perform your own search to find where you might want to direct users when they entered those search terms.

Then you would go into Search Keywords of the Site Collection Administration group in Settings for the Site Collection site.  In the resulting dialog, you could define a primary Keyword Phrase along with multiple synonyms for the search term, each separated with a semi-colon.  You could then define one or more Best Bets or URL links for those terms.  Best bets could be created with no expiration, or you could place an expiration date on a Best Bet that might expire when an event transpires.

Sounds great?  It was, but it also was a lot of work that our small team did not really have time for.  Along came SharePoint 2010 which we finally installed this past summer (remember we are a school district and have limited times when we can update major software applications).  With all of the upgrade activity and all the new functionality of SharePoint 2010, we did not have time to look at Best Bets until just this past week.  What we found was much of the same infrastructure for building the Best Bets, but more importantly, we found a set of built-in reports that would help us choose which Best Bets to create.

To begin, open your top-level site and click on Site Actions.  At the bottom of the dropdown menu, find and click Site Settings.  Under the Site Actions group, look for Site Collection Web Analytics Reports.  (Note: If you click on Site Web Analytics Reports instead, you will not see the reports you need.)  Notice that there is a Search section with a group of reports (as shown on the right).  Let’s first take a look at a report named Best Bet Suggestions.

This report (a sample is shown below) shows search query text and common selected links based on user history.  This does not necessary mean that the Best Bet must use the suggested link, only that the suggested link has been selected by users in the past who entered the search query text.  You still need to determine whether to accept or reject the suggestion.

So what does accept the suggestion mean?  Perhaps you noticed the check box before each search query text.  The cool part of this report is that by checking the checkbox for the search query text that has a corresponding Search Result URL that looks correct to you, you can automatically create a basic Best Bet definition by opening the Analysis ribbon and clicking the Accept button in the Best Bet Action group. (Yes, there are other options in this ribbon to control the report’s output, export the report to Excel, and schedule alerts or reports on a regular basis, but we will not discuss those options here.)

If you have multiple pages in your report, you must accept best bets on a page-by-page basis.  If you move to another page, you lose the selections made on the previous pages.

After you create a basic Best Bet, you can edit it to add synonyms, descriptions, and even secondary URL links.  We will cover more on these topics next time.  But for now, let’s look at another report.

The Failed Queries report is a very important report because it tells you which search terms either did not return any suggested URLs at all and for those terms that did return one or more URL, how often did the user not select any of the suggested URLs.  The higher the Percentage Abandoned percent is, the more important the need is for a Best Bet for that search term, especially if the number of queries using that term is high.  The default order for this report begins with the search query text that users entered the most.  For that reason, terms near the top of the list are the most important to investigate, especially those with high abandonment rates.  The following figure is an example of this report.

As with almost all of the analytic reports, you can control the date range of the report which defaults to the last 30 days.  You can change the report period by clicking the Change Settings link in the Date Range bar at the top of the report.  You may find the top failed queries will change when you consider different date ranges for the report.  However, the top failed queries near the beginning of the report should always be your focus.  If you check this report monthly, you should see different failed queries as you add more and more Best Bets.

For failed queries that need a Best Bet and for Best Bets created using the Best Bet Suggestions report, you need to go to the Search Keywords page.  A link for Search Keywords also has been conveniently placed at the bottom of the Search section in the Site Collection Analytic reports left menu.  You can also get there by opening the Site Actions menu again and clicking on Site Settings.  You should see a section named Site Collection Administration.  Within this group are Search Settings, Search Scopes and Search Keywords.  Click Search Keywords.

The resulting list of keywords unfortunately is not sorted by keyword.  However, because best bets have not only a primary keyword, but also a set of synonyms, a better way to find whether a best bet has been defined for a keyword is to use the search option at the top of the report (see below).  For example, the search in the figure below shows a search for the word ‘Jobs’ in the list of Synonyms.  Note that you may also need to search the Keyword column (by changing the selection in the Find Keywords dropdown) to cover all possibilities.

To create a new keyword and set its properties click the Add Keyword link in the header of the report.  To edit a keyword and its properties, click on the keyword itself.

Next week in part 2, I’ll go more into some of the ‘rules’ of defining keywords and their synonyms.  Until then, have a great week.

2012 – A Year For Innovation

According to Wikipedia, Innovation is the creation of better or more effective products, processes, services, technologies, or ideas.  When thinking about a New Year’s resolution, I thought that making it a year of innovation would be perfect.

Think about this.  Is your job merely responding to the needs and wants of others?  Do you find yourself doing nothing more than responding to other people’s problems?  When was the last time you came up with a new idea, a new process, or a new technique that went beyond merely helping someone get their job done the same way that it has been done for months or even years?  When was the last time you were able to transform the way they do their job to save you or them time or money?

If you cannot remember, it has been too long.

Perhaps another way to look at this question is to ask whether management in your organization thinks of you, assuming they think of you at all, as merely someone who comes in and gets their job done, or do they look at you as an innovator coming up with new ways to do things that can save time and money.  Why is this important?  In an economy that is still struggling, management is always looking for ways to cut costs.  If you are not providing innovation to the organization, maybe they might just decide to outsource the work you are doing to someone else who can do it cheaper or faster and eliminate the overhead of your position.  That’s a scary thought.  So what can you do?

First, take a look at the job you are doing.  Is there a way you can do things better?  For example, if part of your job is to help other users with their problems using various systems in your organization, there are two ways to can change your job.  One way is to build better systems that are easier to use and have been tested more thoroughly so that they do not fail as often.  If you don’t have the ability or authority to make changes to systems, can you create a self-help on-line document that users can find their own answers?  Perhaps all you need is a Frequently Asked Questions page.  Maybe a SharePoint Wiki site could serve as a living, growing source of help for people’s questions.  Publishing an in-house newsletter with tips and tricks can also over time help people solve their own problems.  Consider that innovation means bringing to your company something that is new to them.  You can certainly learn from what other companies and people in your industry are doing and apply what you learned to your situation.  To the rest of your company, that is still innovation.  And who knows, maybe by using some of the things you learn from others, you will come up with your own twist on a technique or procedure that goes beyond what anyone else has done.  But you have to start somewhere.

One of my New Year’s resolutions this year is to focus on helping others within our organization learn how to effectively use Pivot Tables and Pivot Charts.  I intend to show them how to analyze their own data rather than rely on reports that need to be coded by a developer assuming that development time can even be scheduled within a reasonable time frame.  Even though pivot tables and pivot charts have been around for several years, few people know how to use them.  Even fewer know how to use PowerPivot to create their own end-user business intelligence platform.  In fact, in the week before I left for Christmas vacation, I already laid the groundwork for one department to look at using PowerPivot tables to replace reports for one of their systems.  We will be meeting early in January to discuss this project. To help illustrate how pivot tables can change their world, I created a ‘sample’ database filled with ‘generated’ data in a SQL Server development system that I then imported into a PowerPivot table using Excel 2010.  I easily came up with over a dozen different views of their data by changing the dimensions I use for the columns and rows as well as switching the measures calculated in the body of the table all of which require no programming on the part of the ‘customer’.

Another resolution is to help more people in our organization build simple forms with InfoPath to collect data for applications rather than using one of the .NET languages to write application-based forms to collect data.  I’m not saying that application development is never needed.  However, for many simplier systems or in cases where people already have to fill in paper forms, the use of electronic forms can save time, validate entered data instanteously and avoid development backlogs to get the product to production sooner.  I’ve been in discussions with several groups to help them eliminate paper forms by going to Word and InfoPath forms to collect data.  By eliminating paper forms, we can make it easier to transfer information between people and processes.  We can eliminate the need to allocation floor space and buy additional file cabinets.  We can save time by making it easier to search for the completed forms using SharePoint search.  Of course, we also can reduce the amount of paper purchased and the cost of creating pre-printed forms.  And best of all most forms don’t require programming experience to develop.

Finally, I’m continuing with my resolution from last year to move people away from using folders with SharePoint libraries more than 2 levels deep.  I’ve made some progress in the past year, but not as much as I wanted.  This is a major paradigm shift for many people who have been trained over the years to store documents in nested folders.  However, I have a few ‘converts’ who are spreading the word to other departments that libraries with metadata make finding your information far easier than nested folders.

No other department in most organizations is in a better position to promote changes like the ones mentioned above than IT who is after all responsible for gathering, manipulating and reporting on data throughout the organization.  IT can use many of the tools they already have without having to buy anything new to turn raw data into knowledge to help management make decision.  In our case, the primary tools we need to use include Microsoft Office, SQL Server and SharePoint, all of which we already own.

If your IT organization can provide value to your organization and not just be a cost center, then management will look at IT and you as a valuable partner in its attempt to dig out of the current economic doldrums we all seem to be in.  IT can lead your organization out of the current recession by bringing innovation to all products, processes, services, technologies, and ideas.  Won’t you join me with your own resolutions this year to innovate?